Clothing has often become a source of protection against privacy. We see it in celebrities all the time, from head-to-toe coverage, to recent developments in paparazzi-resistant gear, we’re constantly pushing our clothing to do more for us. And with so much fear surrounding face-tracking software, many have begun wondering ways to protect yourself from this breach of privacy. In a recent paper, Kaidi Xu, Mengshu Sun, Yanzhi Wang, Xue Lin (all from Northeastern University), Gaoyuan Zhang, Sijia Liu, Quanfu Fan, Pin-Yu Chen (from MIT-IBM Watson AI Lab, IBM Research), and Hongge Chen (from Massachusetts Institute of Technology), introduce an adversarial t-shirt — a wearable method of avoiding AI-based human detection cameras.
In the past, developments have been made in creating adversarial designs that evade person detectors, but they have been limited to unwearable objects. These were important advancements—eyeglasses, designs stuck to cardboard, stickers—which managed to fool the detectors, but they all limited the actual usability of the design.
[Related Article: Trust, Control, and Personalization Through Human-Centric AI]
In reality, if people are to use this design, it must move with them and be easy to use. And so, the team came up with something that did just that: created a T-shirt which can avoid detection even as the wearer moves around and wrinkles the pattern.
While creating a design that still worked on moving fabric wasn’t easy, the team found a way to circumvent this challenge, they “employ TPS mapping  to model the cloth deformation caused by human body movement. TPS has been widely used as the non-rigid transformation model in image alignment and shape matching . It consists of an affine component and a non-affine warping component.”
“Figure 2: (a): examples of our T-shirt with printed checkerboard to construct control points for TPS transformation. (b) and (c): two frames with checkerboard detection results. (d): result of applying TPS transformation from (b) to (c).”
The team found that their design achieved 79% and 63% attack success rates in digital and physical worlds respectively against YOLOv2. In contrast, the state-of-the-art physical attack method to trick an AI-based person detector only achieves 27% attack success rate. This is a huge margin of success, particularly when this is the first design of its type. We can only look forward to see how it’s changed and improved upon further.
[Related Article: Create Your First Face Detector in Minutes Using Deep Learning]
The team concludes by discussing how the designs used in creating an adversarial t-shirt can be implemented within other consumer-wear, like clothing, accessories, or paint on face, for instance. There’s no doubt that designers and consumers will be interested in this new technological advancement, especially as fears from Black Mirror become instilled even deeper in our minds.