The Anti-Surveillance State: Clothes and Gadgets Block Face Recognition Technology, Confuse Drones and Make You (Digitally) Invisible

The Anti-Surveillance State: Clothes and Gadgets Block Face Recognition Technology, Confuse Drones and Make You (Digitally) Invisible

An entire industry is dedicated to getting your privacy back.

By Janet Burns / AlterNet

April 21, 2015

Last spring, designer Adam Harvey hosted a session on hair and makeup techniques for attendees of the 2015 FutureEverything Festival in Manchester, England. Rather than sharing innovative ways to bring out the audience’s eyes, Harvey’s CV Dazzle Anon introduced a series of styling methods designed with almost the exact opposite aim of traditional beauty tricks: to turn your face into an anti-face—one that cameras, particularly those of the surveillance variety, will not only fail to love, but fail to recognize.

Harvey is one of a growing number of privacy-focused designers and developers “exploring new opportunities that are the result of [heightened] surveillance,” and working to establish lines of defense against it. He’s spent the past several years experimenting with strategies for putting control over people's privacy back in their own hands, in their pockets and on their faces.

Harvey’s goal of “creating a style that [is] functional and aesthetic” has driven several projects and collaborations, including a method for “spoofing” DNA, and via the Privacy Gift Shop, his drone-thwarting Stealth Wear line (clothing he claims "shields against thermal imaging...[which is] used widely by military drones to target people," seen below) and the OFF Pocket phone sleeve, able to keep out unwanted wireless signals.

Click to enlarge.

His CV Dazzle designs for hair and makeup obscure the eyes, bridge of the nose and shape of the head, as well as creating skin tone contrasts and asymmetries. Facial-recognition algorithms function by identifying the layout of facial features and supplying missing info based on assumed facial symmetry. The project demonstrates that a styled “anti-face” can both conceal a person’s identity from facial recognition software (be it the FBI’s or Facebook’s) and cause the software to doubt the presence of a human face, period.

Click to enlarge.

Harvey’s work is focused on accessibility in addition to privacy. “Most of the projects I've worked on are analog solutions to digital challenges,” he said. His hair and makeup style tips – a veritable how-to guide for how to create “privacy reclaiming” looks at home – are “deliberately low-cost.” His current project – software to “automatically generate camouflage…that can be applied to faces” – will allow a user to “create [their] own look and guide the design towards [their] personal style preferences.”

Other low-tech protections against widespread surveillance have been gaining ground, too. Though initially designed as a tongue-in-cheek solution to prying eyes and cameras, Becky Stern’s Laptop Compubody Sock offers a portable, peek-free zone to laptop users, while the CHBL Jammer Coat and sold-out Phonekerchief use metal-infused fabrics to make personal gadgets unreachable, blocking texts, calls and radio waves. For people willing to sport a bit more hardware in the name of privacy, the Sentient City Survival Kit offers underwear that notifies wearers about real-life phishing and tracking attempts, and its LED umbrella lets users “flirt with object tracking algorithms used in advanced surveillance systems” and even “train these systems to recognize nonhuman shapes.”

Click to enlarge.

Large companies are also getting in on the pushback against increasing surveillance. Earlier this year, antivirus software leaders AVG revealed a pair of invisibility glasses developed by its Innovation Labs division. The casual looking specs use embedded infrared lights “to create noise around the nose and eyes” and retro-reflective frame coating to interfere with camera flashes, “allowing [the wearer] to avoid facial recognition.” In early 2013, Japan’s National Institute of Informatics revealed a bulky pair of goggles it had developed for the same purpose.

A spokesperson for Innovation Labs claims its glasses represent “an important step in the prevention against mass surveillance…whether through the cell phone camera of a passerby, a CCTV camera in a bar, or a drone flying over your head in the street.” Innovation Labs says that, with a person’s picture, facial recognition software “coupled with data from social networking sites can provide instant access to the private information of complete strangers. This can pose a serious threat to our privacy." Though AVG’s glasses are not scheduled for commercial release, Innovation Labs said that individuals can take a number of steps to prevent their images from being "harvested":

“First and foremost, make sure you’re not allowing private corporations to create biometrics profiles about you. When using social networks like Facebook, be aware that they are using facial recognition to give you tag suggestions. Facebook’s DeepFace was already tested and trained on the largest facial dataset to-date (an identity labeled dataset of more than 4 million facial images belonging to thousands of identities).”

Click to enlarge.

Holmes Wilson of nonprofit Fight for the Future, which works to defend online privacy and freedoms on various fronts, is more concerned with other types of privacy invasion than real-life image harvesting. “It’s pretty unlikely in most of the world that you’ll get followed around using a network of street cameras with face recognition,” he said. “It’s probably pretty likely, though, that you’ll get filmed by police at a protest. But [there’s] not much you can do about that other than wearing a mask.” 

Wilson advises people concerned about privacy breaches through surveillance to first focus on the ways in which their gadgets are supplying info to third parties. “The place where it’s easiest to fight back against surveillance is in protecting the security of your messages,” he said, adding that message security "can be a problem for activists, too.” He said apps like Textsecure, Signal, and Redphone can make it “a lot harder for people to spy on you.” Wilson added:

“Phones are the biggest thing. Lots of people think of smartphones as the big privacy problem, but old-fashioned phones are just as bad, and worse in some ways. All cellphones report on your location to the network as you move around. That’s just how they work, and they need to send that information or the system won’t know where to send your call. There’s no way to turn that off, other than by turning off the phone and, for good measure, taking the battery out.”

In collaboration with the Electronic Frontier Foundation, Fight for the Future recommends a variety of options for encrypting messages, password-protecting accounts and securing a user’s various communication and browsing activities via Reset the Net. Wilson encouraged those with specific privacy concerns to check out tutorials, resources and break-downs of privacy issues from Surveillance Self-Defense.

Last year, Facebook announced that its DeepFace facial recognition technology can detect a person’s identity from photos with 97.25 percent accuracy, only a hair below the 97.5 percent success rate for humans taking the same test. Currently, a congressional front is preparing to extend surveillance powers granted to legal bodies by Section 215 of the Patriot Act—the NSA’s legal foothold of choice with regard to mass collection of US phone records since 2006, and set to expire on June 1—with the light-on-reform USA Freedom Act.

It seems likely that a growing number of both tech-wary and tech-savvy people will continue weighing how best to ensure their personal privacy, whether by putting stark makeup on or by turning their phones off.

Janet Burns is a writer in Brooklyn, NY. Her website is warmlyjanetburns.com.

Comments

Popular posts from this blog

WARNING NUDITY 18+ As the Hunter's become the Hunted An Untamed Perverted World Documentary (Video & Pictures)WARNING NUDITY 18+

The Real Information on the Hawaiian Ballistic Missile System "Mistake"