- Amazon’s ‘Clifford the Big Red Dog’ reboot isn’t for you—and that’s fine 5 Years Ago
- Walmart pulls ‘Let it snow’ cocaine sweater, ruining Christmas 5 Years Ago
- The way Facebook serves political ads could be driving polarization 5 Years Ago
- A YouTuber simulated a mass shooting from his hotel room—and then posted the videos 5 Years Ago
- Trump tries another ‘Simpsons’ defense as impeachment articles drop Today 10:52 AM
- ‘Rick and Morty’ attempts to contain its dragon with mixed results in episode 4 Today 9:24 AM
- James Comey puts ‘Fox & Friends’ on blast Today 8:54 AM
- Nick Cannon’s latest Eminem diss is not working out for him Today 8:27 AM
- Conservatives want a war on porn. It’s puritanical sex values that need to go Today 7:00 AM
- The year in Meghan McCain news cycles Today 6:30 AM
- Why Tumblr is totally obsessed with 2 characters from Stephen King’s ‘It’ Today 6:00 AM
- Game developer Chucklefish accused of whitewashing characters of color Monday 5:22 PM
- Apple TV’s ‘Hala’ is a silent explosion of a coming-of-age film Monday 5:20 PM
- This new video game apparently lets you play Jesus Monday 4:02 PM
- Golden toilet creator sells world’s most expensive banana—only for another artist to eat it Monday 3:24 PM
Has facial recognition paranoia gotten stupid?
Google Street View is not the Big Bad when it comes to public video surveillance.
French artist Marion Balac has created an art project from Google Street View. The project, called Anonymous Gods, is a collection of photos of famous monuments whose faces have been blurred by Google’s automatic face-recognition technology. The result is an eerie display of “anonymized” Gods, heroes, and other figures, including the Egyptian Sphinx statue in Las Vegas. The pictures have tickled the fancy of bloggers who have remarked that they are “funny and creepy in equal parts.”
Balac presents the collection of images without any political commentary, but it is worth taking a moment to think about this weird side-effect of the intersection of technology and social fear—and what it means for privacy.
The failure of privacy protection
The irony of Anonymous Gods is that even with a blurred face, you know that the statue of the Sphinx is the statue of the Sphinx. You don’t suddenly fail to recognize a giant gold-covered Buddha just because his face is blurry. For these iconic objects, blurring the face literally has no effect on anonymity.
This invites an important question: Why do people think that blurring your face in Google Street View will protect your privacy? What, exactly, is the face-blurring feature supposed to protect you from? To answer this, let’s start with the negative: What are some things that face-blurring on Google Street View does not protect you from?
It does not protect you from having your movements recorded by public CCTV and security cameras almost continually any time you are outside in any major city. When you are in public, you are almost always on camera, and none of these cameras is owned by Google. Instead, these are private or public security cameras created and run by high-tech companies such as Cisco and Avigilon. The cameras not only can record high-definition video clips of you, often they can analyze digital video feeds of your activities in real time.
None of these companies blurs faces, presumably because nobody has asked them to. Everyone is so focused on being freaked out by Google Street View (which does not record your movements or track you in real time), that they often forget about companies that have much more data about your day-to-day movements.
But here is the real kicker: You are just like the Anonymous Gods. Even if they did blur out your face, it still wouldn’t stop a really determined investigator from identifying you.
For years, biometric technology has been focused on alternatives to face recognition such as gait analysis: an analysis of the way you walk. Gait analysis is superior to facial recognition in many ways, including the fact that it can operate at a great distance and with blurry images. This is one of the reasons it has been incorporated into the repertoire of biometric measures used by the FBI.
Facial recognition gets a lot of buzz in popular media because of its commercial visibility. Facebook, Google, and others have been using it in very public ways, triggering much-needed conversations about what is appropriate and inappropriate in the commercial use of face recognition technology.
But I have bad news for you: Google Street View is not the Big Bad when it comes to public video surveillance, and if you are truly concerned about a powerful shadowy agency tracking and identifying your movements, then blurring out your face isn’t doing a thing to help you.
Why do people feel like it helps? To answer this, we need to look back at how face-blurring in Google Street View came about in the first place.
Why the Sphinx has no face
Google Street View launched as a public service on May 25, 2007, and less than a week later, the freak-out began. On May 30th, Boing Boing reported on a letter from Mary Kalin-Casey who was literally shaking because Google Street View had captured an image of her cat inside her home, sitting in front of a window.
It is worth pointing out that anybody walking down the street could see her cat sitting in the window. Anybody could legally take a photo of the exterior of her house from the street, and it would doubtless include the petite feline face staring out from the window pane. External views of a house, including cat-adorned windows, are not private per se.
But in the United States, taking photographs through windows is the beginning of a legal gray area. Some laws and court rulings have suggested that as long as you’re using a normal (i.e. non-telephoto, non-elevated) lens, accidentally capturing the interior of a house through a window is not a breach of privacy. However, we do have “peeping tom” and trespass laws, and the boundaries for photography are not always clear.
Over time, more controversial photos appeared: people exiting sex shops, people being arrested. Once again, privacy laws in the United States are often unclear. Most privacy laws are rooted in the notion of the “expectation of privacy.” A clear-cut example is public bathrooms: They are technically a public place, but people have an expectation of privacy in the bathroom. As a result, taking photos of somebody in the bathroom is illegal.
Do you have an expectation of privacy when you enter a sex shop from a public street? What about when you are leaving an AA meeting? In a legal gray area like this, Google sensed the potential for serious trouble. So in 2008, Google Street View began to use automatic facial recognition software to blur faces in their Street View photos. This made people feel better, at least for a little while; it also lead to the Anonymous Gods.
What does face-blurring really accomplish?
When Google agreed to blur faces, it made people feel better. But let’s take a closer look at what it accomplishes. Let’s say you are using Google street view, and lo! And behold you suddenly see yourself in a street view photo. Let’s imagine, for the moment, this was before the automatic face blurring began. What is the actual risk to you?
Google Street View only takes snapshots—static images—so it can’t track your movements. It can only “catch you in the act,” as it were, when you are in a public place. Maybe you are leaving a sex shop. Maybe you are sunbathing nude and didn’t realize you were visible from the street. Maybe you are wearing a tacky tie. The most often cited “fear” that people express when talking about the invasion of privacy due to Google Street View is the fear of being embarrassed.
In 2007, Kevin Bankson, a staff attorney with the Electronic Frontier Foundation was reported as saying that “there’s definitely a privacy concern” with Google Street View, but then he added the very telling comment: “We don’t think what Google’s done here is necessarily illegal… it’s more that they’ve done something that’s really irresponsible and rude to people.”
The claim that being embarrassed on camera is a breach of privacy is dubious, at best. There is a dearly held tradition in the United States that it is not the government’s role to protect people from embarrassment. It is part of the freedom of speech protected by the First Amendment: The government cannot punish you for saying something merely because it might make someone feel awkward, and you can’t get the government to force someone to shut up just because their speech is making you uncomfortable. It is one of the reasons that the “right to be forgotten” movement in Europe has not picked up any steam in the United States—and never will.
Google Street View might embarrass you, but that’s not really a “breach of privacy.” No matter how bad it feels, you don’t have a right to not be embarrassed.
Know the real enemy
The public is becoming more sophisticated about the distinction between being photographed on a public street, on the one hand, and true privacy invasions such as wiretapping, on the other. Numerous surveys have shown that a strong majority of people support surveillance cameras in public places even as disapproval and concern over NSA wiretapping increases. People are beginning to favor increased recording of public images, especially when the police are involved.
This doesn’t mean Google Street View is an innocent angel, of course. It still has some serious charges to face. Even after they agreed to blur faces, they began using elevated lenses that could see over fences, in clear violation of “Peeping Tom” and trespassing laws. They began illegally wiretapping wireless networks and collecting passwords and email as they drove down the street. They are going to have to answer for these clear violations of law.
But the practice of blurring your face on Google Street View is useless. It is worse than useless, actually—it gives people a false sense of protection. It allows people to focus on the wrong things when they think about privacy. It allows people to obsess about embarrassment and facial recognition, when they should be worried about wiretapping and real-time video-monitoring. As a practical matter, we would all be better off if Google stopped blurring faces on Google Street View. It’s the wrong battle to fight; it’s time to let it go.
Greg Stevens is a data scientist with over 20 years of hands-on experience with machine learning, predictive analytics, and related statistical methods. His research-driven essays tackle issues in pop culture, politics, and science. He also hosts a YouTube channel.