Article Lead Image

Illustration by Max Fleishman

With facial recognition tech on the rise, is it time to delete all your selfies?

There's a privacy nightmare in the making.

 

Ben Dickson

Tech

Posted on Jun 21, 2016   Updated on May 26, 2021, 2:02 pm CDT

Imagine what would happen if the entire social profile of a complete stranger could be obtained just by snapping a photo of their face. 

Well, the possibilities are now being explored with the advent of FindFace, an app launched recently launched in Russia that allows users to identify people featured in photographs with very high accuracy. 

The technology uses artificial neural networks to compare photographs it receives against an image database. It cannot tap into Facebook’s huge image repository because of its privacy settings and profile access policies, but it has achieved great success in using VK.com (previously VKontakte), also known as the “Russian Facebook,” the dominant social media platform in Eastern Europe that boasts over 200 million accounts.

The software, founded by Artem Kukharenko, 26, and Alexander Kabakov, 29, isn’t the first facial recognition app developed. Facebook, Google and IBM, among others, have their own respective face recognition platforms, which are sophisticated and reliable. But FindFace surely is the most controversial of the flock and could very well spell the doom of privacy, both online and in the physical world, if its creators have their way. 

What sets FindFace apart from its peers is its untethered access to VKontakte’s image database and its efficiency in searching huge data sets. Kabakov says the platform can search over one billion pictures in less than a second. FindFace also boasts having bested Google at the MegaFace facial recognition contest.

Already, in the past two months, 500,000 users have downloaded the app to get their taste of a technology that had previously been only in the command of huge corporations and security agencies, and more than 3 million searches have been made to put innumerable use cases to test.

FindFace has helped Russian law enforcement to advance cases that haven’t been moving for years, reports the Guardian, and the startup will soon be providing services to the Moscow city government to help complement its network of 150,000 CCTV cameras and improve its crime fighting capabilities. 

But the sinister and controversial applications of the technology far outweigh the positive ones.

A photographer in St. Petersburg used the platform in an impressive photo project called “Your face is Big Data,” in which he used FindFace to prove how easy it is to identify strangers. Online vigilantes are using it to identify, stalk, and harass unfortunate victims. Retailers will want to take advantage of the provided opportunities to further bombard their customers with tailored ads.

Authoritarian regimes will also have a vested interest in FindFace, as accurate facial recognition linked to social media platforms will grant them unprecedented power in identifying protestors who take part in street rallies. Kabakov and Kukharenko said in an interview with the Guardian that they were open to offers by Russia’s FSB security service.

The controversy triggered by FindFace underlines the huge social, legal, and humanitarian implications of disruptive tech such as facial recognition becoming available to the masses. And it is not the only case where the rolling out of new technology and features cause dispute. Facebook is dealing with its own set of legal problems over its facial recognition technology, and an application that is mining facial expressions from YouTube videos will bring along its own repercussions for user privacy. 

Many maintain that such technology should not be made publicly available—or at the very least firms must keep tabs on the way their products are being put to use. In tandem, social media companies and other firms that collect user-related data should be mindful of how they share that data with external parties. An example would be Twitter’s recent move to cut spy agencies from its analytics services.

Others, including the developers of FindFace, believe that technological progress cannot be stopped; therefore, we must work with it and make sure it remains transparent. After all, any new technology can be put to good or bad use, depending on who uses it. An analogy could be drawn with the heated crypto-debate that has pitted government agencies and tech firms over the availability of encryption technology.

“A person should understand that in the modern world he is under the spotlight of technology,” Kharbakov told the Guardian. “You just have to live with that.”

But whether his own efforts at transparency are helping better the lives of the people who will be affected by it remains in a cloud of doubt. Any photo or video that you post on your social media accounts can become the basis to identify you in the hundreds of other photos of you that might be uploaded on the internet without your consent or awareness, whether it’s a friend, a security camera feed or a picture taken from a crowd. And after that, you can be bombarded with ads, harassed by haters, or be harmed in some other devious way. 

Does this mean that we should rip off all of our pictures from the internet, or start posing for images in weird angles, as the experts at Kasperky Labs suggest? And given the recent MySpace hack that proves your data is forever, is it even possible to retrace your steps and remove your photos from the internet?

The one assumption we can make is that privacy as we know it today can soon come to an end, so as it’s been said a thousand times, you better think deeply before you post your next photo.

Share this article
*First Published: Jun 21, 2016, 8:00 am CDT