The end of revenge porn

Facebook's new facial recognition technology signals a new era of fighting one of the Web's most pernicious crimes.

Mar 1, 2020, 12:46 am*

Internet Culture

 

Gillian Branstetter

Facebook’s latest and creepiest method of gathering your data might have some unintended benefits. Earlier this month, the company published a paper in the Computer Vision Foundation journal that unveiled its PIPER technology, an image-recognition software that can identify people without seeing their face.

Thus far, most of the technology attempting to identify people in photos and videos has focused on facial recognition software utilizing a database of photos collected from social media. PIPER, however, uses other external cues—such as hair, skin tones, or clothing, and even the body language or the context of photos—to identify the same person in different photos. When analyzing over 40,000 photos from Flickr, PIPER recognized individuals 83 percent of the time, regardless of their position in the photo or the angle from the lens.

In many ways, PIPER could be the Holy Grail of photo management in social media. Imagine letting Facebook tag your friends in a photo album for you, as opposed to going through and doing it yourself. The Facebook app Moments, for example, could automatically sort your photos into albums based on where they were taken and who they were taken with, removing much of the work inherent in sharing images on Facebook.

More important than finding yourself in the photos you do want online, however, is identifying when you’re in photos you’d rather nobody see.

PIPER uses other external cues—such as hair, skin tones, or clothing, and even the body language or the context of photos—to identify the same person in different photos.

The most important implication of PIPER is that the technology could allow Facebook to notify you when any image of you is uploaded to the site, giving you complete control over what images are released into the ether.

Thus, revenge porn published on the site would be far harder for creeps to upload without your knowledge, and Facebook could more readily remove the content before it can do much damage to that person’s social or professional life. As CNN reports, not only do many revenge porn victims lose their jobs, but it also hampers their ability to find another one, as “more than 80 percent of employers rely on candidates’ online reputations as an employment screen.”

Currently, most revenge porn policies across the Internet place the onus on the victim to not only report instances but also prove they are the subject of the photos. With PIPER, Facebook could help put the power back into the hands of the victims and make revenge porn nearly impossible to spread, with other sites following suit.

Currently, Facebook’s revenge porn policy consists of banning all nudity on the site, though its methods of finding and removing it leaves much to be desired. When one user reported a fake profile depicting her in altered pornographic images, Facebook took months to remove the page, only doing so after police asked for information related to the identity of the user. The user is now suing Facebook for $123 million lawsuit, alleging “significant trauma, extreme humiliation, extreme embarrassment, severe emotional disturbances, and severe mental and physical suffering” due to Facebook’s negligence.

The months it took Facebook to respond to this woman’s complaint is likely due to the fact it fields reports of abuse by hand, hiring thousands of laborers to scour Facebook’s massive catalog for any content that violates its policies. To Facebook’s credit, this is how most sites keep disturbing content off their servers, but PIPER stands to help that mission along in a major way.

Because this new technology can identify people by body type, skin tone, and facial recognition, this helps Facebook manage disturbing content and find serial offenders, as well as empowering victims. The technology, however, must be perfected—an 83 percent success rate is not great if one’s livelihood is in question. But if refined to a near complete success rate, PIPER could put Facebook streets ahead of where other sites are on this issue.

Last February, Reddit became the first major social media site to ban revenge porn. The former home of /r/creepshots urged users who believe a nude image of themselves was posted without their consent to “please contact us ([email protected]), and we will expedite its removal as quickly as possible.” A month later, Twitter followed suit with its own policy revision banning “intimate photos or videos that were taken or distributed without the subject’s consent.”

With PIPER, Facebook could help put the power back into the hands of the victims and make revenge porn nearly impossible to spread, with other sites following suit.

And just last week, Google announced it would allow users to submit reports of revenge porn on its Image search. Other apps, like Snapchat and Yik Yak, have done shockingly little to reel in this abusive trend.

Even among those sites who have, none have gotten past the format of requiring victims to identify and report revenge porn. Facebook alone has 1.3 billion users who’ve uploaded 250 billion photos—with 350 million more pictures flowing to the site each day. It’s too big a task for any civilian user to find nude photos of themselves in that haystack, especially when they likely aren’t even aware such content is lurking on Facebook’s servers to begin with.

If Facebook were to hone PIPER’s skills by unleashing it on that massive trove of images, it could automate this process for its users. The biggest concern with PIPER is not its limitations, however, but its capabilities. Identity recognition technology is eerie enough without the ability to identify people without using their faces, and Facebook already has a rocky history when it comes to respecting its users’ privacy.

The entire tech industry as a whole is struggling to embrace the standards privacy advocates would like for facial recognition technology, a debate that caused recent talks between industry and privacy representatives to fall apart in a tense standoff. Should, for example, a software like PIPER be allowed to identify people in a public photo against the subject’s will? Privacy experts say no, but the industry says yes.

So PIPER might have to come with improved standards for technology of its kind, but it stands to be the first major innovation to come out of Silicon Valley that actually protects the privacy they so regularly violate. 

If PIPER isn’t the end of revenge porn altogether, getting this technology right will prove an important step in making the Web a better place for potential revenge porn victims—and everyone else.

Gillian Branstetter is a social commentator with a focus on the intersection of technology, security, and politics. Her work has appeared in the Washington Post, Business Insider, Salon, the Week, and xoJane. She attended Pennsylvania State University. Follow her on Twitter @GillBranstetter

Photo via parkerpyne_investigates/Flickr (CC BY ND 2.0)

Share this article
*First Published: Jun 24, 2015, 1:59 pm