,

Deepfakes: How Redditors are using AI to make fake celebrity porn

Forget Photoshopped celebrity nudes—the depraved new trend is fake celebrity porn videos driven by machine learning. On Reddit, they’re calling them “deepfakes.” With an app, a little technical ability, and a couple of high-res videos of a celeb, anyone can transform their favorite porn star into their favorite actress. It’s a trend that’s starting to blow up, and it has implications well beyond pornography.

In a Motherboard post that put deepfakes on the map, Samantha Cole reported this week that a deepfakes subreddit—mostly, but not entirely, porn—has sprung up. One poster there, who goes by “deepfakeapp,” created an app so others could easily make their own altered videos.

“Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button,” deepfake app told Motherboard.

Perhaps the most impressive deepfake so far is this CGI Princess Leia, from Rogue One: A Star Wars Story, that looks comparable to the big-budget version that actually appeared in the movie. A user called Derpfake claimed it took him all of  20 minutes.

There are also some popular, amusing Nicolas Cage fakes popping up on the subreddit. The deepfakers have really taken Cage’s iconic role in Face/Off to heart and taken his face … off! Here it is on Sean Connery’s body in a James Bond flick:

https://gfycat.com/ConfusedScratchyCowbird

But mostly, the software is being used to create creepy fake-but-close-enough porn of popular actresses and singers. Emma Watson, Daisy Ridley, Natalie Dormer, and Katy Perry are popular targets. Many of the fakes are obvious, but some of them manage to cross the Uncanny Valley. If you’re a horny dude looking for celebrity porn, it hardly matters that this deepfake isn’t “really” Katy Perry.

katy perry deepfake face swap

Deepfakers defend their work by arguing that the distinction between real and fake has broken down so completely that using people’s likenesses in this way doesn’t matter. If you point out that it’s a very short jump from celebrity porn mock-ups to reputation-destroying revenge starring any of us, they’d argue we’re all safer if fake video tools become so widespread that no video is considered real or trustworthy. “No one knows if it’s really you” probably isn’t as comforting to potential deepfake victims as these video creators might think.

“What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on,” one deepfakes poster acknowledged. But he went on to argue that, “If anything can be real, nothing is real. Even legitimate homemade sex movies used as revenge porn can be waved off as fakes as this system becomes more relevant.”

“While it might seem like the LEAST moral thing to be doing with this technology, I think most of us would rather it be in the hands of benign porn creators shaping the technology to become more focused on internet culture,” he added.

That’s a pretty big leap. Porn has always been at the vanguard of technological innovation—there’s hardly been a big invention in history that hasn’t been adapted for masturbation—but that doesn’t mean society as a whole is automatically comfortable with having their faces edited onto others’ bodies.

Years of exposure to very convincing Photoshops have trained internet users to be skeptical of every image they see. Even if it looks completely convincing, it could just be a skillful fake. We’re less skeptical of video: the technology for faking moving images is newer, jankier, and less widespread. But that’s starting to change, and we’re going to have to add awareness of this unsettling new tech to our media literacy toolboxes. In the age of fake news, skepticism is a valuable reflex.