,

AI-generated fake porn is on the rise—and it has huge implications

Since it first appeared on Reddit, the deeply concerning practice of face-swapping celebrities onto adult stars’ bodies for porn-watching purposes has quickly spread and is now more convincing than ever.

Created by Redditor deepfakes, these fake videos are made from simple artificial intelligence (AI) tools and open-source code that anyone with basic computer science knowledge could use.

They’ve become extremely popular on certain subreddits, like r/CelebFakes. One Redditor, deepfakeapp, even created an app that makes it easy for anyone—even those without coding knowledge—to generate their own AI-assisted porn videos, Motherboard reports. Everything that’s needed is free and easily accessible.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” Deepfakeapp (who doesn’t appear to be related to deepfakes) told Motherboard. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

A few months after Motherboard first reported the practice in December, the original deepfakes created his own subreddit named after himself. It now has more than 15,000 subscribers, who use the term ‘deepfake’ as a noun for AI-manipulated porn videos. Many of the posts uploaded to the subreddit are users asking how to create the videos while others wonder how long it takes to train the AI. Many of the posts are blurry, low-resolution first-attempts from users.

Others are much more convincing, like one video posted by Redditor Unobtrusive, which shows Jessica Alba’s face on the body of porn performer Melanie Rios, “Super quick one—just learning how to retrain my model. Around 5ish hours—decent for what it is,” the creator wrote in a comment.

Look closely at most of the videos and it’s easy to spot small glitches in the clips, like a face turned the wrong direction or certain body parts with lines running through them. Those giveaways don’t exist in other videos.

Obviously, this technology can have huge implications for women. With AI, people could effortlessly pass these off as sex tapes and revenge porn. As Motherboard points out, creations on the deepfakes subreddit have been reuploaded on websites that post nudes from hacked celebrities. One of Emma Watson was tagged as a “never-before-seen video” from her “private collection.”

Even more troubling is that these videos are only getting more convincing as the technology advances and those using it grow more skilled.

“You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced,” Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, told Motherboard. “That’s not going to stay true for more than a year or two.”

H/T Motherboard