These videos might be harder to find now, but they’re far from gone.
On Wednesday, Reddit banned “deepfakes,” the new NSFW videos that use machine learning to convincingly put celebrity faces on porn performers’ bodies. Reddit had become the main hub for deepfakes after a user created and shared an app that made the videos easy to create. But despite the bar—and similar action from sites that include Twitter and Pornhub—deepfakes are still in circulation. Their hardcore fans are sharing them on other social media sites, plus a handful of dedicated websites and hosting services, and there are no plans to stop them.
On 8chan, which broke from 4chan in 2014 in a searchfor looser rules on harassment, posters are criticizing Reddit’s decision and sharing their own videos. An anonymous poster offered this list of places where deepfakes are still being shared:
Voat, a conservative Reddit clone that has played host to banned Reddit communities in the past, is a natural home for deepfakes, whose supporters consider the videos a type of free speech.
4chan already had an “adult GIF” forum, and deepfakes are right at home there. Several 4chan posters are incensed about the Reddit ban, and they see the crackdown on deepfakes not as an ethical issue but as Reddit protecting “rich people”—i.e., the famous women whose faces are now appearing in fake porn—from a mild inconvenience. They’re also really annoyed at the “flood” of redditors arriving on 4chan in search of celebrity porn content.
“Meanwhile all sorts of other degenerate subreddits carry on. Just don’t mess with rich people, goyim!” wrote one poster, tossing in a little anti-Semitism for that distinctive 4chan flavor.
Anyone who’s against deepfakes, in their view, is a “beta male crying over mildly reprehensible shit that harms no one.”
There are also a couple of deepfakes-specific websites, some of which use the open-source forum software MyBB. These are tougher to shut down because they’d have to be hit with copyright complaints via their web-hosting companies
Perhaps more problematically, two of the most prominent sites where people are uploading deepfakes are refusing to take action.
“It’s something new and for the moment we do not see any reason to be against. If the people concerned are of age and the video presented as a fake we take this as a parody. If it hurts someone it’s always possible to send a DMCA request and we remove the content,” porn-hosting site Erome told the Daily Dot via email.
Likewise, SendVid, the other site redditors were using to upload videos, responded to the Daily Dot’s inquiry by linking to its DMCA copyright complaint page, apparently indicating that it won’t take action on deepfakes unless someone files a takedown notice.
As we saw with the 2015 “Fappening” leak of nude celebrity photos, there will always be sites willing to host nude celebrity content, whether it’s real or fake. Anyone truly determined to keep the content up can hop from host to host or find a country with favorable laws.
Perhaps the biggest thing deepfakes still have going for them is that Reddit didn’t ban “safe for work” videos, and it didn’t ban a forum about the Deepfakes app itself. Because the technology has legitimate uses, anyone can still find a download link and a tutorial on Reddit and make their own videos—whether that means porn or just splicing Nicolas Cage’s face into various movie scenes.
Already, enthusiasts have packaged up their collections of deepfakes for hosting on BitTorrent sites and other hard-to-track file-sharing services. On 4chan and 8chan, posters have said they don’t see any reason to stop making and “fapping to” deepfakes unless they’re made explicitly illegal.
Deepfakes may be driven out of brightly lit public spaces like Reddit and into the dark crevices of the web, but they’ll probably never go away completely.