A deepfake video featuring Emma Watson

wakashaka12/Reddit

This may have crossed the legal line.

For weeks, people have been using a controversial machine-learning app to create a new kind of fake celebrity porn called “deepfakes.” Now custom versions are being sold on corners of the internet.

Users of the social news site Reddit have been offering to buy and sell deepfakes—or to make custom videos in exchange for Bitcoin “donations”—in a new subreddit called r/deepfakeservice. The group only has 250 subscribers so far, but the services offered are troubling from an ethical (and legal) standpoint.

Offering redhead deepfakes by donation (or free, as long as they’re redheads),” promised one post by someone with the username “pm-me-redheads-grool.”

“If you don’t want the video released to the subreddit, it’ll have to be covered by a donation and it’ll be added to my personal library,” the user wrote. “I can do non-redheads for payment, but my preference is suuuuper leaning towards redheads and they will be prioritized even if I don’t get paid.”

There are only a few posts offering these services so far. The typical post is just a request for deepfakes of a specific celebrity, sometimes with the name of a porn performer who could wear that celebrity’s face.

Vice’s Motherboard looked into the legality of buying and selling fake celebrity porn and concluded that, although it’s still a grey area, the exchange of money means deepfakes are making “commercial use” of the likenesses of celebrities and porn stars. Profiting from the unauthorized use of someone’s image is a major copyright violation in the U.S., and it makes the victims’ cases much stronger.

That’s why posters at Reddit’s central repository for deepfakes (which now has 90,000 subscribers) frown upon the upstart “services” forum.

“Ok listen up Fuckers, stop trying to sell/buy deepfakes from other people,” reads one popular post on r/deepfakes. “We established this since the sub was created. Now Motherboard (of course) is rightfully calling it out. Cut the crap or you might just bring the rest of us down with you. Make your own deepfakes, or use the request thread.”

Deepfakes creators are walking a thin and untested legal line, and they’re wary of adding legal risk to the already fraught proposition of editing famous people’s faces onto pirated porn. Still, the trend is to tiptoe ever closer to what they imagine is the legal line, without stepping over.

Although Deepfakes started with A-list celebrities like Emma Watson, Daisy Ridley, and Taylor Swift, Reddit horndogs are starting to make porn of less-famous women, including YouTubers and Fox News personalities. Recent targets include YouTubers Reina Scully, Anna Akana, and Geo Antoinette; Twitch streamer Pokimane; Fox Business host Maria Bartiromo; alt-right vlogger Lauren Southern; popular ASMR YouTuber Gibi, and Lia Marie Johnson, who you may remember from the Fine Bros’ “Kids React” videos.

There have also been requests for deepfakes of friends, classmates, and even someone’s cousin. These more brazen videos are still tough for deepfakers to make because it’s easier to find high-quality footage of stars and YouTube personalities than of “civilians.” If these requests are being filled, the creators are being careful not to share them on Reddit.

Update: 4:12pm, Feb. 7: Reddit announced on Wednesday that it is banning fake celebrity porn, including “deepfakes.” All subreddits related to deepfakes have been closed.

H/T Motherboard

Jay Hathaway

Jay Hathaway

Jay Hathaway is a senior writer, specializing in internet memes and weird online culture. He previously served as the Daily Dot’s news editor, was a staff writer at Gawker, and edited the classic websites Urlesque and Download Squad. His work has also appeared on nymag.com, suicidegirls.com, and the Morning News.

Debug
Can anyone stop the rise of fake celebrity porn?
PornHub says it is working to take down the AI-made videos.
From Our VICE Partners