bootynabber/Reddit (Fair Use)

One by one, sites are cracking down on fake celebrity porn videos.

Twitter is the latest website to crack down on “deepfakes,” porn videos that use machine learning to put celebrities’ faces onto porn performers’ nude bodies.

In a statement to Mashable, Twitter confirmed that deepfakes violate its policy against posting “intimate photos or videos of someone that were produced or distributed without their consent.” Accounts that violate the policy will be suspended, and the tweets will be removed.

Deepfakes, a phenomenon that started on Reddit and has also gained traction on 4chan, only recently made their way to Twitter. Since GIF-sharing site Gfycat and major porn streaming provider PornHub both banned deepfakes last week, celebrity porn creators have been looking for new places to host their wares. One Redditor announced on Tuesday that he was launching a deepfakes Twitter account. It took less than a day for Twitter to shut it down.

The legality of deepfakes is still unclear and untested. They raise issues around porn studios copyrights, celebrities’ rights to control how their likenesses are used, and “revenge porn” laws that ban non-consensual posting of sexual images. Until there’s legislation or a controlling court case, it’s still up to websites to decide whether deepfakes violate their terms of service.

Twitter has made its decision, but enforcing it is a different matter. The social network has been plagued with Nazis, white supremacists, and other organized harassment groups, and has struggled mightily to shut them down. It might not matter whether Twitter can actually get rid of deepfakes posters, though. As we saw with Gfycat and PornHub, an official ban and a few suspended accounts could be enough to send celebrity porn producers to a website with looser rules.

Parsec
From Our VICE Partners

Pure, uncut internet. Straight to your inbox.