, ,

Most deepfakes are nonconsensual porn, not political

Despite repeated concerns that deepfake videos could be used to spread disinformation and disrupt elections, a new report found that the technology is overwhelmingly used to create porn.

The report, published Monday by Netherlands-based cybersecurity company Deeptrace, found that porn-related content accounted for 96% of all deepfakes online.

Described as “nonconsensual deepfake pornography,” the top four largest deepfake porn sites have garnered more than 134 million video views since December of 2018. The technology was also used 100% of the time to place the faces of females onto the bodies of female porn stars.

Deeptrace further notes that the prevalence of deepfakes has nearly doubled online during the last seven months to 14,678 videos.

While the report does note two incidents in Gabon and Malaysia where deepfakes were linked to an “alleged government cover-up and a political smear campaign,” porn continues to be the major driver behind the technology’s expansion.

The emergence of deepfake technology online was tied to porn as well. A now-banned subreddit first began putting Hollywood celebrities into porn videos in 2017.

Deeptrace attributes the technology’s significant spread since then to “the growing commodification of tools and services that lower the barrier for non-experts.”

A free tool released in June known as “DeepNude” allowed users to utilize artificial intelligence to remove women’s clothing from photos. Although the tool was eventually pulled offline, copycat versions quickly sprung up in its place.

A Chinese deepfake app known as ZAO also became popular recently for allowing anyone to place their face onto a celebrity’s with just a single selfie.

Even with the majority of deepfakes being related to porn, most of the public efforts to combat the technology’s misuse has centered on issues related to politics.

READ MORE:

H/T TNW