U.K. advocacy group releases deepfakes of Corbyn, Johnson endorsing each other

To those who follow U.K. politics closely, two very surprising videos appeared on the internet today.

One featured Prime Minister Boris Johnson endorsing opposition leader Jeremy Corbyn for the upcoming election.

The second video had Corbyn doing the same thing for the sitting prime minister.

This isn’t some strange kind of quid pro quo between the two party leaders, though you might be forgiven for thinking so due to the current state of U.K. politics.

Instead, they’re deepfakes; AI-generated videos that superimpose one person’s image over another, put out by think tank Future Advocacy to warn us about the dangers this technology pose to democracy.

Concerns about deepfakes being used to interfere with elections and otherwise spread misinformation are as old as the technology itself.

While the vast majority of deepfakes are actually porn right now, as the technology improves, it’s possible there will be attempts at using them for political ends.

At the same time, people are already blaming deepfakes when their preferred politicians says or does something they disagree with, even when the video footage is verifiably real.

Future Advocacy wants to raise awareness of the technology itself and its potential to spread disinformation, interfere with elections, and otherwise prevent people from knowing what’s really happening in the world.

They also want to promote dialogue about regulation.

Having managed to get the phrase “deepfake” trending on Twitter, with people advocating for better education on the subject, they have successfully met at least part of that goal.

However, not everyone is impressed by the quality of work on Future Advocate’s deepfakes.

Rather than using technology to imitate Johnson and Corbyn’s voices, they hired impressionists to do the job, and many people found their work far from plausible.

Others find the effect of the videos too plausible, and—while acknowledging that the voices are far from perfect—point out that this won’t actually stop people from being taken by them.

Twitter user @elmyra, who studies deepfakes and the way they’re used to create pornography, explained how easy it would be to edit these videos to hide their deepfake origins.

The fact that no watermark appears in the first part of the videos while the fake politicians are endorsing each other, presumably to enhance their impact by making them more plausible prior to the reveal, shows it all too easy to also just crop any kind of reveal out.

While Future Advocacy raised an important issue for the future of politics, it’s entirely possible that in the short term they have created a little more confusion.

READ MORE: 

Siobhan Ball

Siobhan Ball

Siobhan Ball is a historian, archivist, and journalist. She also writes for Autostraddle and bi.org