- TikTok signs licensing agreement with Merlin 2 Months Ago
- Anime film ‘NiNoKuni’ falls apart with flimsy plotting 2 Months Ago
- Cop who called for boycott of Beyoncé’s Super Bowl performance now says he’s Black Today 11:12 AM
- Uber, Lyft dragged for surging prices during mass shooting Today 11:06 AM
- The legacies of colonialism loom in Netflix’s new horror show ‘Ares’ Today 10:41 AM
- College student arrested in China after tweeting about Xi Jinping Today 10:37 AM
- YouTuber ImJayStation accused of faking the death of his girlfriend for views Today 10:23 AM
- Twitter sends cease-and-desist letter to facial recognition firm scraping its images Today 10:01 AM
- A CNN analyst’s impeachment joke sparks fake news fury Today 9:08 AM
- Patrick Stewart invited Whoopi Goldberg to join ‘Star Trek: Picard’ season 2 Today 8:26 AM
- Dolly Parton inspires ‘LinkedIn, Facebook, Instagram, Tinder’ meme Today 8:12 AM
- ‘Star Trek: Picard’ episode 1 recap: A glimpse into a troubled future Today 8:00 AM
- ‘Captain Marvel 2’ movie in the works with new screenwriter Today 7:11 AM
- Fortune Feimster embraces the past and present in celebratory ‘Sweet & Salty’ Today 7:00 AM
- Review: ‘Star Trek: Picard’ is a triumphant return for Patrick Stewart Today 5:00 AM
To those who follow U.K. politics closely, two very surprising videos appeared on the internet today.
One featured Prime Minister Boris Johnson endorsing opposition leader Jeremy Corbyn for the upcoming election.
The second video had Corbyn doing the same thing for the sitting prime minister.
This isn’t some strange kind of quid pro quo between the two party leaders, though you might be forgiven for thinking so due to the current state of U.K. politics.
Instead, they’re deepfakes; AI-generated videos that superimpose one person’s image over another, put out by think tank Future Advocacy to warn us about the dangers this technology pose to democracy.
A video showing Boris Johnson endorsing Jeremy Corbyn for Prime Minister has just landed online, another shows Corbyn backing Johnson.— Catrin Nye (@CatrinNye) November 12, 2019
Confused? Well they’re deep fakes created by @futureadvocacy & I’ve been behind the scenes for the making of them for @VictoriaLIVE > pic.twitter.com/N5uvwsZAFU
Concerns about deepfakes being used to interfere with elections and otherwise spread misinformation are as old as the technology itself.
While the vast majority of deepfakes are actually porn right now, as the technology improves, it’s possible there will be attempts at using them for political ends.
At the same time, people are already blaming deepfakes when their preferred politicians says or does something they disagree with, even when the video footage is verifiably real.
Future Advocacy wants to raise awareness of the technology itself and its potential to spread disinformation, interfere with elections, and otherwise prevent people from knowing what’s really happening in the world.
They also want to promote dialogue about regulation.
Having managed to get the phrase “deepfake” trending on Twitter, with people advocating for better education on the subject, they have successfully met at least part of that goal.
An interesting example of a #deepFake video along which illustrates some of the potential implications. This is the kind of thing we need to be discussing in schools however the challenge is to find a space in the already busy curriculum. https://t.co/IpfqOpKKuJ— gary henderson (@garyhenderson18) November 12, 2019
#Deepfake of UK prime minister Boris Johnson appearing to endorse Labour leader Jeremy Corbyn.— Vivien Boidron (@VivienBoidron) November 12, 2019
Combined with Facebook not fact checking political ads this could lead to a surprising #BrexitVote.
Politicians have yet to address the issue of #disinformation online. https://t.co/zELCd6VdDy
However, not everyone is impressed by the quality of work on Future Advocate’s deepfakes.
I've seen better tbh, poor effort.— Soapbox Orator (@SrlUndrchvr) November 12, 2019
Rather than using technology to imitate Johnson and Corbyn’s voices, they hired impressionists to do the job, and many people found their work far from plausible.
The voices are obviously wonky.— Jonathan JK Morris (@Jonathanjk) November 12, 2019
Voice accuracy is already here. Maybe they don't want a version that would be more realistic going viral, thereby making their point for them.
Literally they are pure comedy— Quack Cocaine (@BatmanInit) November 12, 2019
Anyone who has ever heard these two blokes speak will immediately recognise the voices are off.
Others find the effect of the videos too plausible, and—while acknowledging that the voices are far from perfect—point out that this won’t actually stop people from being taken by them.
Publishing deep fake videos of politicians to demonstrate the danger to #democracy is a bit like detonating a nuke to demonstrate the danger of nuclear proliferation #deepfakes https://t.co/1btbl0BbN0— Amir Tocker (@amir_t) November 12, 2019
Twitter user @elmyra, who studies deepfakes and the way they’re used to create pornography, explained how easy it would be to edit these videos to hide their deepfake origins.
Except: Boris Johnson is our Prime Minister, and *he always sounds like that*. He is beyond parody, making the rest of politics beyond parody, and your Boomer mum sharing random videos on Facebook definitely can't tell the difference.— dr elmyra (@elmyra) November 12, 2019
The fact that no watermark appears in the first part of the videos while the fake politicians are endorsing each other, presumably to enhance their impact by making them more plausible prior to the reveal, shows it all too easy to also just crop any kind of reveal out.
While Future Advocacy raised an important issue for the future of politics, it’s entirely possible that in the short term they have created a little more confusion.
Siobhan Ball is a historian, archivist, and journalist. She also writes for Autostraddle and bi.org