- The horror game banned for mocking China’s president probably isn’t coming back 4 Months Ago
- Cheap vibrators, condoms, and lube: The most satisfying Amazon Prime Day deals 4 Months Ago
- George R.R. Martin says fan backlash won’t affect his ‘Game of Thrones’ ending 4 Months Ago
- The very finest Area 51 memes Today 2:52 PM
- Tweet map ranks states where people are boycotting Amazon Prime Day Today 1:54 PM
- Lil Nas X says he will perform at Area 51 for free Today 12:56 PM
- The best Prime Day deals for gamers Today 12:53 PM
- How Republicans are dancing around Trump’s racist tweets Today 12:42 PM
- Not even anti-immigrant groups are defending Trump’s ‘go back’ tweets Today 12:37 PM
- Netflix’s latest chase thriller ‘Point Blank’ lacks electricity Today 12:27 PM
- Jay Inslee floats Megan Rapinoe as his secretary of state pick Today 11:33 AM
- The cast list for the ‘Kingsman’ prequel movie looks totally nuts Today 11:17 AM
- The best Prime Day deals to heat up your kitchen Today 11:16 AM
- YouTuber Emily Hartridge killed in electric scooter crash Today 10:50 AM
- Is Lashana Lynch really playing 007 in the new Bond movie? Today 10:33 AM
It’s easier and harder than you think.
Now more than ever, a certain level of sophistication is needed to navigate the various sources of information on the internet. Compounding the issue is the rise of the “deepfake,” extraordinarily realistic-looking videos that often feature public figures doing and saying things they never actually did. So far, their uses have been relatively benign, so far as we know, but the potential for abuse is alarming, to say the least. Here’s everything you need to know spot a deepfake video.
I've gone down a black hole of the latest DeepFakes and this mashup of Steve Buscemi and Jennifer Lawrence is a sight to behold pic.twitter.com/sWnU8SmAcz— Mikael Thalen (@MikaelThalen) January 29, 2019
What is a deepfake?
A deepfake is a video created using a type of machine learning to pick up the facial movements of one person and graft them onto another person making similar gestures. The term comes from the practice’s affiliation with a popular Reddit user called Deepfakes.
Private citizens have less to worry about because these algorithms require loads of material from a subject to be able to replicate them with accuracy. But we’ve already seen the technology to create fake celebrity porn, and if you’re talking about the possibility of swaying an election by attributing fake messages or deeds to a candidate, it has the possibility to affect a great many through electoral influence.
- Latest deepfakes show how Nicolas Cage and David Schwimmer are nearly identical
- Terrifying deepfake combines Donald Trump and Mr. Bean
Members of Congress have recognized the danger. In September 2018, U.S. Reps. Adam Schiff (D-Calif.), Stephanie Murphy (D-Fl.), and Carlos Curbelo (R-Fl.) sent a letter to the Director of National Intelligence requesting a report to Congress on “the implications of new technologies that allow malicious actors to fabricate audio, video, and still images.”
How to spot a deepfake video
Some of the prominent examples of deepfakes that have made the rounds are not without their giveaways to the discerning eye. Obviously, President Donald Trump doesn’t have Mr. Bean’s eyes.
Still, it helps to have a set of guidelines for those who don’t watch online videos as a career.
One such tell, picked up on by Siwei Lyu, a professor at University at Albany, is blinking.
Because the machine learning depends on the availability of images of a public figure—and famous people are seldom photographed with their eyes shut—deepfakes struggle depicting convincing images of people with their eyes closed.
What’s more, subjects depicted in deepfake videos often blink far less often than humans do in real life. According to Lyu, this method has a 95 percent detection rate. That said, the technology behind deepfakes and sophistication of those who want to employ them will only improve, so it will remain a struggle for fighting against the forces of public manipulation.
Signs you are watching a deepfake video
Jonathan Hui, a writer who specializes in deep learning, pinpoints a few things to look for in possible deepfakes. Once a viewer slows a video down, they should look for the following things:
- Blurring evident in the face but not elsewhere in the video
- A change of skin tone near the edge of the face
- Double chins, double eyebrows, or double edges to the face
- Whether the face gets blurry when it’s partially obscured by a hand or another object
Can artificial intelligence spot deepfakes?
Some are attempting to build safeguards against the coming proliferation of deepfakes, using artificial intelligence to combat forgeries online.
For example, the AI Foundation has developed Reality Defender, a program that runs alongside other online applications, identifying potentially fake media.
Researchers at Germany’s Technical University of Munich developed an algorithm called XceptionNet that purports to that spots fake videos online, so they can be flagged and removed. Of course, at the speed with which damning false reports can fly, it will take a lot of diligence from laypeople online in the future to make sure damage cannot be done before one of these programs can out a fake.
- Deepfakes 2.0: The terrifying future of AI and fake news
- Why it’s harder to spot a deepfake once it goes viral
- Jennifer Buscemi is the deepfake that should seriously frighten you
Additionally, because social media uploads tend to compress files, these tactics lose efficacy when clips originate on Twitter or Facebook.
At a panel at South by Southwest in March, Matthew Stamm, an assistant professor at Drexel University, discussed the difficulty. “There’s a lot of image and video authentication techniques that exist but one thing at which they all fail is at social media,” he said.
These techniques look for “really minute digital signatures” that are embedded in video files, Stamm added. But when a video file is shared on social media, it’s shrunken down and compressed. These processes are “forensically destructive and wipe out tons of forensic traces,” he said.
In the end, we might just be forced to rely on our own BS detector. But if the fake news epidemic of 2016 taught us anything, we are in for a rough ride.