- Animator for Netflix’s ‘Carmen Sandiego’ says he was fired after asking for fair pay Sunday 3:17 PM
- YouTube reverses decision to remove creators’ badges Sunday 1:47 PM
- How video game developer Valve got served secret subpoena as part of FBI’s counterterrorism fight Sunday 12:31 PM
- Aron Eisenberg, ‘Star Trek: Deep Space Nine’ actor, dead at 50 Sunday 11:35 AM
- Who needs glass slippers? This Cinderella cosplayer upgraded with a stunning glass arm Sunday 10:19 AM
- How to check if Yahoo owes you $358 Sunday 9:25 AM
- How to stream Bears vs. Redskins on Monday Night Football Sunday 7:00 AM
- What are the best alternatives to the electoral college? Sunday 6:30 AM
- The best PS4 games you can’t play anywhere else Sunday 6:00 AM
- How to watch the 2019 Emmy Awards Sunday 5:00 AM
- How to stream ‘Power’ season 6, episode 5 Sunday 4:00 AM
- Former developer at software company deletes his code to protest its ties to ICE Saturday 4:21 PM
- A mysterious website is doxing Hong Kong protesters and journalists Saturday 1:44 PM
- The best ‘Skyrim’ followers and how to get them Saturday 1:26 PM
- Why Joel Osteen gets cyberbullied every time Houston floods Saturday 12:40 PM
Now more than ever, a certain level of sophistication is needed to navigate the various sources of information on the internet. Compounding the issue is the rise of the “deepfake,” extraordinarily realistic-looking videos that often feature public figures doing and saying things they never actually did. So far, their uses have been relatively benign, so far as we know, but the potential for abuse is alarming, to say the least. Here’s everything you need to know spot a deepfake video.
I've gone down a black hole of the latest DeepFakes and this mashup of Steve Buscemi and Jennifer Lawrence is a sight to behold pic.twitter.com/sWnU8SmAcz— Mikael Thalen (@MikaelThalen) January 29, 2019
What is a deepfake?
A deepfake is a video created using a type of machine learning to pick up the facial movements of one person and graft them onto another person making similar gestures. The term comes from the practice’s affiliation with a popular Reddit user called Deepfakes.
Private citizens have less to worry about because these algorithms require loads of material from a subject to be able to replicate them with accuracy. But we’ve already seen the technology to create fake celebrity porn, and if you’re talking about the possibility of swaying an election by attributing fake messages or deeds to a candidate, it has the possibility to affect a great many through electoral influence.
- Latest deepfakes show how Nicolas Cage and David Schwimmer are nearly identical
- Terrifying deepfake combines Donald Trump and Mr. Bean
Members of Congress have recognized the danger. In September 2018, U.S. Reps. Adam Schiff (D-Calif.), Stephanie Murphy (D-Fl.), and Carlos Curbelo (R-Fl.) sent a letter to the Director of National Intelligence requesting a report to Congress on “the implications of new technologies that allow malicious actors to fabricate audio, video, and still images.”
How to spot a deepfake video
Some of the prominent examples of deepfakes that have made the rounds are not without their giveaways to the discerning eye. Obviously, President Donald Trump doesn’t have Mr. Bean’s eyes.
Still, it helps to have a set of guidelines for those who don’t watch online videos as a career.
One such tell, picked up on by Siwei Lyu, a professor at University at Albany, is blinking.
Because the machine learning depends on the availability of images of a public figure—and famous people are seldom photographed with their eyes shut—deepfakes struggle depicting convincing images of people with their eyes closed.
What’s more, subjects depicted in deepfake videos often blink far less often than humans do in real life. According to Lyu, this method has a 95 percent detection rate. That said, the technology behind deepfakes and sophistication of those who want to employ them will only improve, so it will remain a struggle for fighting against the forces of public manipulation.
Signs you are watching a deepfake video
Jonathan Hui, a writer who specializes in deep learning, pinpoints a few things to look for in possible deepfakes. Once a viewer slows a video down, they should look for the following things:
- Blurring evident in the face but not elsewhere in the video
- A change of skin tone near the edge of the face
- Double chins, double eyebrows, or double edges to the face
- Whether the face gets blurry when it’s partially obscured by a hand or another object
Can artificial intelligence spot deepfakes?
Some are attempting to build safeguards against the coming proliferation of deepfakes, using artificial intelligence to combat forgeries online.
For example, the AI Foundation has developed Reality Defender, a program that runs alongside other online applications, identifying potentially fake media.
Researchers at Germany’s Technical University of Munich developed an algorithm called XceptionNet that purports to that spots fake videos online, so they can be flagged and removed. Of course, at the speed with which damning false reports can fly, it will take a lot of diligence from laypeople online in the future to make sure damage cannot be done before one of these programs can out a fake.
- Deepfakes 2.0: The terrifying future of AI and fake news
- Why it’s harder to spot a deepfake once it goes viral
- Jennifer Buscemi is the deepfake that should seriously frighten you
Additionally, because social media uploads tend to compress files, these tactics lose efficacy when clips originate on Twitter or Facebook.
At a panel at South by Southwest in March, Matthew Stamm, an assistant professor at Drexel University, discussed the difficulty. “There’s a lot of image and video authentication techniques that exist but one thing at which they all fail is at social media,” he said.
These techniques look for “really minute digital signatures” that are embedded in video files, Stamm added. But when a video file is shared on social media, it’s shrunken down and compressed. These processes are “forensically destructive and wipe out tons of forensic traces,” he said.
In the end, we might just be forced to rely on our own BS detector. But if the fake news epidemic of 2016 taught us anything, we are in for a rough ride.