- TikTok users jokingly wear big hats to sneak snacks into movie theaters 1 Week Ago
- Why today’s new facially recognition bill is being called ‘woefully’ inadequate Today 3:15 PM
- Facebook has given more user data to the government than ever before Today 2:57 PM
- Instagram included in Facebook transparency report for the first time Today 1:46 PM
- PayPal pulls out of Pornhub, leaving sex workers to consider cryptocurrency Today 1:46 PM
- Billionaires are resorting to making racist jokes against Warren now Today 1:30 PM
- What is the meme of the decade? Today 1:07 PM
- At least 5 employees resign from GitHub, citing ICE contract Today 12:57 PM
- The ‘Sonic the Hedgehog’ redesign was led by a ‘Sonic’ artist Today 12:17 PM
- The 16-inch MacBook Pro is a beast, and it has a decent keyboard Today 11:24 AM
- This group is scanning thousands of faces in Congress today to protest facial recognition Today 11:09 AM
- Why is everyone debating Pete Buttigieg’s Medicare for All stance? Today 10:47 AM
- The Motorola Razr is a foldable homage to millennial nostalgia Today 10:22 AM
- The ‘I’m baby’ meme gets much more literal on TikTok Today 10:20 AM
- MrDeadMoth avoids jail time for assaulting pregnant partner during live stream Today 9:21 AM
Researchers have discovered how to use lasers to control devices with voice-activated smart assistants.
The trick was first uncovered last year after cybersecurity researcher Takeshi Sugawara pointed a laser at his iPad’s microphone and found that the device interpreted the light as sound.
Since then, Sugawara teamed up with researchers from the University of Michigan to see just how far the trick could be taken.
In a recently released paper, the group detailed how they used lasers to send voice commands to devices including the Amazon Echos, the Google Home, and the Facebook Portal. They were also able to control iOS and Android devices as well.
The attack works because a smart device’s microphone interprets light similarly to sound. By altering a laser’s intensity, the team was able to match the frequency of a human voice.
In one of their first tests, the researchers sent silent commands to 16 different voice-activated devices from as far away as 164 feet. In another test, that range was extended to 361 feet, although the only devices to respond were the Google Home and a first-generation Echo Plus.
Showcasing the trick’s versatility, the team showed how a laser could be used to convince a Google Home that it had been asked to open a garage door.
The team also says the lasers, which can easily penetrate windows, can even be used to order a smart device to make online purchases.
Given the nature of the attack, a victim would only know they had been targeted if they noticed a laser dot on their device. But the researchers say an infrared laser, which is invisible to the naked eye, could be used instead.
While a smart device will often respond audibly when given an order, potentially alerting a victim that their device had been activated, the team also notes that a hacker could send a command to mute the device before carrying out further attacks.
Although the chances that the average person will be targeted by a laser-wielding hacker are extremely low, both Google and Amazon stated to Wired that they are taking the team’s research seriously.
It remains uncertain though whether companies will be able to fix the vulnerability. The researchers have suggested that a light-blocking shield around the microphone could be one solution.
For now, just don’t place your Amazon Echo or Google Home next to the living room window.
- Researchers expose how Amazon Echo and Google Home can steal passwords
- Some of your Amazon Echo data is being kept indefinitely, even after you delete it
- 1 in 5 U.S. adults have access to a smart speaker, study finds
Mikael Thalen is a tech and security reporter based in Seattle, covering social media, data breaches, hackers, and more.