Researchers have discovered how to use lasers to control devices with voice-activated smart assistants.
The trick was first uncovered last year after cybersecurity researcher Takeshi Sugawara pointed a laser at his iPad’s microphone and found that the device interpreted the light as sound.
Since then, Sugawara teamed up with researchers from the University of Michigan to see just how far the trick could be taken.
In a recently released paper, the group detailed how they used lasers to send voice commands to devices including the Amazon Echos, the Google Home, and the Facebook Portal. They were also able to control iOS and Android devices as well.
The attack works because a smart device’s microphone interprets light similarly to sound. By altering a laser’s intensity, the team was able to match the frequency of a human voice.
In one of their first tests, the researchers sent silent commands to 16 different voice-activated devices from as far away as 164 feet. In another test, that range was extended to 361 feet, although the only devices to respond were the Google Home and a first-generation Echo Plus.
Showcasing the trick’s versatility, the team showed how a laser could be used to convince a Google Home that it had been asked to open a garage door.
The team also says the lasers, which can easily penetrate windows, can even be used to order a smart device to make online purchases.
Given the nature of the attack, a victim would only know they had been targeted if they noticed a laser dot on their device. But the researchers say an infrared laser, which is invisible to the naked eye, could be used instead.
While a smart device will often respond audibly when given an order, potentially alerting a victim that their device had been activated, the team also notes that a hacker could send a command to mute the device before carrying out further attacks.
Although the chances that the average person will be targeted by a laser-wielding hacker are extremely low, both Google and Amazon stated to Wired that they are taking the team’s research seriously.
It remains uncertain though whether companies will be able to fix the vulnerability. The researchers have suggested that a light-blocking shield around the microphone could be one solution.
For now, just don’t place your Amazon Echo or Google Home next to the living room window.
- Researchers expose how Amazon Echo and Google Home can steal passwords
- Some of your Amazon Echo data is being kept indefinitely, even after you delete it
- 1 in 5 U.S. adults have access to a smart speaker, study finds