Lasers Can Take Over Voice Assistant Systems From Long Distances, Research Finds
By pointing a laser or even a flashlight into the microphone of a Google Home, Siri or Alexa system, the researchers were able to control the devices and the systems connected to them.
- By Haley Samsel
- Nov 06, 2019
Security researchers in Japan and the University of Michigan discovered a startling flaw in voice-controlled assistant systems that revealed how easily devices like Siri, Alexa and Google Home could be manipulated.
In a paper published on Monday, cybersecurity experts shared details of how they were able to use easily available laser pointers, and in some cases flashlights, to take over Amazon, Google and Apple digital assistants from hundreds of feet away.
Some examples include opening a garage door by pointing a laser at a voice assistant connected to the system, and even climbing to the top of a bell tower at the University of Michigan to manipulate a Google Home in an office building 230 feet away, The New York Times reported.
The longest distance that the researchers were able to control a voice assistant was more than 350 feet away, showcasing a glaring vulnerability in the systems.
“This opens up an entirely new class of vulnerabilities,” Kevin Fu, a computer science professor at the University of Michigan, told the Times. “It’s difficult to know how many products are affected, because this is so basic.”
All companies affected by the issue, including Tesla, Ford, Amazon, Apple and Google, were alerted to the light vulnerability prior to the release of the paper. Each corporation said they were studying the issues detailed in the research.
Perhaps the most concerning aspect of the report is that by taking over the digital assistant systems, hackers would have the ability to access and control any systems connected to a Google Home or similar product. The researchers pointed out that they could have unlocked cars or started vehicles remotely if they were connected to the devices.
To fix the issue, most microphones on the systems would need to be redesigned because covering the mic with a piece of tape does not address the problem. Dirt shields on several microphones were not able to block the lasers and the commands, according to Fu.
There is no indication that lasers or flashlights have been used to carry out cyberattacks or takeovers of the devices, according to the researchers. As tech companies assess the problem, experts advise users of voice-controlled assistants to move their devices away from areas where it can be seen from the outside and limit the number of systems connected to them.
“This is the tip of the iceberg,” Fu said. “There is this wide gap between what computers are supposed to do and what they actually do. With the internet of things, they can do unadvertised behaviors, and this is just one example.”
About the Author
Haley Samsel is an Associate Content Editor for the Infrastructure Solutions Group at 1105 Media.