How to hack Alexa: a guidance from researchers
By now everyone knows how the voice assistants work, right? You just need to talk to them and they will do what you want. As simple as that. But the group of smart researchers tried out a new technique and concluded that you can also command them by directing laser light at smart speakers and other gadgets that manage virtual helpers like Amazon‘s Alexa, Apple’s Siri and Google’s Assistant.
The experiment was conducted by the researchers from the University of Michigan and the Japan’s University of Electro-Communications. They found out that until you have a line sights to the smart gadgets, you can control their actions regardless of distance (well, better if it is in the range of about 100 feet).
What does that mean? That proved that anyone with enough motivation and money can attack your virtual assistant even outside your house making it do anything (including buying stuff from Amazon or opening your garage door).
These guys decided to go even further and encoded a command in a light. So, depending on the intensity of the light beam, the virtual assistant could get various commands without anyone saying anything. The key to that was the diaphragm built into the smart speaker’s microphone which caused it to vibrate in the same way as if someone was actually talking to it.
Researchers tested several devices: Google Home, Google Nest Cam IQ, multiple Amazon Echo, Echo Dot, and Echo Show devices, Facebook’s Portal Mini, the iPhone XR, and the sixth-generation iPad. And turned out that all of those are vulnerable to such light commands.
The Apple devices though were a hard nut to crack. The researchers had to put effort into tricking the device’s privacy settings. All other devices from the list above don’t typically come with the user authentication features turned on by default.
Why should it be relevant to us?
After the results of the research were published, both companies and consumers started worrying about the safety of buying and respectively selling voice assistance devices. As before that, everybody perceived it more like a fun toy that is absolutely harmless and can not lead to any serious consequences. That is why the market for assistant-using smart speakers has ballooned over the past five years.
According to data published, there were about 26.1 million smart speakers shipped during only the second quarter of the year. Just Amazon shipped a quarter of these speakers which equals 6.6 million between April and June 2019.
Anyone wanting to hack someone’s voice assistance device can do that for just $400. The calculations are easy: a $20 laser pointer, a $339 laser driver, and a $28 sound amplifier.
One of the researchers from the University of Michigan, Benjamin Cyr, said:
“If you have a laser that can shine through windows and across long distances — without even alerting anyone in the house that you’re hitting the smart speaker — there’s a big threat in being able to do things a smart speaker can do without permission of the owner.”
The research showed that the Google Home device and first-generation Echo Plus are the easiest to command as you can send a signal from the longest distance of 110 meters.
Safety measures
Even though the researchers themselves stated that they haven’t heard of this security issue being taken advantage of, it is better to be safe than sorry, right?
The best thing you can do to avoid any issues happening to you and your home devices is to make sure that no one from outside the house can see your smart speaker.
The thing the device manufacturers should do though is to redesign the MEMS microphones built into these devices. However, that is a rather long and complicated process.
One of the researchers said that he already contacted Google, Apple, Amazon, and other companies to notify them about their security issues.
Comments (0 comment(s))