Smart speakers have long been linked to privacy issues and hacking concerns, but researchers have recently discovered an unexpected vulnerability of these gadgets towards lasers. Usually one has to talk to these voice assistants - Amazon's Alexa, Apple's Siri and Google's Assistant - to get them to do a task.
A group of researchers at the University of Michigan and Japan's University of Electro-Communications claimed that they can also command them by shining a laser at these gadgets. They figured out they could do this silently and from hundreds of feet away, as long as they had a line of sight to the smart gadget.
This can enable anyone to attack a smart speaker from outside anyone’s house, making it do anything from playing music to opening a smart garage door to buying you stuff on internet. According to Daniel Genkin, assistant professor at the University of Michigan, the sounds of each command were encoded in the intensity of a light beam. The light would hit the diaphragm built into the smart speaker's microphone, that will make it vibrate in the same way as if someone had spoken that command.
However, it will need a lot of effort from an attacker to exploit it. One has to have specialised equipment, like a laser pointer, laser driver, sound amplifier, a telephoto lens, and more.
A list of devices that were tested and are proven to be vulnerable to such light commands includes Google Home, Google Nest Cam IQ, multiple Amazon Echo, Echo Dot, and Echo Show devices, Facebook's Portal Mini, the iPhone XR, and the sixth-generation iPad.
The computer science and electrical engineering researchers — Takeshi Sugawara at the University of Electro-Communications in Japan; and Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr at the University of Michigan — have released their findings in a paper this week. They have notified Tesla, Ford, Amazon, Apple and Google to the light vulnerability. As a response the companies said they were studying the conclusions.