Researchers whisper inaudible commands to Alexa and Siri – with a laser

| |

It's a scary idea: a voice speaker like an Amazon Echo or a smartphone are inconspicuously in the apartment around. Suddenly the assistant responds to an inaudible command – and opens the garage door. Or order something from a particular Amazon retailer. Researchers at the University of Michigan have now done just that with the help of a laser.

Several videos show the experiments. A device with voice assistant is bombarded with a low-frequency beam of light through which inaudible commands are transmitted. Suddenly the voice assistant of the respective manufacturer reacts and implements the command. The mesh worked on all voice speakers from Amazon Echo to Google Home. Even the few tested smartphones and tablets could be outsmarted, including quite current devices such as the iPhone XR or the Samsung Galaxy S9.

[Embed] https://www.youtube.com/watch?v=EtzP-mCwNAs [/ embed]

Researchers do not fully understand the hack itself

The fact that the commands work by laser, is due to the microphone technology, put on almost all devices with a small housing. Microelectromechanical systems, called MEMS for short, allow a compact design with high audio quality. The now discovered disadvantage: they not only react to audio stimuli, but also to light. "We still do not quite understand the physics behind it," one of the researchers told Ars Technica. their supposition: The light causes a movement on the membrane, which is then recognized as an audio signal.

The hack also works over longer distances: Up to 110 meters, the researchers had an effect, the distance is likely to increase, so their guess. In some experiments, they show how the discovery can be misused. For example, you open the garage door linked to the voice assistant from another building through glass panes. Also certain cars could be opened like that. Another abuse they call online shopping and the opening of doors.

Countermeasures possible

The manufacturers of the devices are alarmed. They would have contacted the researchers and would now rate their findings, as Amazon and Google told "Ars Technica". The researchers have already included approaches to fixing the error. Since the laser must aim directly at the microphone, one might require an audio signal from several microphones before a command is implemented. In a normal use this would almost always be the case, the function could presumably also be submitted by software update. The automatic voice recognition of the user is an additional hurdle. For example, it is automatically turned on for iPhones and Android smartphones when the wizard is set up.

However, the researchers estimate that the danger of hackers using the tricks in the wild is low. You need an appropriate laser, which can also hold an unbroken line of sight to the attacking device. He must be able to control exactly the microphone, which is not possible from many angles. Despite the silent commands, it is also difficult to stay undetected. If you do not use infrared, the laser spot can be seen on the device, and most voice assistants also respond well. Unlike the command, the execution does not take place quietly and silently.

Sources: Blogpost of the researchers, Ars Technica

Previous

Ratings: Final of «Prize of Freedom» wins

Alemany, the architect of the rebirth of Valencia, goes out the back door

Next

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.