Light command injection attacks on smart assistant devices
Light command injection attacks on smart assistant devices

Did you know that your voice-controlled smart assistants like Google Home, Alexal, Portal and Siri could actually open your door for someone else without even asking you? In their defence, they heard the command to do so! Or to be more precise, they saw it. In the form of a laser light.

People, nowadays, are using more and more IoT devices in their homes and offices to make their lives easier by using as little physical interaction as possible.

ق€œAlexa, turn off the lights.ق€
ق€œOk Google, open my garage door.ق€
ق€œHey Siri, call my Dad.ق€

But everything comes with a drawback. Smart user are innocently exposing themselves to more threats that also require as little physical interaction as possible.

In a study by Japanese and Michigan researchers, it has been found that a modulated laser of adjusted wavelength can be used to execute remote command injection in voice-controlled (VC) systems. The researchers tested this on a number of VC devices including smart phones and tablets and found some bothering results.

Alexa google home vulnerable

Light command injection attack

It has been found that a normal laser pointer (5 mW) is enough to carry out a light command injection attack and gain full access to the target smart home device. And just a little more (60 mW of laser power) is needed to gain the same access over smartphones and tablets. All the attack equipment is cheaply available to anyone online.

light command injection attack

Attackers can focus a laser using a telephoto lens on the target device through a distance of up to 100 metres. It is important to note that this distance is only limited to the available conditions of the test. Closed glass windows can be easily penetrated through the focused laser to hit the devices with laser attacks.

Threat
Lack of user authentication leads to attackers getting access to home appliances and performing tasks like opening a garage door, starting a vehicle (if connected to voice assistant), or buying something online. From command execution to gaining full control over the compromised device, the attack implications can be serious.

The reason behind it
The MEMS microphones (micro electro mechanical system) can interpret and respond to light just as they do to sounds. This vulnerability, found in many popular in-use voice control systems can be exploited by attackers by injecting a carefully moderated laser light which the device would interpret as a voice command.

Undetected attacks
Due to no audible sound being used for the attack, the laser attack cannot be suspected by a victim. Whatق€™s more? The attack could be made almost impossible to detect by enabling the Whisper mode (such as on Alexa) so that the user cannot hear that an unauthenticated command is being executed.

Brute forcing PINs
Since some smart home devices have weak security protocols, where password lengths are not strict neither the attempts are limited. In the tests, the researchers were able to easily take out some time and brute force PINs to access important connected devices.

Vulnerable devices
Almost all devices are vulnerable that use MEMS microphone and execute commands without any further authentication.

brutforcing PINs
bypassing voice authentication on phone

What can be done?

While command injection attacks (like high frequency sounds and acoustic signal attacks) are not a new discovery for voice controlled smart devices, light commands are certainly a shocking revelation. To protect vulnerable devices, companies could impose various software and hardware restrictions to block such light based inputs from entering the appliance. But these countermeasures could not be enough to provide absolute protection against the highly proficient laser injection attacks.

All you can do is to limit what you allow such devices to access, as of now. Better to go and open the garage door yourself, right? Read the whole study done by researchers to know the technical details of these attacks.

Tell us what you think about this and share this with those who are way too dependent on Alexa and Google Home for their chores.

Stay tuned, stay safe.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*


آ�2021 Tech Brewery . All Rights Reserved.

Log in with your credentials

or

Forgot your details?

Create Account