Hackers can silently control your Google Home, Alexa, and Siri
A group of cybersecurity researchers have discovered a clever technique for injecting commands into voice control devices - all just by shining a laser onto the targeted device instead of use words.
Named 'Light Commands', a flaw-based hack in MEMS microphones embedded in the widely used voice control systems, accidentally reacts to light as if it were sound.
According to experiments conducted by a team of researchers from Japan and Michigan University, a remote attacker standing a few meters away from the device could implicitly trigger the attack by adjusting the amplitude of the Laser light to create negative pressure waves.
"By adjusting the electrical signal according to the intensity of the beam, an attacker could trick microphones into generating electrical signals as if they were receiving real sound," the researchers said in their paper. PDF ].
Does this sound scary? Please read this section carefully
Smart voice assistants in phones, tablets and other smart devices, such as Google Home and Nest Cam IQ, Amazon Alexa and Echo, Facebook Portal, Apple Siri devices, are all susceptible to injection injection attacks. This new light-based effect.
"As such, any system that uses the MEMS microphone and operates on this data without additional user confirmation may be vulnerable to attack," the researchers said.
Because this technique ultimately allows an attacker to inject the command as a legitimate user, the impact of such an attack can be judged based on the level of access your voice assistant has on devices. device or other connected service.
Therefore, with light commands, the attacker can also hijack any digital smart system attached to the targeted voice control assistants, for example. for example:
Smart home control switch,
Open the smart garage door,
Online shopping,
Remotely unlock and launch some vehicles,
Smart unlock by sneaking brute-forcing PIN of users.
In one of their tests, researchers simply injected the command "OK Google, open the garage door" to Google Home by firing a laser beam at them. Google Home is connected to it and successfully opened a garage door.
In a second experiment, the researchers successfully issued the same command, but this time from a separate building, about 230 feet from the targeted Google Home device through a glass window.
Besides long-range devices, researchers can also test their attacks against a variety of smartphone devices that use voice assistants, including the iPhone XR, Samsung Galaxy S9, and Google Pixel. 2, but they only work at short distances.
The maximum range for this attack depends on the power of the laser, the intensity of the light, and of course, your ability to aim. In addition, physical barriers (e.g., windows) and absorption of ultrasonic waves in the air can further reduce the range of attack.
In addition, in the case of voice recognition, an attacker could defeat the speaker authentication feature by building the desired voice command record from the relevant words given by the rightful owner of the speaking device. .
According to the researchers, these attacks can be considered "easy and cheap" using a simple laser pointer (under $ 20), laser driver ($ 339) and audio amplifier ($ 28). To set it up, they also use a telephoto lens ($ 199.95) to focus the laser on long-range attacks.
How can you protect against these vulnerabilities?
Software manufacturers should provide users with an additional layer of authentication before processing commands to minimize malicious attacks.
For now, the best and most popular solution is to keep the visibility of your voice assistant devices physically blocked from outside and avoid allowing it to access things you don't want others to access.
- Data Center Network Security | Next Generation Firewall
- Help Protect Your Digital Assets Against Cyber Threats Network Security
- Network Security Software | Search Learn more Security NET
- Spam filter, Antivirus software, Proxy server | Network access control (NAC)
- Review and analysis Managed Application Support Services for AWS
Operate and exploit advertising by iCOMM Vietnam Media and Technology Joint Stock Company.
Adress: 99 Nguyen Tat Thanh, To 2, Khu 6, Thi tran Tan Phu, Tan Phu, Dong Nai.
Email: phuongtran2191@gmail.com | Tel: (+84) 984654960
Editor in chief: Tran Nha Phuong
Company: Lucie Guillot (Nha Phuong Tran)