Ars Technica is (or rather, researchers they’re reporting on are) quietly taking over Alexa and other smart-home devices with inaudible – and sometimes invisible – pulses of laser light that for some reason are being interpreted as sound by the gadgets’ built-in microphones:
Shining a low-powered laser into these voice-activated systems allows attackers to inject commands of their choice from as far away as 360 feet (110m). Because voice-controlled systems often don’t require users to authenticate themselves, the attack can frequently be carried out without the need of a password or PIN. Even when the systems require authentication for certain actions, it may be feasible to brute force the PIN, since many devices don’t limit the number of guesses a user can make. Among other things, light-based commands can be sent from one building to another and penetrate glass when a vulnerable device is kept near a closed window.
The attack exploits a vulnerability in microphones that use micro-electro-mechanical systems, or MEMS. The microscopic MEMS components of these microphones unintentionally respond to light as if it were sound. While the researchers tested only Siri, Alexa, Google Assistant, Facebook Portal, and a small number of tablets and phones, the researchers believe all devices that use MEMS microphones are susceptible to Light Commands attacks.
“We find that VC systems are often lacking user authentication mechanisms, or if the mechanisms are present, they are incorrectly implemented (e.g., allowing for PIN bruteforcing),” the researchers wrote in a paper titled Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems. “We show how an attacker can use light-injected voice commands to unlock the target’s smart-lock protected front door, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) if the vehicles are connected to the target’s Google account.”
One of the researchers’ attacks successfully injected a command through a glass window 230 feet away. In that experiment, a VC device was positioned next to a window on the fourth floor of a building, or about 50 feet above the ground. The attacker’s laser was placed on a platform inside a nearby bell tower, located about 141 feet above ground level. The laser then shined a light onto the Google Home device, which has only top-facing microphones.
In a different experiment, the researchers used a telephoto lens to focus the laser to successfully attack a VC device 360 feet away. The distance was the maximum allowed in the test environment, raising the possibility that longer distances are possible.
The findings, the researchers wrote, identify a “semantic gap between the physics and specification of MEMS (microelectro-mechanical systems) microphones, where such microphones unintentionally respond to light as if it was sound.” The researchers are still determining precisely what causes MEMS microphones to respond this way. The microphones convert sound into electrical signals, but as the research demonstrates, the microphones also react to light aimed directly at them. By modulating the amplitude of a laser light, attackers can trick microphones into producing electrical signals as if they are receiving a specific audio sound, such as “Alexa, turn volume to zero” or “Siri, visit Ars Technica dot com.”
You can read the Light Commands paper here.