Siri, Google Assistant and Amazon Echo can be hacked using lasers; Here's how
These attacks called 'Light Commands' can be misused to unlock doors, shop on websites and start vehicles.
A new flaw in Amazon Echo’s working allows it to detect commands sent from lasers, catching sounds that are inaudible to the human ear. These can be misused to unlock doors, shop on websites and start vehicles.
These attacks called ‘Light Commands’ were discovered by researchers from Tokyo’s University of Electro-communications and University of Michigan. The attacks employ acoustic signals into the target microphones from as far as 110 meters and take advantage of a vulnerability of MEMS microphones to respond to light mistaking it for sound.
“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” the researchers pointed out.
Here's a video explaining how this works:
While clever, it isn’t easy for attackers to successfully pull off. The method fails when the target microphones are not in the laser’s direct line of sight, or when they are protected by locked devices, trusted voice recognition, etc.
Moreover, setting up the required equipment alone will cost a potential attacker a lot of money, with a laser pointer, laser driver, sound amplifier and a good telephoto lens required for the process.