Friday, Apr 19, 2024 | Last Update : 02:45 PM IST

  Technology   In Other news  06 Nov 2019  Siri, Google Assistant and Amazon Echo can be hacked using lasers; Here's how

Siri, Google Assistant and Amazon Echo can be hacked using lasers; Here's how

THE ASIAN AGE.
Published : Nov 6, 2019, 12:46 pm IST
Updated : Nov 6, 2019, 12:46 pm IST

These attacks called ‘Light Commands’ can be misused to unlock doors, shop on websites and start vehicles.

While clever, it isn’t easy for attackers to successfully pull off.  (Photo: 9to5Google)
 While clever, it isn’t easy for attackers to successfully pull off. (Photo: 9to5Google)

A new flaw in Amazon Echo’s working allows it to detect commands sent from lasers, catching sounds that are inaudible to the human ear. These can be misused to unlock doors, shop on websites and start vehicles.

These attacks called ‘Light Commands’ were discovered by researchers from Tokyo’s University of Electro-communications and University of Michigan. The attacks employ acoustic signals into the target microphones from as far as 110 meters and take advantage of a vulnerability of MEMS microphones to respond to light mistaking it for sound.

“By modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” the researchers pointed out.

Here's a video explaining how this works:

While clever, it isn’t easy for attackers to successfully pull off. The method fails when the target microphones are not in the laser’s direct line of sight, or when they are protected by locked devices, trusted voice recognition, etc.

Moreover, setting up the required equipment alone will cost a potential attacker a lot of money, with a laser pointer, laser driver, sound amplifier and a good telephoto lens required for the process.

Tags: light commands, siri, google assistant, amazon echo