Alexa, Siri, Cortana vulnerable to "silent hacking"

Inaudible voice commands could be used to take over virtual assistants and smart speakers, researchers find

Smart speakers from some of the industry's leading companies have been found to be vulnerable to an exploit that allows hackers to take control of a device by issuing silent commands.

Researchers at Zhejian University in China recently discovered a technique known as "DolphinAttack", which involves modulating the sounds of an individual's voice so that they are inaudible to human ears, but are still capable of issuing commands to a speaker.

The microphones built into some of the most popular home speakers, including the Amazon Echo and Google Home, and common consumer devices like tablets and laptops, are able to pick up the inaudible commands and perform tasks seemingly at random to their user.

Specifically, the attack is able to send commands using ultrasonic frequencies in a range above 20,000hz, something that a human voice can't possibly recreate or listen out for.

Advertisement
Advertisement - Article continues below

In lab tests at the university, researchers were able to demonstrate the trick with smart assistants developed by Google, Amazon, Apple, Microsoft, and Huawei. The tests allowed researchers to not only turn on each device, but activate functions such as airplane mode, opening websites and more importantly, interact with any smart home device linked to the speaker.

Smart devices are becoming increasingly popular additions to homes, however the vulnerability raises concerns that appliances such as smart locks or connected thermostats could be silently controlled by a hacker.

Unfortunately, the process of modulating a voice and playing back audio at high frequencies can be performed using various devices that cost as little as $3, according to the researchers. However, in order for the process to work, a hacker needs to be within six feet of the targeted speaker and to go entirely undetected, the speaker would need to be unlocked first, otherwise a tone would alert the user to a command.

The researchers recommend device manufacturers and developers should ensure that commands issued at frequencies that are impossible for humans to communicate in are ignored.

Featured Resources

The IT Pro guide to Windows 10 migration

Everything you need to know for a successful transition

Download now

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Software-defined storage for dummies

Control storage costs, eliminate storage bottlenecks and solve storage management challenges

Download now

6 best practices for escaping ransomware

A complete guide to tackling ransomware attacks

Download now
Advertisement

Recommended

Visit/security/354156/google-confirms-android-cameras-can-be-hijacked-to-spy-on-you
Security

Google confirms Android cameras can be hijacked to spy on you

20 Nov 2019

Most Popular

Visit/security/identity-and-access-management-iam/354289/44-million-microsoft-customers-found-using
identity and access management (IAM)

44 million Microsoft customers found using compromised passwords

6 Dec 2019
Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/network-internet/wifi-hotspots/354283/industrial-wi-fi-6-trial-reveals-blistering-speeds
wifi & hotspots

Industrial Wi-Fi 6 trial reveals blistering speeds

5 Dec 2019
Visit/hardware/354237/five-signs-that-its-time-to-retire-it-kit
Sponsored

Five signs that it’s time to retire IT kit

29 Nov 2019