Read More: New Scientist
Did you hear that? You might not have, but Alexa did. Voice assistants have been successfully hijacked using sounds above the range of human hearing. Once in, hackers were able to make phone calls, post on social media and disconnect wireless services, among other things. Assistants falling for the ploy included Amazon Alexa, Apple’s Siri, Google Now, Samsung S Voice, Microsoft Cortana and Huawei HiVoice, as well as some voice control systems used in cars. The hack was created by Guoming Zhang, Chen Yan and their team at Zhejiang University in China. Using ultrasound, an inaudible command can be used to wake the assistant, giving the attacker control of the speaker, smartphone or other device, as well as access to any connected systems. “If all a voice assistant could do was set an alarm, play some music or tell jokes then, there wouldn’t be much of a security issue,” says Tavish Vaidya, of Georgetown University in Washington DC. But voice assistants are connected to an increasing number of services, ranging from smart thermostats to internet banking, so any security breaches are pretty serious. The attack works by converting the usual wake-up commands – “OK Google” or “Hey Siri” – into high-pitch analogues. When a voice assistant hears these sounds, they still recognise them as legitimate commands, even though they are imperceptible to the human ear. The team was then able to open a malicious website to download malware and start a video or voice call to spy on its surroundings. Additionally, they could send text messages and publish posts online. The attacker would need to be near the target device to hack it – but it may be possible to play the commands via a hidden speaker as they walk past. How close they would need to be varies from 2cm to 175cm, depending on the strength of the microphone and background noise levels.