If you enjoy ordering your voice-activated gadgets to do things, don’t get too smug – researchers claim they have discovered the technology is very vulnerable to an alarmingly simple trick.
A team from China’s Zhejiang University says systems such as Apple’s Siri and Amazon’s Alexa can be commanded to do things with high frequency voices that humans can’t hear.
According to the research team, just a few commands converted into high frequencies allowed them to take control of gadgets such as Amazon’s Echo, Apple’s iPhones and MacBooks, and Samsung’s Galaxy phones.
And they didn’t just ask the likes of Siri silly things, like ‘Siri, why does all of Apple’s stuff cost so much?’, they could also get the digital assistant to phone telephone numbers or open malicious websites (like Briebart News, perhaps).
The sneaky procedures have been termed ‘dolphin attacks’, as dolphins can hear the ultrasonic frequencies too – though, fortunately, the peaceful sea creatures aren’t connected to the internet, as far as I know.
In a paper, the research team wrote: ‘Inaudible voice commands question the common design assumption that adversaries may at most try to manipulate a [voice assistant] vocally and can be detected by an alert user.’
Worryingly, you could be sauntering down the street staring at your phone to see how many likes your photograph of a trifle got on Facebook, while nearby a real-world cyber crook could be hovering, broadcasting inaudible commands to your mobile. They could then tell Siri or whoever to download malware onto your phone (or maybe just set your alarm for 3.30am for a laugh).
Unsettling stuff, but it has raised an interesting question: is it possible to command dogs to do things with high frequency commands beyond human perception? That’s my weekend sorted.