Alexa and Siri can be controlled using ‘silent’ messages hidden in music, researchers discover
Popular voice assistants like Alexa and Siri can be controlled using 'silent' commands hidden in music.
In the hands of hackers, the tricks could be used by to steal money from your bank account and unlock smart locks to enter your home, the researchers who discovered the exploits told The New York Times.
These subliminal messages are undetectable to the human ear and could be used to buy things on Amazon and elsewhere online using your payment details.
Smart speakers are quickly becoming one of the must-have home gadgets, with almost half of UK households expected to own such a device by 2022 (up from 10 per cent today), according to consulting firm OC&C.
Gadgets like the Amazon Echo, Google Home and Apple HomePod allow users to control their speakers using voice commands.
With Amazon's Alexa digital assistant (which is built into the Echo speaker), you can play music, shop on Amazon, ask about the weather, order a pizza, and even make phone calls.
But new research suggests that subliminal messages embedded into audio recordings can instruct these bots to carry out devious tasks.
So while you may think you're listening to your fave tune, your Amazon Echo might hear a command ordering it to add something to your shopping list.
Audio snippets have tricked Alexa into making online purchases in the past.
An Echo owner previously filed an unsuccessful complaint with the UK's Advertising Standards Authority alleging his device was fooled into ordering cat food online by an Amazon ad.
Consumer watchdogs have also hurled spying accusations at the tech, accusing it of listening to users' conversations.
For its part, Amazon claims the Echo only records what you say after you use the wake word "Alexa" to prompt the bot to carry out a task.
But a patent filed by Amazon in June last year revealed a terrifying future concept that allowed the device to constantly snoop on your chats and phone calls.
The latest revelations build on a previous study that demonstrated how hackers could hide commands in white noise played over loudspeakers to get smart devices to open websites.
This time round, they made tweaks to audio files to cancel out the sound that Alexa's speech-recognition system was supposed to hear.
Instead, they replaced it with a sound that would be understood differently by the bot, while going undetected by humans
In response to the findings, Amazon said that it's taken steps to keep its Echo speakers secure. But it didn't specify what these measures were.
Apple said that its HomePod smart speaker is designed to prevent commands from doing things like unlocking doors.
It also noted that iPhones and iPads must be unlocked before Siri will act on commands that access sensitive data or open apps and websites, among other measures.
And Google claimed that its Assistant has features to mitigate undetectable audio commands.
ليست هناك تعليقات