Alexa and Siri can hear this hidden command. You can't.

Angelica Greene
May 14, 2018

The exploit then made use of ultrasonic sounds to attack voice recognition systems in popular digital assistants.

Music can be transcribed as arbitrary speech, and human beings cannot hear the targeted attacks play out.

Unfortunately, "in the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online - simply with music playing over the radio".

According to a report by NY Times, the researchers in China and the U.S. have begun testing how hidden commands can be sent to Alexa, Google Assistant, and Siri that are undetectable to the human ear.

However, latest studies claim that the attack can be amplified and executed even from a distance of 25 feet.

Researchers Nicholas Carlini and David Wagner spoke to The New York Times about how they embedded the secret message, "OK Google, browse to evil.com", in seemingly harmless sentence as well as in a short video of Verdi's "Requiem".

Mariners second baseman leaves game with fractured hand
Castellanos was hit by a pitch in the first inning and had his left hand checked before going to first base. Cano , 35, is batting.287 with four home runs and 23 RBIs this season, his fifth with the Mariners .

Bale hits brace as Real Madrid put six past Celta Vigo
Zinedine Zidane is only too happy that Gareth Bale is giving him a selection headache ahead of the Champions League final . His first was a low shot that went in off the post.

Salmonella outbreak linked to recalled eggs sickens dozens
Last month, more than 200 million eggs from the North Carolina farm were recalled because of bacterial contamination. Days later, Cal-Maine Foods, Inc. issued its own recall for 23,400 dozen eggs that it bought from the farm.

"We want to demonstrate that it's possible, and then hope that other people will say, "Okay this is possible, now let's try and fix it", Carlini added. But someone else might be secretly talking to them, too.

A day after Google announced the addition of six new voices to its Artificial Intelligence (AI)-powered Assistant, users in the USA can now change the voices of their assistants, a media report said. Despite a heavy focus on artificial intelligence and voice at its yearly I/O developer conference this week, Google's keynote presentation hardly mentioned security at all. Well, it might not be your dodgy accent to blame as a bunch of researchers have found ways to manipulate the likes of Siri and Alexa, using white noise and a series of commands not audible to us puny humans. "For example, users can enable Voice Match, which is created to prevent the Assistant from responding to requests relating to actions such as shopping, accessing personal information, and similarly sensitive actions unless the device recognizes the user's voice". That vulnerability, which left the Alexa assistant active even after ending a session, was fixed by Amazon after receiving its report from the researchers' team. For instance, it wouldn't stop an attacker from messing with your smart home gadgets or sending a message to someone.

This is not the first instance of voice AI being vulnerable. Are hackers already using this method to victimize users, and how can you protect yourself against it?

For most users in Canada, they will need to change their assistant's voice from Canadian English to American English for these options to show up.

In response to the report, Amazon told the newspaper that it has taken steps to ensure its Echo smart speaker is secure, while Google said its Assistant has features to mitigate undetectable audio commands.

Other reports by GizPress

Discuss This Article

FOLLOW OUR NEWSPAPER