Alexa and Siri can hear this hidden command. You can't.

Angelica Greene
May 14, 2018

The exploit then made use of ultrasonic sounds to attack voice recognition systems in popular digital assistants.

Music can be transcribed as arbitrary speech, and human beings cannot hear the targeted attacks play out.

Unfortunately, "in the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online - simply with music playing over the radio".

According to a report by NY Times, the researchers in China and the U.S. have begun testing how hidden commands can be sent to Alexa, Google Assistant, and Siri that are undetectable to the human ear.

However, latest studies claim that the attack can be amplified and executed even from a distance of 25 feet.

Researchers Nicholas Carlini and David Wagner spoke to The New York Times about how they embedded the secret message, "OK Google, browse to evil.com", in seemingly harmless sentence as well as in a short video of Verdi's "Requiem".

Red Sox down the Blue Jays
Price (3-4) skipped Wednesday's start against the Yankees after experiencing numbness during a bullpen session last Sunday. With one out, Mookie Betts doubled and scored when the following batter, Andrew Benintendi, hit a ground-rule double.

Salmonella outbreak linked to recalled eggs sickens dozens
Last month, more than 200 million eggs from the North Carolina farm were recalled because of bacterial contamination. Days later, Cal-Maine Foods, Inc. issued its own recall for 23,400 dozen eggs that it bought from the farm.

Public office in Afghanistan comes under armed attack
At least four attackers, armed with rocket-propelled grenades and machine guns, still appeared to be fighting police. Afghan security forces surrounded the area and were still battling the gunmen hours after the initial explosions.

"We want to demonstrate that it's possible, and then hope that other people will say, "Okay this is possible, now let's try and fix it", Carlini added. But someone else might be secretly talking to them, too.

A day after Google announced the addition of six new voices to its Artificial Intelligence (AI)-powered Assistant, users in the USA can now change the voices of their assistants, a media report said. Despite a heavy focus on artificial intelligence and voice at its yearly I/O developer conference this week, Google's keynote presentation hardly mentioned security at all. Well, it might not be your dodgy accent to blame as a bunch of researchers have found ways to manipulate the likes of Siri and Alexa, using white noise and a series of commands not audible to us puny humans. "For example, users can enable Voice Match, which is created to prevent the Assistant from responding to requests relating to actions such as shopping, accessing personal information, and similarly sensitive actions unless the device recognizes the user's voice". That vulnerability, which left the Alexa assistant active even after ending a session, was fixed by Amazon after receiving its report from the researchers' team. For instance, it wouldn't stop an attacker from messing with your smart home gadgets or sending a message to someone.

This is not the first instance of voice AI being vulnerable. Are hackers already using this method to victimize users, and how can you protect yourself against it?

For most users in Canada, they will need to change their assistant's voice from Canadian English to American English for these options to show up.

In response to the report, Amazon told the newspaper that it has taken steps to ensure its Echo smart speaker is secure, while Google said its Assistant has features to mitigate undetectable audio commands.

Other reports by GizPress

Discuss This Article

FOLLOW OUR NEWSPAPER