Wake Words or Non-Wake Words – Smart Speaker is Always Up

0 Shares
0
0
0

Around two-thirds of voice assistants wake up to non wake words. According to recent research conducted by Ruhr University and Max Planck Institute for Security and Privacy, voice assistants awaken when they hear a diverse range of words, even if they are not their wake words.

Apparently, there are more than a thousand terms that serve as their wake words, apart from their own.

The research included testing on Siri, Alexa, Google Assistant as well as Microsoft’s Cortana, Deutsche Telekom’s voice assistant, even voice assistants made by Chinese market giants such as Baidu, and Xiaomi.

How was the research conducted?

The whole process of the research comprised keeping the speakers in front of televisions and radios during an ongoing podcast or news broadcast, and even recorded speeches.

All the words that successfully managed to wake the voice assistants were noted, and it was found that there were more than a thousand such words. However, the researches noticed that the phrase sometimes was very close to the wake words.

Such as Google Assistant awakening whenever it heard an “Ok Cool”, and Cortana awakening when it heard “Montana.” Alexa even woke up to “unacceptable” and Echo to “tobacco”, Siri to “seriously.”

Researcher Dorothea Kolossa confessed in a statement that the devices are programmed in a forgiving manner because they are supposed to understand their humans, which is why they tent to turn up once too often.

Breach of Privacy

However, this did conclude that there was a serious breach of privacy. Some of the mistake words would wake the speakers up, compelling them to record the information and send it to the could.

Others would wake up, but not send any information to the cloud. The researchers recorded all of these inconveniences in the document filed “Unacceptable, where’s my privacy?.”

How frequently does a voice assistant wake up to a non-wake word?

According to a recent survey conducted in January, about two-thirds of voice assistant users claimed that their smart speakers accidentally awake quite a few times daily.

According to a study conducted by Northeastern University, smart speakers awaken about 19 times a day. They also record conversations up to forty-three seconds. 

Researchers conducted a research on Apple HomePod, Harman Kardon Invoke as well as second and third-generation Amazon Echo dots. They exposed the speakers to 125 hours of Netflix and estimated the data that smart speakers wake up to 19 times a day.

Why is this a cause of concern?

Users are questioning the security and privacy factors, as this is unacceptable.

A lot of private conversations occur in a day, and they are worried about them getting leaked to a stranger. If the smart speakers are waking up to non wake words, they are surely recording it and sending it to the cloud.

This is a serious concern when it comes to breach of privacy. Especially during this pandemic, when people are advised to stay home, conversing with our near and dear ones is the only way to keep out mental stress.

If the conversations keep going to the cloud, who knows whom it might land up with?

What do the researchers have to say about this?

Famous researcher Thorsten Holz said that if we look at this matter from a privacy point of view, it’s completely unacceptable and outrageous. However, if we look at this from an engineering point of view, it’s forgivable.

It’s understandable, as the only way to improve systems is if we use such data. He said that the manufacturers need to strike a balance though, between data protection and technical optimization.

Loading

0 Shares
You May Also Like