Hey, Alexa, stop listening to everything I say

November 23, 2018
Contact: Laurel Thomas ltgnagey@umich.edu

ANN ARBOR—They look to be one of the top Black Friday bargains this year—those smart speakers like Alexa and Google Home that can give you the weather forecast, turn on your favorite tunes or arm the alarm on your house.

If you haven’t adopted one yet or don’t have it on the holiday list, it’s likely you just don’t see the value of having one. Or, you don’t trust the devices and the companies who make them to keep your personal business private.

If you already are a user or have asked Santa to bring you one this year, chances are you are excited about how handy a voice-controlled smart speaker is and are willing to give up the privacy in your home for the convenience.

These are among the findings of recent research from the University of Michigan School of Information looking at privacy perceptions, concerns and privacy-seeking behaviors with smart speakers.
Florian Schaub, assistant professor and senior author of the study, said some of those who haven’t adopted smart speakers don’t believe they’ll have much use for them but about half of those interviewed were worried about privacy.

Perhaps of more concern, he said, were the sentiments and behaviors of those who have adopted them but were unconcerned about protecting themselves and were not using the privacy features.

“We found that people were resigned to giving up their privacy, and they rationalize this choice by saying it’s just a little more data that Google is getting or Amazon is getting about me,” Schaub said. “I find that really concerning. It shows a dangerous and creeping erosion of privacy and privacy protections. These technologies are slowly chipping away at people’s privacy expectations.

“Current privacy controls are just not meeting people’s needs,” he said.

And it’s not just what the companies that make the devices are doing with the data that is at issue. The technology also opens users to attacks from hackers who would not likely target an individual but might infect speakers and similar devices with malware as they have done with home routers, smart TVs and other devices.

Schaub also notes potential interest by law enforcement. Just last week a judge ordered Amazon to turn over recordings from an Echo smart speaker that may end up being a “witness” to a New Hampshire murder.

Thirty-four participants divided evenly among those who had adopted smart speakers and those who chose not to purchase one for reasons other than price were interviewed in depth for the study.

In addition to finding out why people did or did not use a smart speaker, the team also asked if users knew about some of the built-in functions that could protect privacy.

For instance, some users knew there was an app that would allow them to go in and delete recorded information, but the researchers found a number of smart-speaker owners used this function instead to take advantage of the recordings—some to capture fun interactions among their children, others to check up on babysitters or housesitters.

Many users knew there was a mute button on the devices, although some were confused and thought it was for muting the speaker, not the microphone. Others who were aware of the feature didn’t think it was very practical to have to press a button to mute a voice-commanded device.

They, instead, relied on the fact that the device only records when they say “Alexa” or “Hey, Google,” to activate it. Although recording is only supposed to occur when the keyword is used, there have been accounts of false activations and accidental sharing of private conversations.

Another unknown is what companies are doing with data generated from the questions people ask the devices that may reveal personal information about their lives—if they have children, how healthy are their food choices, and what kind of music they like.

“The microphone is always on so it requires a lot of trust in that company producing that smart speaker to handle your data responsibly,” said Josephine Lau, first author of the study.

She said companies that are serious about providing customers with privacy controls should consider integrating their privacy features into the devices’ voice commands.

“When asked how they could stop their speakers from listening, multiple current users attempted to address this by saying ‘Alexa, stop listening’ or ‘Alexa, stop recording,’ showing how users naturally assume that they would be able to engage with these privacy controls with their voice,” Lau said. “However, current speakers do not support this.

“This, coupled with other participants’ voiced desire for an incognito mode akin to ones commonly found on web browsers, proves that there is user demand for such functionality. For example, users should be able to ask, ‘Hey, Alexa, don’t listen for the next hour’ or ‘Alexa, forget what you heard in the last day.'”

For those considering buying a smart speaker this time of year when the prices are rock bottom for the holidays, Schaub recommends that everyone in the family participates in a conversation about what it means if one comes into the home. And he said people should use the features that currently are provided to protect privacy.

“When you bring a smart speaker into your home or office, you need to understand that you have introduced a device with a live microphone into probably your most intimate space,” he said.

 

More Information: