Why every user needs a smart speaker security policy

HomePods and other devices may listen to what you say up to 19 times each day, report claims.

Apple, Amazon, Google, HomePod, Siri, Alexa, Security, iOS, iPhone
Michael Simon/IDG

Does your voice assistant wake up randomly when you are engaged in normal conversation, listening to radio, or watching TV? You’re not alone, and this could have serious implications in enterprise security policy.

All things being equal (they’re not)

“Anyone who has used voice assistants knows that they accidentally wake up and record when the 'wake word' isn't spoken - for example, 'seriously' sounds like the wake word 'Siri' and often causes Apple's Siri-enabled devices to start listening," the Smart Speakers research study says.

In an ideal world, it wouldn’t matter. You’d say what you wanted to say and if your voice assistant accidentally woke in response to random conversation it wouldn’t matter, because no one else would ever know it happened.

This is not the world we’re in.

That's because elements of accidentally gathered conversations are listened to by people we don’t know anything about as part of most voice assistant companies' “grading process." – the process in which what these systems do is checked and software improved.

This was the cause of some discussion last year, with most players in the space subsequently working to mitigate the consequences of this system and put users in a little more control. However, it seems plausible to think that some information still leaks.

Enterprise security 101

While it is true that some of the more egregious elements of this have been addressed, particularly by Apple, that’s not going to be a big enough commitment for enterprise security teams seeking to protect confidential information from being accidentally picked-up by entities outside their control.

Preventing things like this must be pretty high up the list of most security priorities.

Apple (at least) does promises that any recordings it keeps “are not associated” with your identity in the form of your Apple ID. All the same, in some cases, stripping the identity away from the statement may still not be security enough, particularly as criminals put more effort into breaking into Echo, Google and HomePod devices.

How many accidents take place each day?

How big a threat is this? It may be greater than many think. The Smart Speakers research study conducted over six months by researchers at Northeastern University and Imperial College London provides insight into how often such systems can be accidentally triggered.

They found that:

  1. Smart speakers are not listening all the time.
  2. Accidental activations may happen up to 19 times each day.
  3. The reasons for such activation are inconsistent.

It is worth observing that some devices remain active for longer than others when accidentally activated:

“Echo Dot 2nd Generation and Invoke devices have the longest activations (20-43 seconds). For the HomePod and the majority of Echo devices, more than half of the activations last 6 seconds or more,” the researchers said.

In other words, some Echo devices may listen and record almost a minute of what you say once you accidentally enable the machine.

A HomePod at least only listens for up to six seconds.

What activates these devices?

The study attempted to figure out which word and sentence constructions are most likely to activate the systems. Unsurprisingly, they found words/phrases that sound like the trigger phrase were the most likely culprits.

In the case of the HomePod, for example:

“Activations occurred with words rhyming with Hi or Hey, followed by something that starts with S+vowel, or when a word includes a syllable that rhymes with “ri” in Siri. Examples include, 'He clearly,' 'They very,' 'Hey sorry,' 'Okay, Yeah,' 'And seriously,' 'Hi Mrs.,' 'Faith’s funeral,' 'Historians,' 'I see,' 'I’m sorry,' 'They say.'”

Who watches the watchmen?

Some users may not be so concerned about snippets of conversation being picked up by accidentally invoked smart speaker systems. At the same time, it does seem reasonable to expect manufacturers to make it possible for consumers to check the frequency of such accidents, and manage those recordings that do exist.

The researchers agree, and will next be exploring whether smart speaker system manufacturers “correctly show all cases of audio recording to users?”

That’s important, of course, because any users wanting to conduct a security audit of accidental or otherwise recordings made using such devices will want to know this.

To be fair, Apple strips the ID from the recording, which means those snippets of sound are no longer directly related to you. But any information that does leak in this way may still be of value to you or to your business.

How to prevent this happening

The beauty of using Siri on your HomePod is that you can easily ask it to send messages, play music at outstanding quality, take reminders and do lots of other tasks. But the trade-off in terms of accidentally triggering the device may make it unsuitable for some deployments.

Fortunately, there are ways to stop Siri from listening to your conversations on most devices, including the HomePod. Here are three ways to prevent your HomePod listening in:

1: Ask Siri

Ironically, you can ask Siri, just say, “Hey Siri, stop listening.” The device will ask you whether you are sure.

While in this state, Siri will not listen to you, which also means it won’t take accidental recordings. You will still be able to stream music from another device through the system, or just tap the top of the device to enable Siri again.

2: Use the Home app

  • Open the Home app on your iPhone and press and hold the HomePod
  • On the next page, tap the gears icon at bottom right corner.
  • Scroll down this page to the Siri section where you should disable the Listen for Hey Siri. If you leave the Touch and Hold for Siri command active you’ll still be able to use Siri in the normal way, but only by touching the device.

3: Turn it off

Not using the HomePod? Disconnect it from power and there’s no chance it will be listening, though you won’t get to enjoy the fantastic-sounding audio these systems create.

I don’t intend to be an alarmist in focusing on smart device security. I think it is important to do so, in part because this is still a very early-stage industry and mistakes do and will be made.

This is why it seems reasonable to me that every smart speaker user should take an audit of the security of their systems, just as I advise every iPhone, iPad or Mac user to regularly check their own system security.

Enterprise users should make such security audits a regular part of what they do.

Stay safe out there!

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2020 IDG Communications, Inc.

  
Shop Tech Products at Amazon