Smart speaker concept. AI speaker. Voice recognition

Voice assistants, like Google Assistant and Alexa, record what you say after the wake word to send off to company servers. The companies keep your recordings until you delete them. Some companies let you turn that behavior off: here’s how.

Voice Assistants Record You After Their Wake Word

Voice assistants work in a straightforward manner. They continually listen to everything you say, all day long. But the device in the room doesn’t have much intelligence. The only thing it can understand is its wake word: Alexa, Hey Google, Hey Cortana, and so on.

Once it detects that wake word, it starts recording everything that follows (and a second or so from when it thought it heard the wake word). The device sends the recording out to company servers (Alexa, Google, etc.) to figure out everything else you said and then act on it.

But after executing your command, the companies don’t necessarily delete your recording. Instead, your spoken words are retained indefinitely to improve Voice Assistant results and determine new features.

Some companies let you turn this behavior off. And some don’t. Among those that do, turning off recordings will break the voice assistant entirely, but that’s not true in all cases. We’ve rounded up what you can do, and what the results are.

Google is The Leader in Choice

Red Google Home on a shelf.

Google stands alone as the only company that gives you a choice to use Google Assistant without storing your voice forever. And in a real step of leadership, that’s now the default behavior for new users who set up Google Assistant.

Existing users are grandfathered into the old system of retaining your voice recordings, but you can turn that off. Turning off voice storage is as simple as going to Google’s Activity controls, toggling off “Voice & Audio activity,” and then clicking pause.

Google Voice and Audio activity dashboard with toggle off.

Best of all, turning off voice storage doesn’t break Google Assistant or Google Home devices, so there’s no reason not to turn the function off if you don’t like the thought of large companies retaining copies of your voice.

Alexa Doesn’t Give You Much Choice

Amazon Echo on a nightstand.

Amazon offers no Google equivalent option to prevent the storage of your voice recordings. If you use Alexa, from any Echo device or an Alexa app, your voice is processed and sent to Amazon’s servers. Amazon retains your recordings to improve upon Alexa.

Your only options are to listen to your recordings and delete them or forego using an Alexa-powered device.  You can mute Echo devices, but that isn’t necessarily a permanent solution. If someone else noticed the device on mute and turns it back on, you’re back to where you started. And in any case, muting breaks the ability to use Alexa at all, defeating the point of owning the devices.

Amazon does provide a privacy dashboard where you can tell the company not to use your voice recordings to develop new features or to improve transcriptions. Just click the “manage how your data improves Alexa” option and then turn both toggles off. But you’ll notice this tells Amazon not to use your data for these two purposes; it doesn’t prevent storing your recordings or using them for any other purpose.

Update: Amazon now lets you delete some recordings with your voice, too.

Alexa privacy dashboard with 'help develop new features' toggle.

Hopefully, Amazon will follow Google’s lead and offer better options.

Cortana’s Only Option Is an Off Button

Harmon Kardon Invoke Cortana speaker on a kitchen counter.

Similar to Amazon, Microsoft offers no option to prevent voice recording storage. You can only view and delete the existing recordings in Microsoft’s privacy dashboard.

Microsoft Privacy dashboard with Cortana recordings.

Worse than Amazon, you can’t even limit how Microsoft uses your recordings. The only real option is to turn off Hey Cortana altogether. In the start search bar, type “Talk to Cortana,” hit Enter, and then toggle off Hey Cortana.

If you’re using a Cortana speaker, you would have to mute it. Of course, you’re effectively giving up Cortana entirely. So if you want to use the Voice Assistant, currently you have to agree to Microsoft storing your voice recordings for its purposes.

Siri at Least Deletes Your Recordings When You Turn It Off

Homepod speaker on a filing cabinet.

Apple simultaneously provides you the easiest way to delete your recordings and ties for the least useful options to prevent recording in the first place.

Just like Microsoft and Amazon, the only option to prevent Apple from storing your recordings is not to use Siri at all.  Using Siri is essentially agreeing to allow Apple to use your voice recordings for whatever purposes it sees fit.

The good news is that rather than having to track down a privacy dashboard, simply turning off Siri deletes your recordings from Apple’s Server—so long as you turn off dication too.

To turn off Siri Dictation go to Settings > Siri and toggle off both Hey Siri and Siri. Tap on “Turn off” in the prompt. Notice it mentions that recordings are still stored if dication is turned off.

Siri settings dialog with arrow pointing to Siri toggles and Turn off Siri prompt

Turn to off Dication got to Settings > General > Keyboards and toggle off Dictation. Tap “Turn off” in the prompt. Now it will confirm the recordings will be deleted. (If you do this in the opposite order, the warnings will adjust as appropriate).

Dictation settings dialog with arrows pointing to enable dictation toggle and turn off dictation prompt

Unfortunately, not all Voice Assistants are created equal. Siri gets the nod for easiest to delete your recordings, but Google takes the crown for allowing you to prevent storage and still use Google Assistant. Hopefully, they’ll learn from each other (or better yet outright steal from each other) and provide better granular controls for your data.

Profile Photo for Josh Hendrickson Josh Hendrickson
Josh Hendrickson is the Editor-in-Chief of Review Geek. He has worked in IT for nearly a decade, including four years spent repairing and servicing computers for Microsoft. He’s also a smarthome enthusiast who built his own smart mirror with just a frame, some electronics, a Raspberry Pi, and open-source code.
Read Full Bio »