Who cares if voice assistants listen to us?

It's a little-known fact that men's brains are fitted with a 4GB memory chip that's constantly recording background conversations. It's why, when you've got your head buried in The Times crossword and your other half barks: "are you listening to me?", we're able to respond: "Yes, put the recycling bin out and feed the dog at half one" without having actually registered a single word of the original conversation.

Many people suspect Amazon's Echo speakers are similarly equipped. That they're sat there in the corner, like Darth Vader's Pringles tube, secretly recording every last word we utter: conversations with the dog, the wrong answers to questions on University Challenge, the access codes for the nuclear reactor under the conservatory. I've lost count of the number of people to whom I've recommended the Echo, who give a derisory snort and say: "Put a device in my living room that's recording everything I say? No, ta".

In many ways, the fear is understandable. We've all read the stories about webcams and smart televisions being used as bugging devices. We've all seen the pictures of Facebook founder Mark Zuckerberg with the little piece of black tape over his laptop's webcam. We've all read the Snowden revelations of mass government surveillance. Only the paranoid survive, as Intel's Andy Gove said.

And let's be clear: the notion that Amazon's speakers could be turned into some sort of bugging device isn't ridiculously far-fetched. Any device with a microphone and an internet connection is potentially susceptible to hackers. No device is 100% secure.

However, there are a few reasons why I'm not losing any sleep over Alexa grassing me up to MI5, not least because I'm somewhere between Nanette Newman and the Dalai Lama on the list of terror suspects. Firstly, Amazon's Echo devices only record what you say after you utter the 'wake word', which is normally 'Alexa'. So unless you're asking Alexa how to make a nail bomb, it's unlikely what you say will ever find its way back to GCHQ.

Ah, I hear you whelp, wasn't there a US murder case where the prosecution wanted access to the accused's Alexa recordings? There was, and it is indeed true that Amazon stores recordings of what you say after the 'wake' word. If you go to Settings, History in your Amazon app, you can listen to and delete these recordings. The prosecutors were looking for evidence that might have been captured in the background of these recordings, so if you're ever being murdered in your own home, scream: "Alexa, my wife is strangling me for not putting the bins out", and your other half's looking at a 10-year stretch, at the very least. Still, the chances of these brief audio snippets ever being sufficient to form evidence against you are vanishingly small.

The final reason I'm not bothered by Alexa's eavesdropping abilities is that even the known exploits aren't exactly terrifying. On the day I'm writing this column, news websites are frothing about "Alexa being turned into a snooping device!". Except the exploit itself requires hackers to break into your house, remove the base of the Echo speaker and fiddle with its software. Which begs the question: why not just plant a tiny listening device and bypass the Echo altogether?

If you're still not reassured, lay it on the line with your Echo. Say: "Alexa, I want the truth." Just prepare yourself beforehand: you may not be able to handle the answer.

This article originally appeared in Web User. Main image credit: Web User

Barry Collins

Barry Collins is an experienced IT journalist who specialises in Windows, Mac, broadband and more. He's a former editor of PC Pro magazine, and has contributed to many national newspapers, magazines and websites in a career that has spanned over 20 years. You may have seen Barry as a tech pundit on television and radio, including BBC Newsnight, the Chris Evans Show and ITN News at Ten.