Home / Technology / Hello Alexa, clarify how much you’re really recording us

Hello Alexa, clarify how much you’re really recording us



We are learning an important lesson about cutting-edge voice technology: Amazon's Alexa is always listening. So are Apple's Google Assistant and Siri.

Putting live microphones into our homes has always been an outside idea. But technology companies successfully marketed talkative speakers like Amazon Echo and Google Home to millions assuring us that they only record us when we give a "word of warning."

That turns out to be an inappropriate name. These devices are always "awake", passively listening to the command to activate, such as "Alexa", "O.K. Google," or "Hey Siri". The problem is that they are far from being perfect about just answering when we want them to do it.

The last and most alarming example to date: A family in Portland, Oregon. Two weeks ago they discovered that Echo had recorded a private conversation and sent her to a contact at random. The event, reported by the KIRO 7 of the state of Washington, went viral on Thursday between the owners of Echo and those who despised the idea of ​​allowing technology companies to put microphones in our homes.

Privacy is the only aspect of Alexa that Amazon can not afford to screw up. (Amazon's executive director, Jeffrey P. Bezos, owns The Washington Post.)

In a statement, Amazon made it sound like the Portland case involved a sequence of events that could be expected in an episode of "Seinfeld." He said the Echo woke up when he heard a word that sounded like Alexa. "The subsequent conversation was heard as a request to send a message." At that point, Alexa said loudly, "Who?" At that point, the conversation in the background was interpreted as a name in the customer's contact list "

Amazon also said that the incident was rare and that" evaluates options to make this case even less likely. "

But how often? Do these devices go false and record more than we would like? Neither Google nor Amazon immediately responded to my false positive questions because of their "warning words." But anyone who lives with one of these devices knows what happens.

As a technical columnist, I have an Echo, Google Home and Apple HomePod in my living room, and I discover that at least one of them starts recording, at random, at least once a week. It happens when they pick up a TV sound, or a piece of conversation that sounds enough like one of their wake words.


The Amazon Alexa application will play stored recordings, including cases like this, where it started recording because it misread its "activation word".

Separating a command from the surrounding environmental noise, especially loud music – is not an easy task. Amazon & # 39; s Echo uses seven microphones and noise cancellation technology to hear your activation word. By doing so, it records a second ambient sound on the device, which constantly discards and replaces it. But once he thinks about it he listens to his Awake, the Echo's blue light ring activates and starts sending a recording of what he hears to Amazon's computers.

Overdubbing is not just an Amazon problem. Last year, Google faced an error when some models of its Home Mini were configured to record everything and had to be patched. Earlier this month, the researchers reported that they were able to get Siri, Alexa and the Google Assistant to listen to secret audio instructions that were undetectable to the human ear.

So, what should you do about it? You can mute these devices, which in the case of Amazon Echo physically disconnects the microphone, until you are ready to use it. But that partly defeats the utility of a computer that can scream when its hands are occupied in another way.

Another approach is to deactivate some more sensitive functions in the Alexa application, including making product purchases by voice. You can disable the "drop in" function that allows another Echo to connect automatically to start a conversation.

It also has the ability to deepen into what is being recorded. Prepare to be a little horrified: Amazon and Google keep a copy of each conversation, as a nod to transparency and to help improve their speech recognition and artificial intelligence systems. In the Alexa application and the Google user activity site, you can listen to and delete these past recordings. (Apple also saves the Siri recordings, but not in a way that it can search, and anonymizes them after six months)

The nuclear response is to completely disconnect your smart speaker until companies clarify how often their voice assistants -ready- and what they are doing to stop it.

.


Source link