What is the scariest thing you can imagine that an Amazon Echo does? Think realistically Would it be something simple but sinister, like an artificially intelligent speaker, record a conversation between you and a loved one and then send that recording to an acquaintance? That seems pretty bad to me. And guess what: it's happening.
Recently, a husband and wife in Portland received a disturbing call from the man's employee. "Disconnect your Alexa device now," said the voice on the line. "You're being hacked." That would have been scary enough, but then, the thoughtful employee explained that he had recently received audio files that contained a conversation between the couple. When they doubted him, the employee sent the files. Sure enough, the Amazon Echo couple had shared a recording of a private conversation without the couple's permission, and it was not because of the hackers. It was by Amazon.
Amazon recently admitted that the Portland couple had been the victim of an "… improbable … series of events." Somehow, its echo had mistakenly interpreted the background noise as a keyword and then another sound as a command to send a message and then another string of words as a command to send the recording to the employee of the man. Amazon even claims that Alexa said "[contact name] right?" To confirm the action, but the couple denies that the devices ever asked for a confirmation to send the message. Hell, they did not even know they were being recorded in the first place.
Say: "This is something Black Mirror shit", not only would it be a cliché, it would be an understatement. This incident illustrates the real-life privacy nightmare that voice aides always carry to our homes. As with any technology connected to the Internet, smart speakers such as Amazon Echo and Google Home confront consumers with the decision to change privacy for convenience.
The terms of that compensation are not clear. For now, we know that these devices record their commands to train their voice software to better understand the commands. We also know that Google and Amazon have several patents that would allow them to collect voice command data to do anything from judging the level of a child's "mischief" to measuring the mood of a person to personalize the content or goal. advertisements. Amazon, specifically, has already started experimenting with ads on devices with Alexa in the form of sponsorships and is reportedly in talks with companies about the publication of ads based on voice commands. If you ask how to remove a blemish, for example, Alexa could respond with a Clorox ad. But at this moment, these are just ideas.
The current reality is, in some way, much more terrifying. The technology that feeds voice-controlled devices connected to the Internet is so new that we simply do not know how or when it will fail. And we definitely do not know what the consequences might be when they do it. Scenarios like the previous Alexa do not even represent security issues. They represent a fundamental design flaw in these apparently insufficiently tested systems. If Amazon Alexa and the Google Assistant are supposed to improve as they collect more data and learn more about human speech, we can only conclude that there is always the possibility that they will fail and do the wrong thing along the way. Now we know it could mean that your Echo could record a private conversation between you and a loved one and send it to someone on your contact list.
That's all assuming that these devices work as they should. There are other ways in which voice controlled assistants are compromised, including, among others, software errors, security deficiencies, and government intervention. For example, a touch panel error turned some Google Home Minis into full-fledged surveillance devices last year. The security researchers, meanwhile, have had a field day pirateando to Alexa and turning it into a spy that always listens. And let's not forget that Amazon has shown that it will deliver its Echo data to the authorities if the situation demands it. Meanwhile, the FBI may or may not be intercepting Echo devices.
If you've noticed that I did not mention Apple or Siri in all this creepy surveillance business, you'll get a golden star. The HomePod and other experiences with Siri have simply not been subject to so much scandal (yet). That could be because Apple insists that all Siri commands are anonymized, encrypted and stored on the device. Who knows if, in the long term, this means that Siri is a safer assistant than Alexa. But for now, as far as we know, Apple's technology has simply not created the kind of atrocious situation that leads to the private conversation of a couple that is sent to a seemingly random person due to horrible software. However,
Regarding Alexa, now it seems a moment of settling accounts. Amazon admitted that Echo's error occurred almost at the same time we saw reports that Google Home had sold more than Amazon Echo devices for the first time. It's probably a coincidence, but it makes you wonder if Amazon is crazy when it comes to artificial intelligence and machine learning.
I have argued in the past that Google's smart devices work better than Amazon's. And now, despite an error here or there, I begin to feel that Alexa could be dangerous in her inferiority. Alexa errors are frightening. The recording of the conversation is really terrifying. It's a nightmare. And it is also one that you can avoid. Do not buy an echo Definitely, do not buy one for a friend.