It’s safe to say that technology, in general, will continue to have growing pains as more innovations and advancements come to be. In some cases, devices may malfunction, stop working entirely, or do something unexpected.
In Alexas case, the AI assistant powering Amazon Echo, that means recording a private conversation and sending it to a third party. Oops.
How Alexa Recorded a Private Convo and Involved Someone Totally Unrelated
Alexa mistook Danielle and her husbands conversation as voice commands, and continued to listen and react.
Danielle, from Portland, explained how her Amazon Echo recorded an entire conversation between her and her husband. The recording was then sent to one of her husband’s colleagues.
“We unplugged [our Echo devices] and [the colleague] proceeded to tell us that he had received audio files of recordings from inside our house.” explains Danielle. “At first, my husband was, like, ‘no you didn’t!’ And the [colleague] said ‘You sat there talking about hardwood floors.’ And we said, ‘oh gosh, you really did hear us.”
Understandably, Danielle now says, “I’m never plugging that device in again, because I can’t trust it.”
An Amazon spokesperson confirmed it did happen, but it was because Alexa mistook part of the conversation as voice commands. She was not, in fact, spying on the woman and her husband. Of course, that revelation doesn’t make the event any less concerning.
Here’s what the spokesperson told The Verge.
“Echo woke up due to a word [that sounded] like “Alexa.” Then, [she] heard a “send message” request. Alexa [asked] “To whom?”
“At [this] point, the background conversation was interpreted as a name in the customers contact list. Alexa asked, “(contact name), right?” Alexa then interpreted background conversation as “right” [and sent off the recording].”
Could This Happen to Me?
Alexa does sometimes wake up to completely unrelated words, even when not given any direct commands.
For it to happen again, the conversation in question would have to be highly specific.
The Amazon spokesperson says:
“As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
So, it probably won’t happen again after Amazon does some fine-tuning, at least not exactly in the same way. But it’s not a stretch to think something similar could happen to you, however, as Alexa does often wake up or activate for unrelated commands and words. Saying something close to “Alexa” can put her into listening mode.