By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
Amazon confirms report that Alexa recorded a couples conversation and sent it to a family friend
fe2dfb844fa4b84fc1bd8e67fd98c6c57b5b62aa86d3a84425db4424f8f21194
Many Christians have concerns with voice assistants and smart speakers, such as Amazons Echo Dot, shown here. - photo by Herb Scribner
A couple from Portland, Oregon, complained to Amazon that their Alexa device recorded a conversation and sent it to the phone of someone they knew.

As KIRO-7 reported, the couple said the recording went to someone on their familys contact list.

The couple said Amazon devices were installed throughout their house so they could control temperature, lights and security.

"My husband and I would joke and say I'd bet these devices are listening to what we're saying," the woman, named Danielle, told KIRO-7.

But Danielle (who did not want to have her last name published) said her opinion of Amazon changed when she received a call from one of her husbands employees, who said he received a recorded conversation from the couple.

"We unplugged all of them (Amazon devices) and he proceeded to tell us that he had received audio files of recordings from inside our house," she said. "At first, my husband was, like, 'no you didn't!' And the (recipient of the message) said 'You sat there talking about hardwood floors.' And we said, 'oh gosh, you really did hear us.'"

Danielle said she felt invaded.

The couple contacted Amazon about the problem.

"They said 'our engineers went through your logs, and they saw exactly what you told us, they saw exactly what you said happened, and we're sorry.' He apologized like 15 times in a matter of 30 minutes and he said we really appreciate you bringing this to our attention, this is something we need to fix!"

Amazon confirmed the report in a statement to Ars Technica.

Echo woke up due to a word in background conversation sounding like Alexa. Then, the subsequent conversation was heard as a send message request. At which point, Alexa said out loud To whom? At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, (contact name), right? Alexa then interpreted background conversation as right. As unlikely as this string of events is, we are evaluating options to make this case even less likely.

In March, Amazon's Alexa also faced scrutiny, for randomly laughing at people, according to Business Insider.

"In rare circumstances, Alexa can mistakenly hear the phrase 'Alexa, laugh.' We are changing that phrase to be 'Alexa, can you laugh?' which is less likely to have false positives, and we are disabling the short utterance 'Alexa, laugh.' We are also changing Alexa's response from simply laughter to 'Sure, I can laugh' followed by laughter," Amazon said in a statement sent to Business Insider.

Both incidents highlight privacy concerns customers face when buying artificial intelligence devices, according to ZDNet. Its also made people question whether devices are actually listening to them.

Interestingly, Yale privacy scholar Tiffany Li said on Twitter that Amazon doesnt have a specific policy when it comes to the Amazon Echo or anything that could record information.

Reminder that there is no privacy policy for the Amazon Echo or any Amazon Alexa devices, she tweeted. The TOS only refers to the http://Amazon.com privacy policy, which does not include details on recording, voice data security, etc.