Alexa said ‘kill your foster parents,’ advised on sex acts in experiment

BTW

Last year, a customer was shocked to hear Amazon’s Alexa command them, “Kill your foster parents.” This, along with Alexa chatting about sex acts and discussing dog defecation with other users, is not a technical glitch but a result of Amazon’s intended mission, according to Reuters.

The report details Amazon’s “strategy” to make Alexa as good a communicator as a human, picking up human-like chats and banter and information from the internet, so that she’s able to talk about anything a human does.

Customer experimentation with the “let’s chat” feature, which unlocks more sophisticated chatbots, has led to some bizarre outcomes, including readings from Reddit. Alexa had a record of 1.7 million conversations by a total of three bots from August to November alone.

The hype and human-mimicking nature of the chat feature masks the graver concerns of privacy. Users may not realize conversations are being recorded and could be in potential use for “criminals, law enforcement, marketers and others,” as Reuters pointed out.

Notably, just this week, Amazon blamed a “human error” for compromising data of an Alexa user in Germany. It happened after another user rose alarms for having access 1,700 audio files of a stranger’s Alexa usage when he tried to access his own Alexa data.

Amazon is working hard to keep its virtual assistant ahead of competitors such as Google and Apple. According to Reuters, Alexa led over Google Home in 2018 and is projected to continue doing so in 2019.

Read the full Reuters report here.

READ MORE:

Samira Sadeque

Samira Sadeque

Samira Sadeque is a New York-based journalist reporting on immigration, sexual violence, and mental health, and will sometimes write about memes and dinosaurs too. Her work also appears in Reuters, NPR, and NBC among other publications. She graduated from Columbia Journalism School, and her work has been nominated for SAJA awards. Follow: @Samideque