- Here’s why you shouldn’t buy a Nintendo Switch until mid-August Monday 5:11 PM
- Man blasted for making his coworkers babysit his child Monday 5:07 PM
- Pete Buttigieg’s country radio interview was blocked from the air Monday 4:35 PM
- 15-year-old Smash Bros. prodigy caught using racist slur in private Discord server Monday 3:47 PM
- Instagram users who post pet pictures more likely to get hacked Monday 3:45 PM
- Post-Prime Day recap: Shipping delays, more sales, and a scam Monday 3:08 PM
- Jacob Wohl returns to Twitter … for now Monday 1:56 PM
- How to stream WWE Raw Reunion Monday 1:35 PM
- ‘I hope Trump deports you’: Woman goes on racist rant to Spanish speakers at a store Monday 1:24 PM
- Emoji Mashup Bot gives life to unidentifiable emotions Monday 1:15 PM
- Notorious grifter Anna Sorokin reportedly blocked from profiting off Netflix series Monday 12:45 PM
- Charlottesville attacker’s Twitter account included praise for Hitler Monday 12:10 PM
- ‘Short Treks’ trailer: Spock, Pike, and Number One return Monday 11:57 AM
- Everything we know about ‘Star Trek: Lower Decks,’ the new animated show Monday 11:55 AM
- Cole Carrigan says he left Team 10 after being called homophobic slur Monday 11:32 AM
This is frightening stuff.
In their never-ending war over user attention and time, tech giants are constantly coming up with new ways to keep customers attached to their platforms and products. In the latest of such forays, Amazon’s Alexa team is beginning to analyze user interactions with its Echo smart speakers to understand their moods and feelings.
Like every other company that collects and mines tons of user data, Amazon claims that its new effort help it better serve its customers and provide a more personalized experience. But emotion analysis can also serve Amazon in creepy and privacy-invading ways that won’t necessarily be beneficial to users.
Last year, an unnamed source told MIT Technology Review that Alexa was working on such technology to stay ahead of the competition as Google and Apple were getting ready to ship their own voice-controlled speakers. But more recently, Alexa chief scientist Rohit Prasad confirmed that the company was working on emotion analysis in an interview with VentureBeat.
Alexa already knows to respond to users when they express sadness or happiness. But Amazon wants its smart assistant to recognize users’ mood without being told. Amazon is already teaching Alexa to mimic human habits in its talking. It’s only natural that the assistant also be able to understand human emotions as well, instead of treating all queries in the same manner.
To be fair, such technology can help in many positive ways. For instance, it can help Alexa detect and address frustration. Amazon says it might one day enable Alexa to respond to queries based on your emotional state, to scan voice recordings to diagnose disease or to even help fight loneliness by conducting lengthier conversations.
But that isn’t likely where the company will stop. In a world where user data and attention translates into profit, big tech companies like Amazon, Facebook, and Google will go to great lengths to trick users into continuing to interact with their platforms. In May, a leaked report revealed that Facebook was considering allowing advertisers to target teenagers when they felt “insecure,” “worthless” and “in need of a confidence boost.” A few years earlier, Facebook conducted a mass experiment in which it manipulated the emotions of hundreds of thousands of users by altering the content of their news feeds.
Amazon would very much like to be able to pull similar feats with its users, for which it will need a lot of data—part of which it already has. Every time you call the “Alexa” wake word and send a command to your smart speaker, Amazon saves a recording of your voice, which it uses to improve its ability to recognize you.
Now it will further mine that data to establish a baseline for your tone and detect outliers such as panic, anger, and fear. Being able to understand and respond to your emotions will enable Alexa to engage in lengthier conversations. During these conversations, you’ll reveal even more information about yourself, which Amazon will use to run machine learning algorithms and further complete your digital profile and potentially manipulate you in profitable ways. Imagine having a natural conversation with Alexa, in which it subtly steers the direction of the conversation toward suggesting you purchase product X or Y based on your emotional state.
While these are the creepy ways that Amazon might leverage its new-found power, what’s more worrisome is how others might abuse the platform. The memories of how malicious actors used Facebook to spread fake news and disinformation are still fresh. Amazon, which provides tools to develop all kinds of “skills” for Alexa, will be even more open to abuse as it grows smarter and more powerful.
This is something that academics and scientists warned against earlier this year in a piece published in Scientific American. “Some software platforms are moving towards ‘persuasive computing,’” the article, which was aptly named “Will Democracy Survive Big Data and Artificial Intelligence,” reads. “In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people.”
We can only hope that Alexa will use its new features to address real problems and help us live healthier and more meaningful lives and not use it for evil purposes. But while past history shows that Amazon and others are more concerned about their bottom line than the welfare of their customers, we keep on calling the magical wake word and pouring our invaluable data into the hungry maws of Alexa.
Ben Dickson is a software engineer and founder of TechTalks. His work has been published by TechCrunch, VentureBeat, the Next Web, PC Magazine, Huffington Post, and Motherboard, among others.