- Instagram to restrict posts promoting diet culture and plastic surgery Wednesday 6:58 PM
- Apple wants to trademark ‘Slofie,’ its term for slow-motion selfies Wednesday 5:51 PM
- Fortnite leak reveals a Batman crossover event may be happening Wednesday 5:32 PM
- The explosion at a bull semen factory generated a lot of obvious jokes Wednesday 4:33 PM
- Jessica Jaymes, adult film star, dead at 43 Wednesday 4:18 PM
- How to stream Falcons vs. Colts in Week 3 Wednesday 4:05 PM
- Beto O’Rourke says he opposes police use of facial recognition tech Wednesday 4:01 PM
- Lawsuit alleges woman was kidnapped by Lyft driver and gang-raped Wednesday 3:19 PM
- Facebook and Ray-Ban want to replace smartphones with smart glasses Wednesday 3:13 PM
- Sirfetch’d is the gallant new Pokémon winning everyone’s heart Wednesday 3:09 PM
- Danielle Cohn’s dad says she’s not really 15 years old Wednesday 2:14 PM
- Chilling ad by Sandy Hook Promise features kids using school supplies during a shooting Wednesday 1:50 PM
- Don’t fall victim to this Venmo texting scam Wednesday 1:18 PM
- Here’s what’s coming and going on Netflix in October 2019 Wednesday 12:55 PM
- Marvel just turned Goldballs into one of the most powerful X-Men Wednesday 12:33 PM
In their never-ending war over user attention and time, tech giants are constantly coming up with new ways to keep customers attached to their platforms and products. In the latest of such forays, Amazon’s Alexa team is beginning to analyze user interactions with its Echo smart speakers to understand their moods and feelings.
Like every other company that collects and mines tons of user data, Amazon claims that its new effort help it better serve its customers and provide a more personalized experience. But emotion analysis can also serve Amazon in creepy and privacy-invading ways that won’t necessarily be beneficial to users.
Last year, an unnamed source told MIT Technology Review that Alexa was working on such technology to stay ahead of the competition as Google and Apple were getting ready to ship their own voice-controlled speakers. But more recently, Alexa chief scientist Rohit Prasad confirmed that the company was working on emotion analysis in an interview with VentureBeat.
Alexa already knows to respond to users when they express sadness or happiness. But Amazon wants its smart assistant to recognize users’ mood without being told. Amazon is already teaching Alexa to mimic human habits in its talking. It’s only natural that the assistant also be able to understand human emotions as well, instead of treating all queries in the same manner.
To be fair, such technology can help in many positive ways. For instance, it can help Alexa detect and address frustration. Amazon says it might one day enable Alexa to respond to queries based on your emotional state, to scan voice recordings to diagnose disease or to even help fight loneliness by conducting lengthier conversations.
But that isn’t likely where the company will stop. In a world where user data and attention translates into profit, big tech companies like Amazon, Facebook, and Google will go to great lengths to trick users into continuing to interact with their platforms. In May, a leaked report revealed that Facebook was considering allowing advertisers to target teenagers when they felt “insecure,” “worthless” and “in need of a confidence boost.” A few years earlier, Facebook conducted a mass experiment in which it manipulated the emotions of hundreds of thousands of users by altering the content of their news feeds.
Amazon would very much like to be able to pull similar feats with its users, for which it will need a lot of data—part of which it already has. Every time you call the “Alexa” wake word and send a command to your smart speaker, Amazon saves a recording of your voice, which it uses to improve its ability to recognize you.
Now it will further mine that data to establish a baseline for your tone and detect outliers such as panic, anger, and fear. Being able to understand and respond to your emotions will enable Alexa to engage in lengthier conversations. During these conversations, you’ll reveal even more information about yourself, which Amazon will use to run machine learning algorithms and further complete your digital profile and potentially manipulate you in profitable ways. Imagine having a natural conversation with Alexa, in which it subtly steers the direction of the conversation toward suggesting you purchase product X or Y based on your emotional state.
While these are the creepy ways that Amazon might leverage its new-found power, what’s more worrisome is how others might abuse the platform. The memories of how malicious actors used Facebook to spread fake news and disinformation are still fresh. Amazon, which provides tools to develop all kinds of “skills” for Alexa, will be even more open to abuse as it grows smarter and more powerful.
This is something that academics and scientists warned against earlier this year in a piece published in Scientific American. “Some software platforms are moving towards ‘persuasive computing,’” the article, which was aptly named “Will Democracy Survive Big Data and Artificial Intelligence,” reads. “In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people.”
We can only hope that Alexa will use its new features to address real problems and help us live healthier and more meaningful lives and not use it for evil purposes. But while past history shows that Amazon and others are more concerned about their bottom line than the welfare of their customers, we keep on calling the magical wake word and pouring our invaluable data into the hungry maws of Alexa.
Ben Dickson is a software engineer and founder of TechTalks. His work has been published by TechCrunch, VentureBeat, the Next Web, PC Magazine, Huffington Post, and Motherboard, among others.