The report, which cites a source familiar with the the Amazon Echo project that utilizes Alexa, suggests significant updates to the voice-controlled assistant will improve its language skills and may provide it with the ability to recognize the tone of emotion in a user’s voice.
The ability to better identify emotion and react to it is a “key area” of Amazon’s research and development going forward with Alexa according to the report.
Alexa already digests a considerable amount of personal data about its users, but future versions will remember what a person has said to the virtual assistant so it can hold an ongoing conversation rather than simply answering one or two straightforward questions.
A large part of making that continuous conversation possible will stem from the emotional understanding instilled in the virtual assistant. Some studies have suggested matching a computer’s tone to the human it’s speaking with can make communication more effective and engaging.
More generally, the company is also working on improving upon Alexa’s ability to understand context and interpret ambiguous questions. For example, if a person asks, “What’s the score of the Hawks game?” they will receive a different answer depending on their location; a Seattle-based user is more likely to want to know about the Seahawks, while a user in Atlanta is probably interested in the Atlanta Hawks.
The Echo has been a successful venture for Amazon thus far, with the Pringles can-shaped speaker working its way into the homes of millions of users. However, remaining ahead of the curve will be a primary challenge for Amazon as the battle of the virtual assistants starts to heat up.
Google recently announced its Google Home speaker, a direct competitor to the Amazon Echo that will make use of Google’s massive amount of data and search tools to answer questions and complete tasks.