- Everything we know so far about Peacock, NBC’s new streaming service Tuesday 7:42 PM
- Selena Gomez producing docuseries about immigration for Netflix Tuesday 7:11 PM
- How to stream Manchester City vs. Shakhtar Donetsk in Champions League action Tuesday 6:14 PM
- Milo Yiannopoulos threatens to crash furry convention he is barred from Tuesday 5:54 PM
- How to stream Juventus vs. Atletico Madrid in Champions League action Tuesday 5:52 PM
- How to stream Real Madrid vs. PSG in Champions League action Tuesday 5:24 PM
- No-fly zone implemented over Area 51 ahead of Alienstock festival Tuesday 5:16 PM
- TikTok accused of censoring content about Hong Kong protests Tuesday 5:04 PM
- Smoke ’em, pass ’em, Week 3: At the Bakery Tuesday 4:38 PM
- Alex Trebek says he will be undergoing chemotherapy again Tuesday 4:27 PM
- Dan Crenshaw roasted after attacking Sanders’ call for veteran care Tuesday 4:19 PM
- How to stream NXT for its USA network debut Tuesday 4:12 PM
- This website will show you how AI classifies you Tuesday 3:22 PM
- School tells Black 4-year-old to cut his hair or wear a dress Tuesday 3:17 PM
- Lizzo called a ‘snitch’ for accusing Postmates runner of stealing food Tuesday 2:30 PM
Talking to your Amazon Alexa is cool, but sometimes her responses can be robotic. (Just because she’s an AI doesn’t mean she has to sound like an AI, right?) Amazon hopes to change that by giving developers the ability to hone Alexa’s responses with “a wider range of natural expression.”
Amazon recently announced five new tools developers can use to integrate more human-like details into their Alexa responses. Third-party developers can now adjust the volume and pitch of Alexa’s speech. They can also adjust how fast she speaks, or add emphasis to certain words. Developers can instruct Alexa to whisper responses, or “bleep” out expletives. These tools are part of a standardized language Amazon uses called SSML, or Speech Synthesis Markup Language.
Amazon already utilizes a bunch of these SSML abilities in Alexa’s personality-filled responses. Until now though, third-party app developers were more limited—responses they concocted had to be more cut-and-dry. Developers still can’t go wild with these markup tools, though. According to TechCrunch, Amazon will set limits as to how much devs can alter Alexa’s speech patterns. The goal is to make her sound more human, not transform her into a squeaking, bleeping monster. (Although maybe every once in a while, that’d be really funny—but that’s just me.)
Amazon also recently expanded Alexa’s understanding of local slang, specifically for its U.K. and German markets. Now, if U.K. app developers include the word “blimey” in a response, for example, Alexa will speak it with the appropriate intonations automatically.
It’s unclear how quickly developers will start integrating these features into their Alexa skills (if at all). If you use your Alexa for a wide variety of tasks though, it may be worth listening for changes in the coming weeks and months.
Christina Bonnington is a tech reporter who specializes in consumer gadgets, apps, and the trends shaping the technology industry. Her work has also appeared in Gizmodo, Wired, Refinery29, Slate, Bicycling, and Outside Magazine. She is based in the San Francisco Bay Area and has a background in electrical engineering.