Lead article image

When voice recognition goes very, very bad

Have you ever had a voice recognition fail like this epic error?


Janet Kornblum

Internet Culture

Posted on Jan 31, 2013   Updated on Jun 2, 2021, 2:21 am CDT

DamnYouAutoCorrect’s got nothing on this one: I almost told a friend to kill herself.

I mean—those exact words: “You need to kill yourself.” She had told me that she was in psychic pain, that she was feeling suicidal. And that’s the response I came back with.

Don’t hate me.

I didn’t mean it. My computer made me do it. Literally. Actually, technically, my computer did it. Or aided and abetted at the very least; at worst, committed the crime on its own.

This is was what I call a “speako,” a word that I’ve been trying to popularize (and thanks to Siri I have hope that I will.) Because you know. Voice recognition is awesome. Until it sucks. And when it sucks, it can suck really big.

These days, thanks to the aforementioned Siri chick, most of you can now empathize with my voice troubles. She’s helped introduce the general culture to the concept of speaking into a device that then translates the words onto a screen. When I first started using voice recognition, it had hardly earned its name (e.g., “voice recognition”). I mean it was beyond bad. You’d say hello and it would type, “Banana.” 

Well, maybe not quite that bad. But it felt like it. 

Thankfully, with every new version voice recognition software has improved remarkably, to the point where you can now actually rely on it. (I basically support Nuance, which produces Dragon, because I’ve bought every new version for both Mac and PC for the past decade or so.)

And that’s the problem.

Before, I knew that Dragon would make mistakes. But I still used it because it was better than my hands hurting, right? Because I knew how fallible it was, I’d check and recheck the results. And then I’d add a disclaimer into my email. Just in case. Because when Dragon screws up, it screws up big.

Some of my favorite speakos included: accidentally calling someone a slut (because “select” and “slut” apparently sounded, for quite a long time, exactly the same to Dragon—which is unfortunate since “select” is a frequent command; calling a thin (thank God she was thin) woman “fat woman” because “Facklemann” might as well be “fat woman.” I can’t tell you how many emails I’ve signed as “Jim,” and even “damn it,” because, you know, Janet sounds like all those things. And yes, I tried to train Dragon, to no avail. Love, damn it.

These days, though, I’ve gotten a little too comfortable. I don’t check as much, because the technology is so much better. Dragon and Siri are pretty reliable.

And so I almost sent out that fateful mistake. The one where I urged an already depressed person, someone who was looking for love and support and was at the bottom of her personal barrel to essentially go jump off the nearest bridge.

I don’t know why I looked at the email before I hit send. But thankfully I did—and I caught it out of the corner of my eye.

Oh. My. God.

For a moment I imagined her opening up the email as it was, the stab of pain she would have felt when her BFF suggested this alternative. It was so bad it was funny. But in my experience, suicidal people don’t have a very active sense of humor.

I caught it. I caught it.

Carefully—very carefully—I put my fingers on the keyboard and I typed. Typed, for safety’s sake, because Dragon couldn’t be trusted this time.

“I love you,” I had spoken. That part worked fine.  “You need to heal yourself.”

And then I hit send.

As a longtime user of voice recognition software, Janet Kornblum has offended many a netizen with her speakos. If you email her at [email protected] she promises not to be offended by yours.

Photo by Sean MacEntee

Share this article
*First Published: Jan 31, 2013, 3:09 pm CST