- Why Gritty was one of the most popular memes of 2018 Today 7:00 AM
- Kelly Sue DeConnick explores the mythic sensibility of her new Aquaman comic Today 6:30 AM
- How to watch the History Channel online for free Today 6:00 AM
- Why the Senate’s First Step Act isn’t true criminal justice reform Today 5:30 AM
- Mom calls cops on son who can’t get ready for school on time Tuesday 11:19 PM
- Tinder exec fired after involvement in lawsuit alleging sexual assault Tuesday 10:48 PM
- Woman matches on Tinder with LaCroix thief—and his victim Tuesday 7:38 PM
- U.K. police will have to disclose documents about WikiLeaks journalists Tuesday 6:37 PM
- Backpack Kid sues Fortnite developer over flossing emote Tuesday 5:38 PM
- Conservatives rage at Alexandria Ocasio-Cortez’s ‘week of self-care’ Tuesday 4:02 PM
- 2 inflatable snowmen fought in front of a combo KFC/Taco Bell Tuesday 2:47 PM
- How to watch the Boca Raton Bowl online for free Tuesday 2:43 PM
- DAZN KOs YouTube, Snapchat as (temporarily) the most downloaded app Tuesday 1:57 PM
- AT&T says it’s rolling out 5G service this week Tuesday 1:03 PM
- NY state senator tells woman staffer ‘Kill yourself!’ in a tweet Tuesday 12:54 PM
This isn’t the first time technology has shown racial bias.
Richard Lee, a 22-year-old New Zealander of Asian descent, recently had his passport photo rejected when the automated facial recognition software registered his eyes as being closed. Lee posted about the incident on Facebook.
Lee told Reuters that it was no big deal, “I’ve always had very small eyes and facial recognition technology is relatively new and unsophisticated.” An Internal Affairs spokesman also said about 20 percent of passport photos are rejected.
However, as much as people love to claim that technology is inherently unbiased, there are many examples of racial bias in facial recognition software. For a while, Google photos was auto-tagging images of black people as gorillas. HP’s face tracking webcams could detect white people but not black people. And many versions of facial recognition software have determined Asian people’s eyes are closed.
As Rose Eveleth wrote for Motherboard, the bias comes not from the technology, but from the programmers. “Algorithms are trained using a set of faces. If the computer has never seen anybody with thin eyes or darker skin, it doesn’t know to see them. It hasn’t been told how. More specifically: The people designing it haven’t told it how.” However, because engineers say they aren’t intentionally programming racial biases, many refuse to admit there is even a problem.
Lee is right that facial recognition technology is unsophisticated. That’s often because the pool of faces used to train it isn’t as diverse as it needs to be. It will always make mistakes, but as it stands now, those mistakes disproportionately affect non-whites.
Jaya Saxena is a lifestyle writer and editor whose work focuses primarily on women's issues and web culture. Her writing has appeared in GQ, ELLE, the Toast, the New Yorker, Tthe Hairpin, BuzzFeed, Racked, Eater, Catapult, and others. She is the co-author of 'Dad Magazine,' the author of 'The Book Of Lost Recipes,' and the co-author of 'Basic Witches.'