- The new ‘Hunger Games’ book paints President Snow as a hero—and people are not happy Tuesday 9:03 PM
- Influencer called out for ‘troubling image’ with Kenyan child Tuesday 8:18 PM
- Professor arrested for spending $185K of grant money on iTunes and strippers Tuesday 7:28 PM
- Man cuts his books in half to make them ‘portable,’ spurs online debate Tuesday 6:09 PM
- Fans defend Lana Del Rey after she was mocked for flying commercial Tuesday 5:10 PM
- Lady Gaga fans find alleged new song name in her website’s code Tuesday 4:42 PM
- Barstool Sports deletes anti-union tweets, blog post in settlement Tuesday 3:47 PM
- The ‘can have … as a treat’ meme has come full circle Tuesday 3:09 PM
- Joe Rogan says he’s voting for Bernie Sanders Tuesday 2:54 PM
- Woman spots mole in man’s TikTok video, saves him from cancer Tuesday 2:17 PM
- ‘You’ star confirms his character is queer and ‘never will be’ straight Tuesday 1:08 PM
- This Twitch streamer pooped his pants during a broadcast Tuesday 12:17 PM
- Apple’s iCloud encryption plan halted amid FBI pressure, report Tuesday 10:57 AM
- Glenn Greenwald charged with cybercrimes in Brazil Tuesday 10:48 AM
- BadBunny rips her fans for not sending her enough money Tuesday 10:06 AM
Richard Lee, a 22-year-old New Zealander of Asian descent, recently had his passport photo rejected when the automated facial recognition software registered his eyes as being closed. Lee posted about the incident on Facebook.
Lee told Reuters that it was no big deal, “I’ve always had very small eyes and facial recognition technology is relatively new and unsophisticated.” An Internal Affairs spokesman also said about 20 percent of passport photos are rejected.
However, as much as people love to claim that technology is inherently unbiased, there are many examples of racial bias in facial recognition software. For a while, Google photos was auto-tagging images of black people as gorillas. HP’s face tracking webcams could detect white people but not black people. And many versions of facial recognition software have determined Asian people’s eyes are closed.
As Rose Eveleth wrote for Motherboard, the bias comes not from the technology, but from the programmers. “Algorithms are trained using a set of faces. If the computer has never seen anybody with thin eyes or darker skin, it doesn’t know to see them. It hasn’t been told how. More specifically: The people designing it haven’t told it how.” However, because engineers say they aren’t intentionally programming racial biases, many refuse to admit there is even a problem.
Lee is right that facial recognition technology is unsophisticated. That’s often because the pool of faces used to train it isn’t as diverse as it needs to be. It will always make mistakes, but as it stands now, those mistakes disproportionately affect non-whites.
Jaya Saxena is a lifestyle writer and editor whose work focuses primarily on women's issues and web culture. Her writing has appeared in GQ, ELLE, the Toast, the New Yorker, Tthe Hairpin, BuzzFeed, Racked, Eater, Catapult, and others. She is the co-author of 'Dad Magazine,' the author of 'The Book Of Lost Recipes,' and the co-author of 'Basic Witches.'