Article Lead Image

Google Photos bug tags black people with racist term

'This is 100% not OK.'

 

Jam Kotenko

Tech

Posted on Jul 1, 2015   Updated on May 28, 2021, 10:57 am CDT

Google Photos has the makings of a great Web-based photo-editing platform. Unfortunately, its auto-tagging feature might need a serious overhaul, as evidenced by a tweet documenting a highly erroneous—and unfortunately, unwittingly racist—result:

A new update to the app reportedly allows it to tag people automatically in photos uploaded; the photos are then mechanically sorted into categories based on similar images. When Twitter user Jacky Alcine uploaded pictures of him posing with his friend onto the platform, it filtered the set into an album titled “Gorillas.”

Auto-tagging features for photo-editing platforms seems to be a common problem, with Flickr experiencing a strikingly similar misidentification issue in May. While Alcine’s posts immediately elicited over a thousand retweets, the most important response came over an hour later, from Yonatan Zunger, Google’s chief social architect.

https://twitter.com/yonatanzunger/status/615355996114804737

As a remedy for the mishap, developers removed the “gorilla” tag from Google Photos’ database and tweaked searches. Zunger admitted, however, that more work is required to come up with a better fix. 

“Really interesting problems in image recognition here,” Zunger explained in his Twitter correspondence with Alcine. “Obscured faces, different contrast processing needed for different skin tones and lighting, etc. We used to have a problem with people (of all races) being tagged as dogs, for similar reasons.”

While this type of situation can be understandably frustrating for people of color hoping to use Google Photos to store their selfies, the rapid response from Google’s customer support provides hope for an even more improved version of the app.

https://twitter.com/yonatanzunger/status/615383011819794432

A Google spokesperson also issued an official statement to Ars Technica:

“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

H/T Ars Technica | Illustration by Max Fleishman

Share this article
*First Published: Jul 1, 2015, 8:45 pm CDT