- Beyhive coming for Sainsbury’s supermarket over Ivy Park shade 5 Years Ago
- Antique store blasted for selling ‘white only’ signs Today 1:45 PM
- DaBaby explains altercation with hotel employee after video goes viral Today 12:32 PM
- Kanye faces backlash for headlining Christian event with anti-LGBTQ leaders Today 10:31 AM
- Why is Yennefer of Vengerberg so different in Netflix’s ‘The Witcher’? Today 10:00 AM
- Actress slammed for ‘acid attack-face’ TikTok challenge Today 9:46 AM
- ‘Weathering With You’ blends fantasy and realism in a magical love story Saturday 6:18 PM
- Kidnapped teen used Snapchat to get rescued Saturday 4:35 PM
- What fans do and don’t want to see in future ‘Far Cry’ installments Saturday 4:26 PM
- Aaron Carter accused of stealing lion art for merch Saturday 3:10 PM
- Instagram’s hidden like counts were inspired by a ‘Black Mirror’ episode Saturday 2:06 PM
- Student says they were expelled for tricking teacher into making inappropriate TikTok Saturday 12:26 PM
- Space Force uniforms relentlessly mocked, memed Saturday 10:52 AM
- Man flamed after admitting he called police on Target employee over a toothbrush Saturday 9:10 AM
- Netflix’s ‘Vivir Dos Veces’ searches for a last chance at first love Saturday 8:00 AM
Google Photos bug tags black people with racist term
‘This is 100% not OK.’
Google Photos has the makings of a great Web-based photo-editing platform. Unfortunately, its auto-tagging feature might need a serious overhaul, as evidenced by a tweet documenting a highly erroneous—and unfortunately, unwittingly racist—result:
A new update to the app reportedly allows it to tag people automatically in photos uploaded; the photos are then mechanically sorted into categories based on similar images. When Twitter user Jacky Alcine uploaded pictures of him posing with his friend onto the platform, it filtered the set into an album titled “Gorillas.”
Auto-tagging features for photo-editing platforms seems to be a common problem, with Flickr experiencing a strikingly similar misidentification issue in May. While Alcine’s posts immediately elicited over a thousand retweets, the most important response came over an hour later, from Yonatan Zunger, Google’s chief social architect.
As a remedy for the mishap, developers removed the “gorilla” tag from Google Photos’ database and tweaked searches. Zunger admitted, however, that more work is required to come up with a better fix.
“Really interesting problems in image recognition here,” Zunger explained in his Twitter correspondence with Alcine. “Obscured faces, different contrast processing needed for different skin tones and lighting, etc. We used to have a problem with people (of all races) being tagged as dogs, for similar reasons.”
While this type of situation can be understandably frustrating for people of color hoping to use Google Photos to store their selfies, the rapid response from Google’s customer support provides hope for an even more improved version of the app.
A Google spokesperson also issued an official statement to Ars Technica:
“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”
H/T Ars Technica | Illustration by Max Fleishman
Jam Kotenko is a technology reporter who specializes in coverage of Instagram, Facebook, and other social media apps. Her work has been published by Digital Trends, Bustle, and Gotta Be Mobile.