- Review: Kentucky Route Zero is one of the most magical games ever made 4 Months Ago
- Backlash grows against Clearview as lawsuit looms 4 Months Ago
- Tyler the Creator calls out the Grammys for racism over ‘Rap Album’ win 4 Months Ago
- Democrats call on John Bolton to testify after book bombshell Today 9:56 AM
- Pete Buttigieg ripped for basketball ‘field’ tribute to Kobe Bryant Today 9:13 AM
- See how Logan Paul reacted to a college student spitting on him Today 8:50 AM
- Lewis Capaldi mistaken for Grammy seat filler because no one knows who he is Today 7:40 AM
- Why we’re obsessed with abandoned power plants and theme parks Today 7:00 AM
- Democrats are open to changing one of the internet’s bedrock principles Today 6:30 AM
- Swipe This! Social media makes me feel like my career is lagging. Will I ever catch up? Today 6:00 AM
- ‘Zola’ is a surreal and wild tale of a road trip gone wrong Today 5:00 AM
- Sebastian Gorka blocks pundit over Fleshlight joke Today 1:56 AM
- Woman slammed for trying to put UPS driver on blast Sunday 5:23 PM
- Twitter users are sharing which celebrities have blocked them Sunday 4:43 PM
- Conspiracy theorists are already taking advantage of Kobe Bryant’s death Sunday 4:14 PM
A new website proves that we might not be ready to know what AI thinks of us.
The project, called ImageNet Roulette, allows users to upload their photos and see how their faces are categorized by machine learning software trained to identify humans.
The team behind ImageNet Roulette says the project’s aim is to expose the many issues with such classifications, which are based on datasets with “problematic, offensive and bizarre categories.”
“AI classifications of people are rarely made visible to the people being classified,” the website states. “ImageNet Roulette provides a glimpse into that process—and to show the ways things can go wrong.”
Once users began uploading their own results to Twitter, it didn’t take long to see those issues.
Many photos of Black people, for example, were labeled with outdated and offensive terms like “Negro” and “Negroid.” The terms were sometimes applied to photos of white people, too.
This the last thing I expected: pic.twitter.com/MVM1j5I8Nk— the Godfrogger (@myuncleisadj) September 16, 2019
Similarly, the software categorized at least one Black person as a “clown” wearing white makeup.
Man who you tellin? pic.twitter.com/RmVo5G2bZG— Adrian T. WOMACK (@weauxmaque) September 16, 2019
The AI described one woman who uploaded her own photo as an unmarried girl and a likely “virgin.”
The AI categorized a photo uploaded by a man of his 16-year-old self as a “rape suspect.”
I uploaded this photo from when I was at a Florida Marlins game as a 16 year old, and uh... pic.twitter.com/7AtTADOrbZ— Will Brown (@WdB11) September 16, 2019
Even Democratic presidential candidate Joe Biden wasn’t safe. The AI labeled the former vice president as a “klansman,” while President Donald Trump was categorized as a “centrist.”
Those interested in getting roasted by AI can visit the site and upload a picture or use their webcam. The site’s creators state that they do not store or keep any of the images uploaded.
Mikael Thalen is a tech and security reporter based in Seattle, covering social media, data breaches, hackers, and more.