A new website proves that we might not be ready to know what AI thinks of us.
The project, called ImageNet Roulette, allows users to upload their photos and see how their faces are categorized by machine learning software trained to identify humans.
The team behind ImageNet Roulette says the project’s aim is to expose the many issues with such classifications, which are based on datasets with “problematic, offensive and bizarre categories.”
“AI classifications of people are rarely made visible to the people being classified,” the website states. “ImageNet Roulette provides a glimpse into that process—and to show the ways things can go wrong.”
Once users began uploading their own results to Twitter, it didn’t take long to see those issues.
Many photos of Black people, for example, were labeled with outdated and offensive terms like “Negro” and “Negroid.” The terms were sometimes applied to photos of white people, too.
This the last thing I expected: pic.twitter.com/MVM1j5I8Nk— the Godfrogger (@myuncleisadj) September 16, 2019
Similarly, the software categorized at least one Black person as a “clown” wearing white makeup.
Man who you tellin? pic.twitter.com/RmVo5G2bZG— Adrian T. WOMACK (@weauxmaque) September 16, 2019
The AI described one woman who uploaded her own photo as an unmarried girl and a likely “virgin.”
The AI categorized a photo uploaded by a man of his 16-year-old self as a “rape suspect.”
I uploaded this photo from when I was at a Florida Marlins game as a 16 year old, and uh… pic.twitter.com/7AtTADOrbZ— Will Brown (@WdB11) September 16, 2019
Even Democratic presidential candidate Joe Biden wasn’t safe. The AI labeled the former vice president as a “klansman,” while President Donald Trump was categorized as a “centrist.”
Those interested in getting roasted by AI can visit the site and upload a picture or use their webcam. The site’s creators state that they do not store or keep any of the images uploaded.