- Report: Disney yanks YouTube ad spending following child exploitation accusations Wednesday 7:56 PM
- These people are organizing Fyre Fest live-action role-play parties Wednesday 6:35 PM
- White woman berates Mexican restaurant manager for speaking Spanish Wednesday 4:12 PM
- In Pixar short ‘Kitbull,’ a cat and pit bull become unlikely friends Wednesday 3:48 PM
- Stop exploiting the Jussie Smollett case to discredit LGBTQ hate crime victims Wednesday 3:28 PM
- The best Netflix original movies of 2019 Wednesday 3:20 PM
- Pinterest is reportedly blocking vaccination searches Wednesday 2:53 PM
- Nike’s self-lacing smart sneakers malfunction days after release Wednesday 2:50 PM
- How to quickly get the Havoc weapon in Apex Legends Wednesday 2:48 PM
- The truth behind the anti-LGBTQ emoji controversy Wednesday 1:37 PM
- Tristan Thompson disables Instagram comments after reports he cheated on Khloe Kardashian Wednesday 11:25 AM
- Introducing ‘boner culture,’ this Gamergate blogger’s latest cause Wednesday 11:16 AM
- HBO debuts trailer for controversial Michael Jackson doc ‘Leaving Neverland’ Wednesday 10:46 AM
- Christian woman refuses to do taxes for lesbian married couple Wednesday 10:43 AM
- Political campaigns will be snooping on your phones in 2020 Wednesday 10:43 AM
AI ‘gaydar’ is now a thing—and it can tell if you’re gay based on your face
Artem Oleshko/Shutterstock (Licensed)
Some see this as dangerous.
In the recently published study, researchers from Stanford University analyzed more than 35,000 faces of men and women from a dating website. Using a sophisticated mathematical system to look at images, the researchers created an algorithm that could correctly tell whether men were straight or gay 81 percent of the time, and whether women were gay or straight 74 percent of the time.
The study, which found that human “gaydar” was much less reliable than the machine’s algorithm, said that gay men and women tended to have “gender-atypical” features.
“The data also identified certain trends, including that gay men had narrower jaws, longer noses, and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women,” the Guardian reported.
The researchers said the study provides “strong support” for the theory that sexual orientation is related to exposure to certain hormones before birth, and that therefore people are born gay.
“The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid,” the Guardian reported.
The study—which did not include people of color (aka only researched white people’s faces) or consider transgender or bisexual people—raised questions about the ethics of facial recognition technology and how it could abuse LGBT people’s privacy.
Nick Rule, an associate professor of psychology at the University of Toronto who has published research on the science of gaydar, told the Guardian that this new AI capability is “certainly unsettling.”
With billions of images of people’s faces stored on social media sites and in government databases, it’s not hard to imagine how something could go wrong.
“If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad,” Rule told the Guardian.
But the researchers—and Rule—both said it’s still important to develop and test this technology in order for governments and companies to proactively consider its dangers and implement regulations.
“What the authors have done here is to make a very bold statement about how powerful this can be,” Rule told the Guardian. “Now we know that we need protections.”
H/T the Guardian
Kris Seavers is the Evening Editor for the Daily Dot, where she covers breaking news, politics, and LGBTQ issues. Her work has appeared in Central Texas publications, including Austin Monthly and San Antonio Magazine, and on NPR.