- Polar Peak in Fortnite is cracking, and players think a dragon may be beneath the ice 2 Years Ago
- ‘Rise of Skywalker’ first look reveals mysterious new characters 2 Years Ago
- Meet the anti-choice, pro-NRA Trump supporter challenging Rep. Justin Amash 2 Years Ago
- Moby attempts to prove he dated Natalie Portman with a shirtless photo 2 Years Ago
- After feuding with James Charles, Tati Westbrook angers the YouTube community Today 11:06 AM
- Does Keri Russell’s ‘Rise of Skywalker’ character have an offensive name in Spanish? Today 10:59 AM
- It’s not clear if Ralph Northam is in racist yearbook photo, investigators say Today 10:48 AM
- The atonement of an alt-right troll Today 9:25 AM
- #StopTheBans protests draw thousands across the country in support of abortion rights Today 9:24 AM
- North Korea is using Trump’s low IQ attack on Joe Biden Today 9:14 AM
- How to watch ‘Kidding’ for free Today 8:00 AM
- What’s the deal with Bran Stark at the end of ‘Game of Thrones’? Today 6:30 AM
- How to watch TruTV online for free Today 6:00 AM
- Fans call out Madonna for edited Eurovision video Tuesday 9:36 PM
- Partnered Twitch streamer temporarily banned for airing troll’s racist message Tuesday 8:45 PM
AI ‘gaydar’ is now a thing—and it can tell if you’re gay based on your face
Artem Oleshko/Shutterstock (Licensed)
Some see this as dangerous.
In the recently published study, researchers from Stanford University analyzed more than 35,000 faces of men and women from a dating website. Using a sophisticated mathematical system to look at images, the researchers created an algorithm that could correctly tell whether men were straight or gay 81 percent of the time, and whether women were gay or straight 74 percent of the time.
The study, which found that human “gaydar” was much less reliable than the machine’s algorithm, said that gay men and women tended to have “gender-atypical” features.
“The data also identified certain trends, including that gay men had narrower jaws, longer noses, and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women,” the Guardian reported.
The researchers said the study provides “strong support” for the theory that sexual orientation is related to exposure to certain hormones before birth, and that therefore people are born gay.
“The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid,” the Guardian reported.
The study—which did not include people of color (aka only researched white people’s faces) or consider transgender or bisexual people—raised questions about the ethics of facial recognition technology and how it could abuse LGBT people’s privacy.
Nick Rule, an associate professor of psychology at the University of Toronto who has published research on the science of gaydar, told the Guardian that this new AI capability is “certainly unsettling.”
With billions of images of people’s faces stored on social media sites and in government databases, it’s not hard to imagine how something could go wrong.
“If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad,” Rule told the Guardian.
But the researchers—and Rule—both said it’s still important to develop and test this technology in order for governments and companies to proactively consider its dangers and implement regulations.
“What the authors have done here is to make a very bold statement about how powerful this can be,” Rule told the Guardian. “Now we know that we need protections.”
H/T the Guardian
Kris Seavers is the Evening Editor for the Daily Dot, where she covers breaking news, politics, and LGBTQ issues. Her work has appeared in Central Texas publications, including Austin Monthly and San Antonio Magazine, and on NPR.