- Florida city is pushing homeless people out by playing ‘Baby Shark’ on a loop Wednesday 7:27 PM
- A ‘Gossip Girl’ reboot is coming to HBO Max–and fans are not happy with the casting details Wednesday 6:44 PM
- Beto can’t leverage his slave owner ancestry to gain Black voters’ trust Wednesday 5:51 PM
- Oakland to become the third U.S. city to ban facial recognition Wednesday 5:50 PM
- ‘Release the Snyder Cut’ billboards pop up outside of San Diego Comic-Con Wednesday 5:24 PM
- Iggy Azalea and Peppa Pig have an epic Twitter fight Wednesday 4:39 PM
- Should you be concerned about your privacy on FaceApp? Wednesday 4:15 PM
- Google ‘terminates’ Dragonfly, its censored search engine for China Wednesday 3:33 PM
- AOC rips Facebook during Libra House hearing Wednesday 3:14 PM
- The time traveler conversation meme finds its way to TikTok Wednesday 2:52 PM
- Grimes claims she had an ‘experimental’ eye surgery and practices sword fighting Wednesday 2:42 PM
- 70 Border Patrol employees under investigation for posts in secret Facebook group Wednesday 1:45 PM
- Republican’s Operation Safe Return criticized as cover for mass deporation Wednesday 1:42 PM
- ‘Chernobyl’ star Jared Harris is concerned about people taking Instagrams there Wednesday 12:18 PM
- Mattel’s BTS dolls are finally up for preorder Wednesday 12:14 PM
AI ‘gaydar’ is now a thing—and it can tell if you’re gay based on your face
Artem Oleshko/Shutterstock (Licensed)
Some see this as dangerous.
In the recently published study, researchers from Stanford University analyzed more than 35,000 faces of men and women from a dating website. Using a sophisticated mathematical system to look at images, the researchers created an algorithm that could correctly tell whether men were straight or gay 81 percent of the time, and whether women were gay or straight 74 percent of the time.
The study, which found that human “gaydar” was much less reliable than the machine’s algorithm, said that gay men and women tended to have “gender-atypical” features.
“The data also identified certain trends, including that gay men had narrower jaws, longer noses, and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women,” the Guardian reported.
The researchers said the study provides “strong support” for the theory that sexual orientation is related to exposure to certain hormones before birth, and that therefore people are born gay.
“The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid,” the Guardian reported.
The study—which did not include people of color (aka only researched white people’s faces) or consider transgender or bisexual people—raised questions about the ethics of facial recognition technology and how it could abuse LGBT people’s privacy.
Nick Rule, an associate professor of psychology at the University of Toronto who has published research on the science of gaydar, told the Guardian that this new AI capability is “certainly unsettling.”
With billions of images of people’s faces stored on social media sites and in government databases, it’s not hard to imagine how something could go wrong.
“If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad,” Rule told the Guardian.
But the researchers—and Rule—both said it’s still important to develop and test this technology in order for governments and companies to proactively consider its dangers and implement regulations.
“What the authors have done here is to make a very bold statement about how powerful this can be,” Rule told the Guardian. “Now we know that we need protections.”
H/T the Guardian
Kris Seavers is the Evening Editor for the Daily Dot, where she covers breaking news, politics, and LGBTQ issues. Her work has appeared in Central Texas publications, including Austin Monthly and San Antonio Magazine, and on NPR.