- Cooking Mama’s return whips up a fresh batch of memes Tuesday 8:18 PM
- Influencer body-shames model, Photoshops photo of self to ‘prove point’ Tuesday 7:27 PM
- Boosie Badazz goes on transphobic rant about Dwyane Wade’s daughter Tuesday 6:34 PM
- Royal Family’s website accidentally links to porn instead of charity Tuesday 5:39 PM
- Republican senator spreads false conspiracy about coronavirus Tuesday 5:11 PM
- New DNA technology could help exonerate Black man serving life sentence Tuesday 4:24 PM
- ‘SNL’s’ Kenan Thompson to host the White House Correspondents’ Dinner Tuesday 3:58 PM
- Singer Summer Walker dragged for insensitive HIV comments Tuesday 2:39 PM
- This video of a teddy bear getting steam cleaned makes a perfect meme Tuesday 2:27 PM
- Ted Cruz goes on Twitter tirade over proposed vasectomy bill Tuesday 2:22 PM
- Billie Eilish says she’s stopped reading Instagram comments Tuesday 2:13 PM
- Christian group blames satanists for Twitter poll results Tuesday 1:41 PM
- Coronavirus has pandemic-themed video games topping charts Tuesday 12:58 PM
- Bloomberg said kids are drawn to socialism because they think it involves social media Tuesday 12:55 PM
- Jake Paul gives ill-informed advice on how to deal with anxiety Tuesday 12:25 PM
In the recently published study, researchers from Stanford University analyzed more than 35,000 faces of men and women from a dating website. Using a sophisticated mathematical system to look at images, the researchers created an algorithm that could correctly tell whether men were straight or gay 81 percent of the time, and whether women were gay or straight 74 percent of the time.
The study, which found that human “gaydar” was much less reliable than the machine’s algorithm, said that gay men and women tended to have “gender-atypical” features.
“The data also identified certain trends, including that gay men had narrower jaws, longer noses, and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women,” the Guardian reported.
The researchers said the study provides “strong support” for the theory that sexual orientation is related to exposure to certain hormones before birth, and that therefore people are born gay.
“The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid,” the Guardian reported.
The study—which did not include people of color (aka only researched white people’s faces) or consider transgender or bisexual people—raised questions about the ethics of facial recognition technology and how it could abuse LGBT people’s privacy.
Nick Rule, an associate professor of psychology at the University of Toronto who has published research on the science of gaydar, told the Guardian that this new AI capability is “certainly unsettling.”
With billions of images of people’s faces stored on social media sites and in government databases, it’s not hard to imagine how something could go wrong.
“If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad,” Rule told the Guardian.
But the researchers—and Rule—both said it’s still important to develop and test this technology in order for governments and companies to proactively consider its dangers and implement regulations.
“What the authors have done here is to make a very bold statement about how powerful this can be,” Rule told the Guardian. “Now we know that we need protections.”
H/T the Guardian
Kris Seavers is the IRL editor for the Daily Dot. Her work has appeared in Central Texas publications, including Austin Monthly and San Antonio Magazine, and on NPR.