Google reportedly sent people searching ‘Black girls’ to porn (updated)

Analysis

The Google Ads Keyword Planner helps users optimize their pages for the best search engine results possible. It’s also racist. Up until this week, searching for girls of color introduced myriad suggestions for sexualized keywords and pornography terms, a new report from the Markup revealed.

After searching “black girls” in the keyword planner, Google recommended users consider porn-related keywords including “naked black girls,” “big booty black girls,” “ebony cam,” and “ebony nude,” the Markup found. Sexually objectifying terms were also provided for “latina girls” and “asian girls,” whereas the phrases “white girls” and “white boys” both “returned no suggested terms at all.”

Among 435 terms returned by “black girls,” 203 were “adult ideas,” and the remaining 232 still had pornographic language in the search results, the Markup reported. The keywords “black girls sucking dick” and “black chicks white dick” both appeared as suggested search terms outside of the keyword planner’s adult filter. Another keyword, “piper perri blacked,” refers to white porn star Piper Perri on an adult site focused specifically on Black men topping white and light-skinned women.

Black Girls Google Ads Racism
Racist results for ‘black girls’ as reported by The Markup.

After the Markup sent Google a request for comment, the tech company reportedly blocked search terms results in the keyword planner that combine a race or ethnicity and the terms “boys” and “girls,” the report notes.

“Google’s systems contained a racial bias that equated people of color with objectified sexualization while exempting white people from any associations whatsoever,” the Markup reported. “In addition, by not offering a significant number of non-pornographic suggestions, this system made it more difficult for marketers attempting to reach young Black, Latinx, and Asian people with products and services relating to other aspects of their lives.”

Google said it has since “removed these terms from the tool” and is “looking into how we [can] stop this from happening again.”

“The language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance,” spokesperson Suzanne Blackburn told the Markup.

Despite these changes, the Daily Dot confirmed that searching for “black girls,” “latina girls,” “asian girls,” and case sensitive variations of these terms brought immediate suggestions for search terms that sexualize the groups on two services that rely on Google Search data.

Mangools’ KWFinder gives users keyword suggestions based on search data, much of which draws on Google as “one of [its] main data sources” to the point where “the data that [it uses] are often identical to the ones in Google Keyword Planner,” as Mangools notes in a blog post. The Daily Dot discovered that searching “latina girls” on KWFinder returned “hot latinas” and “latina boobs” as the first and 12th related keyword suggestions. “Black girls” brought up “ebony girls” as a second recommended keyword. For “asian girls,” “oriental girls,” “asian girl on girl,” and “hot asian girls” were all within the top 10 results.

Meanwhile, searching for “white boys” suggested “black male models” and “black men on white boys” within the top five results. (Among the other two sexually suggestive keywords, “hot white boys” and “hot white men,” only the former ranked in the top five.) A query for “white girls” on KWFinder brought up no adult keywords in the top 10 list.

Google Trends suggested more blatantly adult material. Searching for “black girls” with the tool brought up top five suggestions for “rising” related topics such as “strap-on dildo” and “ethnic pornography.” A search for “Asian girls” suggested related topics like “camel toe,” “strap-on dildo,” and “defecation” within the first five results, along with “webcam model” as the ninth result. Meanwhile, searching “Latina girls” suggested “stripper,” “creampie,” “defecation,” “handjob,” and “tranny.”

Google Trends Black Girls Racism

A search for “White girls” included fewer sexual topics, all of which were outside of the top five, with one in the top 10. Only one sexual term, “Twink,” returned for a “White boys” search, whereas a search for “Asian boys” brought about topics such as “urination,” sissy,” and “semen.” (All search results were case sensitive.)

Google Trends has an info blurb stating these rising topics come from “users searching for your term [who] also searched for these topics” and displays “related topics with the biggest increase in search frequency” over a specific timeframe (in the Daily Dot’s case, this was the past 12 months). However, the Markup argued racism is “embedded in Google’s algorithms” and criticism dates back to well before 2020.

UCLA professor Safiya Umoja Noble, the author behind Algorithms of Oppression, wrote for Bitch magazine in 2012 that Google and other search engines use artificial intelligence and algorithms to assign values to specific keywords without taking “social context into account.” These are all impacted by “a variety of commercial advertising and political, social, and economic factors” that bump some search results up higher than others, Noble wrote.

“When it comes to commercial search engines, it is no longer enough to simply share news and education on the web—we must ask ourselves how the things we want to share are found, and how the things we find have surfaced,” Noble explained. “These shifts are similar to the ways that certain kinds of information are prioritized to the top of the search pile: information, products, and ideas promoted by businesses and sold to industries that can afford to purchase keywords at a premium, or URLs and advertising space online that drive their results and links to the top of the near-infinite pile of information available on the web.”

Google Racism Asian Girls

In other words, when Google Trends or KWFinder suggest a specific topic or keyword, there’s more to the story than just an algorithm spitting out data. Websites game keyword values and topic suggestions to increase their search ranking, creating a self-fulfilling cycle where racism and misogyny get regurgitated through the search engines people use every day. The Markup’s findings are a symptom of that problem.

The Daily Dot reached out to Google and the Markup for comment.

Update 3:34pm CT, July 24: When reached for comment, a Google spokesperson told the Daily Dot that Google Keyword Planner is primarily used for research and “very few active ad campaigns are driven by Keyword Planner.” The spokesperson also said that suggested keywords would not necessarily be approved for ads. However, data used in the Keyword Planner is based off of Google Search content.

“While the planner tool may reflect real keyword trends from Search, it would not be accurate to say that our Google Search product is reflecting those searches in features like Autocomplete or in refinement features. We have policies that govern those features,” a Google spokesperson said, directing the Daily Dot to an explainer on Google Search.

“Because our systems are surfacing and organizing information and content from the web, search can mirror biases or stereotypes that exist on the web and in the real world,” the spokesperson said. “We understand that this can cause harm to people of all races, genders and other groups who may be affected by such biases or stereotypes, and we share the concern about this. We have worked, and will continue to work, to improve image results for all of our users.”

When reached for comment, the Markup’s Aaron Sankin concluded the Daily Dot’s findings appear to confirm that “this issue of the sexualization of young people of color is a problem that’s seeped its way into many different systems across the internet.”

“It does say something about the intractability of this very specific problem of an association between these terms and porn, at the exclusion of terms that speak to other aspects of people’s lives, that even with all this attention being paid to it, Google (and other companies that draw from Google’s data) aren’t able to solve it,” Sankin said.

Editor’s note: Sankin is a former senior staff writer of the Daily Dot.

READ MORE:

H/T the Markup