- Rotten Tomatoes wants to see your ticket stub to leave a verified review 5 Years Ago
- ‘Sonic the Hedgehog’ movie delayed to 2020 to fix his look 5 Years Ago
- ‘Swamp Thing’ gets off to a promising start, but can it tell a convincing love story? 5 Years Ago
- ‘Falling on deaf ears’: ‘Queer Eye’ star sparks conversation about ableist idioms Today 11:15 AM
- Parents are spending thousands on YouTube camps that teach kids how to be famous Today 10:43 AM
- In season 2 of ‘She’s Gotta Have It,’ Spike Lee remains unapologetically himself Today 10:36 AM
- Trump selling Pride shirts is a grotesque insult to the LGBTQ community Today 10:27 AM
- Logan Paul is being mocked for pulling out of slapping competition Today 9:57 AM
- 47 House Democrats sign criticized net neutrality working group letter Today 9:17 AM
- How ‘and I oop’ became the perfect reaction meme for shocking developments Today 8:47 AM
- Netflix’s ‘The Perfection’ is a totally unhinged, WTF horror film Today 8:00 AM
- ‘After Maria’ looks at the long road to recovery after Hurricane Maria Today 7:30 AM
- ‘Star Wars: Knights of the Old Republic’ movie is reportedly in the works Today 7:04 AM
- Why do conservatives hate Ocasio-Cortez’s community garden? Today 6:30 AM
- What happens to Hawkeye’s Disney+ spinoff after ‘Avengers: Endgame’? Today 6:30 AM
Most of the 10,000 slurs we tweet every day aren’t intended to be derogatory, but that doesn’t make them any less disturbing.
Just how racist is Twitter? About .00007 percent racist, according to a new study.
But that still adds up to a disconcerting volume of words used as religious, ethnic, or racial slurs. There’s some good news, however: Most of the language isn’t used with derogatory intent.
Media Bistro reports that Demos, a UK research group, has published a new study finding that about 1 in every 15,000 tweets contains a slur. That’s over 10,000 tweets each day.
The most common phrase that turned up on the study? “White boy.”
The study crowdsourced its list of slurs from Wikipedia and then scraped tweets from the platform during late November of 2012. The result was nearly 127,000 tweets containing what researchers identified as a slur. From there, researchers used machines and humans to analyze how the tweets were being used.
The study stresses that it’s a measurement of specific words and phrases only, not hate speech or other kinds of harmful dialogue around race on Twitter. And while praising Twitter as “an unprecedented source of data” for linguists, they cautioned that context—something it’s not always easy to get from 140 characters—is everything.
But even within the study’s limited constraints, the results are revealing:
— Most uses of slurs are done casually, without any violent or abusive intent; only a relatively small number (between 500—2,000 daily) were directed at an individual with derogatory intent.
—Casual slurs were most often used to build in-group solidarity or to describe someone. These kinds of non-derogatory uses include examples of humor, satire, and a group appropriating or reclaiming a word used to slur them.
—But the study also noted that “casual use of racial slurs” was significant: “The way in which
racist language can seep into broader language use and reflect underlying social attitudes is potentially an area of concern,” notes the study. About 5—10% of tweets fell into the category of slurs that the user might view as non-derogatory, but that others could perceive as threatening.
Another aim of the study was to determine how well machine analysis could work on these types of tweets, to determine context and purpose. While they were useful in data sorting, when it came to parsing whether tweets were derogatory or not, they fared less well.
But then again, so did the humans, who disagreed with each other nearly a third of the time on how to classify the data. The study cited “cultural bias” as one reason for this, pointing out that even the culture around the words themselves isn’t always clear on what’s a slur or not. For example, many people would argue that the phrase “white boy” doesn’t operate on the level of other words in the list, because it doesn’t help to reinforce a social power imbalance that favors white men over basically everyone else.
And plenty of people might disagree with the way the study catalogued certain tweets. Here’s the example the study gives for “casual racial slur:”
@^^^: Fucking paki shops that charge you for cash withdrawals or paying by card need nuked
The study labeled this tweet as “non-heated,” but to many people, a tweet declaring that owners of Pakistani shops should be bombed and destroyed might come across as very heated. And while the study distinguished between this kind of tweet and “an explicit incitement to do something “in the real world,’” similar oblique tweets have been read as threats when they’re directed at, oh, the U.S. president instead.
All in all, the study raises as many questions as it answers; but one thing seems clear.
Whether it’s easy to judge their intent or not, words can and do still hurt you.
Photo via eldh/Flickr CC by SA 2.0
Aja Romano is a geek culture reporter and fandom expert. Their reporting at the Daily Dot covered everything from Harry Potter and anime to Tumblr and Gamergate. Romano joined Vox as a staff reporter in 2016.