Lead article image

Illustration by Max Fleishman

The abuse Leslie Jones endured on Twitter is nothing new for black women

Harassment and hate speech are everyday experiences for black female users of the platform.


Jaya Saxena


Posted on Jul 19, 2016   Updated on May 26, 2021, 10:35 am CDT

It’s well-known that Twitter is, at best, slow when it comes to responding to reports of abuse. But one would think a verified celebrity would get preferential treatment when reporting harassment. 

And yet last night actress and comedian Leslie Jones was on her own, fending off hundreds of racist and abusive tweets, before quitting the platform.


Earlier this year, Twitter formed a Trust and Safety Council, saying it “does not tolerate behavior intended to harass, intimidate, or use fear to silence another user’s voice.” The council of safety advocates and organizations provides input on Twitter’s features and programs, supposedly all in an effort to end abuse. 

However, it doesn’t seem to be working very well. Or if it is, it’s only working for a select few.

There have been many instances of both celebrities and nonfamous people being banned from Twitter for good cause. Last year, George Zimmerman was suspended after tweeting nude photos of his ex-girlfriend. Chuck C. Johnson was suspended after asking his followers to “take out” activist Deray McKesson. And Twitter has made it easier to report multiple tweets from a single user, rather than reporting them one at a time.

But the abuse against Jones is, unfortunately, not shocking. Many black women say that what Jones experienced is commonplace on Twitter, and that Twitter is especially slow to respond to racist abuse.


When she saw the abuse Jones was receiving, comedian and YouTuber Akilah Hughes told the Daily Dot, “It felt like I was reading my own mentions on nights when I decide to take a stance or use a trending hashtag. When you’re black, it’s totally common to be called a slur after just existing on this site and using it like everyone does.”

The response was similar for Shireen Mitchell, founder of Digital Sistas and Stop Online Violence Against Women. “My first thought was I’m not surprised. This is very typical,” she said. “And my second thought was if Twitter doesn’t do anything for a celebrity, this is a level of bias we can’t dig into.” Writer Morgan Jerkins also agrees. “I was definitely desensitized to it. Nothing surprised me whatsoever about the backlash that she received.”

Hughes said every time she gets abused it takes a lot of work on her end to report it, and so far, it’s rarely successful. There was only “one time ever they actually deactivated a racist account that was harassing me,” she says, “but only because the account had less than 20 tweets and they were all directed at me,” she said. 

So what is Twitter’s formal abuse and harassment policy? Well, it explicitly states: “You may not promote violence against or directly attack or threaten other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or disease.”

The Daily Dot has reached out to Twitter to clarify whether targeted racist speech counts as “promoting violence,” but has not heard back as of this posting. 

But often abuse on the platform comes with physical threats—rape threats, death threats, abusers telling users to “kill yourself.” Much of it is targeted toward women, and it’s historically been hard to get Twitter to recognize this kind of talk as harassment. And it seems that racist abuse falls into the “sticks and stones” category for Twitter—it’s hurtful, but nobody is threatening to bomb your house (yet).

“People are used to saying these things to people of color, especially women of color, and getting away with it,” says Mitchell. And that goes for Twitter too. “They don’t see anything wrong with it, so they don’t think anything bad is happening. If you had a woman of color looking at [Jones’s case], they’d see both the gender problems and the race problems.” 

For instance, a tweet saying, “I hope your kid doesn’t become a hashtag” directed at a black woman may not set off any hate speech filters, but the intent is as violent as anything else. 

There’s also the issue of selectiveness in Twitter’s response to abuse. Earlier this year, rapper Azealia Banks was suspended from Twitter after she sent prejudiced tweets to former One Direction member Zayn Malik. Banks had a long history of engaging in hate speech on the platform, but then again, so have many other famous and nonfamous users. People were quick to question why they weren’t being punished equally, and why others who’ve been abused have received more support.



“They got rid of her in an instant,” says Mitchell of Banks. “The reverse is never true. And she never threatened [Zayn’s] life. I do not support anything Banks said or has been saying, but the issue is about the double standards.”

In another example of biased behavior, Instagram, which is owned by Facebook, appears to be deleting comments from Taylor Swift’s account after users flooded her with snake emoji (inspired by this tweet from Kim Kardashian regarding this whole thing). 

Instagram certainly has a better reputation for responding to abuse than Twitter, but according to Jerkins, the response is faster for white women and white celebrities. “White women always get preferential treatment even if they are not being abused. We have to understand that white women and their supposed inherent innocence and purity have been a crucial element of white supremacy,” she said, pointing to the history of lynching black men for engaging with white women. “I bet you if you scroll through Nicki Minaj or any other black female celeb’s Instagrams, you’d probably see a lot of hatred, but it’s not being cleared up.”

Hughes says it’s proof that the perceived harassment of white women is still seen as more important than black women experiencing racism. “I think we handle white women with kid gloves and we ignore black women’s very real problems,” she said. “I think Twitter realized that Leslie Jones isn’t as popular as Taylor Swift, and that her being black certainly wasn’t going to be enough reason to help. It never is.”

A representative for Twitter told BuzzFeed the abuse Jones faced is not allowed on Twitter, and that “we rely on people to report this type of behavior, but we are continuing to invest heavily in improving our tools and enforcement systems to prevent this kind of abuse.” 

However, overall it seems Twitter regularly ignores the inherent violence of racist speech. The abuse may never escalate to physical threats, but it shouldn’t need to. The legacy of racism is painful enough for the victims of it.

Many will argue that victims should get off Twitter to avoid further abuse. Twitter is, on the surface, entirely optional. But such thinking ignores the role social media plays in our lives. Twitter is a space where people find communities, build professional connections, and gather information. And by not responding to abuse, it increasingly becomes a space black women and others can’t use. This abuse directly and disproportionately cuts black women off from important networks and relationships.

According to Mitchell, what happened to Jones is the “epitome of the black woman’s experience in tech.” It takes a lot of work (and money) to monitor abuse on a platform as large as Twitter, and some suspect that’s what’s driving their reluctance. After all, abuse counts as platform engagement.

Despite all this, Jerkins says she isn’t leaving. “Even though the abuse happens, I am always going to be using Twitter because if I left that would give the racist trolls exactly want they desire: a silencing of women of color’s voices.” 

But for many others, any action from Twitter may be too little, too late. “No matter what happens online, what happens to women of color get it six times worse with no protections,” says Mitchell. “Black women are the biggest voices on Twitter—but also the most unprotected.”

Share this article
*First Published: Jul 19, 2016, 2:33 pm CDT