Article Lead Image

Can Twitter’s new ‘Safety Council’ solve the company’s massive harassment problem?

But it's unclear what this group will do.

 

Selena Larson

Tech

Posted on Feb 9, 2016   Updated on May 27, 2021, 6:12 am CDT

As pressure mounts for Twitter to find a fix for its mounting harassment problem, the company has announced its latest attempt at a solution.

The company just launched the Twitter Trust and Safety Council, a group made up of 40 organizations that will supposedly help Twitter develop better policies and tools for addressing abuse. Included among the advocacy groups and researchers is Feminist Frequency, one of the most high-profile targets of threats and harassment on the platform. 

There’s not much information about what the Twitter Trust and Safety Council will actually do. Patricia Cartes, Twitter’s head of global policy outreach, wrote: “As we develop products, policies, and programs, our Trust & Safety Council will help us tap into the expertise and input of organizations at the intersection of these issues more efficiently and quickly.” 

The Trust and Safety Council website does not provide comprehensive information, either. We’ve reached out to Twitter for clarification and will update when we hear back. 

Harassment on Twitter is an enormous problem that Twitter tries to keep in check while balancing freedom of speech. Current reporting and blocking tools aren’t enough to avoid the onslaught of slurs, threats, and abuse that goes unresolved by the company. The onus is on users to report accounts for abuse individually, and when they do, Twitter seems to arbitrarily decide what constitutes harassment or bannable offenses, despite reported accounts that seem to blatantly break the rules

https://twitter.com/justkelly_ok/status/694757316248244224

Some user-created blocking tools help, including Block Together and the Block Bot. In response to the automated blocking features built by its community, last year Twitter made it possible to share block lists directly

As comic and writer Alison Leiby recently explained after one of her jokes went viral, causing her to receive an onslaught of violent, misogynistic tweets, harassment and the failure to address it ultimately makes individuals want to stop using these platforms. 

Reading a barrage of violent comments and threats doesn’t make me want to retaliate. It doesn’t make me want to fire back at those guys with the same hate and rage that they spewed my direction about me and the rest of my gender. It makes me want to censor myself. It makes me hesitant to write certain jokes. Could this tweet make hundreds of men tell me I belong locked in their closet? Will this idea I’m putting out there also end in threats of rape or murder?

That’s what happens to Twitter when people don’t feel safe. People stop creating and people stop posting. One developer set her account to private this week after receiving so many gendered slurs and threatening tweets that she wound up blocking 700,000 accounts. 

She used a mix of tools including shared blocklists and custom scripts to block people on Twitter. The woman, who asked not to be named to avoid further harassment, says that accounts knowing they are blocked from viewing someone’s profile contributes to the problem.

“It’s clear that Twitter doesn’t care,” she said in an email to the Daily Dot. “The harassment I get is small potatoes compared to some of the higher-profile women on Twitter, and they’ve sat on their hands then too. Twitter’s out here changing favorites to likes and swapping the Moments button with Notifications hoping you’ll accidentally click on it, but they barely seem to acknowledge that harassment is a major problem on their platform and won’t actually listen to the people who have been through it on how to solve it.”   

The new Trust and Safety Council appears to address this criticism at least, by working with individuals and organizations who have experienced it personally.

At first, she explained, she tried to report her harassers to Twitter, but the process of reporting multiple accounts is flawed—Twitter emails follow up questions about your reporting without providing account information or the tweet in question, so it’s hard to keep track. And, she said, Twitter never really did anything about the threatening accounts in question.

“I personally know a handful [of women] who have left the platform, but more often what happens is we just avoid certain topics entirely,” she said. “Don’t speak out about online harassment, because that’s the best way to attract online harassment. Don’t be a vocal feminist—heck, don’t be a woman on the internet—because you’ll get abuse for it.”

Twitter’s fundamental disconnect between users on the platform and the company itself was perfectly encapsulated by a series of tweets from Brandon Carpenter, one of the company’s iOS developers. After last week’s news that Twitter might start showing timelines out of order, he tweeted multiple responses, which went viral. It was his first time one of his tweets went out of his “immediate network,” and he experienced the abuse and harassment from people unhappy with the reported algorithmic changes. 

(Sorry, this embed was not found.)

https://twitter.com/bhcarpenter/status/695824384594808832

https://twitter.com/bhcarpenter/status/695834826432065536

Hopefully Twitter’s new Trust and Safety Council will assist in making Twitter a safer place where people feel empowered to speak, create, and share with minimized fears of being silenced by harassers.

It will certainly have its share of skeptics on the outset. As the developer told me: “Time will tell if it does any good, or if it’s just a feelgood PR move.”

Illustration via Max Fleishman

Share this article
*First Published: Feb 9, 2016, 9:27 am CST