- Jeremy Renner asks nicely for Sony to let Spider-Man back in the MCU 4 Years Ago
- The best and safest torrenting sites you should be using in 2019 4 Years Ago
- ‘Beyoncé’s Assistant for a Day’ creator is releasing more games on storytelling app Yarn Today 1:54 PM
- Why does everyone keep falling for that Instagram and Facebook hoax? Today 1:46 PM
- A bunch of celebrities fell for that viral Instagram hoax Today 1:17 PM
- Former Die Antwoord crew member says video shows ‘homophobic attack’ Today 1:13 PM
- How to stream all the MLS Rivalry Week matches Today 1:13 PM
- Nevada officials issue warnings for people prepping to ‘Storm Area 51’ Today 12:55 PM
- These are the 8 best fighting games available today Today 12:43 PM
- Pluto TV and the NFL launch the NFL Channel Today 12:40 PM
- Trump: ‘I am the chosen one’ Today 12:33 PM
- Video shows arrest of 15-year-old who threatened school shooting online Today 12:11 PM
- Woman finds massive diamond after watching YouTube video on how to find diamonds Today 11:30 AM
- Up to 20 states are banding together to probe Facebook, Google Today 11:08 AM
- Get your tinker on with the Electronic Games Advent Calendar Today 10:51 AM
Can Twitter’s new ‘Safety Council’ solve the company’s massive harassment problem?
But it’s unclear what this group will do.
As pressure mounts for Twitter to find a fix for its mounting harassment problem, the company has announced its latest attempt at a solution.
The company just launched the Twitter Trust and Safety Council, a group made up of 40 organizations that will supposedly help Twitter develop better policies and tools for addressing abuse. Included among the advocacy groups and researchers is Feminist Frequency, one of the most high-profile targets of threats and harassment on the platform.
There’s not much information about what the Twitter Trust and Safety Council will actually do. Patricia Cartes, Twitter’s head of global policy outreach, wrote: “As we develop products, policies, and programs, our Trust & Safety Council will help us tap into the expertise and input of organizations at the intersection of these issues more efficiently and quickly.”
The Trust and Safety Council website does not provide comprehensive information, either. We’ve reached out to Twitter for clarification and will update when we hear back.
Harassment on Twitter is an enormous problem that Twitter tries to keep in check while balancing freedom of speech. Current reporting and blocking tools aren’t enough to avoid the onslaught of slurs, threats, and abuse that goes unresolved by the company. The onus is on users to report accounts for abuse individually, and when they do, Twitter seems to arbitrarily decide what constitutes harassment or bannable offenses, despite reported accounts that seem to blatantly break the rules.
Some user-created blocking tools help, including Block Together and the Block Bot. In response to the automated blocking features built by its community, last year Twitter made it possible to share block lists directly.
As comic and writer Alison Leiby recently explained after one of her jokes went viral, causing her to receive an onslaught of violent, misogynistic tweets, harassment and the failure to address it ultimately makes individuals want to stop using these platforms.
Reading a barrage of violent comments and threats doesn’t make me want to retaliate. It doesn’t make me want to fire back at those guys with the same hate and rage that they spewed my direction about me and the rest of my gender. It makes me want to censor myself. It makes me hesitant to write certain jokes. Could this tweet make hundreds of men tell me I belong locked in their closet? Will this idea I’m putting out there also end in threats of rape or murder?
That’s what happens to Twitter when people don’t feel safe. People stop creating and people stop posting. One developer set her account to private this week after receiving so many gendered slurs and threatening tweets that she wound up blocking 700,000 accounts.
She used a mix of tools including shared blocklists and custom scripts to block people on Twitter. The woman, who asked not to be named to avoid further harassment, says that accounts knowing they are blocked from viewing someone’s profile contributes to the problem.
“It’s clear that Twitter doesn’t care,” she said in an email to the Daily Dot. “The harassment I get is small potatoes compared to some of the higher-profile women on Twitter, and they’ve sat on their hands then too. Twitter’s out here changing favorites to likes and swapping the Moments button with Notifications hoping you’ll accidentally click on it, but they barely seem to acknowledge that harassment is a major problem on their platform and won’t actually listen to the people who have been through it on how to solve it.”
The new Trust and Safety Council appears to address this criticism at least, by working with individuals and organizations who have experienced it personally.
At first, she explained, she tried to report her harassers to Twitter, but the process of reporting multiple accounts is flawed—Twitter emails follow up questions about your reporting without providing account information or the tweet in question, so it’s hard to keep track. And, she said, Twitter never really did anything about the threatening accounts in question.
“I personally know a handful [of women] who have left the platform, but more often what happens is we just avoid certain topics entirely,” she said. “Don’t speak out about online harassment, because that’s the best way to attract online harassment. Don’t be a vocal feminist—heck, don’t be a woman on the internet—because you’ll get abuse for it.”
Twitter’s fundamental disconnect between users on the platform and the company itself was perfectly encapsulated by a series of tweets from Brandon Carpenter, one of the company’s iOS developers. After last week’s news that Twitter might start showing timelines out of order, he tweeted multiple responses, which went viral. It was his first time one of his tweets went out of his “immediate network,” and he experienced the abuse and harassment from people unhappy with the reported algorithmic changes.
Wow people on Twitter are mean
— Brandon Carpenter (@bhcarpenter) February 6, 2016
Heh, I’ve been on Twitter since 2009, and I think this is the first time one of my Tweets has gone beyond my immediate network.
— Brandon Carpenter (@bhcarpenter) February 6, 2016
Hopefully Twitter’s new Trust and Safety Council will assist in making Twitter a safer place where people feel empowered to speak, create, and share with minimized fears of being silenced by harassers.
It will certainly have its share of skeptics on the outset. As the developer told me: “Time will tell if it does any good, or if it’s just a feelgood PR move.”
Illustration via Max Fleishman
Selena Larson is a technology reporter based in San Francisco who writes about the intersection of technology and culture. Her work explores new technologies and the way they impact industries, human behavior, and security and privacy. Since leaving the Daily Dot, she's reported for CNN Money and done technical writing for cybersecurity firm Dragos.