Advertisement
Tech

Inside TikTokers’ quest to make the platform more transparent

TikTok’s black-box content moderation sparks creator activism.

Photo of Viola Stefanello

Viola Stefanello

A man talking into the camera.

In late January, a few weeks after people wielding white supremacist imagery stormed the U.S. Capitol, Georgetown University student Zev Burton (@zevulous) uploaded a video on TikTok. In it, he explained how to recognize neo-Nazi propaganda that is still running rampant on the app. 

Featured Video

Within 15 minutes, it was taken down. That’s when—a little bit out of confusion, a little bit out of spite—he uploaded the first of what would become an unexpectedly long series of videos addressed to the platform’s CEO. 

@zevulous

@tiktok please explain

♬ original sound – Zev

“My question to you is: How do you plan to combat white supremacy when creators on this app can’t inform people about it?” he asked, looking into his phone’s camera. “Because there’s a white supremacy problem on TikTok! How do you plan on solving it?”

Advertisement

The next day, he sat down and did the same. The day after that, he started counting: “This is day 3 of asking TikTok CEO Vanessa Pappas to explain the content moderation guidelines,” the video says. “Even if they don’t want to release the algorithm that does that, just explain to me why sexist, racist, homophobic, Islamophobic, white nationalist, and neo-Nazi content is being kept up on this platform, but the moment I and plenty of other creators make a video trying to teach people about that so that they can be safe, it gets taken down.” Over a hundred days and 139,000 TikTok followers later, Zev is nowhere closer to getting an answer. 

TikTok’s silence is particularly jarring when you consider it has long been trying to position itself as a bastion of transparency. On its page dedicated to Transparency and Accountability, the company clearly states that it aspires “to be transparent with our community in order to build and maintain trust.” Its users know the reality is much messier, as innocent videos or entire profiles—even those with large followings—risk being taken down with little to no reason.

@zevulous

day 100 @tiktok. It’s time for phase two. #releasetheguidelines

♬ original sound – Zev

In theory, TikTok does not allow violent extremism, hateful ideologies, harassment, and hate speech toward a long list of protected categories. The platform also says it may reduce discoverability, “including by redirecting search results or limiting distribution in the For You feed,” to videos that “could be considered upsetting or depict things that may be shocking to a general audience.” In practice, the reasoning behind its content moderation choices is, at best, nebulous. In just the past month, the platform has been accused of removing videos critical of Russian President Vladimir Putin at the Kremlin’s request and deleting the accounts of trans creators without explanation. 

Advertisement

After past moderation guidelines were leaked, the company admitted to having secretly limited the reach of videos published by LGBTQ+ and disabled creators, as well as by people arbitrarily considered poor or ugly, in what it called a misguided attempt to protect users from cyberbullying. 

And yet, the company is investing a lot of effort—and money—on transparency initiatives. It launched a new set of Community Guidelines and published a Transparency Report. It also inaugurated two Transparency and Accountability Centers in Los Angeles and Washington, D.C., with one set to open in Europe in the next few months. These physical offices are meant to allow visitors to gain a hands-on understanding of how the company stores data and moderates content. The rationale behind it, according to former interim CEO Vanessa Pappas, is to give people “a chance to evaluate moderation systems, processes, and policies in a holistic manner.” 

The Daily Dot reached out to TikTok, but the company did not reply to its request for a comment.

Experts say this commitment to transparency makes sense, considering the fact that relations between America and China, where TikTok is owned, show no sign of improvement and TikTok has already been thrown into the pit in the past. “Global politics is out of their control, all they can do is position themselves for the inevitable next storm, prepare to defend themselves legally as need be,” Matthew Brennan, author of Attention Factory: The Story of TikTok and China’s ByteDance, told the Daily Dot. “Part of the preparations to defend themselves is making public efforts to demonstrate a commitment to transparency”.

Advertisement

As policy analyst Spandana Singh wrote on Slate a few months back, some of the strides TikTok has made in this field exceed industry standards, especially for such a new tech company. It must also be noted that neither Facebook, YouTube, or Twitter have agreed to share the instructions their human moderators are given, or the algorithms behind critical content moderation decisions, despite having all repeatedly faced serious controversies in the past. 

Zev Burton, though, is hoping that TikTok will do just that—and he has resorted to increasingly creative shenanigans in hope of obtaining an answer, including wearing handmade crop tops, singing sea shanties, and pretending to be an action hero infiltrating the TikTok headquarters to steal the content moderation guidelines, Nicolas Cage-style.

While TikTok has community guidelines anyone can read, they can be widely interpreted and involve amorphous applications. Burton is hoping the company will release the concrete instructions that guide actual enforcement of policy. 

His Change.org petition calling on TikTok to release the content moderation guidelines is now very close to reaching 15,000 signatures. In December 2020, Latinx trans creator @rosalynnemontoya launched a similar petition asking for the content moderation guidelines to be changed after her popular account had been heavily targeted by transphobic trolls and, eventually, deleted. In both cases, TikTok kept quiet.

Advertisement

“I haven’t talked to anyone at TikTok, but I know that people who work for their content moderation department have seen my videos,” Burton tells the Daily Dot. “A fair amount of people I know who work in tech have told me they sent my videos to friends at TikTok, although they won’t give me names, for obvious reasons.”

People have also been trying to bring his petition to the company’s attention through other channels: After TikTok’s official Twitter account asked “What’s something u learned from CleanTok that u wish someone told you sooner?” on May 6, a user replied saying “I’ve learned that TikTok suppresses anyone who is marginalized (POC, LGBTQ+ folks, activists etc)& we have to create a damn petition for the release of your content moderation guidelines & yall STILL WON’T release them.” 

A quick search on Twitter shows many dozens of disgruntled users complaining about joining the “banned for no reason club.” The mounting frustration that stems from being kicked out of the profile you have often dedicated huge amounts of time and energy to cultivate with no explanation is palpable. And it can feel like a joke when you realize a huge amount of borderline—if not outright prohibited—content does not get removed.

Advertisement

Abbie Richards (@tofology), another popular creator who is most famous for her hatred of golf and her viral chart about conspiracy theories, has an entire old phone dedicated to finding the darkest sides of TikTok.

“One of my For You pages is full-on fascist,” she told the Daily Dot. “There’s a fascist hype house, the hashtag for #nationalsocialist is still up. Swastikas, secret societies, crazy amounts of homophobia and transphobia, talk of killing gay and trans people, skull masks, literally anything. I haven’t seen any gruesome snuff stuff, but I’ve seen talk of it. And a lot of guns.”

The graduate student, who has been publishing content combatting disinformation on TikTok for months, believes the central issue lies in how complicated the app is. 

“I wish it could be a simple thing, like ‘They wrote the guidelines to be racist!’ but it’s probably not what’s going on. I would love to see the moderation guidelines too,” she adds. “They need thorough guidelines, they need people working on this, they need a team combating disinfo and extremism and hate speech on their platform. They need a team to lift up marginalized creators and help them spread their content.”

Advertisement

As for Burton, he still maintains hope that, as discussions around content moderation gain momentum in the United States, TikTok will improve its record. “They will probably come out and say they want to be leaders in content moderation where Facebook, Twitter, and others have failed. I don’t think they’re going to say they’re releasing the content moderation guidelines because of @zevulous.” 

The 21-year-old will certainly keep hammering on with his request to the platform, as will all the smaller creators who turn to TikTok’s official Support page on Twitter asking for any kind of explanation. 

The hope that the company will soon live up to the high transparency standards it has set for itself is there. 

But whether TikTok will follow through is up to it. 

Advertisement

Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
 
The Daily Dot