Tech

13-year-olds can be pushed videos about suicide within minutes of joining TikTok

The app served a ‘standard’ teen account a video about body image every 39 seconds.

Photo of Jacob Seitz

Jacob Seitz

TikTok open on phone in hand in front of blurred background

TikTok actively pushes suicide and eating disorder content to accounts of children, according to a new study.

Featured Video

The study, conducted by the Center for Countering Digital Hate—a nonprofit that advocates for the de-platforming of hurtful and hateful content on social media—found that TikTok is quick to push harmful content to children. The organization created fresh accounts in the U.S., United Kingdom, Canada, and Australia and set the accounts to the lowest age TikTok will allow—13. The accounts paused briefly on videos about body image and mental health and liked the videos. Within 2.6 minutes, the study found, TikTok recommended suicide content. Within eight minutes, the app was recommending content related to eating disorders, and TikTok was serving content about body image and mental health every 39 seconds. Some were unmarked ads for weight-loss products.

“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” said Imran Ahmed, CEO of the Center for Countering Digital Hate, in a press release

The group created two accounts in each country, both age 13. One account was set up to be a “standard” teen account, while the other was set to be a “vulnerable” teen. The “vulnerable” accounts were set up to include “loseweight” in the username, as researchers found that would impact the kind of content the account was pushed. 

Advertisement

The standard teen accounts had mental health or body image content shown to it every 39 seconds, with 185 videos shown to the accounts in 30 minutes. These accounts were also shown suicide, self-harm, or eating disorder content every 206 seconds, according to the report.

Unsurprisingly, the report said that accounts intentionally set up to mimic vulnerable teens were shown more harmful content, not less. The “vulnerable” accounts were shown content about suicide, self-harm, and eating disorders every 66 seconds, over double the rate of the standard accounts. In one case, a “vulnerable” account was shown three videos of users discussing suicide plans in one minute.

CCDH recommends that countries need to enact legislation to make the most direct impact.

“Without legislation, TikTok is behaving like other social media platforms and is primarily interested in driving engagement while escaping public scrutiny and accountability. The TikTok algorithm does not care whether it is pro-anorexia content or viral dances that drive the user attention they monetize. TikTok just wants to collect users’ data and keep them on the platform, viewing advertisements,” the report said.

Advertisement

Ahmed said the time for legislation to prevent harmful content on TikTok, especially content shown to teens, is now.

“This report underscores the urgent need for reform of online spaces,” Ahmed said. “Without oversight, TikTok’s opaque algorithm will continue to profit by serving its users—children as young as 13, remember —increasingly intense and distressing content without checks, resources, or support.”

A TikTok spokesperson called the methodology of the study into question, saying in a statement to the Daily Dot that it “does not reflect genuine behavior or viewing experiences of real people.”

TikTok also said it “regularly consults with health experts, removes violations of our policies, and provides access to supportive resources for anyone in need.”

Advertisement

The company said in Q2 of this year it removed 97% of suicide and self-harm content before a report could be made.

This post has been updated with comment from TikTok.

For more information about suicide prevention or to speak with someone confidentially, contact the National Suicide Prevention Lifeline (U.S.) or Samaritans (U.K.).

For more information about eating disorders or to speak with someone confidentially, contact the National Eating Disorders Organization.

Advertisement
web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
 
The Daily Dot