A hand shown completing a jigsaw puzzle with the text TikTok

Christoph Scholz/Flickr

U.K. launches investigation into TikTok’s handling of children’s data

Officials are concerned that adults are easily able to message children through the app.

 

Samira Sadeque

Internet Culture

Posted on Jul 3, 2019   Updated on May 20, 2021, 9:22 am CDT

The U.K. is investigating TikTok for its handling—or lack thereof—children’s data and security, the Guardian reports.  

The U.K.’s information commissioner Elizabeth Denham told a government committee that a team is “looking at the transparency tools for children” on the short-form video app.  

“We’re looking at the messaging system, which is completely open, we’re looking at the kind of videos that are collected and shared by children online,” Denham said, according to the Guardian. “We do have an active investigation into TikTok right now, so watch this space.”

In addition to content shared from children’s accounts, the investigation will look into the messaging feature, which allows adults to message children on the app without any restrictions. 

Denham said TikTok could potentially be in violation of the General Data Protection Regulation (GDPR), an E.U.-based regulation that has specific rules for children. Article 8 of the GDPR, for example, states that the data of a user can be processed only if they are 16 or older; for those under 16, parental consent is required.     

The U.S. equivalent is the Children’s Online Privacy Protection Rule (COPPA), which was at the core of a February Federal Trade Commission ruling that Denham said prompted their investigation. COPPA provisions say app operators cannot ask for identifying information such as email addresses from users under 13 without parental consent. But such information was required upon signing up for the app, TechCrunch reported at the time. 

The FTC fined TikTok $5.7 million for being in violation of children’s privacy laws and saw a change in the app’s age requirements. All users now have to verify their age, and those under 13 are redirected to a different, more protected page and not allowed to publish videos on the app, according to TechCrunch.   

The FTC investigation began when the company was named Musical.ly, later bought by Chinese company ByteDance, which then closed down and merged its users with TikTok in 2018.  

READ MORE:

Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.

Share this article
*First Published: Jul 3, 2019, 10:51 am CDT