The company is rolling out a new age-restriction policy.
The YouTube Kids app has a problem.
While advertised as a safe place for younger viewers to browse, its fallen victim to its algorithms, which are pushing bizarre, disturbing, and often obscene content in front of unassuming children. Now YouTube says it’s implementing a new policy to restrict that kind of content. According to Juniper Downs, YouTube’s director of policy:
Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization. We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.
The policy considers “vulgar language” as well as violent imagery, sexually suggestive content, and “portrayal of harmful or dangerous activities.”
But often these questionable videos are just a string of gamed keywords, which makes the content harder to flag, and eases more of those videos into the “Up next” slot. Asked in July about the abundance of disturbing videos involving Frozen‘s Elsa and superheroes in violent, adult situations, a YouTube spokesperson said: “We understand that what offends one person may be viewed differently by another. As a platform, we strive to serve these varying interests by asking our community to flag any video that violates our strict community guidelines.”
So it looks like YouTube is taking more responsibility in regulating the problem now, though viewers flagging content is still required. This new policy will only affect the content of YouTube main; if something is flagged it will be age-restricted to 18, and will not be considered for the Kids app. Age-restricted content is also not eligible for monetization, which could cut into this weird and lucrative subgenre.
H/T the Verge