- Report: Disney yanks YouTube ad spending following child exploitation accusations Wednesday 7:56 PM
- These people are organizing Fyre Fest live-action role-play parties Wednesday 6:35 PM
- White woman berates Mexican restaurant manager for speaking Spanish Wednesday 4:12 PM
- In Pixar short ‘Kitbull,’ a cat and pit bull become unlikely friends Wednesday 3:48 PM
- Stop exploiting the Jussie Smollett case to discredit LGBTQ hate crime victims Wednesday 3:28 PM
- The best Netflix original movies of 2019 Wednesday 3:20 PM
- Pinterest is reportedly blocking vaccination searches Wednesday 2:53 PM
- Nike’s self-lacing smart sneakers malfunction days after release Wednesday 2:50 PM
- How to quickly get the Havoc weapon in Apex Legends Wednesday 2:48 PM
- The truth behind the anti-LGBTQ emoji controversy Wednesday 1:37 PM
- Tristan Thompson disables Instagram comments after reports he cheated on Khloe Kardashian Wednesday 11:25 AM
- Introducing ‘boner culture,’ this Gamergate blogger’s latest cause Wednesday 11:16 AM
- HBO debuts trailer for controversial Michael Jackson doc ‘Leaving Neverland’ Wednesday 10:46 AM
- Christian woman refuses to do taxes for lesbian married couple Wednesday 10:43 AM
- Political campaigns will be snooping on your phones in 2020 Wednesday 10:43 AM
The company is rolling out a new age-restriction policy.
The YouTube Kids app has a problem.
While advertised as a safe place for younger viewers to browse, its fallen victim to its algorithms, which are pushing bizarre, disturbing, and often obscene content in front of unassuming children. Now YouTube says it’s implementing a new policy to restrict that kind of content. According to Juniper Downs, YouTube’s director of policy:
Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization. We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.
The policy considers “vulgar language” as well as violent imagery, sexually suggestive content, and “portrayal of harmful or dangerous activities.”
But often these questionable videos are just a string of gamed keywords, which makes the content harder to flag, and eases more of those videos into the “Up next” slot. Asked in July about the abundance of disturbing videos involving Frozen‘s Elsa and superheroes in violent, adult situations, a YouTube spokesperson said: “We understand that what offends one person may be viewed differently by another. As a platform, we strive to serve these varying interests by asking our community to flag any video that violates our strict community guidelines.”
So it looks like YouTube is taking more responsibility in regulating the problem now, though viewers flagging content is still required. This new policy will only affect the content of YouTube main; if something is flagged it will be age-restricted to 18, and will not be considered for the Kids app. Age-restricted content is also not eligible for monetization, which could cut into this weird and lucrative subgenre.
H/T the Verge
Audra Schroeder is the Daily Dot’s senior entertainment writer, and she focuses on streaming, comedy, and music. Her work has previously appeared in the Austin Chronicle, the Dallas Observer, NPR, ESPN, Bitch, and the Village Voice. She is based in Austin, Texas.