Unless parents are watching YouTube Kids videos all the way through with their children, they might not know what they’re missing—and what inappropriate material might be passing into their child’s developing brains.
An anonymous physician mother wrote on the PediMom blog that she happened to be viewing a video with her son when a person appeared on the screen about five minutes in and showed the viewer how to self-harm by cutting themselves.
According to the mother, a man randomly walked onto the screen 4 minutes and 45 seconds into the animated video, held out his arm, and instructed viewers how to cut their wrists before disappearing.
That particular video was removed by YouTube, but BuzzFeed reported that it’s not an isolated incident. According to the report, an old skit from Filthy Frank—a former YouTube star who helped popularize the Harlem Shake meme but who reportedly retired from YouTube in 2018—was repurposed for the YouTube kids video.
In the eight-second Filthy Frank clip, he was standing in front of a green screen when he said, “Remember, kids, sideways for attention, longways for results. End it.” Then, somebody apparently edited that footage into the cartoon video that the anonymous mother on PediMom discovered.
YouTube told BuzzFeed that the creator had received a strike for that video, the first step toward possible deletion of the channel.
According to Dr. Free Hess, who runs the PediMom site, the video had more than 600,000 views before YouTube deleted it.
“I thought YouTube Kids was safe,” the anonymous author wrote. “They sure make it seem like it is … Not much shocks me. I’m a physician, I work in the emergency department. I’ve seen a lot. But this did.”
This isn’t the first time the YouTube Kids app has faced controversy for inappropriate material filtering through.
Thanks to an algorithm that could be exploited, a wave of bizarre and obscene videos that included Frozen’s Elsa and superheroes in disturbing situations were found on YouTube Kids in 2017. Conspiracy theory videos also had been previously discovered on the app.
Last month, it was discovered that YouTube’s algorithm had been recommending explicit self-harm videos.
Previously YouTube said, “No system is perfect and sometimes we miss the mark. When we do, we take immediate action to block the videos or, as necessary, channels from appearing in the app.”
A YouTube spokesperson told BuzzFeed that it removes millions of videos and channels every quarter that violate the policy and “we remove the majority of these videos before they have any views.”
But these kinds of self-harm videos—whether they’re a joke or not—can do real damage. According to Hess in a blog post, “Suicide is the SECOND leading cause of death in individuals between the ages of 10 and 34 and the numbers of children exhibiting some form of self-harm is growing rapidly.”
So far, it’s a battle that hasn’t been won. As YouTube continues to try to keep these kinds of flagrant violations from being seen on the Kids app and on the regular site, there seems to be no shortage of people who are trying to make sure children on YouTube continue to see disturbing content.