YouTube’s efforts to protect kids from exposure to inappropriate content were in vain. The company’s “YouTube Kids” platform is reportedly filled with conspiracy theory videos that claim the moon landing was fake, the world is flat, the pyramids were built by aliens, and Earth is run by human-reptile hybrids, according to a report by Business Insider.
YouTube confirmed the conspiracy videos were removed from its platform, stating matter-of-factly, “no system is perfect and sometimes we miss the mark.”
“The YouTube Kids app is home to a wide variety of content that includes enriching and entertaining videos for families,” a YouTube spokesperson told Business Insider. “This content is screened using human trained systems. That being said, no system is perfect and sometimes we miss the mark. When we do, we take immediate action to block the videos or, as necessary, channels from appearing in the app. We will continue to work to improve the YouTube Kids app experience.”
The revelation comes just days after YouTube CEO Susan Wojcicki outlined solutions for eradicating misinformation and conspiracy theory videos. Speaking at SXSW, Wojcicki said “information cues,” sourced from Wikipedia—a site that can be edited by anybody—would debunk the clips. The links would reportedly be added first to widely spread conspiracies before trickling their way down to the darker corners of the internet. Ironically, Wojcicki used the moon landing, one of the conspiracy theories popping up on YouTube Kids, as an example.
YouTube Kids, created in 2015, was designed as a safe place for children to view family-friendly content from the largest video database on the internet. More than 11 million people use the platform weekly. A combination of machine learning, algorithms, and community help is designed to filters ads and videos of content deemed inappropriate for children. But those efforts have “missed the mark” by quite a distance.
In November, the YouTube Kids app came under fire for suggesting obscene and violent content. A worrying report from the New York Times described disturbing videos on the platform, including those showing animated characters committing suicide, stripping, and urinating on others. Malik Ducard, YouTube’s global head of family and learning content, called these shocking clips “the extreme needle in the haystack.” However, following a backlash, YouTube implemented an age restriction policy to weed them out.
Like other social giants, YouTube is frantically reacting to the ways in which bad actors can manipulate its platform. The company revitalized its efforts in February after a strange conspiracy video that claimed a survivor of the Parkland school shooting was an actor became the number one trending video. YouTube removed the video but not before it received 200,000 views.