Photo via Krista Kennell/Shutterstock (Licensed)
‘They got some real crazy s**t on YouTube.’
Rapper B.o.B. is a man of theories: He’s mused on Snapchat filters and facial recognition and clashed with Neil deGrasse Tyson after saying the Earth is flat. Now he’s fallen into another rabbit hole.
In late June, B.o.B. posted two videos to Instagram in which he points out a disturbing YouTube trend: superhero videos aimed at kids that are rather adult in nature. He specifically points out videos of people cosplaying Spider-Man and Elsa from Frozen, and one troubling video in which a young girl dressed as Elsa is “pregnant.” He added the hashtag #ElsaGate, and he’s trying to get the word out about offensive “kids” videos on YouTube.
“You gotta ask yourself, who is this geared towards?” B.o.B. asks in the clip. “This shit ain’t funny to no adult.”
There are many live-action videos from channels like SuperHero VS SuperHero and SuperHeroes KidsLine (which B.o.B. calls out in the video) that show characters like Elsa, Catwoman, and Maleficent pregnant or giving birth. (But it’s mostly Elsa—search “pregnant Elsa” and more than 1 million results come up.) The video titles are often just a string of keywords but the brightly colored thumbnails are aimed to catch kids’ eyes, and the videos routinely get millions of hits, which has led to claims that these channels are using bots to get a staggering number of views or have found a way to game YouTube.
The pregnant Elsa video that B.o.B. references from the channel Superheroes (which lists Iraq as its country on the YouTube About page) has more than 29 million views and has been up since January. The SuperHeroes KidsLine video he mentions at the end (in the thumbnail Maleficent’s pregnant belly is being stabbed with what looks like a corn dog) has 6 million views. We’ve reached out to both channels.
This type of “real-life” superhero cosplay is a big business on YouTube, and some creators have been able to make a living off it. But as popular channel h3h3 Productions explained earlier this year, many of these “toy” and “superhero” channels trade in some very bizarre, violent, and adult themes. Other creators have claimed that these kinds of channels are sending subliminal messages to kids.
Animated videos aimed at kids have faced similar criticism. In March the BBC reported on fake Peppa Pig videos. Writing for the Outline, journalist Laura June explained how “suggested” videos are often the source of these more sinister clips and detailed how easy it was for her 3-year-old to encounter a fake version of the Disney series Doc McStuffins in which “people break legs. Bones get exposed. It’s terrifying. One video opens with a man injecting a pumpkin with a hypodermic needle, which somehow results in Doc and her buddies becoming zombies.”
The YouTube Kids app is supposed to provide a more kid-friendly experience and aims to filter out some of that content. Its advertiser policy states that “Videos depicting family entertainment characters or content, whether animated or live action, engaged in violent, sexual, vile, or otherwise inappropriate behavior, even if done for comedic or satirical purposes, are not eligible for advertising.”
Reached for comment, a YouTube spokesperson said: “We’re always looking to improve the YouTube experience for all our users and that includes ensuring that our platform remains an open place for self expression and communication. We understand that what offends one person may be viewed differently by another. As a platform we strive to serve these varying interests by asking our community to flag any video that violates our strict community guidelines.”