- YouTuber allegedly filmed himself abusing and killing his cat Sunday 5:49 PM
- Would you buy a Popeyes chicken sandwich from Quavo for $1,000? Sunday 3:05 PM
- Someone set up a Spider-Man memorial outside D23 Expo Sunday 2:15 PM
- A$AP Rocky just isn’t texting Trump back Sunday 1:24 PM
- Hong Kong protesters knock down alleged ‘facial recognition tower’ Sunday 12:35 PM
- PewDiePie becomes the first YouTuber to hit 100 million subscribers Sunday 11:35 AM
- ‘Breaking Bad’ movie will show us what happened to Jesse Pinkman Sunday 9:39 AM
- How to stream ROH Wrestling’s Honor For All Sunday 7:30 AM
- How to stream Steelers vs. Titans in NFL preseason action Sunday 7:00 AM
- How to stream ‘Good Eats: The Return’ online Sunday 7:00 AM
- How to stream ‘Power’ season 6 Sunday 6:00 AM
- Your best bets for finding discounted and refurbished Airpods Sunday 6:00 AM
- How to stream Barcelona vs. Real Betis Saturday 11:31 PM
- How to stream Tottenham Hotspur vs. Newcastle Saturday 11:21 PM
- All of the ‘Avengers: Endgame’ Easter eggs discovered by fans Saturday 6:52 PM
YouTube hasn’t cleaned up its conspiracy-themed videos problem. Instead, the issue is worsening every time a new mass shooting or terrorist event occurs. That’s the takeaway from a data researcher who performed an extensive search of “crisis actor” videos that eventually recommended as many as 9,000 other conspiracy-themed videos that had been watched nearly 4 billion times.
According to professor and journalist Jonathan Albright, YouTube is unwittingly helping the conspiracy theory industry grow with each new mass shooting because the website incentivizes these disinformation campaigns by hosting a site where content creators can upload their videos and can make money while doing so.
“Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” Albright wrote in a Medium post on Sunday. “The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach. In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube’s algorithms, it’s getting harder to counter these types of campaigns with real, factual information.
“I hate to take the dystopian route, but YouTube’s role in spreading this ‘crisis actor’ content and hosting thousands of false videos is akin to a parasitic relationship with the public.”
YouTube did not respond to a Daily Dot request for comment on Albright’s assertions.
Interest was renewed in YouTube’s conspiracy-themed videos this month in the wake of the Parkland shooting when a video accusing survivor David Hogg of being a “crisis actor” landed in the top spot on the trending page. That continued a conspiracy-tinged trend that occurred after the Las Vegas and the Sutherland Springs shootings.
If you had searched on YouTube for David Hogg on Feb. 21, the top three results emanated from conspiracy channels.
As a result, YouTube reportedly gave the Alex Jones Infowars channel a strike for the video that was eventually deleted. If Jones’ channel receives two more strikes in the next three months, YouTube would terminate his account.
But Jones’ channel isn’t the only one to be making money off these videos.
As Albright explained, 50 of the most-watched mass shooting-related conspiracy videos have been watched about 50 million times, and if you keep following YouTube’s recommended videos algorithm, it’ll lead you to content that has been viewed billions of times.
In his study of what YouTube recommends while somebody is watching a conspiracy video—he began by searching for “crisis actor” videos—Albright wrote that 90 percent of the titles are “a mixture of shocking, vile and promotional. Themes include rape game jokes, shock reality social experiments, celebrity pedophilia, ‘false flag’ rants, and terror-related conspiracy theories dating back to the Oklahoma City attack in 1995.”
Here’s one example he posted.
From Albright’s perspective, YouTube, no matter how much it’s trying to clean up these conspiracy-themed videos, is empowering those who are creating them. It’s a problem YouTube hasn’t figured out how to solve, and at this point, there’s an argument to be made that it’s a problem that perhaps can’t be solved at all.
Josh Katzowitz is a staff writer at the Daily Dot specializing in YouTube and boxing. His work has appeared in the New York Times, Wall Street Journal, Washington Post, and Los Angeles Times. A longtime sports writer, he's covered the NFL for CBSSports.com and boxing for Forbes. His work has been noted twice in the Best American Sports Writing book series.