youtube-conspiracy-theories

Andrew Perry/Flickr

YouTube cracks down on recommending conspiracy theory videos

But the videos can still exist on the platform.

 

Eilish O'Sullivan

Streaming

Posted on Jan 25, 2019   Updated on May 20, 2021, 8:40 pm CDT

In an effort to limit the spread of misinformation, YouTube announced on Friday that it will recommend fewer videos regarding conspiracy theories to its users, the Verge reports.

“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” YouTube said in a statement.

YouTube currently has billions of videos available on its website. The company said the change would affect less than 1 percent of the videos, according to the press release.

YouTube will do this by adjusting its algorithm, which will lessen the amount of content that appears as a recommendation for the user. This will also affect the “Up next” sidebar, which appears when you are watching a video.

YouTube said it will not be removing any videos as long as content complies with its Community Guidelines. But it will reduce the spread of content that “comes close” to violating those guidelines.

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said.

Conspiracy content on YouTube came up in Congress when lawmakers convened with Google CEO Sundar Pichai in December. At the hearing, Rep. Jamie Raskin (D-Md.) brought up the way YouTube’s algorithms can be used to spread conspiracy theories, according to Vox.

“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said at the hearing.

He was alluding to the Pizzagate conspiracy theory spread via the video platform. In 2016, the theory led to an armed gunman showing up to a pizzeria located in Washington, D.C., Vox reported.

“YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales,” Zeynep Tufekci wrote in a New York Times op-ed in March.

YouTube said the change will start out affecting only “a very small set” of videos based in the U.S.

The change will rely on YouTube employees and how they train the sites algorithms to make recommendations, according to YouTube. When those systems used to make the changes become more accurate, YouTube said it will roll out the changes in more countries.

YouTube did not immediately respond to the Daily Dot’s request for comment.

Share this article
*First Published: Jan 25, 2019, 5:09 pm CST