- New York to ban sale of most flavored e-cigarettes Today 8:45 AM
- How to watch ‘Dancing with the Stars’ season 28 Today 8:06 AM
- Watch the new ‘Jurassic World’ short film ‘Battle at Big Rock’ Today 8:04 AM
- Who is Corn Pop? Here are all the theories about the gang leader from Joe Biden’s past Sunday 4:37 PM
- Fresh sexual misconduct allegations against Kavanaugh spur calls for impeachment Sunday 3:28 PM
- Mike Pence says a Triple Crown-winning racehorse bit him Sunday 12:51 PM
- Disney CEO Bob Iger leaves Apple board amid streaming wars Sunday 12:01 PM
- Influencer Destiny Marquez faces backlash for berating Forever 21 employee Sunday 10:32 AM
- Chelsea Handler tackles systemic racism in ‘Hello, Privilege. It’s Me, Chelsea’ Sunday 9:18 AM
- Gun control proposal: Trump, lawmakers considering background check-conducting app Sunday 9:05 AM
- How to stream Browns vs. Jets on Monday Night Football Sunday 7:00 AM
- What are anons? Sunday 6:30 AM
- How to stream Eagles vs. Falcons on Sunday Night Football Sunday 6:00 AM
- How to stream ‘Power’ season 6, episode 4 Sunday 5:00 AM
- How to stream WWE’s Clash of Champions 2019 Saturday 8:00 PM
In an effort to limit the spread of misinformation, YouTube announced on Friday that it will recommend fewer videos regarding conspiracy theories to its users, the Verge reports.
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” YouTube said in a statement.
YouTube currently has billions of videos available on its website. The company said the change would affect less than 1 percent of the videos, according to the press release.
YouTube will do this by adjusting its algorithm, which will lessen the amount of content that appears as a recommendation for the user. This will also affect the “Up next” sidebar, which appears when you are watching a video.
YouTube said it will not be removing any videos as long as content complies with its Community Guidelines. But it will reduce the spread of content that “comes close” to violating those guidelines.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said.
Conspiracy content on YouTube came up in Congress when lawmakers convened with Google CEO Sundar Pichai in December. At the hearing, Rep. Jamie Raskin (D-Md.) brought up the way YouTube’s algorithms can be used to spread conspiracy theories, according to Vox.
“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said at the hearing.
He was alluding to the Pizzagate conspiracy theory spread via the video platform. In 2016, the theory led to an armed gunman showing up to a pizzeria located in Washington, D.C., Vox reported.
“YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales,” Zeynep Tufekci wrote in a New York Times op-ed in March.
YouTube said the change will start out affecting only “a very small set” of videos based in the U.S.
The change will rely on YouTube employees and how they train the sites algorithms to make recommendations, according to YouTube. When those systems used to make the changes become more accurate, YouTube said it will roll out the changes in more countries.
YouTube did not immediately respond to the Daily Dot’s request for comment.
Eilish O'Sullivan is an editorial intern for the Daily Dot studying journalism and government at the University of Texas at Austin. Her work has appeared in the Austin Chronicle and the Daily Texan.