- West Virginia corrections employees suspended after Nazi salute photo surfaces Thursday 8:02 PM
- Here are the 15 best Eddie Murphy movies available to stream Thursday 7:56 PM
- Ex-InfoWars video editor admits to making up Islamophobic stories Thursday 6:55 PM
- WhatsApp accounts deleted amid Kashmir internet blackout Thursday 6:21 PM
- Guy gets mocked for tattoo of Baby Yoda drinking White Claw Thursday 6:18 PM
- Spotify Wrapped has people asking just how much it knows about us Thursday 5:50 PM
- Instagram account allegedly asked for inappropriate photos of children Thursday 5:16 PM
- How to stream ‘Boys vs. Bears on Thursday Night Football Thursday 4:33 PM
- Woman caught her boyfriend cheating through his Fitbit Thursday 4:29 PM
- The Pete Buttigieg ‘High Hopes’ dance was designed by an intern Thursday 4:17 PM
- TikTok admits to hiding content made by fat, LGBTQ, and disabled users Thursday 3:58 PM
- ‘Merry Happy Whatever’ is an unoriginal sitcom with plenty of holiday cheer Thursday 3:55 PM
- The ‘Pod Save America’ Bros are losing it over Joe Biden’s newest ad Thursday 3:28 PM
- Van Halen had a wholesome response in defense of Billie Eilish Thursday 3:15 PM
- Influencer faces wrath of K-pop fans after her son played with penis-shaped soap Thursday 1:27 PM
In an effort to limit the spread of misinformation, YouTube announced on Friday that it will recommend fewer videos regarding conspiracy theories to its users, the Verge reports.
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” YouTube said in a statement.
YouTube currently has billions of videos available on its website. The company said the change would affect less than 1 percent of the videos, according to the press release.
YouTube will do this by adjusting its algorithm, which will lessen the amount of content that appears as a recommendation for the user. This will also affect the “Up next” sidebar, which appears when you are watching a video.
YouTube said it will not be removing any videos as long as content complies with its Community Guidelines. But it will reduce the spread of content that “comes close” to violating those guidelines.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said.
Conspiracy content on YouTube came up in Congress when lawmakers convened with Google CEO Sundar Pichai in December. At the hearing, Rep. Jamie Raskin (D-Md.) brought up the way YouTube’s algorithms can be used to spread conspiracy theories, according to Vox.
“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said at the hearing.
He was alluding to the Pizzagate conspiracy theory spread via the video platform. In 2016, the theory led to an armed gunman showing up to a pizzeria located in Washington, D.C., Vox reported.
“YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales,” Zeynep Tufekci wrote in a New York Times op-ed in March.
YouTube said the change will start out affecting only “a very small set” of videos based in the U.S.
The change will rely on YouTube employees and how they train the sites algorithms to make recommendations, according to YouTube. When those systems used to make the changes become more accurate, YouTube said it will roll out the changes in more countries.
YouTube did not immediately respond to the Daily Dot’s request for comment.
Eilish O'Sullivan is an editorial intern for the Daily Dot studying journalism and government at the University of Texas at Austin. Her work has appeared in the Austin Chronicle and the Daily Texan.