- Why women are getting mysterious greeting cards from ‘Jenny B.’ Today 6:22 AM
- Gwyneth Paltrow peddles pseudoscience in ‘Goop Lab’ Netflix series Today 6:18 AM
- ‘Avenue 5’ packs in laughs and unevenness in a shaky launch into space Today 6:00 AM
- ‘Jojo Rabbit’ director Taika Waititi may make a ‘Star Wars’ movie Today 5:50 AM
- Pornhub sued by deaf man who wants closed captioning in videos Thursday 7:52 PM
- 70,000 Tinder photos of women are being circulated in cybercrime forum Thursday 7:26 PM
- YouTuber faces backlash for using paper towels to soak up an entire swimming pool Thursday 6:54 PM
- Trump’s latest impeachment tweet inspires perfect meme Thursday 6:07 PM
- ‘Vanderpump Rules’ fans dredge up new cast member’s racist tweets Thursday 4:53 PM
- This Bernie Sanders subreddit just banned all content from CNN Thursday 3:52 PM
- Deepfake app turns you into your favorite GIFs Thursday 3:43 PM
- GOP senator calls CNN reporter ‘liberal hack’ Thursday 2:20 PM
- Queer developer blasts Steam over ‘backwards’ adult filter review Thursday 2:20 PM
- Woman blasts Stephen Colbert for not paying when she was an intern Thursday 1:05 PM
- Everyone’s seen the ‘Mayo Pete’ memes except Pete Buttigieg Thursday 1:05 PM
In an effort to limit the spread of misinformation, YouTube announced on Friday that it will recommend fewer videos regarding conspiracy theories to its users, the Verge reports.
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—like videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” YouTube said in a statement.
YouTube currently has billions of videos available on its website. The company said the change would affect less than 1 percent of the videos, according to the press release.
YouTube will do this by adjusting its algorithm, which will lessen the amount of content that appears as a recommendation for the user. This will also affect the “Up next” sidebar, which appears when you are watching a video.
YouTube said it will not be removing any videos as long as content complies with its Community Guidelines. But it will reduce the spread of content that “comes close” to violating those guidelines.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said.
Conspiracy content on YouTube came up in Congress when lawmakers convened with Google CEO Sundar Pichai in December. At the hearing, Rep. Jamie Raskin (D-Md.) brought up the way YouTube’s algorithms can be used to spread conspiracy theories, according to Vox.
“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said at the hearing.
He was alluding to the Pizzagate conspiracy theory spread via the video platform. In 2016, the theory led to an armed gunman showing up to a pizzeria located in Washington, D.C., Vox reported.
“YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales,” Zeynep Tufekci wrote in a New York Times op-ed in March.
YouTube said the change will start out affecting only “a very small set” of videos based in the U.S.
The change will rely on YouTube employees and how they train the sites algorithms to make recommendations, according to YouTube. When those systems used to make the changes become more accurate, YouTube said it will roll out the changes in more countries.
YouTube did not immediately respond to the Daily Dot’s request for comment.
Eilish O’Sullivan is the news wire editor for the Daily Dot. Her work has appeared in the Austin Chronicle and the Daily Texan.