The YouTube logo.

Rego Korosi/Flickr (CC-BY-SA)

Most videos people regretted watching on YouTube came from its recommendation algorithm

Mozilla said 71 percent of 'regret reports' came from recommended videos.


Andrew Wyrich


Published Jul 7, 2021   Updated Jul 7, 2021, 10:18 am CDT

A majority of YouTube videos that internet users regretted watching were recommended to them by the platform, a new report from the Mozilla Foundation has found.

Featured Video Hide

Mozilla, the internet company behind the Firefox browser, released findings today from its “RegretsReporter” browser extension that it launched last year. The extension allowed users to report videos they regretted watching, and allowed for Mozilla to crowdsource information about YouTube’s recommendation algorithms.

Advertisement Hide

YouTube’s recommendations have long been criticized for sending users down extremism rabbit holes.

Since it launched last year, Mozilla said 37,380 people installed RegretsReporter. In its report released today, Mozilla said it gathered information from 3,362 reports from 1,662 volunteers who were in 91 countries.

The reports were submitted between July 2020 and May 2021.

Specifically, Mozilla said 71 percent of the Regret reports came from videos that were recommended to users. Meanwhile, recommended videos were 40 percent more likely to be regretted than videos that a user searched for. Additionally, Mozilla said there were “several cases” where YouTube’s recommended videos actually violated the company’s own community guidelines.

Advertisement Hide

Overall, Mozilla said the kinds of videos that were most frequently regretted were misinformation, violent or graphic content, hate speech, and spam and scams.

Finally, Mozilla said non-English speakers are “hit the hardest.” The rate of YouTube regrets being reported to them was 60 percent higher in counties that don’t have English as a primary language.

“Our research suggests that the corporate policies and practices of YouTube, including the design and operation of their recommendation algorithms, is at least partially responsible for the regrettable experiences that our volunteers had on the platform,” Mozilla wrote in its report. “We believe our research has revealed is only the tip of the iceberg, and that each of these findings deserves and requires further scrutiny.”

Advertisement Hide

In a statement to the Daily Dot, YouTube asserted that recent changes have decreased the number of times “borderline content” is viewed by users through recommendations.

“The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone. Over 80 billion pieces of information is used to help inform our systems, including survey responses from viewers on what they want to watch. We constantly work to improve the experience on YouTube and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content. Thanks to this change, consumption of borderline content that comes from our recommendations is now significantly below 1%,” a YouTube spokesperson told the Daily Dot in a statement.

This article has been updated with a statement from YouTube.

Read more about Big Tech

Biden walks back comments on Facebook coronavirus misinformation ‘killing people’
YouTuber fixes Tesla for $700 after automaker said repair would cost $16,000
Twitter verified a number of bot accounts—raising questions about security
Jeff Bezos admits Amazon worker exploitation essentially funded his space trip
Google employees accuse tech giant of mishandling sexual assault case
Share this article
*First Published: Jul 7, 2021, 9:15 am CDT