Mozilla, the internet company behind the Firefox browser, released findings today from its “RegretsReporter” browser extension that it launched last year. The extension allowed users to report videos they regretted watching, and allowed for Mozilla to crowdsource information about YouTube’s recommendation algorithms.
YouTube’s recommendations have long been criticized for sending users down extremism rabbit holes.
Since it launched last year, Mozilla said 37,380 people installed RegretsReporter. In its report released today, Mozilla said it gathered information from 3,362 reports from 1,662 volunteers who were in 91 countries.
The reports were submitted between July 2020 and May 2021.
Specifically, Mozilla said 71 percent of the Regret reports came from videos that were recommended to users. Meanwhile, recommended videos were 40 percent more likely to be regretted than videos that a user searched for. Additionally, Mozilla said there were “several cases” where YouTube’s recommended videos actually violated the company’s own community guidelines.
Overall, Mozilla said the kinds of videos that were most frequently regretted were misinformation, violent or graphic content, hate speech, and spam and scams.
Finally, Mozilla said non-English speakers are “hit the hardest.” The rate of YouTube regrets being reported to them was 60 percent higher in counties that don’t have English as a primary language.
“Our research suggests that the corporate policies and practices of YouTube, including the design and operation of their recommendation algorithms, is at least partially responsible for the regrettable experiences that our volunteers had on the platform,” Mozilla wrote in its report. “We believe our research has revealed is only the tip of the iceberg, and that each of these findings deserves and requires further scrutiny.”
In a statement to the Daily Dot, YouTube asserted that recent changes have decreased the number of times “borderline content” is viewed by users through recommendations.
“The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone. Over 80 billion pieces of information is used to help inform our systems, including survey responses from viewers on what they want to watch. We constantly work to improve the experience on YouTube and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content. Thanks to this change, consumption of borderline content that comes from our recommendations is now significantly below 1%,” a YouTube spokesperson told the Daily Dot in a statement.
This article has been updated with a statement from YouTube.
Read more about Big Tech
|Facebook wants court to dismiss FTC’s monopoly lawsuit|
|2.5 million Americans are part of COVID denial groups on Facebook, new study finds|
|‘Lives are at stake’: Senators want Reddit to do more about ivermectin forums|
|Facebook whistleblower says there is a need for more transparency|
|Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.|