Tech

TikTokers are sharing their own experiences in YouTube’s ‘alt-right pipeline’

YouTube’s algorithm has long been criticized, and now TikTokers are sharing their experiences.

Photo of Libby Cohen

Libby Cohen

A man walks on an industrial pipeline.

YouTube has long been criticized as being a gateway for far-right and extremist content, and now TikTokers are trying to highlight just how the video streaming site exposed them to that kind of content at a young age.

Featured Video

Teens—particularly boys interested in gaming—say YouTube facilitated an “alt-right pipeline.”

Activist Wagatwe Wanjuki first noticed the alt-right pipeline discourse happening on TikTok, and documented it in a Twitter thread.

https://twitter.com/wagatwe/status/1368051489256607746
Advertisement

Wanjuki noted that “it’s fascinating to watch various young people explicitly talk about YouTube leading them down the pipeline.”

The pipeline refers to how YouTube suggests the next video for users to watch. The platform does this based upon an algorithm that guesses what type of video the user would like to continuing watching.

This feature isn’t insignificant. CNET reports that more than 70% of viewing time on YouTube, the second most-visited website in the world, is suggested by the algorithm.

Major internet companies have already been focusing on YouTube’s alt right pipeline and now everyday users on social media are doing the same.

Advertisement

Mozilla, the company that launched the internet browser Firefox, is calling on YouTube to open up transparency for its algorithm and, TikTok users— particularly those exposed to harmful content—want answers.

What is the YouTube alt-right pipeline?

The alt-right pipeline pushes innocuous viewers, typically gamers, towards pernicious content.

The harmful content regularly consists of anti-women, racist, or homophobic jokes by a conservative creator. Many on TikTok remember talk show host Ben Shapiro appearing in their suggested videos.

Advertisement

The pipeline works like this: If a user watched a compilation of “Call of Duty” or any other video game wins, after the video ends, YouTube might suggest another video about a “feminazi getting owned.” A feminazi is a derogatory term adopted by the far-right for a radical feminist.

After the user watches one video like this, the feedback loop begins as more and more alt-right videos begin to be suggested.

Other videos common on the pipeline include “cringe” social justice warrior (SJW) videos. A “SJW” is another example of far-right lingo for someone who does not actually care for the social causes they are fighting for, but rather for the name and praise for doing so.

And these kinds of videos are all over YouTube. This video is a compilation of “triggered SJWs.”

Advertisement

After searching one SJW video, here are the search results for “femin.” Just one view of a SJW clip results in more offensive content like “feminist cringe” or “feminist rekt.”

YouTube search for 'femin' after watching one SJW cringe video
YouTube

Is YouTube doing anything to prevent alt-right content?

YouTube began banning extremist content in 2019, but by that time it was too late for many impressionable teens before then.

Advertisement

In that announcement, YouTube says it would begin to prohibit content that claims “a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”

Mozilla says it was only a cop out as the content still persists.

“One of YouTube’s most consistent responses is to say that they are making progress on this and have reduced harmful recommendations by 70%,” a Mozilla spokesperson told the Daily Dot. “But there is no way to verify those claims or understand where YouTube still has work to do.”

Instead, Mozilla stepped up to monitor YouTube. Last year, Mozilla created a browser extension called Regrets Reporter that saves participating users’ suggested video content on YouTube.

Advertisement

Regret Reporter launched after Mozilla pressured YouTube to explain the algorithm to users for over a year and a half. As the election approached and the pandemic persisted, Mozilla decided to work with users to prevent the suggestion of content such as election myths or “plandemic” videos.

The goal is to check just how well YouTube is doing at preventing harmful content from sneaking up on innocuous viewers. Mozilla says the findings will be publishing in late spring of 2021.

“By sharing your experiences, you can help us answer questions like: what kinds of recommended videos do users regret watching? Are there usage patterns that lead to more regrettable content being recommended? What does a YouTube rabbit hole look like, and at what point does it become something you wish you never clicked on?,” the Regrets Reporter website says.

YouTube did not return several requests for comment by the Daily Dot.

Advertisement

The discourse joins TikTok

But it’s not just Mozilla that has raised flags about YouTube’s alt-right pipeline. A growing number of users on TikTok have begun sharing their experiences.

Masses of videos are being posted under the hashtag “alt-right pipeline” where users are sharing personal stories about YouTube’s algorithm.

One TikTok details how YouTube showed anti-women content to teens in 2016.

Advertisement
https://www.tiktok.com/@jpolitics/video/6936963508768623877?_d=secCgYIASAHKAESMgow27%2Bj7Jnnnfx9eVl1sYqfXwYTHkj%2BNOSXbxIHVz%2FrXI9xZxKFn7t4TCFRQgn%2FPI%2B8GgA%3D&language=en&preview_pb=0&sec_user_id=MS4wLjABAAAA4x8bdkYtxO9-3rtPXmCIcFhI9PfMqdoWjcAuq2P8nmvatnY_UDIoL8E8029S4RXC&share_item_id=6936963508768623877&share_link_id=8C01389A-3B61-4EBC-B9B8-B292A482ACC7&timestamp=1615407588&tt_from=copy&u_code=d9cbd2db195b2j&user_id=6761617351287292933&utm_campaign=client_share&utm_medium=ios&utm_source=copy&source=h5_m&is_copy_url=1&is_from_webapp=v1

This TikTok shows how easy it was for conservative YouTube creators like Shapiro to grab the attention of young users.

https://www.tiktok.com/@acexsave/video/6921008882458332422?_d=secCgYIASAHKAESMgowxxBSfOjRMWa9KmLFikkcxQ2gskc8C2aMhgV%2BAdar1gMMUm2r3jA7dd049r6dtL3pGgA%3D&language=en&preview_pb=0&sec_user_id=MS4wLjABAAAA4x8bdkYtxO9-3rtPXmCIcFhI9PfMqdoWjcAuq2P8nmvatnY_UDIoL8E8029S4RXC&share_item_id=6921008882458332422&share_link_id=A55AFB52-EB33-45C7-8E04-0EA004B471C0&timestamp=1615407641&tt_from=copy&u_code=d9cbd2db195b2j&user_id=6761617351287292933&utm_campaign=client_share&utm_medium=ios&utm_source=copy&source=h5_m&is_copy_url=1&is_from_webapp=v1

Another video explains how the user saw “feminist getting destroyed” videos following gaming compilations.

Advertisement

Meanwhile, this post recognizes the number of young boys that “go through an alt-right phase or even just an apolitical misogynist phase.”

The TikTok is captioned “check up on your little brothers.”

https://www.tiktok.com/@moomymo/video/6919865602798537990?_d=secCgYIASAHKAESMgowIjGZU8hp%2Fet%2Bp8coAJD681UsDpXQF8VYss1My8W69hbMhn10SkYa7ZRGtbjm0yXCGgA%3D&language=en&preview_pb=0&sec_user_id=MS4wLjABAAAA4x8bdkYtxO9-3rtPXmCIcFhI9PfMqdoWjcAuq2P8nmvatnY_UDIoL8E8029S4RXC&share_item_id=6919865602798537990&share_link_id=88C84C96-FDF9-461F-B54E-DB0347692DCD&timestamp=1615407713&tt_from=copy&u_code=d9cbd2db195b2j&user_id=6761617351287292933&utm_campaign=client_share&utm_medium=ios&utm_source=copy&source=h5_m
Advertisement

Wagatwe noted the commonalities in her thread.

https://twitter.com/wagatwe/status/1368057498335281153

Is TikTok any different?

Like most social media platforms, TikTok has its own problem with keeping harmful content off of its site.

Advertisement

For example, in TikTok videos that bring the alt-right pipeline to light, some users side with the pipeline. A screenshot from Wanjuki’s thread shows one comment that states “glad I stayed in it.”

https://twitter.com/wagatwe/status/1368052898081337345

There are thousands of videos under the hashtags ‘feminazi’ and ‘SJWcringe.’

https://www.tiktok.com/@redeaglepatriot/video/6920769886406954246?_d=secCgYIASAHKAESMgowtbZMaJgxjm581snpfI%2FIQS5QrjZUi3pOo4eTmqFA1zEmQD%2FHmBUFr0Bdo6bd%2F1idGgA%3D&language=en&preview_pb=0&sec_user_id=MS4wLjABAAAA4x8bdkYtxO9-3rtPXmCIcFhI9PfMqdoWjcAuq2P8nmvatnY_UDIoL8E8029S4RXC&share_item_id=6920769886406954246&share_link_id=1C0FB5AC-4B6A-4F21-9A96-4CD7D17E80CC&timestamp=1615571496&tt_from=copy&u_code=d9cbd2db195b2j&user_id=6761617351287292933&utm_campaign=client_share&utm_medium=ios&utm_source=copy&source=h5_m
Advertisement

This TikTok is just a gaming stream with a anti-feminist and SJW rant in the background.

https://www.tiktok.com/@commentaryclub/video/6880524830744104197?_d=secCgYIASAHKAESMgowoQfIVoc4fGuu5QfnAZv933NdzH5KXeOcIsDK254qI4vs4tKc6XGBh2Wqm59npdn5GgA%3D&language=en&preview_pb=0&sec_user_id=MS4wLjABAAAA4x8bdkYtxO9-3rtPXmCIcFhI9PfMqdoWjcAuq2P8nmvatnY_UDIoL8E8029S4RXC&share_item_id=6880524830744104197&share_link_id=0A654A93-7374-4729-8581-3CCCB2FC1FE1&timestamp=1615571510&tt_from=copy&u_code=d9cbd2db195b2j&user_id=6761617351287292933&utm_campaign=client_share&utm_medium=ios&utm_source=copy&source=h5_m

Despite the alt-right content thriving on TikTok, Mozilla has stated in the past that TikTok’s algorithm is more transparent than YouTube.

“As platforms like Facebook and YouTube struggle to explain how their News Feed and recommendation algorithms work, TikTok seems to be moving towards greater transparency by saying it will open up its platform to researchers,” Mozilla wrote.

Advertisement

But until researchers can fully understand the algorithms at both YouTube and TikTok, harmful content can still be curated towards unsuspecting users.

“Until researchers have access to comprehensive data about the algorithms used by TikTok and YouTube, researchers cannot identify patterns of harm and abuse, fueling public distrust,” Ashley Boyd, Mozilla’s vice president of advocacy, told the Daily Dot.

This post has been updated.


Read more of the Daily Dot’s tech and politics coverage

Advertisement
Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
 
The Daily Dot