- Conservative parliament member’s teabag photo spills serious tea 5 Years Ago
- Right-wing conspiracy theorists see coronavirus as a plot against Trump 5 Years Ago
- Chapo Trap House among leftist channels banned on Twitch for streaming Democratic debate Today 4:20 PM
- Meet Ryker, the world’s worst service dog Today 4:01 PM
- Far-right blogger claims Trump ordered arrest of Julian Assange Today 3:47 PM
- Reddit man wants to tell people he’s been with his girlfriend for one year instead of 6—for an incredibly dumb reason Today 2:18 PM
- John C. Reilly’s son Leo is a TikTok star Today 1:58 PM
- ‘Vanderpump Rules’ recap: A friendship sails Today 1:52 PM
- For celebs, Kobe Bryant tattoos are all the rage Today 1:01 PM
- The internet has discovered Jim and Pam Halpert’s daughter—and she’s on TikTok Today 12:32 PM
- YouPorn launches adult-themed TikTok knock-off (updated) Today 12:29 PM
- Clearview AI client list reportedly stolen Today 11:56 AM
- Billie Eilish’s brother Finneas walks back on tone-deaf advice to young creatives Today 11:26 AM
- ‘Caronavirus’ trends after Trump misspells coronavirus Today 11:20 AM
- Conservative writer claims rape victims, trans people are most privileged Today 10:45 AM
Killing Reddit’s most hateful subreddits won’t exterminate its troll problem
They never truly disappear, no matter how hard we try.
Can we reasonably expect a platform like Reddit to rid itself of bigoted trolls and harassers?
In the wake of Ellen Pao’s exit as the site’s CEO, there’s an intensifying battle about how exactly the site will moderate many of its controversial subreddits. There’s no clear answer yet about how Reddit’s leadership will strike a balance between preserving “free speech” and protecting users from offensive content and people committing harm to others. However, all early signs point to bad news for the Internet’s ignorant bottom feeders.
The latest comes by way of the ban of the anti-Semitic subreddit, r/GasTheKikes. As Gawker’s Ashley Feinberg writes, “at the very least, this means that Reddit’s administrators are finally making good on their word to ban ‘anything that incites harm or violence against an individual or group of people.’ Which is a nice but ultimately meaningless gesture.” Meaningless it is, because just hours later, another subreddit emerged in its place: r/KikeTown.
Although the new subreddit doesn’t have a moniker with an immediate call for Holocaust-era mass murder of Jewish people, it’s amassed more than 300 followers in its first few hours, a number will only grow with time. Indeed, r/KikeTown is a despicable spin on r/CoonTown, an anti-black subreddit that’s currently the 55th most popular channel on the site. Let that sit for a moment.
So far, there’s no indication from the site’s leadership about what they’ll do to address the inflammatory, if not dehumanizing, nature of these communities, and how they’ll filter out others like them. But sadly, there’s not much Reddit’s administrators can do to eradicate hateful subreddits.
Banning these hate-filled venues may send a message about community standards—and affirm those values for others—but it doesn’t eradicate the ideologies of the people who participate. It may displace them for a short time, but eventually the disparate elements find a way to reemerge, coalesce, and build another community around their bigotry.
If Reddit user behavior is any indication, there’s a whack-a-mole effect in the process of filtering out offensive content. In the June removal of r/FatPeopleHate, as Richard Lewis wrote for the Daily Dot, many of the memes harassed real people, which site administrators felt would subject them to physical harm. As they argued, the ban was about actions—and not ideas. But it wasn’t enough to keep the trolls away.
Almost immediately after the initial ban, Fat People Hate clones sprouted, and Reddit admins shut them down almost as quickly, seemingly for no reason aside from their names. According to Reddit posts, key Fat People Hate moderators have been shadowbanned, meaning they can still post links and comment but no one can see their submissions. Some have allegedly had their IP addresses banned entirely in a bid to keep them off the site. ‘Sympathy’ subreddits have also disappeared from Reddit, as have multiple users who spoke out against anti-harassment censorship.
As with r/FatPeopleHate and, now, with r/GasTheKikes, subreddits premised on hate find ways to reemerge with a new identity. In a sense, it’s a means of survival, where the trolls figure out yet another new way to test the limits of content moderation policies. Whereas the copycats have names that are slightly less alarming, the content isn’t radically different than before. It’s just the same old bigotry, repackaged and reformatted for a consumption that’s deemed more acceptable.
If Reddit user behavior is any indication, there’s a whack-a-mole effect in the process of filtering out offensive content.
Eradicating bigotry based in network communities, then, mirrors what the process often looks like in real life.
Whereas it was once OK for white supremacists and their sympathizers to call black people “coons” and “n*ggers” openly, there are now dire consequences for most people who get caught using racist language. Now that this level of bigotry has been filtered out and deemed reprehensible by mainstream cultural standards, those who have any deep-seated resentment or strong bias against black people may opt for more insidious (or coded) phrases instead—including “those people” or “thugs.” It’s another way to express similar disregard and contempt for an entire race of people, using the same ideology, but not the same terms or framings.
The Internet affords supremacists, and people like them, with a cloak of relative anonymity that a person in real life doesn’t otherwise have. Without a name, face, or identity to attach to problematic behavior—aside from an IP address and a few other online tracking methods—the trolls operate in a much more fluid space than in real life. Whether site administrators and fellow users know it or not, someone banned on Reddit for harassment could find another computer tomorrow and create another online identity to continue tormenting others.
As Gawker’s Sam Biddle notes, the issue with eradicating Reddit of offensive communities and content relates to both a crisis in leadership, and a culture fostered for sophomoric, if not ignorant, behavior to persist.
The real power, the administrative and bureaucratic power that keeps the site chugging along for its day-to-day users, is a collection of unpaid moderators who wield more power than any office executive. For Reddit to become something resembling a viable business, it has to make money, and that means making the bigots and stalkers and imbeciles feel less welcome—how many firms will do business with the company that pays to keep /r/GasTheKikes running? Any outside CEO is therefore facing an impossible job: fumigate Reddit sufficiently for advertisers while placating a hostile militia of super users that can’t seem to distinguish between mild rules and a prison sentence on Robben Island.
It will take firm, steady leadership to guide Reddit towards maturity as a platform and change the environment so that trolls, harassers, and bigots won’t consider the site an attractive host for their convening. Without a complete overhaul, and considerable resources devoted to community management, Reddit is destined to remain a safe haven for the worst elements of Internet culture.
Derrick Clifton is the Deputy Opinion Editor for the Daily Dot and a New York-based journalist and speaker, primarily covering issues of identity, culture and social justice.
Illustration by Max Fleishman
Derrick Clifton is an identity and culture reporter and columnist. His work has appeared on NBC News, the Guardian, Vox, the Root, Quartz, MSNBC, HLN, and Mic. He is the communications manager for ProPublica Illinois.