- Trump’s transphobic policies are disgusting—but they aren’t new 4 Years Ago
- How to watch the Copa del Rey Final online for free Today 5:45 AM
- How to watch the DFB-Pokal final for free Today 5:30 AM
- Curvy Wife Guy drops music video for rap song ‘Chubby Sexy’ Friday 7:33 PM
- A ‘Black Mirror’ spinoff mini-series is coming to YouTube via Netflix Latin America Friday 5:56 PM
- Kanye West appears on David Letterman’s Netflix show to talk Trump, TMZ, and Drake Friday 3:27 PM
- QAnon believers link small-town arrest to deep state conspiracy without evidence Friday 1:58 PM
- Instagram photos showing prison conditions spark massive protest Friday 1:33 PM
- ‘Gay rat wedding’ headline sparks amazing new meme Friday 1:03 PM
- ‘I read a gossip piece’ meme mocks Moby’s Instagram post Friday 12:39 PM
- Rotten Tomatoes wants to see your ticket stub to leave a verified review Friday 11:46 AM
- ‘Sonic the Hedgehog’ movie delayed to 2020 to fix his look Friday 11:39 AM
- ‘Swamp Thing’ gets off to a promising start, but can it tell a convincing love story? Friday 11:34 AM
- ‘Falling on deaf ears’: ‘Queer Eye’ star sparks conversation about ableist idioms Friday 11:15 AM
- Parents are spending thousands on YouTube camps that teach kids how to be famous Friday 10:43 AM
Killing Reddit’s most hateful subreddits won’t exterminate its troll problem
They never truly disappear, no matter how hard we try.
Can we reasonably expect a platform like Reddit to rid itself of bigoted trolls and harassers?
In the wake of Ellen Pao’s exit as the site’s CEO, there’s an intensifying battle about how exactly the site will moderate many of its controversial subreddits. There’s no clear answer yet about how Reddit’s leadership will strike a balance between preserving “free speech” and protecting users from offensive content and people committing harm to others. However, all early signs point to bad news for the Internet’s ignorant bottom feeders.
The latest comes by way of the ban of the anti-Semitic subreddit, r/GasTheKikes. As Gawker’s Ashley Feinberg writes, “at the very least, this means that Reddit’s administrators are finally making good on their word to ban ‘anything that incites harm or violence against an individual or group of people.’ Which is a nice but ultimately meaningless gesture.” Meaningless it is, because just hours later, another subreddit emerged in its place: r/KikeTown.
Although the new subreddit doesn’t have a moniker with an immediate call for Holocaust-era mass murder of Jewish people, it’s amassed more than 300 followers in its first few hours, a number will only grow with time. Indeed, r/KikeTown is a despicable spin on r/CoonTown, an anti-black subreddit that’s currently the 55th most popular channel on the site. Let that sit for a moment.
So far, there’s no indication from the site’s leadership about what they’ll do to address the inflammatory, if not dehumanizing, nature of these communities, and how they’ll filter out others like them. But sadly, there’s not much Reddit’s administrators can do to eradicate hateful subreddits.
Banning these hate-filled venues may send a message about community standards—and affirm those values for others—but it doesn’t eradicate the ideologies of the people who participate. It may displace them for a short time, but eventually the disparate elements find a way to reemerge, coalesce, and build another community around their bigotry.
If Reddit user behavior is any indication, there’s a whack-a-mole effect in the process of filtering out offensive content. In the June removal of r/FatPeopleHate, as Richard Lewis wrote for the Daily Dot, many of the memes harassed real people, which site administrators felt would subject them to physical harm. As they argued, the ban was about actions—and not ideas. But it wasn’t enough to keep the trolls away.
Almost immediately after the initial ban, Fat People Hate clones sprouted, and Reddit admins shut them down almost as quickly, seemingly for no reason aside from their names. According to Reddit posts, key Fat People Hate moderators have been shadowbanned, meaning they can still post links and comment but no one can see their submissions. Some have allegedly had their IP addresses banned entirely in a bid to keep them off the site. ‘Sympathy’ subreddits have also disappeared from Reddit, as have multiple users who spoke out against anti-harassment censorship.
As with r/FatPeopleHate and, now, with r/GasTheKikes, subreddits premised on hate find ways to reemerge with a new identity. In a sense, it’s a means of survival, where the trolls figure out yet another new way to test the limits of content moderation policies. Whereas the copycats have names that are slightly less alarming, the content isn’t radically different than before. It’s just the same old bigotry, repackaged and reformatted for a consumption that’s deemed more acceptable.
If Reddit user behavior is any indication, there’s a whack-a-mole effect in the process of filtering out offensive content.
Eradicating bigotry based in network communities, then, mirrors what the process often looks like in real life.
Whereas it was once OK for white supremacists and their sympathizers to call black people “coons” and “n*ggers” openly, there are now dire consequences for most people who get caught using racist language. Now that this level of bigotry has been filtered out and deemed reprehensible by mainstream cultural standards, those who have any deep-seated resentment or strong bias against black people may opt for more insidious (or coded) phrases instead—including “those people” or “thugs.” It’s another way to express similar disregard and contempt for an entire race of people, using the same ideology, but not the same terms or framings.
The Internet affords supremacists, and people like them, with a cloak of relative anonymity that a person in real life doesn’t otherwise have. Without a name, face, or identity to attach to problematic behavior—aside from an IP address and a few other online tracking methods—the trolls operate in a much more fluid space than in real life. Whether site administrators and fellow users know it or not, someone banned on Reddit for harassment could find another computer tomorrow and create another online identity to continue tormenting others.
As Gawker’s Sam Biddle notes, the issue with eradicating Reddit of offensive communities and content relates to both a crisis in leadership, and a culture fostered for sophomoric, if not ignorant, behavior to persist.
The real power, the administrative and bureaucratic power that keeps the site chugging along for its day-to-day users, is a collection of unpaid moderators who wield more power than any office executive. For Reddit to become something resembling a viable business, it has to make money, and that means making the bigots and stalkers and imbeciles feel less welcome—how many firms will do business with the company that pays to keep /r/GasTheKikes running? Any outside CEO is therefore facing an impossible job: fumigate Reddit sufficiently for advertisers while placating a hostile militia of super users that can’t seem to distinguish between mild rules and a prison sentence on Robben Island.
It will take firm, steady leadership to guide Reddit towards maturity as a platform and change the environment so that trolls, harassers, and bigots won’t consider the site an attractive host for their convening. Without a complete overhaul, and considerable resources devoted to community management, Reddit is destined to remain a safe haven for the worst elements of Internet culture.
Derrick Clifton is the Deputy Opinion Editor for the Daily Dot and a New York-based journalist and speaker, primarily covering issues of identity, culture and social justice.
Illustration by Max Fleishman
Derrick Clifton is an identity and culture reporter and columnist. His work has appeared on NBC News, the Guardian, Vox, the Root, Quartz, MSNBC, HLN, and Mic. He is the communications manager for ProPublica Illinois.