Tech

Should Reddit’s powerful mods be reined in?

Reddit thrives on the whims of unpaid moderators. Is it time to reel them in?

Photo of Simon Owens

Simon Owens

Article Lead Image

The New York Times. The Washington Post. CNN. The Wall Street Journal. They’re all news organizations you’ve heard of and occasionally visit during your lunch break or morning commute. Each employs hundreds of journalists and editors responsible for covering and influencing the news cycle.

Featured Video

But what if I told you there’s a small contingent of people on the Internet who wield as much power and influence as these news outlets but are, for the most part, anonymous and largely unknown to the public at large? What if they oversaw a portal that sends more traffic than a link on the Drudge Report? What if the very news outlets I listed above often clamor to get their content in front of these powerful Internet users with pseudonyms like Greypo, CedarWolf, and rotorcowboy?

Reddit, which bills itself as the “front page of the Internet,” is arguably just that. Roughly 168 million people visited the social news site last month, and many of its most popular subreddits have as many as 8 million subscribers each. Reddit’s reach and influence over the Internet is profound and pervasive. But despite recent structural changes, an in-depth look at the role of the site’s moderators reveals a community nearing crisis, with a constant battle for quality control and power taking place behind the scenes and threatening the site’s democratic ideals.

‘Serial’: A case study

In early December, when the hit podcast Serial was at the height of its fame, a moderator for a Reddit community dedicated to the podcast created an online poll and linked to it from the community. In the weeks leading up to this poll’s posting, the community—or “subreddit” in Reddit lingo—had achieved its own level of fame because of how Serial’s most dedicated fans flocked to it and obsessively pored over every detail of the Adnan Syed case, in some cases unearthing facts not yet mentioned in the podcast. News articles began to regularly reference the subreddit, it was mocked in a viral video, and occasionally characters from the podcast would even show up with freshly created accounts to pipe in or air their grievances.  

Advertisement

Reddit, which bills itself as the “front page of the Internet,” is arguably just that. 

Understandably, all this attention brought in thousands of new users to what had until then been a relatively obscure subreddit, and those users began to flood it with new posts and comments, many rehashing topics that had already been discussed in the weeks prior. Some posts included Serial-related memes. This increased activity set the stage for the poll that the user PowerOfYes, one of eight moderators of the subreddit, created on Dec. 7. The poll’s one question—“Should we ban memes?”—seemed rather straightforward, but it quickly ignited an argument that has been raging all across Reddit for years, a disagreement between the millions of Reddit users who browse the site every day and the small army of moderators (or mods) who make and enforce the rules that govern every single subreddit.

In the hours following the poll’s release, one could witness the polarity of views in this debate play out in real time. “I don’t think it’s fair to ban memes,” wrote one user. “If you don’t like them, don’t click on them.” Another opined, “While I sympathize with those who suggest keeping them, or keeping them in off-season, I personally find memes just erode community and discussion.”

The poll itself produced no real clarity or consensus. A plurality—31 percent—voted to ban them, but when you added up all the various versions of “allow memes” they equaled a slight majority. The debate over mods’ roles in regulating subreddits didn’t start in r/serialpodcast, and, unsurprisingly, it wouldn’t end there either.

Advertisement

Reading the fine print

Casual browsers of Reddit often assume the platform relies solely on a one-user-one-vote system, that whether a post makes it to the front page of the site rests entirely on if it receives enough upvotes from logged-in users. They would be wrong. Yes, the voting plays a large part, but there are likely thousands of posts and comments submitted every day that are never seen by more than a handful of people before they’re removed from public view, often because they violate at least one of the thousands of subreddit rules spread out across the entire site.

Technically, anyone can become a mod for a subreddit, either by being invited by one of the already-existing mods or by launching a brand new subreddit of your own. If ABC debuts a new sitcom and I become a devoted fan, I can then go to Reddit and launch a subreddit dedicated to that show, and if I’m the first to create one and the show develops an avid following, then the fan subreddit may begin attracting new subscribers. As the mod, I have near-dictatorial control over the subreddit. This means I can modify the background design, remove posts, and even ban users. Mods work as unpaid volunteers, and Reddit admins—employees who actually work for the company—are reluctant to step in and overrule a mod’s decision, doing so only in extreme cases like when there’s a subreddit that posts sexualized photos of minors or some other similarly outrageous misbehavior. A subreddit can take on as many mods as it wants, and often as one grows in popularity its mod list also expands in order to take on the increased demand for reviewing comments and posts.

Depending on the focus of a subreddit, a mod’s most basic role is to ensure posts are on-topic and remove any spam or abusive comments. But as a community gains in popularity, resulting in an ever larger deluge of posts, the very nature of the subreddit and the kind of content it promotes begins to change, and it’s at this point you’ll often see mods step in and instill more restrictive rules. Many of those mods argue that at a certain stage in a subreddit’s growth the one-person-one-vote system begins to break down, and tensions arise between the subreddit’s original subscribers and the newcomers who flood its threads with what some believe is inferior content.

Advertisement

Sometimes, a mod will be widely excoriated for abusing his power. 

This trend becomes compounded whenever a community gets chosen by Reddit’s employees to become a “default” subreddit. A default subreddit—there are at least 50 on any given day and the list often changes—is shown to anyone who visits Reddit and isn’t logged in, and if they create a new account they’re automatically subscribed to all the defaults. The difference between a default and a non-default subreddit is the difference between a few thousand seeing your top-voted post and millions seeing it.

These default subreddits often have the most restrictive rules. R/science, a default with close to 8 million subscribers, requires that any links go out to either peer-reviewed articles or summaries of peer-reviewed articles, and that those articles must be less than six months old. The r/bestof subreddit, which highlights the best comments that can be found all across Reddit, requires you use a special URL with submissions and you’re not allowed to include the name of the subreddit you’re linking to in your post. As you can probably surmise, not all of these rules are common sense, and if you assume an average 10 rules for every default subreddit, then that’s 500 rules you can possibly trip over while innocently submitting a post to Reddit (this isn’t including the thousands upon thousands of rules in non-defaults). But according to the mods who regulate these defaults, the sheer volume of submissions requires a strict level of quality control.

“The biggest complaint we run into is, ‘Why don’t you just let the voters decide?’” said Nathan Allen, a mod for r/science and r/askscience, two of the largest subreddits with millions of subscribers each. “I think that’s been addressed enough to where we know once you get into a large enough community that system just does not work.” Or as one redditor put it in a funny GIF uploaded to the site, when moderating a default sub “there is a vast ocean of shit that you people don’t know shit about.”

Advertisement

Not everyone would agree with Allen, and occasionally these disagreements bubble over into the mainstream. In 2013, the mods at r/politics were widely criticized when they released a list of URLs that were universally banned from the subreddit; the list included the Huffington Post, Salon, Mother Jones, and Gawker. The mods claimed this was because these domains regularly produced subpar content, nevermind the fact that Huffington Post had recently won a Pulitzer for its investigative reporting and Mother Jones had broken arguably the biggest story of the 2012 election cycle—Mitt Romney’s 47 percent video. Then, in 2014, a gumshoe redditor, after noticing a paucity of front-page r/technology posts concerning certain topics, published a list of what he suspected were banned keywords within the subreddit. This included the words “NSA” and “Snowden,” meaning any news relating to arguably the most explosive and far-reaching tech story of the year was not allowed on the largest tech forum on the Internet. When another redditor discovered that the word “Tesla” was banned from r/technology, that redditor was then summarily banned from the subreddit entirely by one of the mods without any given reasons.

Sometimes, a mod will be widely excoriated for abusing his power. In 2014, the mod for the World of Warcraft subreddit, which boasts over 200,000 subscribers, completely shut down the community, apparently in a tantrum over his inability to access the game because of server issues. He was widely accused of essentially holding a massive Internet community hostage. That same year, the community for an XKCD fan subreddit tried to drive out their top mod because he used his prominent perch to promote “men’s rights” and holocaust denialism (they were ultimately successful).

The year prior it was discovered that a cofounder for Quickmeme, a meme generation website, had been a mod at r/adviceanimals, the most popular meme subreddit, and had been removing posts that linked to competing meme generators. Perhaps the most famous example of a Reddit mod’s misdeeds came in the form of Adrian Chen’s unmasking of Violentacrez, a redditor who wielded vast power on the site and who Chen described as having “issued an unending fountain of racism, porn, gore, misogyny, incest, and exotic abominations yet unnamed.”

“Once a community gets big enough, the low quality, low effort submissions and comments come to dominate everything.”

Advertisement

Reading all this, you might be wondering why you should care about the internecine politics of a few Internet forums. But when I was an editor at U.S. News & World Report, we’d see upwards of 300,000 visitors when a story of ours made it to the top of r/politics, more traffic than is typically sent by a link on Drudge. One could argue that, in terms of viewers, the mods of r/politics wield more power than the politics editors at the New York Times. In short, the mods of Reddit deserve the same kind of scrutiny that we’d give to any prominent media executive or news editor.

Debating the upvote

I reached out to several mods in the default subreddits for this story and was mostly rebuffed, but Nathan Allen, the aforementioned mod for r/science and r/askscience who I’ve spoken to for previous stories, agreed to speak to me at length about how mods create and enforce the rules that govern the site. When I brought up the controversies at r/politics and r/technology, Allen, a Ph.D. chemist who works for the Dow Chemical Company, argued that these weren’t abuses of power or editorial overreach, but rather the actions of overworked volunteers who were attempting to deal with a mountainous deluge of poor content.

“From a moderator’s standpoint it makes sense because of the sheer workload they were under,” he said. “They pretty much had to do that at r/technology because they weren’t allowed to expand the mod team and there were only three or four active mods, so what are you going to do?”

Advertisement

Allen’s prescription for solving this problem is simple: add more mods. R/science is unique in that it has over 600 users it’s christened with various moderation powers, thereby ensuring there are at least a handful of mods policing the subreddit every hour of the day and that at no point does the level of user participation become overwhelming. This also guarantees a wider consensus within r/science, allowing its users to appeal post removals and more careful consideration of how new rules are enacted.

“We have had some cases where comment moderators have gone rogue and caused lots of problems,” he said. “They were immediately identified to us in an hour or so, and one of the full mods was aware of it and took corrective action.”

The entire notion that Reddit is built upon a one-person-one-vote ideal is a false one.

I asked Allen what process r/science goes through before instituting a new rule. He described an initial discussion among the full mods in which they identify a perceived problem and banter about ways to handle it. This dialogue includes contemplation of the various trade-offs and risks of employing a new rule. Then they’ll take the discussion to the less-powerful comment mods and solicit their feedback. Finally, they’ll upload what’s called a “meta” post to the entire community describing the proposed rule and allow the entire base of millions of subscribers a chance to pipe in. Allen said the mods try to keep an open mind when reviewing the community feedback. “That being said, I don’t recall a situation in which we found we needed to change something based on that. Because once we explain the problem that we’re trying to solve, how many people do you have to go through before most valid points of view have been presented?”

Advertisement

Anytime one of these meta posts is published, whether it’s on r/science or r/serialpodcast, one of the most consistent responses is: Why not let the upvotes decide? Why ban an entire category of submission when the community, in aggregate, has the ability to separate the wheat from the chaff? But none of the mods I spoke to bought this argument.

“Once a community gets big enough, the low quality, low effort submissions and comments come to dominate everything, driving everything to low quality and low effort,” Allen said. “For example, images and memes always do well. Why? It takes two seconds to see an image and upvote it. A long-written thought piece takes a long time to generate and a long time to process, and most users just aren’t going to read it. Popular voting always appeals to the lowest common denominator and will always favor low effort content.”

This idea that increased user participation leads to a degradation of quality was consistently brought up by mods I spoke to. Davidreiss666, a mod for subreddits like r/history and r/food, argued that if you don’t like a subreddit’s rules, then you can usually find a more permissible community or start one of your own. He used the controversy surrounding r/politics for banning domains as an example.

“There are other subreddits within the political space,” he told me. “There’s also r/libertarian or r/socialism or r/worldpolitics and r/news and stuff like that. R/politics was just one of them. Sure, it’s one of the biggest subreddits, but to a certain extent you could say it’s bigger because” it adheres to a higher standard of quality.

Advertisement

And the entire notion that Reddit is built upon a one-person-one-vote ideal, said Allen, is a false one. If Reddit’s admins wanted so much power to rest with the community, then why did they hand over so much of it to mods?

“The basis of what Reddit is is a person who creates a subreddit is the person who dictates the rule—what is allowed to be there is determined by the moderators.” Essentially, trying to dictate what a mod can or cannot do is like trying to boss around someone about what they can post to their own personal Facebook or Twitter account. Sure, other people can comment on your Facebook post, post to your wall, and “upvote” your posts with likes, but ultimately you decide what goes on your Facebook profile.

At what point does the learning curve become too steep before users give up?

Not everyone agrees with him on this point. “I feel like when you get a larger subreddit, it feels like the users, the subscribers, end up sort of owning it so to speak,” said Andrew Conkling, a redditor of five years. “I feel like Reddit is a very democratic place and that should extend to the moderation control as well.”

Advertisement

What he meant by that is there should be some way for a community to dislodge a mod. Another user named Chris Berry had a proposal for how this could be done. “You would need a super majority of subscribers or super majority of the average number of active people over the last month. There’s some number out there where it would be difficult to dislodge a mod but possible. There should be a mechanism for that.”

Alienating new users

Regardless of if you think a form of quality control is needed, there’s another issue to consider, and that’s whether too many rules spanning across the entire site will eventually begin alienating newer, casual users of Reddit. I myself have given up submitting to certain default subreddits because it’s become too easy to trip over a rule that wasn’t immediately obvious. At what point does the learning curve become too steep before users give up?

“Headlines are the one thing we still struggle with because it’s a problem that there’s no clear solution for.”

Advertisement

Before you dismiss this as a baseless hypothetical, consider one of the largest problems plaguing Wikipedia at the moment. For years now, that community has faced a steady loss of contributors; a 2013 study found the trend stems back to 2007, when the community put in place stricter quality control and rules. As Mashable pointed out, “rejection rates by established editors have increased sharply. Only 6% of new editors’ changes were reverted in 2006, compared with 25% in 2010.” The Wikimedia Foundation has recognized this as potential problem for the long term health of the encyclopedia. “We are not replenishing our ranks,” warned Wikipedia cofounder Jimmy Wales in 2011.

Every redditor I spoke to had stories of how their submissions were removed for non-obvious reasons. Chris Berry recalled a time he accidentally ran afoul of a rule in r/relationships. “I ended up getting banned for it, which seemed like a pretty severe penalty for a rule that was buried in the sidebar. So I actually had to create a second account and basically lost a year-old account because of that.”

Allen didn’t dismiss the idea that, handled in the wrong way, moderation could frustrate and ultimately alienate new users. The primary goal for rulemaking, he explained, is that any new guideline should be constructed in such a way so as to remove any ambiguity. Take, for instance, r/science’s rule that any submission must reference a peer-reviewed article less than six months old; regardless if you agree with it, it’s worded in such a way so it’s not open to interpretation, thereby eliminating any potential confusion.

“Headlines are the one thing we still struggle with because it’s a problem that there’s no clear solution for,” he said. “We have to be accessible to the general public, but at the same time we don’t want massively misleading headlines.”

Advertisement

Just as important as the rulemaking is how those rules are enforced. Many subreddits simply send a message stating a post has been removed for violating a rule or, even worse, remove a post without any notification at all. Allen and the mods at r/science recognized that there should be a place for many of the stories that get submitted to the subreddit but don’t fit its guidelines.

“What we’ve done to counter that problem is set up another subreddit that has looser rules,” he explained. “We have r/everythingscience, and so instead of saying your submission is in violation, go away, we say, ‘Hey this doesn’t adhere to our rules, please submit it here instead.’”

The path forward

Though Reddit’s admins have been reluctant to step in and regulate any particular subreddit, it seems clear that they’re conscious of the risk inherent in any system that gives too much power to too few people, and so they’ve embarked on a number of initiatives to spread that power out. In 2013, after quite a long time bestowing default status to a small group of subreddits, admins announced several new communities they were adding to the default list, as well as the removal of r/politics and r/atheism from default status.

Advertisement

“We could give you a canned corporate answer or a diplomatic answer that is carefully crafted for the situation,” they wrote on the company blog. “But since this is reddit, we’re going to try things a bit differently and give you the real answer: they just weren’t up to snuff.”

That same year, Reddit installed a rule that a single redditor could only moderate a maximum of four defaults (though I’m not sure what would keep that redditor from simply opening a second account to moderate more). Then, last year, Reddit announced it was expanding the default list from 25 defaults to 50 and it would start swapping the list out more regularly.

I don’t envy the admins. Though the Reddit community receives its fair share of bad press, it also hosts some of the most vibrant and thoughtful discussion on the Internet. The same community that fingered the wrong suspects after the Boston bombing and hosts a subreddit called r/picsofniggers also sends pizza to random strangers in need, provided support to a bodybuilder who came out as transgender, and hosts interviews with the world’s most important scientists. Providing a welcome ecosystem for the millions of casual users on Reddit without estranging the small army of mods that donate several hours each week to weed out spam and abusive behavior is no easy task.

It’s also not a unique one. In fact, every Internet platform, once it reaches a certain level of popularity, struggles to provide a framework that’s welcoming to new users without disaffecting its early adopters, those who devoted significant time and effort to build out a community and often resist any attempts at degrading their influence. Getting this dynamic wrong can prove fatal.

Advertisement

Take Digg, for example, which is widely considered to be Reddit’s predecessor. At its most powerful, the site could deliver server-crushing traffic and boasted millions of users. But there emerged a constant struggle between the “power diggers”—users who spent their days trading diggs of each other’s content, thereby gaming the system so they could reach the front page—and the millions of more casual diggers who complained the system was rigged against them. Digg’s administrative staff was openly wary of the power diggers and consistently introduced new reforms to the system in an attempt to level the playing field. But the reforms backfired, eventually causing a mass exodus from the site.

At its height, in 2009, Digg was among the most popular destinations on the Internet. Comscore estimated its monthly audience at 32 million unique visitors.

Its monthly audience today? 342,000, and falling.

Simon Owens is a technology and media journalist living in Washington, D.C. Follow him on TwitterFacebook, or Google+. Email him at simonowens@gmail.com.

Advertisement

Illustration by Jason Reed

 
The Daily Dot