Social news site Reddit is on the verge of a robot takeover. But don’t fear, human redditors. That’s a very good thing.
The AutoModerator bot takes many of the monotonous chores of Reddit moderation out of the hands of real people. Shortly after its release, the bot’s being used in more than 54 Reddit sections. And once it infiltrates more of the site’s top communities, it will vastly improve the Reddit experience for millions of people. And they’ll probably never even know it.
The gears that turn the Reddit machine are pretty much entirely human—volunteer moderators who spend countless hours of their free time on the site. Their main tasks are pretty simple, though time-consuming: make sure posts to their subreddits don’t violate the rules. (And keep the posts on topic; you shouldn’t post a Skyrim screen capture in r/mylittlepony, for instance.) And they fix posts that have been incorrectly labeled as spam by Reddit’s filter.
It’s that second responsibility that affects the most people. Reddit’s spam filter isn’t so smart. It frequently marks things as spam that are clearly not—if your legitimate submission has simply disappeared, it’s probably because the spam filter unfairly throttled it.
The only way to fix this pretty big problem is to message the moderators. Did we mention they’re all volunteers? It’s not like they’re getting paid to hang out on Reddit all day. If no moderators are around to catch your incorrectly spammed submission, you’re pretty much out of luck.
“In an active subreddit, having your submission filtered for more than about 20 minutes can practically guarantee that it won’t be seen by many people,” Deimorz, the Reddit moderator who created the bot, explained to the Daily Dot.
This happens all the time. And the victims aren’t just the posters. If a really good submission gets incorrectly labeled as spam, the subreddit’s subscribers miss out on great content, and finding great content is really the whole point of Reddit.
So Deimorz, who helps some pretty big sections on the site, including the million-strong r/gaming, decided to do something about it. His AutoModerator bot pretty much fixes all these problems. It makes sure that a moderator is always there, correcting the spam filter’s errors and just generally automating all those simple, monotonous, and time-consuming moderator tasks.
Deimorz gave a couple of examples:
/r/Music uses the bot to automatically approve any submissions from YouTube. In the two weeks they’ve been using the bot, this has approved almost 1000 submissions that were initially filtered. Unless manually fixed by the moderators, none of these submissions would have ever been visible to visitors to the subreddit. Some smaller subreddits even use the bot to approve anything that gets filtered, because they don’t have issues with spam, and the subscribers voting can handle any inappropriate submissions, so it’s unnecessary to have anything filtered by default.
The potential benefits are actually pretty limitless. Deimorz isn’t some amateur computer programmer. It’s his real life job, and he made sure the make the bot’s functions are as customizable as possible. (He publicly released its code yesterday.)
That means that as the bot proliferates across Reddit, it can be tinkered with and tweaked, refined into a thousand different iterations to help make Reddit run more smoothly.
Bots aren’t entirely new to Reddit. The site releases its source code publicly, so ambitious or just bored programmers can play around as much as they want. One popular bot posts plain text transcriptions of links to tweets on Twitter; another searches for image reposts and links to the original submission.
Those generally provide a service in Reddit’s comment section; Deimorz’s bot, by targeting moderation, has a potentially much bigger effect on the Reddit user experience.
And he’s not nearly finished.
“There are still a lot of features I’d like to add to it,” Deimorz wrote.
The Reddit robot invasion is just beginning.