- ‘American Dirt’ controversy inspires meme about Latinx stereotypes in literature Wednesday 9:02 PM
- What is the TikTok ‘flex challenge’? Wednesday 8:03 PM
- GoFundMe to send ‘Target Tori’ on vacation raises more than $30K Wednesday 6:54 PM
- Furries stop domestic assault in viral video Wednesday 6:10 PM
- Gritty under police investigation for allegedly punching a teen fan Wednesday 6:04 PM
- Twitter users throw animal parties with emoji in new meme Wednesday 5:21 PM
- Woman who went viral supporting Soleimani killing exposed as Libyan militia lobbyist Wednesday 5:01 PM
- Jeff Bezos subtweets Saudi prince following phone hack report Wednesday 3:29 PM
- ‘Yeah, good. OK’ Bernie Sanders meme is a new way to dismiss people Wednesday 3:10 PM
- ‘Vanderpump Rules’ recap: Petty displays of affection Wednesday 2:12 PM
- Makeup artist transforms into Timothée Chalamet on TikTok Wednesday 1:54 PM
- Iguanas are falling from trees—and people are selling them online for food Wednesday 1:02 PM
- 75,000 sign petition to fire Wendy Williams after ‘cleft lip’ comment about Joaquin Phoenix Wednesday 12:30 PM
- Kim Kardashian says Kylie Jenner’s setting spray is ‘cheap sh*t’ Wednesday 11:59 AM
- Trump continues to demand Apple unlock iPhones for the government Wednesday 11:46 AM
Reddit moderation has long been a complicated issue. Within its darkest recesses are subreddits which many would find distasteful. Removing them, however, is no easy task, particularly if it wants to be perceived as a bastion of free and unfiltered discussion. Offense, after all, is fundamentally subjective, and the company is wary of being seen as partial to a particular viewpoint.
It’s primarily for that reason why today’s change to Reddit’s policies against harassment and bullying is a landmark. In a post to /r/announcements, Reddit administrator landoflobsters explained that abusive behavior would no longer need to meet the criteria of “continued” or “systematic” in order to become actionable by the company.
“Chiefly, Reddit is a place for conversation,” they said. “Thus, behavior whose core effect is to shut people out of that conversation through intimidation or abuse has no place on our platform.”
For the first-time, Reddit also plans to accept reports from “bystanders” who have witnessed abuse but were not the recipient of it. Previously, the company only accepted reports from those who had received inappropriate comments first-hand.
Hoping to assuage the fears of users wary of heavy-handed enforcement, the Reddit representative explained that it’ll attempt to pay attention to context. The site plans to use machine-learning tools to prioritize reports, but these will play no role in actual enforcement. That job will remain in the hands of human moderators.
By lowering the threshold where a post or subreddit becomes objectionable, and allowing anyone to report a post, users will inevitably report more posts. The question remains whether the so-called “Frontpage of the Internet” can cope.
According to Amazon’s Alexa ranking service, Reddit is the 18th most visited website in the world. It has an army of moderators, but these are largely unpaid and, according to a 2018 Engadget report, routinely face harassment and threats. Without finding more people to perform this thankless job, it’s not clear how Reddit will process these new reports in a timely and effective manner.
And then there’s the fact that moderating any platform which is primarily text-based, and drenched in context, is inherently hard. It’s safe to assume that mistakes will be made, which will draw the ire of the broader Reddit community, which is intrinsically sensitive to any perceived encroachments to the unfiltered and frank speech which the site is known for.
When Reddit crosses that line, it’ll be interesting to see how it responds. Will it water down these new rules, or will it stand firm?