- How to stream Browns vs. Jets on Monday Night Football Today 7:00 AM
- What are anons? Today 6:30 AM
- How to stream Eagles vs. Falcons on Sunday Night Football Today 6:00 AM
- How to stream ‘Power’ season 6, episode 4 Today 5:00 AM
- How to stream WWE’s Clash of Champions 2019 Saturday 8:00 PM
- How ‘F*ck off Scotland’ became a Scottish rallying cry amid Brexit madness Saturday 6:28 PM
- A Missouri officer resigned after his Islamophobic Facebook posts surfaced Saturday 5:08 PM
- Adding ‘Triggered’ to stock photos of white men creates Netflix comedy special thumbnails Saturday 3:10 PM
- New restaurant in New York has a seriously unfortunate name: ‘Qanoon’ Saturday 1:38 PM
- These are the 10 best ‘Star Wars’ ships Saturday 12:41 PM
- Google Maps helped solve a decades-old missing persons case Saturday 12:27 PM
- Teen who plotted deadly swatting prank over Call of Duty argument gets prison time Saturday 11:58 AM
- RIP to the real star of ‘Stranger Things’: Steve Harrington’s mullet Saturday 11:04 AM
- People are sharing their wholesome stories with #Hey19YearOldMe Saturday 9:20 AM
- Review: The Joule is a pricey, sleek, easy-to-use entry into sous vide Saturday 8:00 AM
The cost of absolute free speech? Less speech
On the Internet, bigots and trolls may feel absolutely free to spew hatred. But what about the people they attack?
“I disapprove of what you say, but I will defend to the death your right to say it.”
That statement, often misattributed to Voltaire, gives perfect voice to a core belief that has defined democratic values since the Enlightenment.
It is also a sentiment at the core of how we handle comments on the Internet.
The most glaring example is perhaps Reddit, which hews so closely to the principle of free speech that it features user-maintained sections of the site (subreddits) that are openly racist or dedicated to the fetishization of sex crimes.
This problem is in no sense limited to Reddit, however.
Vlogger Anita Sarkeesian, a woman who dares to talk about sexism in video games, was recently forced from her own home because of threats to her and her family that arrived on Twitter and numerous other social networks.
It’s not even limited to sites that primarily rely on user-generated content. Even the most advanced sites offering professionally produced content can go full-on “Lord of the Flies” in their comment sections. Recently news site Jezebel called out its own parent company for maintaining commenting systems that allowed abusive behavior. Gawker Media arguably leads the industry in commenting systems, with its home-grown platform, Kinja.
Ultimately, Gawker updated Kinja to discourage abusive comments, requiring users to establish a track record before they’re allowed to post openly to the site without review from moderators—we’ll see if it’s effective.
On the Internet, we allow comments we vehemently disapprove of, that verge on hate speech, harassment, and threats because, we say, we will not curb free speech. We may hate these comments, but we will defend to the death your right to post them anonymously on an open commenting system.
And yet, is that stance actually encouraging free speech? Or, by being so uncompromising on the rules of civic engagement online (i.e. a commenting system’s Terms of Service agreements,) have we in fact compromised free speech?
We are confusing anarchy with freedom.
Consider this: Wednesday, a group of moderators from Reddit—arguably the worst hellhole of trolldom on the Internet—signed an open letter asking the site to take a more proactive stance against hateful and racist messages. One signer is a moderator of the r/rape forum. Yes, the moderator has the right to remove offensive posts in the subreddit, which is currently intended to be a support room for people who have suffered sexual assault. However, the moderator often has to delete entire threads spawned by the trolls, which also removes the victim’s valid and important responses.
In that moderator’s own words:
Personally, there is nothing that I enjoy less than not only having to remove the messages that violate our rules but the messages of the person they are attacking. Not only were they forced to defend themselves and their account of events as if they were on trial, but I then remove their brave statements of defense. This is often the first time these survivors mentioned their rape to another person only to be told that they are lying or just a slut. And then I have to follow and take their words from them, usually because their half of the conversation can be triggering to other users. It doesn’t feel good.”
On top of that, r/rape’s previous top moderator was herself driven off Reddit because of harassment and doxing, or reveal information that can personally identify a user (doxing is against Reddit’s rules, but the tools to ban users are fairly ineffective in enforcing those rules in any meaningful way).
In other words, the laissez-faire system of comments on Reddit manages to allow many users (the trolls) to silence others, just as effectively as any government apparatchiks ever have.
Many of us would like to believe that the Internet is a chance to build a more perfect society. And in some sense, it is that impulse that underlies our almost mindless cries of “free speech.” But as we shout, we also must ask ourselves, are we actually creating a society that lives the values we so loudly espouse?
Because this is the current state of the Internet, at least in the comments: While those who would spew racist, hateful remarks with all the discrimination of a grenade feel their speech now enjoys near-perfect freedom, women seeking empathy for traumatic experiences they’ve endured feel anything but free to speak.
Illustration by Jason Reed
Nicholas White is the founder and editor in chief of the Daily Dot. His work has appeared in Wired, PBS, the Associated Press and elsewhere, and his reporting has been honored for excellence in journalism by the Associated Press.