8chan qanon

Jason Reed/The Daily Dot

As followers get more violent, should 8chan ban QAnon?

A Q supporter wound up in court for a murder. Is it time 8chan stepped in?

 

Mike Rothschild

Tech

Posted on Mar 20, 2019   Updated on May 20, 2021, 4:42 pm CDT

When it was revealed that the alleged killer in the Christchurch massacre first posted his racist manifesto on the anarchic image board 8chan, the news brought renewed attention to the anything goes, no censorship forum that’s become a haven for some of the worst ideas on the internet.

And unsurprisingly, all that attention has been extremely negative.

NPR’s Jasmine Garsd highlighted the role 8chan had in letting the shooter put up the link to his Facebook stream of the attack, writing that he “seems to have first advertised the attack on the online forum 8chan, a message board known for right-wing extremist users,” while other outlets have wondered if it’s time for 8chan to be pulled down entirely.

And while internet hosting companies in several countries, including Australia and New Zealand, have responded to the Christchurch massacre by banning 8chan on their own servers, such a wholescale banning of an entire website is unlikely in the United States.

Forbes Thomas Brewster posited that “even though today it’s easy to find users promoting violence and links to possible child-exploitation material,” the site isn’t likely to be wiped out, as its infrastructure and security provider Cloudflare has generally refused to delete other extremist sites, making only a few exceptions.

While 8chan itself likely isn’t going anywhere, is there a possibility of the site policing itself? In particular, will 8chan take action about the increasingly violent rhetoric of the QAnon conspiracy, which features cryptic posts by a supposed Trump administration insider—all of which originate on 8chan?

Over the past week, Q’s normal mix of patriotic glurge, unsolvable puzzles, and broken promises of great events to come has gotten decidedly more focused on acts of violence, including a series of posts earlier this week highlighting the words “target,” “ammunition,” and “KILL” [sic].

The poster followed those with a series of official portraits of Obama administration officials like Sally Yates, James Clapper, and James Comey. All have been featured in Q drops before, and all were highlighted right after Q broke yet another deadline, claiming the 21-day countdown to mass arrests that started in late February would be delayed because the Robert Mueller report wasn’t ready.

At the same time, the violence that’s at the center of QAnon’s “great awakening” of purged Democrats and deep staters is spilling out into real life. The alleged Christchurch killer didn’t mention QAnon in his manifesto, but they share similar obsessions with child sex rings and un-evidenced crimes by the great leaders of the world.

Then, there’s also the bizarre case of Anthony Comello, the alleged shooter who murdered Gambino family boss Francesco Cali, who drew several QAnon slogans on his hand during an extradition hearing.

It’s clear that the fantasy being spun by QAnon is colliding with the reality of internet self-radicalization. So would 8chan deleting or banning QAnon content forestall another massacre?

8chan is known primarily for its ethos of allowing anything other than child pornography to stay up without what it deems “censorship.” In fact, many movements that have been banned from other image boards, including 8chan predecessor 4chan, have found a willing home on 8chan.

The initial QAnon posts were made on 4chan, after a few weeks, the Q poster claimed 4chan was compromised by “infiltration” and migrated to their now familiar forum on 8chan.

And while more mainstream sites like Reddit have banned virtually any discussion on QAnon, the /qresearch/ board on 8chan continues to be the most active hub of Q-related conversation, though much of it is anarchic and unrelated to Q itself.

There have only been a few occasions where the site ever took action to take down content that wasn’t illegal, but violated the boundaries of good taste that hem in virtually every other social media site.

In 2015, the 8chan board /Baphomet/ had its contents wiped by its moderator after a user posted the contact information and Social Security number of a federal judge, along with credit card numbers and death threats. But even that was in response to the illegality of posting identifying and financial information. Beyond that, the Christchurch manifesto was posted on /pol/, one of the most active fora on the site. There’s been no talk of the board removing that site, so it’s not reasonable to expect 8chan to start enforcing posting rules it’s never enforced in the first place.

That’s especially true for QAnon, since the two murders that have allegedly been committed by followers haven’t been directly related to the conspiracy theory.

If 8chan (which didn’t respond to a request for comment) isn’t going to take action to deplatform QAnon, it’s up to other sites to make sure that what starts on 8chan doesn’t spread. Reddit has already done so with their ban on Q fora, driving Q decoders from the r/greatawakening forum with over 70,000 followers to one on alt-right leaning board Voat with just 15,000.

And Twitter is starting to take action against major QAnon accounts, banning prominent acolytes with large followings for making violent threats. YouTube and Facebook are also finally addressing the problem of self-radicalization, with YouTube serving strikes against several QAnon video makers, and both sites banning content from one-time QAnon supporter Alex Jones.

But given that QAnon posts and discussion originate on 8chan, and that site has made no effort to police its content, it’s likely that the movement is here to stay. 

And even more violent acts inspired by the poster aren’t going to change that.

Share this article
*First Published: Mar 20, 2019, 6:30 am CDT
 

Featured Local Savings

Exit mobile version