Tech

Social media companies destroyed the history into QAnon

The viral spread of QAnon has been lost to blanket bans by big tech.

Photo of Viola Stefanello

Viola Stefanello

People holding up large magnifying glass to see ACCOUNT SUSPENDED message on profile of person with Q for a head

Featured Video

Go further down the rabbit hole. Click for another story.


Advertisement

In 2019, the FBI listed QAnon as a dangerous conspiracy theory that had the potential to lead to domestic terror in the United States. But by then, the conspiracy theory already made itself comfortable on mainstream social media platforms.  

And in their efforts to eliminate its spread, social media companies managed to destroy its entire history.

The conspiracy theory, which is centered around the belief that the liberal media and political elites are actually composed of Satan-worshiping pedophiles, easily crawled out of the 4chan hole it hatched in and found homes on Facebook, Twitter, and other platforms.

After being spread across mainstream social media, the conspiracy theory would explode into the offline world, through candidates running for Congress, violent crimes, and assassination plots—culminating with the attack on U.S. Capitol on Jan. 6.

Advertisement

Journalists and researchers studied the conspiracy theory and have reported on the real-life dangers stemming from it since 2018. Yet, social media companies, which served as ripe recruitment grounds for new QAnon believers, would only start doing something about it in 2020.

It was too little, too late: QAnon theories crept into wellness and health groups as much as they did anti-vax and conservative pages, ultimately merging in a conspiracy singularity

Removing QAnon content, albeit so late, was tech companies’ only way to save face after finally acknowledging that something so dangerous was happening underneath their noses. But it also meant deleting a considerable chunk of valuable internet history—thus depriving researchers and experts of valuable insight on how the biggest conspiracy theory of our generation spread like wildfire, even to the least expected online communities.

It’s not the first time platforms have been accused of sweeping dangerous or illegal content under the rug, crushing the ability to shine light on misdeeds. Researchers have been speaking out against Facebook’s practice of deleting content representing “critical evidence for repatriation efforts and war crimes,” and YouTube was also accused of erasing crucial evidence of war crimes as it enforced its ban on Islamic State videos. In those instances, Syrian Archive researcher Jeff Deutch explained that “it’s important that this content is archived so it’s accessible to researchers, human rights groups, academics, lawyers, for use in some kind of legal accountability.” 

Advertisement

The exact same worries are applicable to QAnon deplatforming. After letting QAnon foment on their websites for years, the platforms cracked down on it, deleting all evidence of what had happened and leaving researchers to scramble for screenshots and archived posts.

The memes; the convoluted, made-up stories about adrenochrome and pedophilic sex international sex rings; the death wishes—basically the whole scaffolding that had been keeping up the immense web of delusions at the heart of QAnon—disappeared behind suspended accounts and removed posts.

Now, scrolling through old articles about the conspiracy theory feels like walking through a content cemetery.  

Most Twitter profiles included in a Rolling Stone investigation on the Pizzagate conspiracy theory, an early harbinger of QAnon, are now gone forever. The deletion of those posts removes precious insight on how Pizzagate beliefs were quickly absorbed by QAnon. One account, @NIVIsa4031, tweeted constantly and specifically about a “hot rumor” about the FBI. The account was deemed “part of a bot network” that shared Pizzagate content, but is now gone. Instead of being able to what else it amplified, the account just shows a message that it has been suspended, noting that it violated Twitter’s rules.

Advertisement

With the account gone, it’s impossible to see how that account interacted with others, where it may have gotten the content from, who followed the accounts, or who its mutual followers are—all pieces of information that are vital for trying to understand the genesis of a conspiracy theory spreading online.

The same happened with a November 2017 Daily Dot article explaining the clearest precursor to QAnon, the “follow the white rabbit” craze, which showed how even the most harmless-looking symbols can be read as evident symbols of global transnational crime happening underneath our noses. Those first few posts were some of the first crossovers from 4chan to Twitter of the QAnon conspiracy.

The Daily Dot spoke with some of the people who played a role in the initial crossover, who claimed that the people behind QAnon took their live-action role-playing concept from them. The group pushed the “follow the white rabbit,” slogan which worked its way into some early QAnon language.

A prominent Twitter account at that time, @THEWHI17ERABB17, who made some very early posts about Q, is now gone. Like many others, the account was suspended, making it difficult today to research it further or see what the connections between the two conspiracies actually were in their initial days.

Advertisement

Besides the missing content from early on in conspiracy theory, primary sources—major accounts that spread QAnon content—were erased systematically by the social media companies.

The first big platform to enforce a ban on all QAnon content—after banning those that actively promoted violence earlier that year—was Facebook, in October 2020, ahead of the U.S. presidential election. The ban dissolved 1,500 pages, groups, and profiles, all of which—if examined thoroughly—could have provided context for researchers about how the conspiracy theory spread and evolved since it moved from 4chan to more mainstream platforms.

YouTube deleted videos touching on QAnon theories as part of a wider crackdown on white supremacist videos in 2019 and it said it prohibited QAnon content threatening violence against a group or individual in October 2020, but it didn’t announce any sweeping ban. Still, Q-Tubers who were critical in pushing the theory have now disappeared.

Twitter had also removed a few thousand accounts spreading QAnon theories in July 2020, adding that it would block trends related to QAnon conspiracies from appearing in its trending topic and it wouldn’t allow people to post related links on their website. 

Advertisement

Major accounts were taken offline like “Inevitable ET,” a user who had a quarter-million followers. Meanwhile, Tommy “Tommy G.” Gelati, a QAnon promoter and podcaster, was banned. The platform also banned another promoter, “Joe M,” who once had the handle @StormIsUponUs that was banned before rebuilding a following as @SheepKnowMore. Together, the three accounts had more than a half-million followers in total. How many people they brought into the conspiracy, and how they did it, has essentially been erased.

Amazon, TikTok, Pinterest, Etsy, and other platforms big and small all have taken steps to smother QAnon-related content on their websites.

The big wakeup call came after the attack on Congress on Jan. 6: Twitter removed 70,000 QAnon-related accounts, saying it cracked down on content that had the potential to cause offline harm. 

For researchers trying to shine a light on how a fringe conspiracy theory turned into a national security threat, radicalizing thousands of people and building a myth around former President Donald Trump in the process, watching as this happened has been frustrating.

Advertisement

“From a moderation standpoint, Twitter, Facebook and the other major platforms taking down a lot of QAnon content is probably the right call,” Jared Holt, a resident fellow at the Digital Forensics Lab specializing in political extremism, told the Daily Dot. “But for people who research things like QAnon, one of the frustrations that come with the job is that when the content goes down, unless you thought to archive it ahead of time, it’s gone forever”.

Holt added: “As a researcher, it can get frustrating: not being able to access the content makes it harder to tell what happened before the content was taken down. Accessing that content could promote and develop a public understanding of how these platforms work, and the kind of movements and disinformation that can develop on them”. 

Holt is not the only researcher who has expressed this kind of frustration. 

Shruti Padke, a Ph.D. candidate at the University of Washington, recently worked on a thorough paper on cognitive dissonance in QAnon communities alongside professor Tanushree Mitra. 

Advertisement

“Content posted by QAnon believers is the most important resource for understanding their ideology, belief system, and other social processes,” she told the Daily Dot. 

Her research ended up relying on content from Reddit, 4chan, and 8chan because QAnon-related content and accounts on other big platforms were no longer accessible, even for research purposes. But Reddit too has stripped away the conspiracies history, Reddit banned r/CBTS_Stream in March 2018., a prominent hub of QAnon content that was started by early QAnon adherents. The subreddit was one of the first crossovers from 4chan to more mainstream social media. r/CBTS_Stream had more than 20,000 subscribers before it was banned.

“On the one hand, it is important to curate and remove content that has disinformation and harmful ideas,” Padke said. “On the other hand, it is also important for researchers to study that content to design systems resilient to future harms. We initially set out to study all 19 QAnon communities that were banned but we found data for only 12. We were not able to gather Twitter and Facebook data as the QAnon related communities and accounts were getting banned.”

Despite the bans, QAnon marches relentlessly forward. And social media companies chose saving face over helping the world understand why it does.

Advertisement
The Daily Dot

Click to go back.

Advertisement
 
The Daily Dot