Man painting over a giant Q

Super1973/Shutterstock (Licensed) Remix by Jason Reed

Facebook shut down massive QAnon groups right before the 2020 election—did it swing the vote?

Imagine a whole political party being banned just weeks before a vote.

 

Alex Thomas

Tech

Posted on Oct 28, 2021


Go further down the rabbit hole. Click for another story.


As misinformation wreaks havoc in the United States, big tech is still struggling to get control of a problem that spun out of hand years ago. Anti-vaccine and election misinformation now run rampant on every platform, and it has all coalesced under the mantle of QAnon, the conspiracy theory that the world is run by a network of Satan-worshipping pedophile elites that former President Donald Trump is fighting.

While the general consensus is that banning these movements from the major platforms is the correct decision, when the tech giants did it raises almost as many questions as why they did. It was just a month before the 2020 election when two of the largest online platforms banned anything related to QAnon. 

Whether you believe in the conspiracy or not, or think the movement should have a platform or not, there is no denying that it has millions of supporters now. Almost all of them are diehard fanatics of Trump. And as they were fighting in the final days to re-elect their God-Emperor, they lost the biggest pulpits to spread their gospel. 

For nearly three years, Facebook allowed QAnon groups to grow and mutate on its platform, helping take the conspiracy from obscure to mainstream. The conspiracy theory is no longer a fringe movement. A poll from early this year found that as many as 15% of Americans believe in aspects of it. 

On Facebook, QAnon surged in popularity during summer of 2020 as conspiracy theorists drew in new followers with the #SaveTheChildren movement and COVID-19 conspiracy groups took hold. 

Over the summer, bans began, with Twitter taking the lead in kicking influencers off their site. But they’d already spread to every crevice of the web.

In September 2020, wildfires swept across the west coast. Using their access to social media, QAnon conspiracy theorists spread the false rumor that antifa had started them. Emergency phone lines were flooded with false reports and the FBI issued a statement pleading that “reports that extremists are setting wildfires in Oregon are untrue. Help us stop the spread of misinformation by only sharing information from trusted, official sources.”

On Oct. 6, Facebook took action, saying that they would “remove any Facebook Pages, Groups and Instagram accounts representing QAnon.” The tech platform cited the wildfire misinformation as one of the reasons behind their decision. But the blanket ban also came just 27 days before an election that was on a razor’s edge.  

Behind the scenes, the concern about QAnon and the election was palpable. In August 2020, a researcher warned of the size of the conspiracy on the site.

“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” a researcher wrote, according to the New York Times, citing recently leaked documents. “In the meantime, the fringe group/set of beliefs has grown to national prominence with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream.”

A candidate in Florida even admitted that a QAnon Facebook group inspired her to seek a seat in Congress. Darlene Swaffar said that the “QAnon Great Awakening” page “inspired me to explore my run for Congress in 2020,” according to Media Matters.

The concern about Q on the platform and its role in the election wasn’t without merit. A study after the vote in November found a correlation between higher vote totals for Trump where there were large pockets of belief in QAnon.

“The higher the support for QAnon in each state, the more the polls underestimated the support for Trump,” a researcher told the Times.

That October statement from Facebook on the ban was an update upon an August release where Facebook said they had begun removing some QAnon groups and ads. A few major QAnon figures, like misinformation superspreader Liz Crokin, were banned prior to Facebook’s new push against misinformation. But it was still rampant on the platform up until October. 

And Facebook’s decision laid down a massive brick in a wall that had been building, a dam to shut down the flood of QAnon from Silicon Valley sites. 

Twitter was the first social media platform to take action against the QAnon conspiracy theory, a few months ahead of Facebook. In July of 2020, Twitter began blocking URLs to QAnon sites from being shared. At the time, QAnon accounts were hawking coronavirus misinformation, blaming it on their usual scapegoats: billionaires and blood-guzzling celebrities. A few days after Twitter’s crackdown, TikTok announced they were taking action by hiding QAnon search results, though they allowed QAnon videos to remain on their platform.

Dr. Samuel Wooley, the program director for propaganda research at the University of Texas Center for Media Engagement, told the Daily Dot: “What’s really clear is that none of these social media firms want to be the first one to make the big, difficult calls about bans or censorship.” 

With Facebook making its decision, others followed suit. 

On Oct. 7, Etsy made a similar announcement saying they would remove “items related to QAnon.” Searches for “QAnon” or the QAnon slogan “WWG1WGA” now yield no results.

On Oct. 15, YouTube followed Facebook and Etsy with their own vague announcement mentioning QAnon. The video hosting site claimed their existing policies were already effective but that they were now “removing more conspiracy theory content used to justify real-world violence.” 

It’s a cascading effect that may have mattered more than people can imagine. The Pizzagate conspiracy—which was a precursor to QAnon—first appeared a little over a week before the 2016 election but metastasized online, in a viral, uncheck spread. As voters went to the polls, vicious rumors about Hillary Clinton, who already had seen conspiratorial nonsense spread about her all through the campaign season, kept popping up in parts of the internet. 

Just eleven days after a Pizzagate-believer took a gun into a Washington, D.C. pizza parlor, Facebook issued a statement outlining how they were fighting hoaxes and fake news. While the statement did not mention the Pizzagate conspiracy theory online, it had cropped up on its platform. The reasoning behind it was obvious.

But in that case, the election had since passed.

Whether that affected the outcome is unclear, but with margins so tight in 2016, any little push could have made the difference. In 2020, with two weeks left before the election, QAnon supporters were no longer allowed to use Facebook and YouTube to smear Joe Biden. Whether you believe in the conspiracy or not, there’s no doubting that a large swath of Trump supporters was prevented from pushing their views and concerns right before people went to the polls. 

And the extent of similarities between Q drops and the political attacks marshaled by Trump’s team ahead of the election is striking. On the same day the New York Post published a story with compromising photos of Hunter Biden—a week after Facebook’s QAnon ban—Q posted those same photos and wrote of Joe Biden, “does having early stage dementia help you re: previous deniability?” At a time when the Hunter Biden photos would have overtaken major QAnon message forums on Facebook, those channels were completely shuttered.

Would they have been able to push the stories to a larger audience, casting greater aspersion on Biden in the weeks before the vote? It’s an answer that’s unknowable. 

Over the next few weeks, Q was relentlessly obsessed with the Hunter Biden story, posting about it multiple times a day. However, whereas just a month ago it could have been memed into the national consciousness, believers had no avenues to post about it.  

While the decision may have ostensibly been about wildfires, Facebook’s move to clamp down on QAnon silenced almost 3 million people, all of who no doubt would be trying to sway the national consciousness toward Trump. 

It’s very possible, without the ban, the election could have gone an entirely different way.

Click to go back.

Share this article
*First Published: Oct 28, 2021, 8:00 am CDT
 

Featured Local Savings

Exit mobile version