YouTube Trump Riot Capitol New Policy

PixieMe / Shutterstock.com (Licensed)

YouTube announces changes to election misinformation policy after pro-Trump Capitol riot

YouTube announced the new policy 'due to the disturbing events that transpired yesterday.'

 

Andrew Wyrich

Tech

Posted on Jan 7, 2021

YouTube announced on Thursday that it was making changes to how it will handle channels that make false claims about the 2020 election in the wake of the pro-Trump riots at the U.S. Capitol yesterday.

The website removed a video from President Donald Trump on Wednesday where he referred to his supporters who stormed the U.S. Capitol, got inside, and ransacked offices as “very special” and added, “We love you.”

On Thursday, YouTube said any channel that posts new videos that include false claims will now receive a strike immediately, which prevents it temporarily from uploading videos. Channels that receive three strikes are banned from the website.

The company announced last month that it would remove videos that included false claims about the 2020 election. Now that removal will also come with a strike.

YouTube’s note that new videos would be subjected to the policy indicates that Trump would need to violate the company’s policy three times before he was removed from the platform.

“Due to the disturbing events that transpired yesterday, and given that the election results have now been certified, starting today * any * channels posting new videos with false claims in violation of our policies will now receive a strike,” the company tweeted. “Over the last month, we’ve removed thousands of videos which spread misinformation claiming widespread voter fraud changed the result of the 2020 election, including several videos President Trump posted to his channel.”

The company added:

“Channels that receive a strike are temporarily suspended from posting or live streaming. Channels that receive three strikes in the same 90-day period will be permanently removed from YouTube. We apply our policies and penalties consistently, regardless of who uploads it.”

YouTube’s new policy announcement comes as social media companies have taken action against Trump in the wake of the riots at the Capitol. During the chaos, pressure was mounted on the companies to suspend the president’s accounts.

Besides YouTube’s removal of the video, both Facebook and Twitter did the same thing.

Twitter put a 12hour block on Trump’s account after deleting the video and Facebook also blocked access to Trump’s account for 24 hours before CEO Mark Zuckerberg announced on Thursday that the ban would extend “indefinitely” or at least the upcoming two weeks until Inauguration Day.


Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Share this article
*First Published: Jan 7, 2021, 12:28 pm CST