On the afternoon of Aug. 12, 2017, a few colleagues and I arrived back at our hotel room in Charlottesville, Virginia. Drenched in sweat, I removed my bulletproof vest and began looking through images I had captured on my camera.
I had just witnessed the death of Heather Heyer and was resolute in my goal of showing exactly what had happened that day.
One of my friends turned on the television, and I felt an instant disconnect between what I had seen and what was being explained on CNN. It was analysis of what was happening, not reporting of what did. And right now, I’m being presented with that same impossible choice by YouTube, where I host my footage. Analyze what happened on the ground, or not be able to report what actually did.
I run a news agency called News2Share where I film and live stream political activism, and put the raw footage online. My footage from Charlottesville was featured on several major news networks and dozens of films, including two Emmy winners and an Oscar winner.
My objective with this approach—raw, minimally edited footage and live streams—is to be a true primary source. I co-founded my outlet, News2Share, in 2014 with another then-student at American University to do exactly that, beginning with footage of the beginning of the Black Lives Matter movement.
But my agency is at a crossroads thanks to social media crackdowns on “disinformation.” Right now, YouTube has begun penalizing and removing raw video documentation of events if people being recorded make disallowed claims such as “the election was stolen” without the presenter offering “countervailing views” to repudiate it. This policy is enforced regardless of the content’s journalistic value or intent of the uploader.
The site will no longer let me show what happened. Instead I have to tell.
My YouTube channel is filled with years of raw footage of political activism which major networks have relied on in their reporting. Since the Capitol insurrection, my work has been licensed by CNN, NBC, the Washington Post, the New York Times, Business Insider, BBC, ITV, PBS, and countless others. The footage is indisputably a valuable source in serving the broader journalism community and the public.
I use YouTube as the definitive destination for my work. My archive spans six years of the social movements shaping America, and it’s been an incredible platform to put my work in front of the eyes of audiences, news producers, and filmmakers alike. But that’s changing.
On Jan. 6, I began my coverage of the day at the Capitol itself. At 9:17am, I began a live feed showing Trump supporters already assembling at the Capitol.
My footage of the deadly violence that followed has garnered drastically more attention and played countless times on news networks, but it was two other videos that the platform took issue with.
On Jan. 24, right before the House delivered Articles of Impeachment accusing former President Donald Trump of “Incitement of Insurrection,” YouTube removed my uncut, raw video of his Jan. 6 speech where he riled up his supporters because it said it “advances false claims that widespread fraud, errors, or glitches changed the outcome of the U.S. 2020 presidential election.”
As I struggled to get in touch with the platform, it demonetized my entire channel, a move it promptly reversed after Fox News reported sympathetically on it.
In the fallout of the demonetization, YouTube stood by the original takedown, noting that the video lacks “countervailing views or sufficient context of the claims made in the footage.”
In a follow-up email, YouTube told me: “If content contains misinformation that violates our policies, in order to comply it must contain countervailing views, and/or additional news/reporting context must be present in the images or audio of the video itself. Providing this in the title or description may be insufficient. More tactically, this could look like a graphical overlay, host voiceover, etc but would need to be incorporated into the video itself by the creator.”
In effect, YouTube is saying that primary source journalism that covers contentious issues—video of a historical moment or speech—is only acceptable if interpretation or analysis is added.
Begrudgingly, I re-uploaded the video with a two-minute warning at the beginning, stating where viewers could look to see election integrity verification sources. This was apparently deemed a sufficient “countervailing view” and thus stayed online.
At its heart, the regulation forces primary source journalists like myself to make an impossible choice: Behave like a secondary source—that is, offer commentary—or be no source at all.
YouTube doubled down on this position in the weeks following the incident, broadening the scope of enforcement.
On Feb. 24, YouTube deleted my live stream that recorded the beginning of the day of the Capitol riot, using the same logic.
YouTube generally uses a three-strike system. In some cases, it removes videos that it sees as violative without penalizing the account otherwise. If it does want to penalize a content creator, it can first put a “warning” on the account. The warning does not expire.
Three strikes within 90 days yields a total account deletion.
The following day, it removed raw footage of a congressman speaking at an early so-called “Stop the Steal” rally.
On March 3, YouTube then deleted three further videos, and applied a strike on one of them, thus suspending my account.
The video I received a strike for (backed up here) was taken outside President Joe Biden’s inauguration, where a religious hate group protested. Their demonstration had nothing to do with the 2020 presidential election, but one apparent Trump supporter confronted the group and shouted, “I still believe that the election was stolen, but you sir, all of you, have been promoting the wrong form of Christianity.”
The video was removed for election disinformation.
The other two videos were footage of far-right self-proclaimed “white majoritarian” Nick Fuentes giving a speech at a December Million MAGA March, and a conservative rally filmed by a freelancer of mine in Oregon.
My Fuentes speech was used at the impeachment trial itself, and the protest video was licensed by NowThis. In neither of those contexts was the content censored by YouTube, and nor should it be, but the discrepancy highlights the arbitrary nature of the enforcement.
YouTube acknowledges in its policies that an exception to absolute rules about hate speech and election disinformation exists for the purposes of documentary.
“Videos that might otherwise violate our policies may be allowed to stay on YouTube if the content offers a compelling reason with visible context for viewers,” it writes. “We often refer to this exception as ‘EDSA,’ which stands for ‘Educational, Documentary, Scientific or Artistic.’”
My documentary footage is intended to be a primary source library of the movements shaping America. The journalistic value is unquestionable; virtually all of the videos removed have been licensed by major networks or documentary films. I agree with YouTube’s EDSA premise, but unfortunately, I don’t believe its hardline stance makes sense, nor is it applied consistently.
To post a truthful, raw video account of a situation, one must now include commentary repudiating the content of the documentation, effectively stripping it of unvarnished authenticity.
It demands that pure truth be replaced with a sprinkle of editorialization.
In my and other freelancers’ case, this is completely against ours business model. I don’t sell my opinions about what I film. I sell licenses to raw footage. Imagine for a moment the absurdity if Getty Images required photographers to include on every picture of Trump a “this person is bad and wrong” watermark.
Beyond the stated policies of the platform, the crucial issue is that a fair hearing on these issues can be virtually impossible to acquire. Back in 2019, my entire channel was demonetized erroneously amidst a wave of content moderation meant to tackle hate speech.
It took a full seven months of advocacy including several major media profiles before the platform apologized and fixed it. I suspect that other users with smaller followings find it impossible to seek resolution.
As a private company, YouTube is entitled to do as it pleases. It has no legal obligation toward fairness, and calls for laws or regulations to tackle the problem may have incredible unintended consequences.
However, YouTube should recognize the incredible power it has and the responsibility that comes with it.
As of now, content moderation decisions are made without the user’s input. The user is accused and penalized at once. Appeals are allowed but generally rejected without any written explanation.
When historians look back on the Trump era, they’ll be inundated with commentary. Hopefully, when they go looking for raw footage on the internet, they’ll still be able to find it.