- Is Trump defiling the U.S. flag in this MAGA dude’s artwork? Today 4:41 PM
- White woman claims she invented sleep bonnets, selling them for $100 Today 4:03 PM
- Even real cats are transfixed by the enigma that is the ‘Cats’ trailer Today 3:04 PM
- Wait, how tall is Peppa Pig? Today 1:55 PM
- Twitter suspends Iranian state media outlets for harassing members of a religious minority Today 1:06 PM
- Pro-MAGA pageant queen stripped of title over ‘offensive’ tweets Today 11:52 AM
- Marvel unveiled its Phase 4 plans at San Diego Comic-Con Today 9:16 AM
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Today 6:30 AM
- Philadelphia to fire 13 officers for racist, violent Facebook posts Saturday 6:12 PM
- Nick Offerman is so down to play every single role in ‘Cats’ Saturday 4:27 PM
- Woman documents how airport staff broke her wheelchair Saturday 3:04 PM
- Funeral home allegedly posted photos of woman’s dead body on social media Saturday 1:56 PM
- Alinity Divine is being investigated after throwing her cat during stream (updated) Saturday 12:04 PM
- ‘Comedians In Cars Getting Coffee’ returns with Seinfeld making a racist joke about China Saturday 10:26 AM
- YouTubers Eugenia Cooney and Shane Dawson make a joint comeback Saturday 9:06 AM
Membership gives users access to “advanced flagging tools.”
With the volume of videos being uploaded to YouTube every day, it’s impossible for the company to police and screen every single piece of content in-house. That’s where YouTube’s Trusted Flagger program comes in.
The program’s existed since October 2012, in an effort to root out videos that might contain pornography, hate speech, animal abuse, or copyright infringement. YouTube hasn’t been very public about recruitment, which is invite-only, though interested parties can submit their names for consideration. According to the program’s official page, “Membership in the Trusted Flagger Program gives users access to more advanced flagging tools as well as periodic feedback, making flagging more effective and efficient. As always, the policy team at YouTube makes the final determination of whether content should be removed.”
This process came under scrutiny last week, when the Financial Times reported Google, YouTube’s owner, allegedly gave expedited “super-flagging” privileges to certain users, including British security officials, as a means to root out extremist material “at scale,” which means up to 20 videos at a time. James Brokenshire, the U.K.’s security and immigration minister, told the Financial Times that they’re trying to do more to address Internet content “that may not be illegal but certainly is unsavoury.”
YouTube maintains they have final say over whether a video is removed. A Google spokesperson told the Wall Street Journal, “Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong.”
A Wall Street Journal source familiar with the program further explained that a “vast majority of the 200 participants in the super flagger program are individuals who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups.”
This situation is especially timely, as questions about the ethical boundaries of U.K. intelligence agencies abound. Google might just be extending more power to trusted flaggers, but that power allegedly yields more results than a casual viewer flagging a video. The Wall Street Journal’s source reported that 90 percent of content marked by super flaggers is either removed or restricted.
An email to YouTube about the program was not returned.
Illustration by Jason Reed
Audra Schroeder is the Daily Dot’s senior entertainment writer, and she focuses on streaming, comedy, and music. Her work has previously appeared in the Austin Chronicle, the Dallas Observer, NPR, ESPN, Bitch, and the Village Voice. She is based in Austin, Texas.