- Twitter unites in collective confusion over ‘Democrats for Trump’ trending Saturday 2:28 PM
- YouTube star tweets and deletes video of his Black cousin ‘Peanut’ acting as a stool Saturday 1:04 PM
- The ‘Do you wash your legs in the shower’ debate has now escalated to feet Saturday 12:20 PM
- Donald Trump posted a world-class golf score, and the internet laughs at him Saturday 10:46 AM
- Lili Reinhart dragged the ‘Game of Thrones’ petition, sparking debate about TV and ‘fan service’ Saturday 9:42 AM
- How to stream UFC Fight Night 152 for free Saturday 8:00 AM
- People keep calling the ‘Game of Thrones’ creators by their initials—and it’s confusing D&D players Saturday 8:00 AM
- After infidelity and abuse accusations, ProJared said his wife wanted an open marriage Saturday 7:40 AM
- ‘Jailbirds’ prioritizes petty drama over insight Saturday 7:30 AM
- How to stream Deontay Wilder vs. Dominic Breazeale for free Saturday 7:00 AM
- How to live stream Josh Taylor vs. Ivan Baranchyk on DAZN Saturday 6:00 AM
- Kim Kardashian West reveals her and Kanye’s 4th child is named Psalm Friday 6:38 PM
- Tan France and Alexa Chung are hosting Netflix’s first fashion show Friday 5:42 PM
- Nonprofit groups express concern with pop-up abortion networks Friday 5:06 PM
- Pet owners mourn Grumpy Cat with photos of their own grumpy pets Friday 4:41 PM
Perhaps only human flaggers can help.
According to Motherboard, YouTube videos made by Nazi groups have been left on the platform for months and, in some cases, years. But YouTube was much quicker to delete the content of Islamic extremists. Oftentimes, those videos would be taken down within hours of them being uploaded.
Late last month, YouTube said it wouldn’t censor white nationalist channels like Atomwaffen—which has been implicated in five murders in the last 10 months—or the Traditionalist Worker Party, despite the fact YouTube’s terms of service proclaim that it will ban “content that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes, such as: race or ethnic origin, religion, disability, gender, age, veteran status, sexual orientation/gender identity.”
Since then, YouTube took action to delete the Atomwaffen channel. But Motherboard reported that copies of the group’s videos still exist on the site.
Google, which owns YouTube, said last year it would “increas[e] our use of technology” and find capable human flaggers to fight terrorism online.
“We tightened our policies on what content can appear on our platform, or earn revenue for creators,” YouTube said in a blog post in December. “We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies … 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms.”
While that appears to be working for pro-ISIS content, YouTube needs more help in determining whether pro-Nazi videos are actually hate speech and should be removed from the platform.
“The hard part is actually joining that up with a sort-of context in order to make a judgment on whether the image that you’re looking at is being used for a white supremacist purpose or not,” ex-NSA hacker Emily Crose told Motherboard.
Perhaps that will be the job of the more than 10,000 human flaggers YouTube said it will employ in 2018.
Click here to read Motherboard’s entire report.
Josh Katzowitz is a staff writer at the Daily Dot specializing in YouTube and boxing. His work has appeared in the New York Times, Wall Street Journal, Washington Post, and Los Angeles Times. A longtime sports writer, he's covered the NFL for CBSSports.com and boxing for Forbes. His work has been noted twice in the Best American Sports Writing book series.