- Here’s why you shouldn’t buy a Nintendo Switch until mid-August Monday 5:11 PM
- Man blasted for making his coworkers babysit his child Monday 5:07 PM
- Pete Buttigieg’s country radio interview was blocked from the air Monday 4:35 PM
- 15-year-old Smash Bros. prodigy caught using racist slur in private Discord server Monday 3:47 PM
- Instagram users who post pet pictures more likely to get hacked Monday 3:45 PM
- Post-Prime Day recap: Shipping delays, more sales, and a scam Monday 3:08 PM
- Jacob Wohl returns to Twitter … for now Monday 1:56 PM
- How to stream WWE Raw Reunion Monday 1:35 PM
- ‘I hope Trump deports you’: Woman goes on racist rant to Spanish speakers at a store Monday 1:24 PM
- Emoji Mashup Bot gives life to unidentifiable emotions Monday 1:15 PM
- Notorious grifter Anna Sorokin reportedly blocked from profiting off Netflix series Monday 12:45 PM
- Charlottesville attacker’s Twitter account included praise for Hitler Monday 12:10 PM
- ‘Short Treks’ trailer: Spock, Pike, and Number One return Monday 11:57 AM
- Everything we know about ‘Star Trek: Lower Decks,’ the new animated show Monday 11:55 AM
- Cole Carrigan says he left Team 10 after being called homophobic slur Monday 11:32 AM
Perhaps only human flaggers can help.
According to Motherboard, YouTube videos made by Nazi groups have been left on the platform for months and, in some cases, years. But YouTube was much quicker to delete the content of Islamic extremists. Oftentimes, those videos would be taken down within hours of them being uploaded.
Late last month, YouTube said it wouldn’t censor white nationalist channels like Atomwaffen—which has been implicated in five murders in the last 10 months—or the Traditionalist Worker Party, despite the fact YouTube’s terms of service proclaim that it will ban “content that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes, such as: race or ethnic origin, religion, disability, gender, age, veteran status, sexual orientation/gender identity.”
Since then, YouTube took action to delete the Atomwaffen channel. But Motherboard reported that copies of the group’s videos still exist on the site.
Google, which owns YouTube, said last year it would “increas[e] our use of technology” and find capable human flaggers to fight terrorism online.
“We tightened our policies on what content can appear on our platform, or earn revenue for creators,” YouTube said in a blog post in December. “We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies … 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms.”
While that appears to be working for pro-ISIS content, YouTube needs more help in determining whether pro-Nazi videos are actually hate speech and should be removed from the platform.
“The hard part is actually joining that up with a sort-of context in order to make a judgment on whether the image that you’re looking at is being used for a white supremacist purpose or not,” ex-NSA hacker Emily Crose told Motherboard.
Perhaps that will be the job of the more than 10,000 human flaggers YouTube said it will employ in 2018.
Click here to read Motherboard’s entire report.
Josh Katzowitz is a staff writer at the Daily Dot specializing in YouTube and boxing. His work has appeared in the New York Times, Wall Street Journal, Washington Post, and Los Angeles Times. A longtime sports writer, he's covered the NFL for CBSSports.com and boxing for Forbes. His work has been noted twice in the Best American Sports Writing book series.