- Kentucky food truck repurposes ‘LGBTQ’ to support Trump, BBQ Tuesday 8:47 PM
- Trump complains about his Twitter follower count to Jack Dorsey Tuesday 6:34 PM
- ‘Avengers: Endgame’ sticks the devastating landing—and gives you time to grieve Tuesday 5:00 PM
- Teen hits Apple with $1 billion lawsuit over alleged face recognition arrest Tuesday 4:48 PM
- John Cornyn tried to attack Patton Oswalt for his old tweets and failed miserably Tuesday 4:29 PM
- Logan Paul is selling a pillow of his dead dog—for a good cause Tuesday 4:04 PM
- Study: Too much Netflix, not enough ‘chill’ Tuesday 3:36 PM
- Pete Buttigieg under fire for saying incarcerated Americans shouldn’t be allowed to vote Tuesday 2:54 PM
- Vine’s co-founder is beta testing a new app called Byte Tuesday 2:51 PM
- Report: Joe Biden’s first 2020 fundraiser will be with a Comcast executive Tuesday 2:49 PM
- Netflix’s ‘Sabrina’ appears to have an art-copying problem (updated) Tuesday 2:47 PM
- People are crying over these cats’ window-sill romance Tuesday 2:27 PM
- The ‘I’m baby’ meme is all about being comforted Tuesday 2:24 PM
- Parody video totally nails what men are like on Tinder Tuesday 1:57 PM
- Twitch star AriLove latest woman to be arbitrarily banned for ‘sexually suggestive’ attire Tuesday 1:47 PM
YouTube plans to have 10,000 moderators battling deluge of awful videos
YouTube is seriously beefing up its moderation team.
In an effort to battle the ever-growing issue of extremist content on its site, YouTube plans to grow its moderation team to 10,000 people in 2018. Algorithms can only go so far, YouTube CEO Susan Wojcicki said.
“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” Wojcicki wrote in a blog post detailing the announcement.
While YouTube’s moderators have manually examined more than 2 million videos since June alone, YouTube is looking to expand this human team even further in the coming year in order to more quickly and more efficiently identify and remove content that violates its guidelines. It also aims to be more transparent about how it handles such “problematic content.”
Right now, YouTube uses machine learning to initially flag videos for review by moderators. After deploying this method in June, YouTube has removed more than 150,000 violent extremist videos. YouTube’s machine learning identification algorithms were able to spot 98 percent of such videos on the site, helping moderators remove five times more videos than they were able to do previously. Staffers are able to remove content more quickly than before, as well: Half of such uploaded content is removed within two hours of upload, 70 percent within eight hours.
Right now, YouTube’s machine learning technology is fine-tuned specifically to identify violent extremist videos. However, the company is working to customize the algorithms to identify other areas such as hate speech and child safety. The latter has proved a serious issue for the video-watching platform.
Beginning in 2018, YouTube will regularly publish reports about how it’s enforcing its community guidelines. The report will discuss the type of flags YouTube sees, as well as what actions the company takes to remove inappropriate comments and video uploads.
After discovering ads were placed against more than 2 million inappropriate videos, YouTube also says it will take a new approach to advertising. The company plans to augment its team of ad reviewers to perform more manual curation, to more relevantly pair ads with videos. This should benefit advertisers, who shouldn’t find their product paired with unsavory content, as well as creators, who aim to make money off their uploads. It should benefit those of us just watching too, since we should see more appropriate advertising on the videos we view and less unsavory content on the site.
H/T The Hill
Christina Bonnington is a tech reporter who specializes in consumer gadgets, apps, and the trends shaping the technology industry. Her work has also appeared in Gizmodo, Wired, Refinery29, Slate, Bicycling, and Outside Magazine. She is based in the San Francisco Bay Area and has a background in electrical engineering.