Article Lead Image

Netflix/YouTube

Facebook gives ’13 Reasons Why’ posts extra moderation to prevent suicides

It's a precaution.

 

Christine Friar

Tech

Posted on May 23, 2017   Updated on May 24, 2021, 1:29 pm CDT

If you’re engaging with Netflix‘s 13 Reasons Why on Facebook, you’re being moderated more closely than elsewhere on the site.

At least that’s what a “massive leak” of the social network’s internal documents this week suggests. The Guardian got its hands on “over 100 internal training manuals” from Facebook on policies pertaining to how they moderate content featuring violence, hate speech, terrorism, pornography, racism, and self-harm, and it seems they beefed up moderation on the teen show’s page in particular since its premiere in March.

The popular Selena Gomez-backed series tells the story of a teen who takes her own life, and the aftermath at her high school in the months that follow. Facebook was apparently so concerned the show would inspire copycats that it advised all moderators to “escalate any content related to the show to senior managers.” This doesn’t mean that every person posting about 13 Reasons Why was put on watch. (Facebook moderators only come into play once other Facebook users have flagged a post for review.) However, it does mean that the show’s page is being monitored with a finer-toothed comb than the rest of the site.

According to Variety, moderators have been reviewing “around 10,400 posts about self-harm during a four-week period of this year.” But of those posts, only a small percentage result in Facebook contacting law enforcement. During one two-week period in 2016, the site saw 4,531 such reports and alerted law enforcement in only 63 of those instances. In other words: A bump in suicide-related posting doesn’t necessarily mean a bump in troubling content, but it definitely requires a beefed-up moderation team regardless.

The moderation seems precautionary, which is good when you consider Facebook’s track record when it comes to removing graphic posts in a timely manner. Earlier this month, founder and CEO Mark Zuckerberg announced plans to hire 3,000 more moderators specifically to comb through potentially graphic imagery—a move that nearly doubles the current team. And now, perhaps, we know why.

H/T Variety

Share this article
*First Published: May 23, 2017, 11:30 am CDT