Facebook plans to hire 3,000 additional employees to remove videos of crimes and suicide on its platform, adding to the 4,500 it currently employs.
CEO Mark Zuckerberg confirmed the hires in a Facebook post Wednesday. “We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down,” Zuckerberg wrote.
Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted…
Facebook Live launched a year ago but has already been used by dozens to broadcast murders and suicides. As Business Insider points out, the most popular Google searches for Facebook Live contain murder, death, and torture. Zuckerberg claims Facebook reviews “millions of reports” every week.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it—either because they’re about to harm themselves, or because they’re in danger from someone else.”
Users were angered after the video of a murder uploaded to Facebook last month took two hours to delete. The company admitted it was “too slow” to respond, and Zuckerberg vowed to “keep doing all we can to prevent tragedies like this from happening” at the company’s F8 developer conference in April.
Facebook came under scrutiny after a similar incident earlier this year. A 14-year-old Miami teenager took her own life on Facebook Live in January. The video lasted two hours and was viewed by a number of the teen’s friends, according to the Miami Herald.
On Monday, the social network announced it would add suicide prevention tools to its Live platform, including new options to report if someone hurt themselves during a broadcast. Viewers are now able to report an escalated response to Facebook, which will also contact emergency workers if someone is in imminent danger. It will also send the person broadcasting a number of resource pop-ups on their screen so they can contact help.
There are obvious concerns for those who police the web and are required to watch videos of suicide, murder, and pornography. In 2010, the New York Times published a report that presents a psychologist’s study used to determine the psychological impact the job has on someone. It concluded people were likely to become depressed or angry, have trouble forming relationships, and suffer from decreased sexual appetites. A small percentage said they reacted to images by vomiting or crying.
I remember the times working at MySpace having to flag suicidal posts and in rare cases forward them to law enforcement, this is 100X worse
— Southey Blanton (@southey) May 3, 2017
An industry group established by Congress recommended that the government provide incentives to account for the psychological impact on employees.
Zuckerberg says because of the social network’s new tools and video monitoring employees, it was able to stop someone from committing suicide earlier this week.