- The new ‘Cats’ trailer is here to make you want to claw your eyes out Thursday 7:59 PM
- Bella Thorne claims Tana Mongeau ‘broke girl code’ in a series of messy tweets Thursday 7:00 PM
- Redditors keep this data engineer’s plants alive for him Thursday 5:20 PM
- Professor writes article defending ‘Asian romantic preference’—and no one is here for it Thursday 4:57 PM
- Ditch Pornhub and support adult content creators instead Thursday 4:46 PM
- Fans grieve Kyoto Animation Studio fire with #PrayforKyoAni Thursday 4:18 PM
- Netflix’s ‘Secret Obsession’ isn’t just terrible—it’s boring as hell Thursday 3:30 PM
- Instagram expands experiment of hiding likes to 6 more countries Thursday 3:20 PM
- Man asks woman to stop speaking Spanish on a plane—and bystanders start speaking Spanish Thursday 12:55 PM
- Schumer calls on FBI, FTC to investigate FaceApp Thursday 12:41 PM
- Netflix loses subscribers—but hopes some tentpole shows can save it Thursday 12:10 PM
- Man utterly roasted for saying women can’t ask for equality in revealing clothing Thursday 12:07 PM
- Instagram struggles to remove photos of Bianca Devins’ dead body Thursday 11:14 AM
- ‘Storm Area 51’ creator says its gotten so big he’s worried about the FBI Thursday 10:49 AM
- Everyone loves Q baby, the baby who apparently supports QAnon Thursday 9:53 AM
Facebook to hire 3,000 people to review suicide and murder videos
The social network has been scrutinized for its slow response to taking down graphic images.
Facebook plans to hire 3,000 additional employees to remove videos of crimes and suicide on its platform, adding to the 4,500 it currently employs.
CEO Mark Zuckerberg confirmed the hires in a Facebook post Wednesday. “We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down,” Zuckerberg wrote.
Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted...Posted by Mark Zuckerberg on Wednesday, May 3, 2017
Facebook Live launched a year ago but has already been used by dozens to broadcast murders and suicides. As Business Insider points out, the most popular Google searches for Facebook Live contain murder, death, and torture. Zuckerberg claims Facebook reviews “millions of reports” every week.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it—either because they’re about to harm themselves, or because they’re in danger from someone else.”
Users were angered after the video of a murder uploaded to Facebook last month took two hours to delete. The company admitted it was “too slow” to respond, and Zuckerberg vowed to “keep doing all we can to prevent tragedies like this from happening” at the company’s F8 developer conference in April.
Facebook came under scrutiny after a similar incident earlier this year. A 14-year-old Miami teenager took her own life on Facebook Live in January. The video lasted two hours and was viewed by a number of the teen’s friends, according to the Miami Herald.
On Monday, the social network announced it would add suicide prevention tools to its Live platform, including new options to report if someone hurt themselves during a broadcast. Viewers are now able to report an escalated response to Facebook, which will also contact emergency workers if someone is in imminent danger. It will also send the person broadcasting a number of resource pop-ups on their screen so they can contact help.
There are obvious concerns for those who police the web and are required to watch videos of suicide, murder, and pornography. In 2010, the New York Times published a report that presents a psychologist’s study used to determine the psychological impact the job has on someone. It concluded people were likely to become depressed or angry, have trouble forming relationships, and suffer from decreased sexual appetites. A small percentage said they reacted to images by vomiting or crying.
@cwarzel I remember the times working at MySpace having to flag suicidal posts and in rare cases forward them to law enforcement, this is 100X worse— Southey Blanton (@southey) May 3, 2017
An industry group established by Congress recommended that the government provide incentives to account for the psychological impact on employees.
Zuckerberg says because of the social network’s new tools and video monitoring employees, it was able to stop someone from committing suicide earlier this week.
Phillip Tracy is a former technology staff writer at the Daily Dot. He's an expert on smartphones, social media trends, and gadgets. He previously reported on IoT and telecom for RCR Wireless News and contributed to NewBay Media magazine. He now writes for Laptop magazine.