- Is that Rosa Parks in random Twitter user’s baby photo? Tuesday 8:24 PM
- Syracuse students say white supremacist manifesto was AirDropped to them Tuesday 7:44 PM
- Florida woman gets prison time for throwing slushie at Matt Gaetz Tuesday 6:28 PM
- Marie Kondo’s online store slammed for selling clutter-worthy products Tuesday 5:34 PM
- People are rallying against toxic masculinity on International Men’s Day Tuesday 4:42 PM
- Reddit wants to stop its pro-Trump forum from outing the alleged whistleblower Tuesday 3:38 PM
- White woman calls cops on man who said he was visiting aunt with his kids Tuesday 3:12 PM
- ‘The Stranded’ is a flawed yet addictive blend of ‘Degrassi’ and ‘Lost’ Tuesday 2:45 PM
- The ‘gonna tell my kids’ meme is revisionist history at its most absurd Tuesday 2:24 PM
- Redditor asks former burglars to give home security tips Tuesday 2:18 PM
- Facebook-Breitbart partnership under fire in wake of new Stephen Miller emails Tuesday 2:00 PM
- John Krasinski under fire after praising the CIA Tuesday 1:46 PM
- Conservatives melt down after Chick-fil-A says it will stop donating to anti-LGBTQ orgs Tuesday 1:33 PM
- ‘Honey Boy’ is an experimental look at channeling trauma Tuesday 1:28 PM
- Disney+ now allows users to resume and restart content Tuesday 11:42 AM
In time for World Suicide Prevention Day, Facebook implemented new policies meant to prevent the spread of posts promoting suicide and self-harm.
Some of the new rules include bans on graphic images including cutting or unhealthy weight loss, TechCrunch reports. In addition to content restrictions, Facebook created a Safety Policy Manager position to monitor these actions.
How Facebook intends to prevent the promotion of suicide on their site comes down to a balancing act. Facebook says it doesn’t want to provide a platform that encourages self-harm, but the platform is also intended to foster a community capable of discussing suicide and noting when someone needs help.
“We do, however, leave up content posted by individuals who express an intent to engage in suicidal or self-harm behavior as there is both therapeutic value in sharing these thoughts, and an opportunity for someone to reach out and respond to what may be a ‘cry for help,'” a Facebook spokesperson told the Daily Dot.
Facebook said it added Orygen’s #chatsafe guidelines to both Facebook and Instagram, TechCrunch reports. In Facebook’s Help Center, users have access to suicide prevention resources. And in case of emergency, Facebook says it’s ready to act.
“If the person is in immediate danger, we will work with first responders to try to get that person help on the ground and conduct a wellness check,” the Facebook spokesperson said. “Speed is critical.”
In the past, Facebook has been scrutinized for not acting fast enough on banned content. Bloomberg reported earlier this year on hate speech received 30 likes before Facebook removed the content. The Daily Dot found in June that a Facebook Page called “Black Face” remained online for months despite user reports.
“While suicide prevention work and dealing with self-harm can be some of the most challenging policy work we do, it’s also some of the most important work we do,” Antigone Davis, Facebook’s global head of safety, told Axios.
Someone dies from suicide every 40 seconds, Axios reports.
- Work-around makes private Instagram posts and Stories viewable to anyone
- Facebook explains privacy concerns ahead of iOS 13
- Popular period tracking apps are sharing users’ private sexual health info with Facebook
Libby Cohen is a third-year University of Texas student originally from New Jersey. She has written for ORANGE Magazine, the Daily Texan, and most recently interned for 1010 WINS in NYC. She's now back in Austin writing for the Texas Standard and the Daily Dot.