- How to stream Manchester City vs. Crystal Palace Today 1:00 AM
- How to stream Tottenham Hotspur vs. Watford Friday 9:00 PM
- How to stream Barcelona vs. Eibar Friday 6:00 PM
- How to stream ‘Bigfoot’ Silva vs. Gabriel Gonzaga in BKFC Friday 6:00 PM
- Demi Lovato’s nude photos allegedly leaked on Snapchat Friday 3:07 PM
- NBA TV is the new streaming service for basketball fanatics Friday 3:02 PM
- California residents will get cell phone alerts seconds before earthquakes Friday 2:29 PM
- How to stream Real Madrid vs. RCD Mallorca Friday 2:00 PM
- Trump accused of ‘using the language of ethnic cleansing’ regarding Kurds Friday 1:42 PM
- Hillary Clinton also thinks Tulsi Gabbard is a Russian bot Friday 1:13 PM
- TikTok girls dancing to voicemails from sh*tty exes is a vibe Friday 12:34 PM
- Netflix reports strong growth—but it faces 3 major hurdles in Q4 Friday 12:33 PM
- Telegram is hosting videos of extrajudicial killings in Syria Friday 12:32 PM
- ‘El Camino: A Breaking Bad Movie’ tops 8 million viewers in first week Friday 11:31 AM
- ‘Uncut Gems’ brings a high-stakes gambling risk to life Friday 11:29 AM
In time for World Suicide Prevention Day, Facebook implemented new policies meant to prevent the spread of posts promoting suicide and self-harm.
Some of the new rules include bans on graphic images including cutting or unhealthy weight loss, TechCrunch reports. In addition to content restrictions, Facebook created a Safety Policy Manager position to monitor these actions.
How Facebook intends to prevent the promotion of suicide on their site comes down to a balancing act. Facebook says it doesn’t want to provide a platform that encourages self-harm, but the platform is also intended to foster a community capable of discussing suicide and noting when someone needs help.
“We do, however, leave up content posted by individuals who express an intent to engage in suicidal or self-harm behavior as there is both therapeutic value in sharing these thoughts, and an opportunity for someone to reach out and respond to what may be a ‘cry for help,'” a Facebook spokesperson told the Daily Dot.
Facebook said it added Orygen’s #chatsafe guidelines to both Facebook and Instagram, TechCrunch reports. In Facebook’s Help Center, users have access to suicide prevention resources. And in case of emergency, Facebook says it’s ready to act.
“If the person is in immediate danger, we will work with first responders to try to get that person help on the ground and conduct a wellness check,” the Facebook spokesperson said. “Speed is critical.”
In the past, Facebook has been scrutinized for not acting fast enough on banned content. Bloomberg reported earlier this year on hate speech received 30 likes before Facebook removed the content. The Daily Dot found in June that a Facebook Page called “Black Face” remained online for months despite user reports.
“While suicide prevention work and dealing with self-harm can be some of the most challenging policy work we do, it’s also some of the most important work we do,” Antigone Davis, Facebook’s global head of safety, told Axios.
Someone dies from suicide every 40 seconds, Axios reports.
- Work-around makes private Instagram posts and Stories viewable to anyone
- Facebook explains privacy concerns ahead of iOS 13
- Popular period tracking apps are sharing users’ private sexual health info with Facebook
Libby Cohen is a third-year University of Texas student originally from New Jersey. She has written for ORANGE Magazine, the Daily Texan, and most recently interned for 1010 WINS in NYC. She's now back in Austin writing for the Texas Standard and the Daily Dot.