- A mysterious website is doxing Hong Kong protesters and journalists Today 1:44 PM
- The best ‘Skyrim’ followers and how to get them Today 1:26 PM
- Why Joel Osteen gets cyberbullied every time Houston floods Today 12:40 PM
- How to stream Jets vs. Patriots in Week 3 Today 12:39 PM
- 10 indie dating simulator games you should be playing Today 12:31 PM
- How to stream Packers vs. Broncos in Week 3 Today 12:14 PM
- Saudi crown prince’s former adviser suspended from Twitter Today 11:57 AM
- How to stream Cowboys vs. Dolphins in Week 3 Today 11:57 AM
- YouTuber to pay restitution after a teen fan died copying her video Today 10:36 AM
- Antonio Brown sent ‘intimidating’ texts to an accuser, including a pic of her children Today 9:38 AM
- Facebook suspended tens of thousands of apps after Cambridge Analytica scandal Today 8:24 AM
- How to stream Browns vs. Rams on Sunday Night Football Today 6:00 AM
- How to watch ‘NFL Primetime’ on ESPN+ Today 5:00 AM
- How to stream Liverpool vs. Chelsea Friday 6:45 PM
- How to stream Real Madrid vs. Sevilla Friday 6:35 PM
In time for World Suicide Prevention Day, Facebook implemented new policies meant to prevent the spread of posts promoting suicide and self-harm.
Some of the new rules include bans on graphic images including cutting or unhealthy weight loss, TechCrunch reports. In addition to content restrictions, Facebook created a Safety Policy Manager position to monitor these actions.
How Facebook intends to prevent the promotion of suicide on their site comes down to a balancing act. Facebook says it doesn’t want to provide a platform that encourages self-harm, but the platform is also intended to foster a community capable of discussing suicide and noting when someone needs help.
“We do, however, leave up content posted by individuals who express an intent to engage in suicidal or self-harm behavior as there is both therapeutic value in sharing these thoughts, and an opportunity for someone to reach out and respond to what may be a ‘cry for help,'” a Facebook spokesperson told the Daily Dot.
Facebook said it added Orygen’s #chatsafe guidelines to both Facebook and Instagram, TechCrunch reports. In Facebook’s Help Center, users have access to suicide prevention resources. And in case of emergency, Facebook says it’s ready to act.
“If the person is in immediate danger, we will work with first responders to try to get that person help on the ground and conduct a wellness check,” the Facebook spokesperson said. “Speed is critical.”
In the past, Facebook has been scrutinized for not acting fast enough on banned content. Bloomberg reported earlier this year on hate speech received 30 likes before Facebook removed the content. The Daily Dot found in June that a Facebook Page called “Black Face” remained online for months despite user reports.
“While suicide prevention work and dealing with self-harm can be some of the most challenging policy work we do, it’s also some of the most important work we do,” Antigone Davis, Facebook’s global head of safety, told Axios.
Someone dies from suicide every 40 seconds, Axios reports.
- Work-around makes private Instagram posts and Stories viewable to anyone
- Facebook explains privacy concerns ahead of iOS 13
- Popular period tracking apps are sharing users’ private sexual health info with Facebook
Libby Cohen is a third-year University of Texas student originally from New Jersey. She has written for ORANGE Magazine, the Daily Texan, and most recently interned for 1010 WINS in NYC. She's now back in Austin writing for the Texas Standard and the Daily Dot.