Instagram will roll out “sensitivity screens” to blur suicide and self-harm images that appear on people’s feeds in the U.S. and globally, the Facebook-owned social media company confirmed with NBC News and the Hill.
“Starting this week we will be applying sensitivity screens to all posts we review that contain cutting self-harm imagery that is not in violation of our guidelines,” an Instagram spokesperson said in an email to the Hill. “Our guidelines allow for people to share that they are struggling, but we remove posts that promote such activity.” The spokesperson did not respond to the newspaper’s request for more information on how it will flag images that require a “sensitivity screen.”
“The sensitivity screens will mean certain images, for example of cutting, will appear blurred and carry a warning to users that clicking on them will open up sensitive content which could be offensive or disturbing,” the BBC explained.
This sensitivity filter is among the steps Instagram recently took to directly address concerns that content showing self-harm partly causes the rising rate of teen suicides across the globe. In an op-ed for British newspaper the Telegraph on Monday, Instagram head Adam Mosseri outlined the company’s efforts against triggering suicide-related posts. “To be very clear, we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content,” he wrote.
“We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images,” he added. “We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.”
These measures come after a British teen’s suicide in 2017 was partly blamed on the photo-sharing app by her parents. According to the BBC, the family found that the girl used it to browse graphic images of self-harm and suicide. The report noted that Instagram’s algorithm allowed for teens already looking at accounts depicting such sensitive content to find other similar accounts.
The case prompted the U.K. government to call on social media companies to create better ways of protecting young users from such distressing material. In a letter to Facebook last week, British Health Secretary Matt Hancock said social media companies need to “purge this content once and for all.” He also wrote that he would use the power of his office to prosecute companies that fail to do so.
Mosseri is scheduled to meet Hancock on Thursday to discuss how to more successfully remove content that promotes suicide and self-harm, the same way the platform has been able to do with terrorist content, according to a Telegraph report.
The Instagram boss admitted in his op-ed that they have yet to do more when it comes to how the platform handles these issues. “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe,” Mosseri wrote.
It must be noted that while the “screens” will hide images depicting self-harm at first glance, these will still appear on their feeds and people can still explicitly choose to see them.
NBC News points out that “Instagram has decided to not ban such content outright, saying people use the platform in a productive way to speak about their struggles with the issue.” A spokesperson for Instagram also told Yahoo that “for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery. This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
H/T NBC News