- ‘Star Trek’s Jonathan Frakes calls out your lies with this new meme Saturday 3:46 PM
- #JusticeForLucca trends after video shows police slam Black teen’s head into pavement Saturday 3:11 PM
- The internet is shocked to learn that Goombas do, in fact, have arms Saturday 2:02 PM
- PayPal, GoFundMe cut off armed militia that detains migrants at border Saturday 1:16 PM
- Barnwood theft may be on the rise because of ‘Fixer Upper’—and fans aren’t having it Saturday 12:23 PM
- Literary Twitter calls out Dzanc Books for Islamophobic, racist novel Saturday 11:40 AM
- How to watch Crawford vs. Khan online Saturday 10:00 AM
- Beyoncé has 2 more projects coming to Netflix after ‘Homecoming’ Saturday 9:53 AM
- How to watch Danny Garcia vs. Adrian Granados for free Saturday 9:00 AM
- The ‘Feeling Cute Challenge’ turns ugly after correctional officers abuse it Saturday 7:30 AM
- How to watch ‘How High 2’ for free Saturday 7:00 AM
- Swipe This! My ex-BFF keeps sliding into my DMs, but I don’t want to be friends Saturday 6:30 AM
- Watch ‘I Am Somebody’s Child: The Regina Louise Story’ for free Saturday 6:00 AM
- How to watch Barcelona vs. Real Sociedad for free Saturday 6:00 AM
- How to stream UFC Fight Night 149 for free Saturday 5:30 AM
Sensitive content will remain searchable.
Instagram will roll out “sensitivity screens” to blur suicide and self-harm images that appear on people’s feeds in the U.S. and globally, the Facebook-owned social media company confirmed with NBC News and the Hill.
“Starting this week we will be applying sensitivity screens to all posts we review that contain cutting self-harm imagery that is not in violation of our guidelines,” an Instagram spokesperson said in an email to the Hill. “Our guidelines allow for people to share that they are struggling, but we remove posts that promote such activity.” The spokesperson did not respond to the newspaper’s request for more information on how it will flag images that require a “sensitivity screen.”
“The sensitivity screens will mean certain images, for example of cutting, will appear blurred and carry a warning to users that clicking on them will open up sensitive content which could be offensive or disturbing,” the BBC explained.
This sensitivity filter is among the steps Instagram recently took to directly address concerns that content showing self-harm partly causes the rising rate of teen suicides across the globe. In an op-ed for British newspaper the Telegraph on Monday, Instagram head Adam Mosseri outlined the company’s efforts against triggering suicide-related posts. “To be very clear, we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content,” he wrote.
“We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images,” he added. “We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.”
These measures come after a British teen’s suicide in 2017 was partly blamed on the photo-sharing app by her parents. According to the BBC, the family found that the girl used it to browse graphic images of self-harm and suicide. The report noted that Instagram’s algorithm allowed for teens already looking at accounts depicting such sensitive content to find other similar accounts.
The case prompted the U.K. government to call on social media companies to create better ways of protecting young users from such distressing material. In a letter to Facebook last week, British Health Secretary Matt Hancock said social media companies need to “purge this content once and for all.” He also wrote that he would use the power of his office to prosecute companies that fail to do so.
Mosseri is scheduled to meet Hancock on Thursday to discuss how to more successfully remove content that promotes suicide and self-harm, the same way the platform has been able to do with terrorist content, according to a Telegraph report.
The Instagram boss admitted in his op-ed that they have yet to do more when it comes to how the platform handles these issues. “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe,” Mosseri wrote.
It must be noted that while the “screens” will hide images depicting self-harm at first glance, these will still appear on their feeds and people can still explicitly choose to see them.
NBC News points out that “Instagram has decided to not ban such content outright, saying people use the platform in a productive way to speak about their struggles with the issue.” A spokesperson for Instagram also told Yahoo that “for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery. This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
H/T NBC News
Trixie Reyna-Benedicto is a lifestyle editor and writer based in the Philippines. Previously, she helmed Cosmopolitan Philippines’ website, Cosmo.ph, as its founding editor. She later served as editor-in-chief of lifestyle and entertainment portals for Manila-based media company TV5. Her work has appeared in several print and online publications in her country, and she contributes to Speed Magazine, DG Traveler, and Connected Women, among others. Visit her website, trixiereyna.com.