- Illinois Republicans share ‘jihad squad’ meme of 4 Dem congresswomen 6 Months Ago
- How a deepfake gets made Today 8:25 AM
- How to watch ‘Veronica Mars’ season 4 online Today 8:21 AM
- The MCU’s Phase 4 is all about Marvel getting weird Today 7:07 AM
- How alt porn site SuicideGirls gets women to pose naked for free Today 7:00 AM
- Why did the GOP launch a website hyping socialist candidates? Today 6:30 AM
- The macrophilia and size-change fetish communities are made possible through the magic of the internet Today 6:00 AM
- Is Trump defiling the U.S. flag in this MAGA dude’s artwork? Sunday 4:41 PM
- White woman claims she invented sleep bonnets, selling them for $100 Sunday 4:03 PM
- Even real cats are transfixed by the enigma that is the ‘Cats’ trailer Sunday 3:04 PM
- Wait, how tall is Peppa Pig? Sunday 1:55 PM
- Twitter suspends Iranian state media outlets for harassing members of a religious minority Sunday 1:06 PM
- Pro-MAGA pageant queen stripped of title over ‘offensive’ tweets Sunday 11:52 AM
- Marvel unveiled its Phase 4 plans at San Diego Comic-Con Sunday 9:16 AM
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Sunday 6:30 AM
Sensitive content will remain searchable.
Instagram will roll out “sensitivity screens” to blur suicide and self-harm images that appear on people’s feeds in the U.S. and globally, the Facebook-owned social media company confirmed with NBC News and the Hill.
“Starting this week we will be applying sensitivity screens to all posts we review that contain cutting self-harm imagery that is not in violation of our guidelines,” an Instagram spokesperson said in an email to the Hill. “Our guidelines allow for people to share that they are struggling, but we remove posts that promote such activity.” The spokesperson did not respond to the newspaper’s request for more information on how it will flag images that require a “sensitivity screen.”
“The sensitivity screens will mean certain images, for example of cutting, will appear blurred and carry a warning to users that clicking on them will open up sensitive content which could be offensive or disturbing,” the BBC explained.
This sensitivity filter is among the steps Instagram recently took to directly address concerns that content showing self-harm partly causes the rising rate of teen suicides across the globe. In an op-ed for British newspaper the Telegraph on Monday, Instagram head Adam Mosseri outlined the company’s efforts against triggering suicide-related posts. “To be very clear, we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content,” he wrote.
“We have engineers and trained content reviewers working around the clock to make it harder for people to find self-harm images,” he added. “We have put in place measures to stop recommending related images, hashtags, accounts, and typeahead suggestions.”
These measures come after a British teen’s suicide in 2017 was partly blamed on the photo-sharing app by her parents. According to the BBC, the family found that the girl used it to browse graphic images of self-harm and suicide. The report noted that Instagram’s algorithm allowed for teens already looking at accounts depicting such sensitive content to find other similar accounts.
The case prompted the U.K. government to call on social media companies to create better ways of protecting young users from such distressing material. In a letter to Facebook last week, British Health Secretary Matt Hancock said social media companies need to “purge this content once and for all.” He also wrote that he would use the power of his office to prosecute companies that fail to do so.
Mosseri is scheduled to meet Hancock on Thursday to discuss how to more successfully remove content that promotes suicide and self-harm, the same way the platform has been able to do with terrorist content, according to a Telegraph report.
The Instagram boss admitted in his op-ed that they have yet to do more when it comes to how the platform handles these issues. “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe,” Mosseri wrote.
It must be noted that while the “screens” will hide images depicting self-harm at first glance, these will still appear on their feeds and people can still explicitly choose to see them.
NBC News points out that “Instagram has decided to not ban such content outright, saying people use the platform in a productive way to speak about their struggles with the issue.” A spokesperson for Instagram also told Yahoo that “for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery. This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most.”
H/T NBC News
Trixie Reyna-Benedicto is a lifestyle editor and writer based in the Philippines. Previously, she helmed Cosmopolitan Philippines’ website, Cosmo.ph, as its founding editor. She later served as editor-in-chief of lifestyle and entertainment portals for Manila-based media company TV5. Her work has appeared in several print and online publications in her country, and she contributes to Speed Magazine, DG Traveler, and Connected Women, among others. Visit her website, trixiereyna.com.