Article Lead Image

Dragana Gordic/Shutterstock (Licensed)

Instagram is cracking down on posts depicting self-harm and suicide

The platform is grappling with the aftermath of the tragic suicide of British teenager Molly Russell.


Matthew Hughes


Posted on Oct 28, 2019   Updated on May 20, 2021, 12:23 am CDT

Instagram has pledged to crack down on self-harm related content as the platform grapples with the aftermath of the tragic suicide of British teenager Molly Russell.

According to the BBC, the popular image sharing website, which is owned by social media giant Facebook, plans to prohibit all material showing methods of self-harm. This will include photographs, cartoons, drawings, and even memes.

In addition, Instagram plans to remove content perceived as promoting suicide or self-harm. This follows measures introduced in February which prohibited “graphic images of self-harm” and content with “suicidal themes.”

Molly Russell was a 14-year-old from London who took her own life in 2017. Shortly after her death, her father, Ian Russell, encountered disturbing self-harm and suicide-related posts on her Instagram and Pinterest pages. Mr. Russell believes that the platform partially contributed to his daughter’s death.

Speaking to the BBC, Adam Mosseri, who heads Instagram, promised that the changes will be rolled out as part of a long-term strategy for addressing this type of content.

“It will take time to fully implement… but it’s not going to be the last step we take,” Mosseri told BBC News.

That slow pace of change will likely frustrate campaigners, who have long criticized Instagram for its hands-off approach. In February of this year, Peter Wanless, CEO of the NSPCC (National Society for the Prevention of Cruelty to Children) said to the Guardian: “It should never have taken the death of Molly Russell for Instagram to act. Over the last decade, social networks have proven over and over that they won’t do enough.

Instagram notes that it has doubled the amount of self-harm related content removed since the first quarter of this year. Between April and June, it removed 834,000 offending posts, of which only 23 percent were reported by users.

Despite that, Mosseri notes that there’s still a lot more work to do. “There is still very clearly more work to do, this work never ends,” he told the BBC.

This is not the first time Instagram has attracted criticism for how it safeguards younger users against potentially harmful content. In 2017, a Royal Society for Public Health survey rated Instagram as the most harmful platform for younger users’ mental health, with Snapchat trailing slightly behind.

Share this article
*First Published: Oct 28, 2019, 4:37 pm CDT