- Man allegedly kills girlfriend, then pretends to be her on Facebook Sunday 4:29 PM
- Trevor Lawrence met TikTok teen who looks just like him Sunday 3:48 PM
- Trump’s hospital visit spawns conspiracy theories Sunday 2:49 PM
- ‘SNL’ skit combines Harry Styles, the Popeyes chicken sandwich, and Disney+ Sunday 2:02 PM
- Doctored photo of GOP congresswoman flipping the bird fools critics Sunday 1:05 PM
- Internet scammers taking advantage of Narwhal the ‘unicorn’ rescue puppy Sunday 12:19 PM
- Sunday Night Football: How to stream Bears vs. Rams live Sunday 12:00 PM
- CupcakKe’s month-long ‘water fast’ has fans concerned Sunday 11:24 AM
- Will.i.am claims ‘racist’ flight attendant called police on him Sunday 10:28 AM
- How does Disney+ compare to Netflix, Hulu, HBO Max, and Apple TV+? Sunday 9:35 AM
- How to stream Patriots vs. Eagles live Sunday 9:30 AM
- Girl turns herself into ‘pleading face’ emoji Sunday 9:27 AM
- How to stream Cowboys vs. Lions live Sunday 9:00 AM
- Chaotic good, true neutral: The 2020 Democrat alignment chart Sunday 6:30 AM
- How to stream Mexico vs. Brazil live in the U-17 World Cup final Sunday 3:00 AM
Instagram has pledged to crack down on self-harm related content as the platform grapples with the aftermath of the tragic suicide of British teenager Molly Russell.
According to the BBC, the popular image sharing website, which is owned by social media giant Facebook, plans to prohibit all material showing methods of self-harm. This will include photographs, cartoons, drawings, and even memes.
In addition, Instagram plans to remove content perceived as promoting suicide or self-harm. This follows measures introduced in February which prohibited “graphic images of self-harm” and content with “suicidal themes.”
Molly Russell was a 14-year-old from London who took her own life in 2017. Shortly after her death, her father, Ian Russell, encountered disturbing self-harm and suicide-related posts on her Instagram and Pinterest pages. Mr. Russell believes that the platform partially contributed to his daughter’s death.
Speaking to the BBC, Adam Mosseri, who heads Instagram, promised that the changes will be rolled out as part of a long-term strategy for addressing this type of content.
“It will take time to fully implement… but it’s not going to be the last step we take,” Mosseri told BBC News.
That slow pace of change will likely frustrate campaigners, who have long criticized Instagram for its hands-off approach. In February of this year, Peter Wanless, CEO of the NSPCC (National Society for the Prevention of Cruelty to Children) said to the Guardian: “It should never have taken the death of Molly Russell for Instagram to act. Over the last decade, social networks have proven over and over that they won’t do enough.
Instagram notes that it has doubled the amount of self-harm related content removed since the first quarter of this year. Between April and June, it removed 834,000 offending posts, of which only 23 percent were reported by users.
Despite that, Mosseri notes that there’s still a lot more work to do. “There is still very clearly more work to do, this work never ends,” he told the BBC.
This is not the first time Instagram has attracted criticism for how it safeguards younger users against potentially harmful content. In 2017, a Royal Society for Public Health survey rated Instagram as the most harmful platform for younger users’ mental health, with Snapchat trailing slightly behind.