Instagram will now blur out sensitive content and ask users if they would still like to view specific images or videos before making them visible again.
Instagram’s previous policy didn’t really account for posts that walk a tight line with the company’s community guidelines. There is plenty of content on the social media platform that fits the app’s “don’t spam or post nudity” policy but isn’t something people want their children to look at. That is where the new feature comes in. If a user stumbles over a sensitive post, they will be greeted with a warning: “This photo contains sensitive content which some people may find offensive or disturbing.” They then have the chance to consider this warning and decide if they still want to view the photo or video.
This doesn’t change what can or cannot be posted on the platform. Instagram still won’t allow posts that overstep the many boundaries it outlines in its guidelines. It will, however, begin to screen each post to determine if it contains sensitive content. What that might mean is up for interpretation, but Instagram told the Verge it is primarily concerned with posts that depict violence.
“Examples include animal rights groups that share content to expose animal testing conditions or animal abuse, or content that raises awareness of humanitarian crises around the world (famine, impact of war on local communities).”
Instagram says the new warnings will start showing up immediately.