Photo via Santeri Viinamäki/Wikimedia (CC-BY-SA)
The new feature will allow all users to enable a filter that will automatically block out comments containing terms found on a default list created by Instagram. Users can also build a custom keyword list so they can add their own words to keep from appearing in their comments.
Both features, which were initially tested on business and celebrity accounts, will now be available to all users on the platform. The filter tools can be enabled in the settings menu.
"We know tools aren’t the only solution for this complex problem, but together, we can work towards keeping Instagram a safe place for self-expression," Kevin Systrom, Instagram co-founder and CEO, said in a blog post.
"My commitment to you is that we will keep building features that safeguard the community and maintain what makes Instagram a positive and creative place for everyone."
The filters are the latest tools from Instagram designed to help curb abuse that takes place on the platform. The Facebook-owned social network previously added the ability to block user accounts, delete comments with a swipe, and report abusive content.
The decision to add user-generated filters comes as Facebook has struggled in defining its own policies for content on its platform. The social network defended its decision to leave a video of a dog being beaten on the service, because the post itself disapproved of the action in the video.
However, Facebook temporarily decided to remove several posts containing the infamous “Napalm Girl” photograph that revealed the human toll of the Vietnam War.
The anti-abuse tools of Instagram give more control to the user than many platforms, including Twitter, which has regularly come under fire for failing to provide its users with the proper tools to protect themselves from abuse and harassment. Earlier this year, the service introduced a Safety Council to help solve issues regarding user safety.