Instagram to allow users to filter hateful comments

MENLO PARK, CA - JUNE 20: An attendee takes a photo of the Instagram logo during a press event at Facebook headquarters on June 20, 2013 in Menlo Park, California. Facebook announced that its photo-sharing subsidiary Instagram will now allow users to take and share video. (Photo by Justin Sullivan/Getty Images)
Caption
MENLO PARK, CA - JUNE 20: An attendee takes a photo of the Instagram logo during a press event at Facebook headquarters on June 20, 2013 in Menlo Park, California. Facebook announced that its photo-sharing subsidiary Instagram will now allow users to take and share video. (Photo by Justin Sullivan/Getty Images)

Credit: Justin Sullivan

Credit: Justin Sullivan

Video includes clips from Instagram and images from Getty Images. Music provided courtesy of APM Music.

Instagram will soon launch a new filter for its estimated 500 million monthly active users to combat harassment online.

The new security feature will reportedly allow users to screen comments that appear on their photos and videos and even let them disable comments completely.

Instagram already has policies in place to flag certain words and phrases, but the new feature will give users the ability to control what appears on individual accounts.

An Instagram executive told The Washington Post, "Our goal is to make Instagram a friendly, fun and, most importantly, safe place for self expression."

The new feature isn't available for everyone yet. The Post reported Instagram is rolling out the security measure for "high volume comment threads." Eventually, Instagram hopes to introduce the feature to the broader public.

The Pew Research Center estimates that 73 percent of adult users have seen someone be harassed online, and 40 percent of users have experienced it personally.