Instagram will begin automatically hiding potentially offensive comments as part of its ongoing attempt to address online bullying.

The company said the comments that will be hidden will be similar to those that have been reported by users in the past. Instagram said it's using existing artificial intelligence systems to identify bullying or harassing language in comments.

Instagram announced on Tuesday it would be testing the feature. The day also marks the app's tenth birthday.

Users will still be able to tap "View Hidden Comments" to see those remarks.

Adam Mosseri, who took the helm of Instagram two years ago, has pledged to fight online bullying. Last year, the Facebook-owned company rolled out a tool called "Restrict." It allows you to "restrict" another user, meaning that comments on your posts from that person are only visible to them, and not to other people. It also previously added a feature that lets people know when their comment may be considered offensive before it's posted. The idea is that it gives people the ability to pause and reflect.

Instagram said, since introducing comment warnings, it's seen "significant improvement" in people editing or not posting the comment, although it did not elaborate further.

On Tuesday, Instagram also said it's adding an additional warning for people who have posted several comments that could be offensive. The notification prompts them to go back to their comment, otherwise they could risk consequences such as their comment being hidden or even their account getting deleted.

Twitter has conducted similar tests. Earlier this year, it began prompting users to consider rewriting their reply to a tweet before publishing, if it contained potentially harmful language.

CNN