Instagram Aims To Prevent Abusive Comments
This week, Instagram announced the launch of a new feature which will use AI to judge whether a comment being left is abusive, and if it is, then the user will be asked if they really want to say that. It will ask if you are sure you want to post while giving a little timer for about 30 seconds, and if you do want to say that, it will post when that timer runs out.
It’s unclear exactly what will cause this warning to appear – we tested with the exact wording that Instagram used in their post to demonstrate it which worked as expected. Instagram said in their tests that this delay caused some users to reconsider what they were putting out there.
Another feature that Instagram are looking to bring in is to allow you to restrict users – instead of straight blocking a user, this seems to work like muting them. You and the restricted user will be able to see their interactions, but you’ll have to approve each comment, etc, before anyone else can see them. Testing for this feature will be rolled out to certain accounts over the next few weeks, with the aim for the feature going live before the end of the year