Instagram will now reduce the visibility of ‘potentially harmful’ content

Instagram is taking new steps to make content less visible in its app. The company says that the algorithm powering the way posts are ordered in users’ feeds and in Stories will now de-prioritize content that “may contain bullying, hate speech or may incite violence.”

While Instagram’s rules already prohibit much of this type of content, the change could affect borderline posts, or content that hasn’t yet reached the app’s moderators. “To understand if something may break our rules, we’ll look at things like if a caption is similar to a caption that previously broke our rules,” the company explains in an update.

Up until now, Instagram has tried potentially objectionable content from public-facing parts of the app, like Explore, but hasn’t changed how it appears to users who follow the accounts posting this type of content. The latest change means that posts deemed “similar” to those that have been previously removed will be much less visible even to followers. A spokesperson for Meta confirmed that “potentially harmful” posts could still be eventually removed if the post breaks its community guidelines.

The update follows a similar change, when Instagram began down-ranking accounts that shared misinformation that was debunked by fact checkers. Unlike that change, however, Instagram says that the latest policy will only affect individual posts and “not accounts overall.”

Additionally, Instagram says it will now factor in each individual user’s reporting history into how it orders their feeds. “If our systems predict you’re likely to report a post based on your history of reporting content, we will show the post lower in your Feed,” Instagram says.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.


Credit: Source link

Comments are closed.