Twitter recently announced that images or videos of private individuals shared on the app could be removed from the platform if requested.

Before removing an image or video, the company said it must have been alerted by the “individuals depicted, or by an authorized representative, that they did not consent to have their private image or video shared.”

The company’s decision was met with sharp criticism, with many saying the rule was too far-reaching. Twitter gave further details on the nature of the policy and said it would consider a number of factors before taking such action.

This includes checking whether the picture is publicly available or “if a particular image and the accompanying tweet text adds value to the public discourse, is being shared in the public interest, or is relevant to the community.”

The company explained the purpose of the rule and announced that it would use the initiative to end the use of videos and images that are used to “harass, intimidate and reveal the identities of private individuals, which disproportionately impacts women, activists, dissidents, and members of minority communities.”

Images/videos that would not violate the rule include those showing people participating in public events or public figures.