Earlier this month, Meta’s independent oversight board announced it would be able to add warning screens and labels to posts on Facebook and Instagram, labeling content as “disturbing” or “sensitive.”
The body, which has the power to consider removal requests, said the labels would be used when “leaving up or restoring qualifying content,” including photos and videos of human rights violations from Sudan.
“Since we started accepting appeals two years ago, we have received nearly two million appeals from users around the world,” the board report said.
“This demonstrates the ongoing demand from users to appeal Meta’s content moderation decisions to an independent body.”
The company formed the oversight board, which includes academics, rights experts and lawyers, to decide on a small portion of content moderation appeals but can also advise on guidelines for the website that Meta says it will follow. It is also indirectly supported by a $130 million Meta trust fund.
Although Meta and the oversight board have been at odds and have had constant disagreements, such as Facebook’s objection to the removal of a positive newspaper report about the Taliban last month, Meta says it will continue to abide by the board’s decisions.
The sources for this piece include an article in Reuters.