Facebook’s Oversight Board is criticizing the company’s management for not implementing important rules that have a clear bearing on the worldwide discussion about what can be said on social media.
The board has overturned the management’s removal of an Instagram post that encouraged people to discuss the solitary confinement of Abdullah Öcelan, one of the founders of movement Kurdistan Workers’ Party (PKK). The board also criticized management for not implementing an important exemption to its rules on dangerous individuals and organisations for three years.
The board says the post should never had been removed and that the company has found a piece of internal rules had inadvertently not transferred to the company’s new review system in 2018.
NUMBER OF MISTAKES
The board believes that the mistake of not implementing the exemption has led to a number of posts wrongly have been removed over the last three years.
The lost piece of guidance made exemptions to rules that prohibit support and praise of individuals or organisations thought to be dangerous to allow discussions on the condition of confinement.
The Board says it “is concerned that Facebook lost specific guidance on an important policy exception for three years. Facebook’s policy of defaulting towards removing content showing “support” for designated individuals, while keeping key exceptions hidden from the public, allowed this mistake to go unnoticed for an extended period. Facebook only learned that this policy was not being applied because of the user who decided to appeal the company’s decision to the Board.”
“NOT TECHNICALLY FEASIBLE”
”While Facebook told the Board that it is conducting a review of how it failed to transfer this guidance to its new review system, it also stated “it is not technically feasible to determine how many pieces of content were removed when this policy guidance was not available to reviewers.”
“The Board believes that Facebook’s mistake may have led to many other posts being wrongly removed and that Facebook’s transparency reporting is not sufficient to assess whether this type of error reflects a systemic problem. Facebook’s actions in this case indicate that the company is failing to respect the right to remedy, contravening its Corporate Human Rights Policy.”
SHOULD NEVER HAVE BEEN REMOVED
”Even without the discovery of the misplaced guidance, the content should never have been removed. The user did not advocate violence in their post and did not express support for Öcalan’s ideology or the PKK. Instead, they sought to highlight human rights concerns about Öcalan’s prolonged solitary confinement which have also been raised by international bodies. As the post was unlikely to result in harm, its removal was not necessary or proportionate under international human rights standards.”
After this sharp criticism, the Oversight Board lists a number of actions that should be taken, among them are:
- Ensure the Dangerous Individuals and Organizations “policy rationale” reflects that respect for human rights and freedom of expression can advance the value of “Safety.” The policy rationale should specify in greater detail the “real-world harms” the policy seeks to prevent and disrupt when “Voice” is suppressed.
- Add to the policy a clear explanation of what “support” excludes. Users should be free to discuss alleged abuses of the human rights of members of designated organizations.
- Explain in the Community Standards how users can make the intent behind their posts clear to Facebook.
- Ensure meaningful stakeholder engagement on the proposed changes to its Dangerous Individuals and Organizations policy through Facebook’s Product Policy Forum, including through a public call for inputs.
- Ensure internal guidance and training is provided to content moderators on any proposed policy changes.
- Ensure that users are notified when their content is removed. The notification should note whether the removal is due to a government request or due to a violation of the Community Standards, or due to a government claiming a national law has been violated (and the jurisdictional reach of any removal).
- Clarify to Instagram users that Facebook’s Community Standards apply to Instagram in the same way they apply to Facebook.
- Include information in its transparency reporting on the number of requests received for content removals from governments based on Community Standards violations (as opposed to violations of national law), and the outcomes of those requests.
- Include more comprehensive information in its transparency reporting on error rates for enforcing rules on “praise” and “support” of dangerous individuals and organizations, broken down by region and language.