Meta’s Oversight Board is strongly criticizing the company’s cross-check programme that allows VIPs and other powerful users to post content that is not in line with moderation rules. The board says the management has said the programme aims to protect human rights while the board says it is used for commercial reasons.
Information about the cross-check programme was included in whistleblower leaks to the Wall Street Journal last year saying that Facebook prioritized traffic driving content from celebrity users rather than ethics. The scandal later made the company rebrand itself including changing its name a year ago to Meta stressing its future developments for the Metaverse.
“In our review, we found several shortcomings in Meta’s cross-check programme. While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the programme appears more directly structured to satisfy business concerns”, the board says.
“The Board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”
“We also found that Meta has failed to track data on whether cross-check results in more accurate decisions, and we expressed concern about the lack of transparency around the programme.”
Among the board’s recommendations:
- To comply with Meta’s human rights commitments and address these problems, a programme that corrects the most high-impact errors on Facebook and Instagram should be structured substantially differently.
- As Meta seeks to improve its content moderation for all users, it should prioritize expression that is important for human rights, including expression which is of special public importance. Users that are likely to produce this kind of expression should be prioritized for inclusion in lists of entities receiving additional review above Meta’s business partners.
- Posts from these users should be reviewed in a separate workflow, so they do not compete with Meta’s business partners for limited resources. While the number of followers can indicate public interest in a user’s expression, a user’s celebrity or follower count should not be the sole criterion for receiving additional protection. If users included due to their commercial importance frequently post violating content, they should no longer benefit from special protection.
- Radically increase transparency around cross-check and how it operates. Meta should measure, audit and publish key metrics around its cross-check programme so that it can tell whether the programme is working effectively.
- The company should set out clear, public criteria for inclusion in its cross-check lists, and users who meet these criteria should be able to apply to be added to them. Some categories of entities protected by cross-check, including state actors, political candidates and business partners, should also have their accounts publicly marked.
- Reduce harm caused by content left up during enhanced review. Content identified as violating during Meta’s first assessment that is high severity should be removed or hidden while further review is taking place.
The board cannot force the management to change the policy but management must answer to the board’s recommendations within a couple of months.