Cross Check, a Facebook and Instagram program giving special treatment to celebrities and other high profile users, is opaque, “flawed” and needs to be overhauled, according to recommendations released by parent company Meta’s semi-independent Oversight Board on Tuesday, which offer a rare insight into the controversial content moderation initiative and underscore the differences between the tech giant’s actions and its stated values.
Cross Check, which separates the moderation of high profile or sensitive accounts from other accounts and adds an additional layer of scrutiny before content decisions are made, is designed to “satisfy business concerns” rather than protect the public or implement Meta’s stated values, the Oversight Board said.
The program appears to be driven more by Meta’s desire to avoid “provoking” VIPs and facing accusations of censorship than professed aims of safeguarding the public and Meta’s numerous public statements that it applies platform rules evenly were misleading, the board added.
The board called for “significant improvements” to the program and offered 32 recommendations to change the process such as publishing key metrics surrounding the program, implementing audits to make sure it’s working effectively and hiding posts while they are evaluated (presently, they remain public pending review).
Suggestions also included radically increasing its transparency, publicly labeling accounts, removing repeat violators from the Cross Check program and beefing up resources for content moderation.
Nick Clegg, Meta’s president of global affairs, said the company will fully address the board’s recommendations and respond to the report within 90 days.
Clegg said the company has already changed some aspects of the program, including introducing more formal criteria for adding users to the scheme, expanding eligibility and establishing annual reviews.
What We Don’t Know
Meta is not obligated to implement or accept the Oversight Board’s recommendations, though it must respond to them. Some recommendations are likely quite easy and cheap for Meta to implement, such as highlighting accounts that are under the scheme publicly. Others could prove more challenging, particularly those requiring significant expenditure or the expansion of staff focused on moderation. Meta, in line with many tech firms, recently announced deep cuts and layoffs in light of gloomy economic forecasts.
Cross Check is a way of managing the challenges associated with moderating the vast quantities of content posted on Meta’s platforms every day, the board said. Though difficult, the board said it is not fair for Meta to unevenly address the issues of falsely flagging, or not flagging, rule breaking content for the most powerful people. “Meta has a responsibility to address its content moderation challenges in ways that benefit all users and not just a select few,” the report said.
Facebook, Meta and executives like Mark Zuckerberg have long insisted the public and high profile figures are treated as equals. This position was explosively dismantled last September, when the Wall Street Journal revealed the secretive set of rules and procedures shielding VIP users—including celebrities, politicians, journalists and advertisers—from the normal moderation process. The program includes millions of users and has allowed rule breaking material, including one instance of non-consensual pornography, to remain up far longer than would normally be expected, the Journal reported. The Oversight Board was asked to look into the issue following the report and at the time it castigated the firm for concealing the true scope and scale of the scheme from it.