Meta Oversight Board demands changes to ‘cross-check’ program that protected Donald Trump

Meta’s Oversight Board has released an in-depth report on Facebook and Instagram’s controversial cross-check system, calling on Meta to make the program “radically” more transparent and beef up its resources.

The semi-independent Oversight Board cited “several shortcomings” in cross-check, which provides a special moderation queue for high-profile public figures, including former president Donald Trump before his suspension from Facebook. It singled out a failure to make clear when accounts are protected by special cross-check status, as well as cases where rule-breaking material — particularly one case of non-consensual pornography — was left up for a prolonged period of time. And it criticized Meta for not keeping track of moderation statistics that might assess the accuracy of the program’s results.

“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the report says. “The board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”

“It was protecting a limited number of people who didn’t even know that they were on the list.”

The report comes more than a year after The Wall Street Journal revealed details about cross-check publicly. Following its revelations, Meta asked the Oversight Board to evaluate the program, but the board complained that Meta had failed to provide important information about it, like details about its role in moderating Trump’s posts. Today’s announcement apparently follows months of back-and-forth between Meta and the Oversight Board, including the review of “thousands” of pages of internal documents, four briefings from the company, and a request for answers to 74 questions. The resulting document includes diagrams, statistics, and statements from Meta that help illuminate how it organized a multi-layered review program.

“It’s a small part of what Meta does, but I think that by spending this amount of time and looking into this [much] detail, it exposed something that’s a bit more systemic within the company,” Oversight Board member Alan Rusbridger tells The Verge. “I sincerely believe that there are a lot of people at Meta who do believe in the values of free speech and the values of protecting journalism and protecting people working in civil society. But the program that they had crafted wasn’t doing those things. It was protecting a limited number of people who didn’t even know that they were on the list.”

Cross-check is designed to prevent inappropriate takedowns of posts from a subset of users, sending those decisions through a set of human reviews instead of the normal AI-heavy moderation process. Its members (who, as Rusbringer notes, aren’t told they’re protected) includes journalists reporting from conflict zones and civic leaders whose statements are particularly newsworthy. It also covers “business partners” that include publishers, entertainers, companies, and charitable organizations.

According to statements from Meta that are quoted in the report, the program favors under-enforcing the company’s rules to avoid a “perception of censorship” or a bad experience for people who bring significant money and users to Facebook and Instagram. Meta says that on average it can take more than five days to make a call on a piece of content. A moderation backlog sometimes delays the decisions even further — at the longest, one piece of content remained in the queue for over seven months.

The Oversight Board has frequently criticized Meta for overzealously removing posts, particularly ones with political or artistic expression. But in this case, it expressed concern that Meta was allowing its business partnerships to overshadow real harm. A cross-check backlog, for instance, delayed a decision when Brazilian soccer player Neymar posted nude pictures of a woman who accused him of rape — and after the post, which was a clear violation of Meta’s rules, Neymar didn’t suffer the typical penalty of having his account deleted. The board notes that Neymar later signed an exclusive streaming deal with Meta.

Conversely, part of the problem is that ordinary users don’t get the same hands-on moderation, thanks to Facebook and Instagram’s massive scale. Meta told the Oversight Board that in October of 2021, it was performing 100 million enforcement actions on content every day. Many of these decisions are automated or given very cursory human review, since it’s a vast volume that would be difficult or impossible to coordinate across a purely human-powered moderation system. But the board says it’s not clear that Meta tracks or attempts to analyze the accuracy of the cross-check system compared with ordinary content moderation. If it did, the results could indicate that a lot of ordinary users’ content was probably being inaccurately flagged as violating the rules, or that Meta was under-enforcing its policies for high-profile users.

“My hope is that Meta will hold its nerve.”

The board made 32 recommendations to Meta. (As usual, Meta must respond to the recommendations within 60 days but is not bound to adopt them.) The recommendations include hiding posts that are marked as “high severity” violations while a review is underway, even when they’re posted by business partners. The board asks Meta to prioritize improving content moderation for “expression that is important for human rights,” adopting a special queue for this content that is separate from Meta’s business partners. It asks Meta to set out “clear, public criteria” for who is included on cross-check lists — and in some cases, like state actors and business partners, to publicly mark that status.

Some of these recommendations, like the public marking of accounts, are policy decisions that likely wouldn’t require significant extra resources. But Rusbridger acknowledges that others — like eliminating the backlog for cross-check — would require a “substantial” expansion of Meta’s moderation force. And the report arrives amid a period of austerity for Meta; last month, the company laid off around 13 percent of its workforce.

Rusbridger expresses hope that Meta will still prioritize content moderation alongside “harder” technical programs, even as it tightens its belt. “My hope is that Meta will hold its nerve,” he says. “Tempting as it is to sort of cut the ‘soft’ areas, I think in the long term, they must realize that’s not a very wise thing to do.”

Source

Meta’s Oversight Board has released an in-depth report on Facebook and Instagram’s controversial cross-check system, calling on Meta to make the program “radically” more transparent and beef up its resources. The semi-independent Oversight Board cited “several shortcomings” in cross-check, which provides a special moderation queue for high-profile public figures, including…

Leave a Reply

Your email address will not be published. Required fields are marked *