Most of us who've been watching and reporting on Facebook for the last decade have assumed that the company had a different policy for handling the accounts of public figures than it has for average users, and a recent Wall Street Journal exposé — likely fed by leaked documents from whistleblower Frances Haugen — confirmed that. Now, Facebook's Oversight Board is saying that the company misled them when they were given the case of former President Trump's account suspension, and the board will be reviewing the internal process known as "cross check" by which celebrity accounts are given a pass to break the rules.
This latest shoe to drop from Haugen's trove of internal documents, copied surreptitiously before she left her employment at Facebook earlier this year, came over a month ago in this Wall Street Journal piece — the first in the paper's "Facebook Files" series. However, the revelations about "cross check," or "XCheck" as it became known internally, were overshadowed by other stories about the company ignoring data on Instagram's harm to teens, inciting rage and division with its news-feed algorithm, and hurting the nation's vaccination campaign despite public proclamations that it was doing everything it could to push people to get vaccinated.
Now, the ostensibly independent Oversight Board is pointing back to the story and noting — with a definite undertone of anger — that Facebook was not transparent with them about "XCheck," and that the board only found out about this internal program when they began asking specific questions in the Trump case.
"When Facebook referred the case related to former US President Trump to the Board, it did not mention the cross-check system," the Oversight Board writes in a new blog post. "Given that the referral included a specific policy question about account-level enforcement for political leaders, many of whom the Board believes were covered by cross-check, this omission is not acceptable. Facebook only mentioned cross-check to the Board when we asked whether Mr. Trump’s page or account had been subject to ordinary content moderation processes."
The Oversight Board went on to tell Facebook that it could not categorically ban a user — even one as egregiously terrible as Trump — and that there needed to be a policy by which major, repeated violations result in a suspension of a specific length of time. In Trump's case, the board ruled in May that his account suspension could not be indefinite, and Facebook returned in June with a decision that would keep Trump banned for two years, until January 2023, but he could be re-banned from the platform if he continued to pose a "risk to public safety."
Trump's risk to public safety, and his violations of the platform's rules against hate speech and incitements or celebrations of violence, far predate his posted statements on January 6th. And now we know that Trump, along with millions of other high-profile users, was white-listed and given a pass to post what he wanted, generally.
As the Journal reported, "Despite attempts to rein it in, XCheck grew to include at least 5.8 million users in 2020, documents show. In its struggle to accurately moderate a torrent of content and avoid negative attention, Facebook created invisible elite tiers within the social network."
An internal review of XCheck, documents from which Haugen seems to have procured, included an admission by Facebook employees that "We are not actually doing what we say we do publicly." The review called the program a "breach of trust," and it said, "Unlike the rest of our community, these people can violate our standards without any consequences."
Facebook responded to the Journal's inquiries about XCheck by saying that the company had already been addressing its issues with XCheck. But is that really possible?
Facebook spokesperson Andy Stone gave a statement saying, "A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross check and has been working to address them."
But the Oversight Board doesn't sound like it thinks any of these issues have been adequately addressed.
"The credibility of the Oversight Board, our working relationship with Facebook, and our ability to render sound judgments on cases all depend on being able to trust that information provided to us by Facebook is accurate, comprehensive, and paints a full picture of the topic at hand," the board writes. "We will continue to track and report on information provided by Facebook to ensure it is as comprehensive and complete as possible."
The board says that in the wake of the Journal article, "it has accepted a request from Facebook, in the form of a policy advisory opinion, to review the company’s cross-check system and make recommendations on how it can be changed."
They're talking about trying to ensure that the users put on the XCheck whitelist are chosen "equitably," but how is that even possible, when the entire system creates an elite tier exempt from normal rules?
The Oversight Board says that it will continue to push Facebook toward more transparency, but in the 10 or so months that the board has been operational, the company has remained as opaque as ever. And the kinds of corporate cultural problems — like the prioritizing of profits over the moral good — that Haugen pointed to in her congressional testimony two weeks ago aren't likely to be easily solved.