The 20 legal and governance experts paid by Facebook to deliver “independent” judgments on its often controversial behaviour showed they have some teeth in their first major ruling.
When asked by Facebook whether it was right to block Donald Trump from its platforms in the wake of the US presidential election, the oversight board supported the ban. Its members agreed that the former president had breached the social network’s rules by praising people engaged in violence and creating “an environment where a serious risk of violence was possible”.
But the board — a Supreme Court-style body appointed in 2020 — also poured scorn on Facebook for making up a new penalty for Trump in the form of an indefinite ban, and for the lack of due process around its decisions on moderation. It passed the buck back to the company to decide when and how to allow Trump back.
“In applying a vague, standardless penalty and then referring this case to the board to resolve, Facebook seeks to avoid its responsibilities,” the board said, questioning the legitimacy of the case itself. “The board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.”
“They cannot invent new unwritten rules when it suits them,” said Helle Thorning-Schmidt, former Danish prime minister and an oversight board member.
The creation of the board came after Mark Zuckerberg, Facebook’s founder, decided the social network should not be the “arbiter of truth” about “everything that people say online”. Instead, the company outsourced its most difficult questions about what to take down and what to leave up.
Nate Persily, a professor at Stanford Law School, said the oversight board was the best solution outside of government intervention. “Facebook has taken on the burden of this experiment,” he said.
In an interview at the Financial Times Global Boardroom conference on Wednesday, Nick Clegg, Facebook’s vice-president of global affairs and communications, acknowledged that the board had “made pretty trenchant criticisms of the standards, the policies and the proportionality” surrounding Facebook’s decision to ban Trump.
He declined to respond directly to the criticism, but said the company would “now go away and consider how we can evolve our approach” and hoped to do so “considerably faster” than the six-month deadline set by the board. “It is not a perfect answer, but it’s the best answer that we can come up with in an imperfect world,” he added.
But the full decision from the board also revealed the limits of its power as it pushed up against Facebook’s business model.
The board said Facebook had declined to answer seven questions it had posed, and gave only partial answers to two others. These included how Facebook’s news feed had affected how many people saw Trump’s posts, and whether Facebook had considered changing the way its news feed amplified or reduced contentious posts in the wake of the storming of the US Capitol on January 6.
Facebook also declined to discuss whether followers of Trump’s accounts had violated its rules too, or whether any politicians had leaned on the company over the suspension of Trump’s accounts.
Jesse Lehrich, at the campaign group Accountable Tech, said Facebook “probably didn’t want the board to be such a pain in the ass” and had done “a good job of kneecapping them” by withholding information.
While the board’s ruling that Facebook should make a decision about Trump within six months is binding, its recommendations about other policies that Facebook should enact are not.
Many of its recommendations are likely to make the company uneasy.
These included calling on Facebook to publicly state its process for suspending or banning the accounts of influential figures, after widespread anger over the opaque way in which it hands out penalties. Facebook has previously argued that being transparent about its rules would help bad actors game the system.
It also suggested that Facebook create a team to handle influential politicians that “should be insulated from political and economic interference, as well as undue influence”.
The company has faced accusations, which it denies, that it has pandered to both the left and right, and concerns that its right-leaning Washington lobbyists such as Joel Kaplan, vice-president of Facebook’s global public policy, have been involved in moderation decisions.
Evelyn Douek, a lecturer at Harvard Law School, described the Trump decision as “meaty and educational” but added that the board “steadfastly refused to give Facebook any concrete guidance on what it should do going forward. It left many, many questions unanswered and ambiguous.”