After a very long build-up, Facebook has finally announced the make-up of its Oversight Board.
This board is an important step by Facebook towards recognizing that the decisions it makes, as a dominant platform, have more public significance than editorial and curation decisions made by others. But what many people have called a “Supreme Court” is not the solution to the day-to-day process concerns people have when dealing with major platforms. We’ve thought about due process for a long time. The metaphor of the justice system is a good way to think about what Facebook’s Oversight Board does and does not accomplish. Users need more district courts, magistrate judges, and administrative hearings, to keep with the legal analogy, not just the long-shot opportunity to have their dispute heard by a part-time board of eminences.
As in the actual court system, a “Supreme Court” hears extraordinary cases. In any system based on rules, there are going to be cases that fall through the cracks, and can’t easily be resolved by just routinely applying existing standards. Most of the work that the Supreme Court and appellate courts in general do involves situations of this kind.
There are also going to be wholly new kinds of issues — in the Facebook case, categories of content — where new rules and standards altogether have to be crafted, by reference to broad principles of one kind or another. This is broadly similar to constitutional law in our legal system. It’s important, but most disputes don’t rise to that level.
According to the co-chairs of the Oversight Board, it will review:
Cases that examine the line between satire and hate speech, the spread of graphic content after tragic events, and whether manipulated content posted by public figures should be treated differently from other content[.]
But they offer this proviso:
We will not be able to offer a ruling on every one of the many thousands of cases that we expect to be shared with us each year. We will focus on identifying cases that have a real-world impact, are important for public discourse and raise questions about current Facebook policies.
In our paper calling for due process, the emphasis is not on the extraordinary cases, but the day-to-day cases. The concerns most people have with major platforms don’t have to do with situations that fall through the cracks, or situations that call for new policies to deal with new situations, but the inconsistent or incorrect application of existing rules. Due process isn’t just about filtering tough cases to people who can make tough judgment calls (and set precedent for the future), but about the routine ability to correct mistakes.
Due process is a flexible concept, and the greater the stakes, the more process is required. In its fullest form, it can include:
- notice of a proposed action and the grounds asserted for it
- the right to be heard by an unbiased decision-maker
- an opportunity to present reasons for the proposed action not to be taken
- the right to present evidence
- the right to know the opposing evidence
- the right to a decision based only on the evidence presented
- the right to written findings of fact including the basis for a decision
For most decisions, due process does not mean that every user that has a piece of content removed is entitled to a video conference with a panel of lawyers with arguments on both sides. But it does mean that dominant platforms should not simply take down content and refuse to provide more information than just saying that the content violates the rules. It means that they should tell users which rules were violated. It means that platforms should not take decisions that affect the livelihood or free expression of users and provide them with no means to ask questions or to contest the decision. It means that if one piece of content is taken down, and another similar piece of content is not, platforms should be required to explain the reasoning, if any, behind the disparate treatment.
The Facebook Oversight Board (again, like the Supreme Court) mixes policy-making with a more straightford appeal process. But most of the actual content policies that Facebook puts forward will still be set by the existing methods. For dominant platforms at least, we may want to think about the extent to which we want different platforms to set different policies, to what extent there should be some oversight of this process, and to what extent policies should be uniform across different platforms. These are important and deep questions–but the basics of due process apply regardless. There may be due process concerns with how policies are set, but many people’s problems would be resolved by just ensuring that existing policies are applied fairly.
Ensuring a basic level of due process is not just about having a final arbiter for the hardest issues. Instead, it’s a matter of providing more front-line resources and intermediate appeal procedures, and establishing processes to ensure consistency, transparency, and promptness in handling disputes and appeals. (As has been highlighted more recently, too, it’s a matter of making sure that human content moderators can work remotely in a safe and privacy-respecting way.) These things are costly, and would be difficult to implement, but would likely do more to establish fairness and create more of a sense of legitimacy in users’ daily interactions with major platforms.
About John Bergmayer
John Bergmayer is Legal Director at Public Knowledge, specializing in telecommunications, media, internet, and intellectual property issues. He advocates for the public interest before courts and policymakers, and works to make sure that all stakeholders — including ordinary citizens, artists, and technological innovators — have a say in shaping emerging digital policies.