Post

Our Thoughts on Facebook’s Oversight Board for Content Decisions

February 11, 2019 , ,

Last month, Facebook announced a draft charter for a future Oversight Board for Content Decisions. When implemented, the Oversight Board for Content Decisions, composed of independent experts, would be the last instance reviewer of important and disputed content moderation cases for the platform.

Facebook will be seeking feedback on the proposal from a variety of stakeholders through to-be-announced mechanisms and in-person workshops around the globe. Public Knowledge will use those mechanisms to provide more detailed feedback on the proposal. Our final assessment of Facebook’s initiative will depend on the final details of the proposed charter and its implementation. We are eager to see how this is implemented.

We think that in adopting a semi-independent Oversight Board for Content Decisions, Facebook is taking a step in the right direction. Last year, we published a paper precisely arguing that users “should have notice of and an opportunity to challenge actions that are proposed to be taken against them, and to have their challenge heard by a truly impartial tribunal.” We note that Facebook is partially moving in the direction we suggested. Like other social media platforms, Facebook has become a key hub of political debate and socialization. For better or worse, Facebook has the enormous responsibility of being one of the key public squares of our era. It’s positive that Facebook at least acknowledges that it cannot do justice to all its users alone and is seeking the advice and help of independent experts. It’s also positive that Facebook promises to staff and compensate Board members, as it may be a sign that they are taking it seriously.

These are some of our early high-level thoughts on Facebook’s proposal:

1) It would be better if the Oversight Board could review and comment on Facebook’s content policies. The draft charter says that Facebook “can incorporate the board’s decision in the policy development process” and “may also request policy guidance for the board.” That’s a good starting point, but it isn’t enough.

We would prefer Facebook to directly allow the Board to review and comment on all its policies and policy changes. Facebook could proactively try to minimize conflict around the enforcement of content rules by leveraging the potential expertise of the Oversight Board for its drafting and implementation guidelines. As a minimum, Facebook should respond to the Board’s comments letting it know why, why not, and how Facebook decided to adopt the recommendations.

2) We would prefer a truly independent multiplatform and multistakeholder board. In an ideal world, all the dominant social media platforms would subject themselves to a single independent multistakeholder oversight body that would also draft a baseline set of minimum standards, agreed to by platforms, relating to online content regulation. An ambitious idea? Yes. And a very well explained one by our friends at Global Partners Digital.

But, while devil is in the details, we are optimistic. In its suggested approach for the selection and removal of the members of the panel, Facebook promises to select the first cohort of experts through a transparent and independent processes, proposes that each successive cohort will be chosen by its predecessors, and states that no member will be removed by Facebook unless “the member has violated the terms of his or her appointment.” We need to know more. But even though Facebook’s proposal doesn’t go as far as we would like to, it would set the basis for an eventual evolution to a more ambitious oversight model. And we should not let the perfect be an enemy of the good. In the meantime, it will be interesting if Facebook lets the public know if it supports an independent, industry-wide oversight board.

3) Former Facebook employees should not be banned from serving on the Board. Facebook suggests that “the board will not include current or former employees or contingent workers of Facebook.” We understand that Facebook is trying to ensure the credibility of its proposal. And that some Facebook former employees might have signed non-disclosure agreements that could prevent them from sharing some information with the rest of the Board, were they selected to serve there. It’s also true that we would be very skeptical if the Board would be composed entirely or in a majority by former Facebook employees. But there can be value in incorporating to the Board the perspective of, for example, former employees with technical expertise. This would be significantly less problematic after the first iteration of the Board, once it is the Board itself selecting its new members and not Facebook. In any case, former Facebook employees should never constitute more than a small minority of the Board members. Facebook should also consider waiving compliance of the totality or a relevant partiality of existing non-disclosure agreements in the hypothetical case that a former employee would join the Board.

Conclusion

This is a promising first step by Facebook to address the problems in its content moderation policies. Getting content moderation online right is fundamental to ensure that the internet doesn’t contribute to radicalization and division. Facebook’s proposal of creating an independent Oversight Board is an unusual gesture of humility that we, cautiously, welcome.

However, we expect Facebook to do much more to regain consumers’ confidence regarding content issues. Facebook needs to keep working to stop the promotion of sensationalized content and to find a way to apply the “Diversity Principle” in its content promotion algorithms. Facebook can find ways to do that by extending the prerogatives of the proposed Oversight Board.