Recent Change in Rules for Facebook’s Oversight Board – Does it Help or Hurt?

The Facebook Oversight Board is popular.

This Board, made up of 20 part-timers who are well-known academics, activists or attorneys spread around the globe, has reportedly received 300,000 appeals from users regarding Facebook enforcement actions. 

As of April 14, the Board has rendered only 8 decisions. Similarly, the US Supreme Court hears approximately 100 cases per year out of more than 7000 appeals for it to hear cases. However, the US Supreme Court sits as the ultimate appeals court after cases are adjudicated in 94 Federal District Courts (where approximately 400,000 cases are heard annually) and 13 Federal Courts of Appeal (where approximately 45,000 appeals cases are heard annually).  There are no similar independent Facebook Boards to hear user appeals before getting escalated to this Oversight Board. 

Facebook recently announced this Board would now also consider user appeals to have other users’ content removed if Facebook did not remove such content.  This will dramatically increase the number of appeals sent to the Board, and further reduce its capacity to deal with user appeals to restore their content or status. 

The Board is in danger of becoming further clogged with appeals and used as a tool to expand censorship if influential users can simply appeal to have content from less-popular users blocked. Faced with such a massive and increasing number of user appeals and no intermediate mechanism for adjudicating censorship disputes between users and Facebook, this Board will not offer an effective appeals mechanism to prevent censorship without changes. 

Facebook and other platforms can dramatically reduce the controversy and scale of enforcement actions by relying on transparently published speech content rules and enforcement actions as outlined in the five guidelines below: 

1.     Transparency – Both the speech content rules along with related enforcement actions need to be published in a manner that is clear to all users with adequate examples and explanation. Enforcement actions, including hidden shadow-banning or limits on how their content can be shared, need to be communicated to the user. Appeal rulings should also be transparently published.

2.     Equal Standards for All – speech content rules and enforcement actions need to be applied to all users (not some). If users can find examples of similar speech content treated differently, the platform should respond by treating such similar content the same way the platform is treating the user’s content.

3.     Commensurate Enforcement Actions – Enforcement actions should be reasonable and commensurate with the violations of the speech rules. First-time offenders, content similar to commonly found content, or content that warrants a warning rather than enforcement actions should be part of the published rules. 

4.     Speedy and Fair Due Process to adjudicate disputes – Users need to immediately have access to a written explanation of which specific speech rules were violated by their content, and a simple method to dispute the enforcement action taken by the platform. The dispute mechanism needs to have resources and capability to respond quickly (within days) to the user’s dispute. The platform needs to provide adequate explanation of their decision that is tied specifically to the speech content rules violated and the enforcement actions taken.

5.     Independent Appeals Process – If a user is not happy with the adjudication results of their dispute, Facebook needs to provide access to a mechanism to appeal the decision to an authority that is independent of the company itself.  The Oversight Board is a good start, but this appeals mechanism needs to be multi-layered with resources and capability to resolve the dispute quickly and fairly. One part-time Board dealing with 300,000 appeals and offering 8 decisions in 6 months is clearly not an answer by itself.  It may be reasonable for some appeals (for example based on the number of appeals filed by the user or the type of enforcement action appealed) to require a fee to avoid excessive appeals, and to have such fees refunded if the user is ultimately successful to have the enforcement action reversed. 

Facebook and other platforms can even more dramatically reduce the controversy and scale of user censorship disputes by focusing enforcement actions on speech that is criminal or dangerous, rather than trying to fact-check which information is true or false. This distinction needs to be spelled out in the published content speech rules. 

Facebook, YouTube and others use a combination of algorithms and third-parties in an effort to fact-check which information is true or false. However, the massive bias in the mainstream newspapers, TV news channels, non-expert employees, and so-called fact-checkers today that reliance on these parties inevitably leads to bias and accuracy problems. 

Fact-checking efforts are simply not workable at the scale of billions of messages and millions of new topics a day, and these efforts will overwhelm the mechanisms for adjudication and appeals of enforcement actions against actual criminal speech or explicitly dangerous speech as evidenced by the snail’s pace and lack of output of the Oversight Board.

Facebook
Twitter
LinkedIn
Email
Skip to content