Facebook’s “Supreme Court”

January 26, 2022

Who should decide what Americans can say online?

Social media platforms like Facebook, Instagram, and Twitter have standards as to what kinds of posts they will and won’t allow. Others, like Gab and Parler allow speech that the larger platforms won’t. And online speech that may be offensive is still protected by the First Amendment – it might violate a platform’s guidelines but it’s not illegal.

This puts tech companies in a pickle. Their moderators comb through controversial content all day, whether it’s a group inciting violence in Ethiopia or an American user spreading misinformation about Covid vaccines. They have to weigh in on difficult matters constantly and can never make everyone happy.

Some Americans believe that tech companies should do far more to take down false or problematic content. Others believe that it shouldn’t be up to companies to decide what speech gets to stay online and what speech comes down. Some want the government to step in and set guidelines, but they don’t all agree on what those guidelines should be.

Perhaps no social media platform has taken more heat for enabling the spread of misinformation than Facebook. In October 2021, Facebook changed its name to Meta. In this article, we’ll refer to Facebook when talking about the app and Meta when talking about the company and its policies. To make it even more complicated, when we’re talking about decisions made before the name change, we’ll call the company Facebook.

In 2018, Facebook announced that it was working on a new approach to misinformation and problematic content, which some likened to a sort of Supreme Court for the platform. It began operating in 2020 as the Content Oversight Board. For the 55% of Americans who think that tech companies should make content decisions rather than the government, the Facebook approach might be an interesting one.

How the Content Oversight Board Works

Day-to-day decisions about what content can and can’t stay on Facebook or Instagram are made by Meta’s moderators. If you post something that moderators take down, or complain about a post that remains up, you can go through an appeals process to get a second look at your case. And if you aren’t happy with the outcome of that process? You can take your concern to the Content Oversight Board.

The Board doesn’t hear every single case that comes its way, though. Instead, it chooses a tiny selection of cases that it sees as emblematic of content issues that arise on the platform. Facebook itself can also ask the Board for help with “significant and difficult” cases.

Once the Board makes a ruling, Meta must implement it within 90 days.

Is the Board Independent?

Meta says it is, explaining that “both the board and its administration are funded by an independent trust and supported by an independent company that is separate from the Facebook company.”

Who’s On the Board?

The board currently has 20 members and will eventually have 40. They span the globe and different realms of expertise, including a Columbia Law School professor, a vice president from a libertarian think tank, a senior editor of Indonesia’s Jakarta Post, and the founder of the Digital Rights Project in Pakistan.

What Decisions Does the Board Make?

The Board has received hundreds of thousands of cases since it began in October 2020. Of those it has ruled on fewer than two dozen. The Board hears a wide variety of cases that crop up around the globe in a multitude of languages. Here are a few:

  • France: Facebook asked the board for a ruling on an October 2020 post, in which a user posted a video and text alleging a scandal at a French government agency. The post alleged the agency refused to authorize hydroxychloroquine combined with azithromycin to treat Covid-19 but did authorize remdesivir. Facebook took the post down, saying that it contributed “to the risk of imminent… physical harm.” The Board overturned Facebook’s decision, saying that the company’s rule was too vague and recommending that the company create a new standard on health misinformation.
  • Ethiopia: In July 2021, a Facebook user posted in Amharic (the country’s official language) that the Tigray People’s Liberation Front committed acts of violence and looted the properties of civilians, and that ethnic Tigrayan civilians participated in the atrocities. Facebook had left the post up, but the Board determined that it violated Facebook’s standard on violence and incitement.
  • Brazil: In March 2021, a government medical council posted that lockdowns are ineffective and go against Constitutional rights. The Board determined that Facebook was right to leave the post up, because although it contained some inaccurate information, it did not create a risk of imminent harm.
  • United States: On January 6, 2021, President Donald Trump made two posts related to the violent events at the Capitol. The next day, Facebook blocked Trump “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.” The Board upheld the decision, saying that some of Trump’s remarks violated Facebook’s rules prohibiting praise or support of people engaged in violence, but found that the company was wrong to impose an indefinite suspension. The company later said it would suspend Trump for two years and would reinstate him “if the risk to public safety has receded.”

Does the Board Make Facebook and Instagram Better?

It’s a matter of opinion. People across the political spectrum have concerns about how Meta moderates content. For example:

  • Many conservatives say tech companies have a left-leaning bias and that the Board is packed with liberals. Some outspoken members of Congress allege that the company’s decision to block President Trump is proof that Facebook is intent on silencing conservative voices.
  • Others worry that Meta is far too slow and ineffective when it comes to keeping misinformation, hate speech, and other objectionable content off its platforms.
  • Still others say that the Board is much too small and limited in scope to match the true magnitude of the problem. It chooses very few cases and takes a slow and methodical approach when it does act.

It’s also worth remembering that while the Board’s decisions are supposed to be binding, it’s not an actual court of law. If Mark Zuckerberg did ever decide to overrule the board, he would face intense public backlash, but it’s not like he would go to jail.

Do We Need More Boards?

It’s an idea! Facebook and Instagram are only two social media platforms. Some people have suggested that tech companies need to get together and agree on one set of standards that they’ll all enforce. Content Oversight Boards could be a way to make online content moderation more transparent, though critics argue that Meta’s current Board isn’t all that transparent.

Do you think Content Oversight Boards can be part of the solution to online misinformation? Let us know on Instagram, Facebook, or Twitter.

Sign Up

Democracy only works when we do – so let’s get started. Sign up to get tools, news, and invitations to special events that will help us all build a stronger future.

This field is for validation purposes and should be left unchanged.