Facebook's Oversight Board Must Uphold the Ban on Trump 

It's not just about penalizing the former president. It's about protecting democracy—in the US and around the world. 
Trump's backside
Photograph: Jabin Botsford/The Washington Post/Getty Images

In the coming weeks, the Facebook Oversight Board will rule on Donald Trump’s indefinite suspension from the platform. This will surely be the board’s most important ruling to date. The board’s specific decisions are binding for Facebook, and in this case the ruling will likely go far beyond Trump and set a global precedent for the policies and enforcement actions the company makes going forward. The stakes could not be higher—not only for American democracy, but for countries around the world that have and will come under threat from undemocratic political leaders. Like it or not, Facebook has a crucial role to play in safeguarding democracy. And the board’s decision will help determine whether the company can fulfill this responsibility, or will wash its hands of its democratic obligations.

Our decade of research on how politicians use social media has made it clear that there’s only one correct way forward. Together with researchers at UNC’s Center for Information, Technology, and Public Life, we believe the board should uphold Facebook’s ban on Trump’s account. The former president clearly, repeatedly, and flagrantly violated Facebook’s Community Standards in his attempt to deny the American public’s right to vote him out of office. Banning Trump from the platform permanently would follow the company's history of suspending users who repeatedly violate policies. More importantly, it would affirm Facebook’s responsibility to protect democracies around the world by taking a strong stance against expression that undermines democratic accountability, especially free and fair elections.

On the most fundamental level, Trump’s use of Facebook repeatedly violated the company’s policies. While there is debate over whether Trump directly incited the attempted coup on January 6, this is the wrong question on which to focus. The bigger and clearer violation of Facebook’s policies is the former president’s use of the platform to undermine free and fair elections—the public’s essential democratic voice. While Facebook’s commitment to “expression is paramount,” its Community Standards have long (rightly) balanced that against the risk of harm, including threats to safety, dignity, and electoral integrity. This includes the company’s extensive stated policies that protect what Mark Zuckerberg referred to as the public’s voice at the ballot box.

There is perhaps no more flagrant attempt in recent US history to silence the people than former President Trump’s months-long campaign of lies about mail-in ballots, illegal voting, and voter fraud and his statements that the election was “fraudulent” and “stolen.” Facebook’s Community Standards require evaluating both accounts and content, as well as the broader “circumstances” that provide context for what appears on the platform. In this case, the president’s election disinformation came in the context of his anti-democratic recognition of hate groups, failure to condemn extrajudicial violence, and work to have federal agencies downplay the threats of armed paramilitary groups.

Based on these facts alone, Facebook’s permanent suspension of Trump is more than justified. In fact, Facebook’s lack of enforcement—until January 7th—of its existing policies in the face of Trump’s repeated violations has been deeply problematic. For too long, the company erred on the side of allowing Trump’s electoral disinformation to stand because it argued the public should be able to hear from its leaders—the company’s “newsworthiness” exemption. But Facebook should have been more consistent in its enforcement of its policies, or developed innovative solutions to proactively ward off the threats the president posed to the election, such as putting his account on a delay to screen for violations.

That said, it is crucial for the Oversight Board to honor the fact that the company eventually did enforce its policies in order to protect democracy. And while Facebook finally acted in the U.S., the failure to enforce its stated policies extend far beyond our borders. President Trump is not the only example of a world leader that has used Facebook to undermine electoral accountability, delegitimize political opposition, and subvert democratic institutions designed to act as a check on their power. Facebook must draw a bright red line at the attempts of any political leader, or those vying to become one, to undermine democratic processes, including those institutions that represent the people’s voice, like elections. We see it as promising that this week Facebook took (overdue) action in Myanmar, banning the military from its platforms in the wake of a military coup that overthrew the democratically elected government.

Threats to democracies across the world are growing, and rejecting anti-democratic behavior will require strong institutions. While the public has a stake in hearing from those representing it (or seeking to), democracies across the globe also face clear and present dangers when disinformation and hateful or inciting speech undermines the accountability of leaders. Preserving democratic accountability, especially free and fair elections, should be the standard by which Facebook judges the expression of the politicians that use its platform.

In this way, the Oversight Board upholding the ban on Trump would endorse Facebook’s recent, though long deferred, effort to safeguard democracy in the US. It would set a clear precedent that can be applied around the world. It might prod other social media companies to make safeguarding democracy a central pillar of their content policies as well. Facebook is sure to face even more difficult cases in the future, and it should have the flexibility to interpret them against the standard of preserving democratic accountability, especially in consultation with other institutions that operate in the public’s interest, such as journalists, academics, and non-governmental organizations. And, when it is a matter of policy enforcement, Facebook should focus its efforts on those holding or running for office and leaders in their parties, rather than exempting them —what elites in particular say and do matters most for peaceful democratic transitions and the preservation of democratic institutions. Platform companies are shaped by the politics of the societies in which they operate—societies which are in turn shaped by platform policies. In this way, the Oversight Board’s decision is not just about Facebook, but about whether the company will honor the obligations it has to democratic societies.

To those who worry about the private power of companies to regulate political speech, we hear you. But it’s important to remember all it took to get to this point. As Facebook’s Civil Rights Audit demonstrated, the company often failed to fulfill its obligations to protect the democratic voices of its most vulnerable users, favoring instead the anti-democratic speech of the powerful such as Trump. Indeed, Facebook has not only long erred on the side of valuing freedom of expression, especially for world leaders, it took a months long campaign by a sitting president to undermine the legitimacy of the election, subvert ballot accountability, and create the context where his supporters attacked the US Capitol for the company to take decisive action to enforce its own policies.

The danger here is that if the Oversight Board does not uphold Trump’s ban, it will set a precedent of valuing political elites’ expression over the right of the public to self-govern in countries across the world.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at opinion@wired.com.


More Great WIRED Stories