On January 7, a day after a violent mob overran the U.S. Capitol, Facebook chief Mark Zuckerberg placed an indefinite suspension on then president Donald Trump’s account. Facebook’s ban, together with the decision by other tech companies to boot Trump and his far-right allies from their platforms, ignited a global debate. Mexican President Andrés Manuel López Obrador compared Facebook’s decision to the “Spanish Inquisition.” Other leaders, such as German Chancellor Angela Merkel, expressed more measured unease about the power of private companies to regulate political leaders’ speech. But many people applauded Facebook’s decision. EU digital czar Margrethe Vestager observed, “I can’t say I would have done it differently if I was in their shoes.” The British comedian Sacha Baron Cohen—who gained attention in 2019 for asserting that under Facebook’s logic, it would have allowed Hitler to post 30-second ads—tweeted, “This is the most important moment in the history of social media. The world’s largest platforms have banned the world’s biggest purveyor of lies.”
In the ensuing days, the Trump ban continued to fester. Accordingly, on January 21, Facebook referred its decision for review by the Oversight Board—an independent content moderation body created by Facebook comprised of about twenty experts, academics, and former policymakers from around the world. The board’s recommendations are binding on Facebook’s decisions, meaning that if its members recommend reversing Trump’s suspension, he must be reinstated on the platform. Five board members will review the case. Disappointingly, these members will remain anonymous, meaning that we may never know who presided over this particular case and their individual reasons for how they reached their decision.
The board’s review of Trump’s suspension boils down to two main questions: Were Trump’s posts responsible in whole or in part for inciting violence on January 6? Has Trump shown a pattern of behavior that makes it likely that his future posts will continue to encourage violence and undermine democratic institutions in the United States? If the answer to either question is yes, then the board is obligated to uphold Facebook’s ban.
While the board will focus primarily on the Trump ban, they may also opine on related policy issues: Should Facebook give political leaders a wider berth about what they post—under a policy described as the “newsworthiness exemption”—even if their speech violates the platform’s community standards (including incitement)? Has Facebook established consistent procedures for considering bans on political leaders like Trump? What precedent will be set for hate speech and online harm as a result of Trump’s suspension and the board’s subsequent ruling?
Trump’s online behavior leading up to the January 6 violence, his actions on the day of the riots, and the likelihood of future harm if Facebook restores Trump’s account should compel the Oversight Board to maintain his ban and to make it permanent. Here’s why.
Facebook’s Community Standards and International Human Rights Law
Freedom of expression is foundational to international human rights law. But its application to private internet platforms is still evolving. The starting point for determining an individual’s free speech rights on Facebook lies with its community standards. So, the first determination the board will make is whether Facebook’s ban matched its own community standards.
What are these community standards? Facebook prohibits violence and incitement, including threats or calls for high-severity violence, mid-severity violence leading to serious injury, or threats bringing serious harm to private individuals. The platform also has certain prohibitions related to election content, specifically banning “statements of intent, calls for action, conditional or aspirational statements, or advocating for violence due to voting, voter registration or the administration or outcome of an election.” Facebook prohibits hate speech—a direct attack on the basis of protected characteristics—which incorporates “violent or dehumanizing speech, harmful stereotypes, statements of inferiority, expressions of contempt, disgust or dismissal, cursing, and calls for exclusion or segregation.” Finally, Facebook balances its priority of protecting “voice” with ensuring “safety” (prohibiting content that threatens, intimidates, excludes or silences others) and “dignity” (banning content that harassing or degrading others).
The Oversight Board will also review whether Facebook’s community standards, as applied in the Trump ban, are consistent with international human rights law. The UN Guiding Principles on Business and Human Rights (UNGPs) make up a voluntary framework that sets out the human rights obligations of private businesses. Using the UNGPs as a template, the board will examine whether Facebook’s ban conforms to Article 19 of the International Covenant on Civil and Political Rights following a well-established three-part test. First, is the restriction provided for in law or stipulated in the platform’s terms of service or community standards? Second, is the restriction necessary and proportionate? Does it impose a minimal burden on the exercise of free expression while protecting the legitimate interest at hand? Third, is the restriction legitimate? Is it grounded in one of the narrow exceptions to free expression—protecting the rights or reputations of others or ensuring national security, public order, or public health interests?
Inciting violence is one exception recognized under international law as a legitimate reason to limit an individual’s free speech rights. To decide whether the speech in question meets the threshold of violent incitement, experts identify six relevant factors. These are the social and political context, the speaker’s status, the speaker’s intent, the speech’s content and form, the speech’s extent or reach, and finally, the likelihood that the speech will result in harm. These factors suggest that speakers who have a higher status and reach—such as political leaders with large followings—may have a greater responsibility to exercise care in case their words provoke violence or harm.
Did Trump’s Facebook Posts Incite Violence?
From the moment Trump began his presidential campaign, he exploited Facebook to bully, intimidate, and threaten critics and political opponents. Throughout his four-year term in office, he made ample use of dog-whistling against minority and vulnerable populations. He repeatedly amplified violent rightwing groups, for example by reposting inflammatory videos from anti-Muslim fanatics, retweeting white nationalist videos, and promoting conspiracy theorists. Before voting began for the 2020 general elections, he refused to commit to the peaceful transfer of power. After the election concluded, he promoted dangerous conspiracy theories on Facebook arguing that Joe Biden’s win was illegitimate. Hours before a mob of his supporters ransacked the Capitol, he pressed his followers to “fight like hell” and to “walk down Pennsylvania Avenue” to “take back our country.” They followed his instructions with terrible consequences.
While his supporters were storming the Capitol, Trump used Facebook to attack vice president Mike Pence and further endanger Pence’s life: “Mike Pence didn’t have the courage to do what should have been done to protect our Country and our Constitution.” He exhorted his followers, who by then were running amok in the Capitol, that “USA demands the truth!” In other words, he continued to cheer on the violence carried out by his supporters with full knowledge of the damage they were causing. Had Trump ceased posting before the Capitol had been breached, or had he urged his followers to stop once the extent of their violence became apparent, one could plausibly argue that Trump intended to back away from his earlier incitement.
But he did not. The facts show the opposite: in the midst of the violence, Trump continued to use social media to spur even greater destruction.
Some critics argue that Trump’s words were not to be taken literally—that his utterance of the line, “I know that everyone here will soon be marching over to the Capitol building to peacefully and patriotically make your voices heard,” absolves him of culpability. Yet his own supporters say otherwise. As Buzzfeed News reports, court records filed in 175 criminal cases indicate that many of his supporters believed he was directing them to attack the Capitol. One follower said, “Trump just needs to fire the bat signal . . . deputize patriots . . . and then the pain comes.”
One of the questions posed by the Oversight Board for public comment is whether Facebook should assess the broader off-Facebook context when enforcing its community standards. In the Trump case, not only is off-Facebook context relevant, but it is nearly impossible to disentangle his online posts from his public statements. Even before January 6, Trump’s use of Facebook and Twitter—blended with his public utterances—fanned the flames of violence, risking public safety and endangering his political opponents. His online and off-line tirades encouraged hostility against officials like Michigan Governor Gretchen Whitmer, Dr. Anthony Fauci, Georgia Secretary of State Brad Raffensperger, and cybersecurity official Christopher Krebs. In a 2020 study carried out by ABC News, reporters uncovered at least fifty-four cases of violence or bigotry tied to Trump’s incendiary rhetoric.
Leading up to the events of January 6, Trump deliberately interwove online missives to his followers (“Big protest in D.C. on January 6th. Be there, will be wild!”), alongside public remarks on the Mall (“You don’t concede when there’s theft involved. Our country has had enough. We will not take it anymore”), followed by more online posts goading his supporters in the Capitol (“USA demands the truth”). The only way to get a comprehensive and accurate picture of Trump’s cumulative impact is to consider the full context of his speech—online and off-line. All the evidence points to a clear conclusion: Trump’s actions leading up to and taking place on January 6 played a primary role in inciting violence in the Capitol, resulting in the deaths of five individuals.
The Danger of Restoring Trump’s Account
The second big question is whether Trump’s pattern of behavior makes it likely that he will use future posts to encourage violence and undermine America’s democratic institutions. His actions in the weeks following the insurrection are telling. Far from showing remorse, he pushed his original legal team to argue in his Senate impeachment trial that the election was “stolen,” thereby justifying the January 6 events. Trump has not learned any lessons about why using such combustible rhetoric represents a reckless abuse of presidential authority. He remains a dangerous, destabilizing force in American politics. His inflammatory language, embrace of violent conspiracies, and cumulative falsehoods (estimated by the Washington Post to have totaled 30,573 false or misleading claims) have exacerbated societal tensions to the point where scholars warn of a new age of “political sectarianism” in the United States. In contrast, the consequences of Trump’s suspension have been unexpectedly positive. According to Zignal Labs, online misinformation about electoral fraud declined by 73 percent in the week following Trump’s ban, dropping from 2.5 million to 688,000 mentions across social media.
The damage that will result from restoring Trump’s bully pulpit is very real. The necessity of protecting Trump’s “voice” on Facebook must be balanced against the potential threats to public safety, not to mention the harm resulting from the silencing or suppression of people who dare to challenge his rhetoric or ideas. One recollects Trump’s Facebook response to Black Lives Matter protesters: “Any difficulty and we will assume control but, when the looting starts, the shooting starts.”
In short, Trump’s pattern of past behavior, his disdain for the harmful consequences of his actions, and the likelihood that he will foment more violence in future online postings are compelling reasons for the Oversight Board not to overturn his suspension and restore his user access.
We should also remember that even if Trump is prevented from returning to Facebook, this will not stop him from communicating publicly. A myriad of outlets, from Fox News to Newsmax, will continue to offer Trump an ample bullhorn. Denying him access to Facebook’s platform will not cut off his means of communication. Rather, it will make it just a little more difficult for him to activate his followers to commit violent and harmful acts.
A Double Standard for Political Leaders?
A final consideration for the Oversight Board relates to how Facebook should treat political leaders. Until recently, Facebook’s policy was to give political figures more leeway, justified under the newsworthiness exemption. There are several problems with this policy.
First, Facebook failed to enforce the very conditions it laid out for itself when it came to Trump. It claimed that the standard was based on a public interest-harm calculus. Yet, it doesn’t appear that Facebook applied this balancing test in any meaningful way to Trump’s conduct. Despite numerous violations—such as urging on protests against state stay-at-home orders (“LIBERATE MICHIGAN!; LIBERATE MINNESOTA!; LIBERATE VIRGINIA, and save your great 2nd Amendment. It is under siege!”) or threatening violence against Black Lives Matter protesters—Facebook took no action against his account. Until the January 6 chaos forced Facebook’s hand, the company was unwilling to implement its own rules.
Second, Facebook’s newsworthiness exemption is a slippery slope. As Trump has shown us, politicians who lie directly to the public without being subject to normal journalistic questions and context are exceedingly dangerous. Philippines journalist Maria Ressa—herself a target of government persecution—warns: “A lie told 1,000 times becomes the truth. If you want to rip the heart out of a democracy, you go after the facts.” Trump created a disinformation playbook that weaponizes Facebook, Twitter, and other platforms to strengthen his power and suppress his critics. The argument that Facebook should permit political figures to post content even if it violates its community standards is a loophole that illiberal leaders are all too willing to exploit. In truth, it should be the opposite: public figures with a megaphone must have a greater responsibility to refrain from harmful speech. A speaker with elevated political status and a wide reach has a much higher capacity to incite violence and harm through their words. It is also worth noting that Trump is no longer a sitting head of state. Thus, any determination of whether his account should remain suspended ought to conform with normal terms of use. The board should not use the public leader exception as an excuse to restore Trump’s Facebook account.
Ultimately, the board must consider the real-world consequences of its decision. Upholding Facebook’s ban may bring charges of censorship from certain corners. But overturning the suspension will potentially cost lives and lead to serious damage to U.S. democracy. The board must confront what giving Trump a pulpit from which to spread lies and stir up violence means for the future regulation of online speech and for democracy globally.
Board members who decide the case should strongly consider signing their names to the decision. The public has a right to know who is behind the ruling for such a critical issue. Let us hope that Trump’s pattern of past behavior, his disdain for the consequences of his actions, and the likelihood that he will incite further violence from his online postings will make a compelling case for the board to uphold his suspension.
This piece expands upon a public comment submitted by the author to the Oversight Board in response to case number 2021-001-FB-FBR on “whether Facebook’s decision to suspend President Trump’s accounts for an indefinite period complied with the company’s responsibilities to respect freedom of expression and human rights, if alternative measures should have been taken, and what measures should be taken for these accounts going forward.”