Source: Getty
commentary

One Strategy Democracies Should Use to Counter Disinformation

Authoritarian governments are leading the push at the UN to develop international norms. Democracies should deploy existing UN codes to provide alternatives.

Published on March 28, 2022

While democracies struggle to develop policies for countering influence operations within and outside their borders, authoritarian governments are taking the lead at the UN in framing international approaches to threats such as disinformation. Russia’s invasion of Ukraine last month has only heightened the urgency with which democracies must respond at this level.

Some critics might argue that the UN cannot influence the behavior of authoritarian governments or that seeking common ground with those states to articulate international norms is futile, yet not attempting to do so is tantamount to allowing authoritarians to dictate which norms will emerge. Democracies must quickly provide an alternative to authoritarian framings to ensure that human rights and individual freedoms are not abused by state powers. To achieve this, democracies must determine which principles should guide international approaches for countering influence operations and which tradeoffs they are willing to make to develop international norms.

Democracies Can No Longer Admire the Problem

How should democracies respond to disinformation? That’s the billion-dollar question. In mid-2021, the U.S. Department of Defense announced a vague, nearly ten-figure deal with a contractor “to counter misinformation from U.S. adversaries,” despite there being no strong evidence of proven strategies to counter these activities. At the same time, pressure mounts for democracies to act. In late 2021, a group of authoritarian regimes co-sponsored a UN resolution on disinformation, which ended by calling on the secretary general to solicit submissions on best practices for countering the phenomenon.

Democracies have been admiring the problem of disinformation for some time. Driven by a proliferation of case studies, democracies have tended to focus on examples of disinformation or the actors behind it. While digital platforms have increased their interventions and many governments have introduced national laws to counter hostile activities like disinformation, little is known about the efficacy of such laws. Most of these national approaches are focused on something specific and undesirable, like blocking coronavirus misinformation or deplatforming disinformation superspreaders. This emphasis on the undesirable weeds, so to speak, shouldn’t be surprising.

Perhaps policymakers need to pivot to emphasizing what democracies want to see in a healthy information environment. Ethicist Simon Blackburn notes that the best way to start a moral or political demand is to frame it in a question of having the right to or freedom from something. What might some of those principles include? How could they lead toward normative behaviors that would help counter misinformation? What would democratic governments, industry, and civil society (including academics) need to do to achieve them? Some answers might be found in reviewing existing UN codes and efforts to envision and articulate a more desirable state for addressing the global problem of disinformation.

Freedom to Access Reliable Information

One principle for ensuring the integrity of the information environment could be that citizens have the freedom to access reliable information. The UN already marks the importance of access to information through a designated annual day. Yet, given declining levels of global trust in institutions and in sources of information, having democracies provide access to information alone is not enough. Ensuring that reliable information is widely available, while also rebuilding trust in a heavily polluted and contested information environment, is key.

Looking beyond standard transparency and access-to-information regimes that all democracies should already have in place to inform citizens, at the national level, democratic governments should introduce mandatory transparency reporting by companies controlling crucial infrastructure, such as social media platforms, within the information environment. This reporting would provide insights into how these companies operate, develop policies, and enforce their rules. To achieve a tangible effect globally, democratic governments should collaborate to ensure a more coherent application of these mechanisms across jurisdictions. At the same time, industry need not wait until governments regulate: digital platforms should implement comprehensive transparency reporting, moving beyond self-selected and ad hoc reporting and including independent auditing.

Both governments and industry should also support public education programs and media literacy campaigns that help people understand what it means to live in the Information Age, while also enhancing access to reliable information by funding independent journalism and fact checking. People have a right to be given the basic cognitive tools to navigate the information environment, lest they be left as vulnerable prey for manipulators. For its part, civil society needs to take a leading role in guiding governments and platforms on regulating industry, identifying the values driving interventions in the information environment and educating the wider public on how information is processed and distributed to audiences so as to rebuild trust in sources.

Freedom From Information Pollution, Including Disinformation

In a 2021 report called “Our Common Agenda,” UN Secretary General António Guterres identified twelve areas for action based on input from member states that specifically raised the need to address disinformation, stating, “We must make lying wrong again.”

Effectively countering disinformation is challenging, but some common first steps could help. For example, Guterres suggested that the UN explore “a global code of conduct that promotes integrity in public information,” moving toward setting normative behaviors. The odds of governments committing to not ever engaging in disinformation are extremely low, but perhaps headway could be made around more narrowly defined circumstances. For example, governments could commit to not engaging in disinformation targeting public goods and services, such as health and education. A precedent for such a move can be found in the 2014 U.S. commitment to not use vaccination programs as a means for intelligence gathering. Or states could commit to not engaging in disinformation about UN agencies or other international nongovernmental organizations, such as the International Red Cross.

Democracies also could come together to massively enrich understanding of the disinformation and the information environment through strategic funding of a multistakeholder, multinational research center. Such an independent research institute could bridge the gap between policymakers, industry, and civil society, enabling greater research on the measurement of the effects of disinformation and the impact of interventions. A multistakeholder structure could also facilitate much-needed work on shared definitions, providing a center of gravity for conceptualizing aspects of the information environment and threats within it, in a manner that works for practical policymaking. Governments could also urge industry to financially support the center, share data, and collaborate with a wider community of researchers, working with civil society to develop data-sharing rules to govern the process. Only with a deeper understanding of how the information environment works can a more meaningful approach to counter disinformation be developed to support the principle of freedom from information pollution.

Freedom From Online Abuse

Article 20 of the UN International Covenant on Civil and Political Rights already prohibits “any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.” A third principle could be that people should have the right to freedom from abuse, both physically and online.

In a 2015 report, the UN Educational, Scientific and Cultural Organization noted that a universal definition of hate speech is unlikely but encouraged governments to develop definitions, taking into account different viewpoints through a multistakeholder approach. And in a 2019 document, the UN called on itself to “support a new generation of digital citizens, empowered to recognize, reject and stand up to hate speech.” Beyond developing policies for addressing hate speech and supporting further research in this space, industry should invest in adapting their products to have better safety features, providing at-risk users with more tools to control and improve their online experience. For example, Twitter has experimented with featuring a safety mode designed to “automatically block abusive behavior,” enabling users to remove followers, and prompting users to rethink the nature of their own posts to curb abuse.

Both industry and government should work with civil society and communities that are targeted online—including journalists, activists, minorities, and women—to better inform regulation and tech solutions to curb abuse. Often, those working in civil society are targeted by abuse themselves and can provide first-hand insights into what would make their experience in the information environment better. 

Freedom From Covert External Manipulation

A 1981 UN declaration, adopted after several attempts to curtail the use of propaganda in wars of aggression, asserts that states and people have the right “to have free access to information and to develop fully, without interference, their system of information and mass media.” The declaration goes on to mention that states will not conduct “any defamatory campaign, vilification or hostile propaganda for the purpose of intervening or interfering in the internal affairs of other states.”  

On the surface, freedom from external manipulation might seem straightforward. In practice, this principle requires that democracies have a frank conversation about the trade-offs between needing to promote democracy abroad and the desire to limit foreign interference at home. While policymakers in democracies can cite their values to justify democracy promotion, the same argument rings hollow for the authoritarian regimes often targeted with such campaigns. In the ensuing deployment of offensive influence operations by all sides, it is ultimately democratic values and the people they aim to empower and protect who will suffer the most.

Coming to a détente on this escalating issue will entail articulating clear lines about what are and are not acceptable levels of influence, both for domestic purposes and in international relations. These lines could evolve around transparency of origins in persuasive communications, where unacceptable influence operations would seek to hide or misrepresent the source of information. Drawing from the two previous principles, engaging in information pollution and abuse would also be lines not to be crossed.

Democracies should engage and support civil society, including those in academia and NGOs, to help determine these normative behaviors. Such a multistakeholder process should entail having an open discussion on the role of influence in society, which actors can legitimately engage in public debate, and what line marks the point beyond which the agency of democratic citizens has been so eroded that they are no longer making free and informed decisions—a bedrock of legitimacy for democracies.

Ultimately, as communication technologies continue to evolve, the choice democracies will face is either to adopt greater controls over their information spaces, becoming more like the authoritarians they oppose, or to offer the world a new vision for democratic renewal built on greater information integrity as a global public good. Failing to articulate such principles offers authoritarian states an advantage, and allows them to offer the only model for all other states to follow, including both flawed and full democracies. This could be a pathway to a splintering of the global information environment, as the world is already seeing in many of Russia’s responses to government and industry efforts to curtail disinformation originating from Russia amid the Kremlin’s invasion of Ukraine. In the long term, if authoritarian states continue to block platforms mostly headquartered in the West from operating in their countries, Chinese and Russian firms will acquire a significant advantage to control communications to those audiences.

Democratic states must act now to help guide the UN on best practices for countering disinformation, soliciting additional proposals from civil society more broadly in so doing, and ultimately articulating guiding principles for how the information environment ought to be governed.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.