• Research
  • Strategic Europe
  • About
  • Experts
Carnegie Europe logoCarnegie lettermark logo
EUUkraine
  • Donate
{
  "authors": [
    "Alicia Wanless"
  ],
  "type": "commentary",
  "centerAffiliationAll": "dc",
  "centers": [
    "Carnegie Endowment for International Peace"
  ],
  "collections": [
    "The Day After: Navigating a Post-Pandemic World"
  ],
  "englishNewsletterAll": "",
  "nonEnglishNewsletterAll": "",
  "primaryCenter": "Carnegie Endowment for International Peace",
  "programAffiliation": "TIA",
  "programs": [
    "Technology and International Affairs"
  ],
  "projects": [],
  "regions": [],
  "topics": []
}

Source: Getty

Commentary

On Disinformation

Even when evidence reveals the work of an influence operation campaign, it’s hard to measure the effects and harder still to counteract them.

Link Copied
By Alicia Wanless
Published on Sep 9, 2020

Influence operations are a source of growing public concern, yet it remains difficult to evaluate their impact with any precision. The current understanding has largely been informed by data from social media companies and the operatives themselves, which, in both cases, can exaggerate efficacy. Those behind a campaign have an understandable interest in claiming effectiveness, while social media companies often share statistics such as reach or engagement that do not directly reflect the effect on the audience.

Social media companies often share statistics such as reach or engagement that do not directly reflect the effect on the audience.

According to the confessions of Andrés Sepúlveda, a now-imprisoned Colombian influence operative, campaigns across Latin America used fake social media accounts to create the illusion of popular support or dissent, installed spyware to gain blackmail material, defaced websites, and spread disinformation. Sepúlveda claimed that his team was paid through a chain of intermediaries to work on presidential elections in Colombia, Costa Rica, El Salvador, Guatemala, Honduras, Mexico, Nicaragua, Panama, and Venezuela. Their goal was to help the client candidate win by sowing discord and discrediting opponents. His measurements of success, however, included the rate at which rumors he planted were picked up and spread. Given the many factors that determine how people decide to cast their votes, this metric does not say much about actual influence.

More recently, following the 2016 U.S. elections and the UK’s Brexit referendum, former staff of the disgraced firm Cambridge Analytica insisted that their data-harvesting efforts swayed votes but have yet to provide measurements to back up their claims.

To understand and counter influence operations, analysts must uncover who is behind a campaign and identify their motives and goals. Only once they establish the desired effects can they attempt to examine whether or how they were achieved. In most instances, much of this information is not available. Nor is it easy to isolate the activities of a single campaign to determine what was the sole or greatest factor that influenced an audience’s behavior. Evidence of activity is not the same as proof of effect.

Analysts must uncover who is behind a campaign and identify their motives and goals.

It may also turn out that focusing on single campaigns is not the best way to analyze the effects of such activity. Instead, stepping back to take a more systemic view of the information environment and of patterns in user engagement over time could prove more revealing. Regardless of approach, assessing the effects of influence activity requires a baseline to measure against, a rationale to connect an action to a specific change, and a means for tracking that change within an audience.

To make progress on these questions and begin translating the answers into policy, many stakeholders will have to cooperate. The burgeoning research community focused on influence operations can gather data on how and by whom such operations are conducted and help to standardize terminology and metrics of efficacy. This will only be possible if social media platforms provide data (with appropriate privacy and proprietary safeguards). Collaboration across academic disciplines—from computer science to psychology and linguistics—will add essential insights. For their part, governments and civil society organizations should provide parameters for acceptable measures to counter influence operations, which platforms would be asked or ordered to implement.

Understanding the complexity and magnitude of influence operations may seem like a daunting goal. But the stakes are too high not to try. The alternative will be further waves of alarm that prompt uninformed responses and leave every corner of society at risk.

Alicia Wanless
Senior Fellow, Technology and International Affairs, Director, Information Environment Project
Alicia Wanless

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.

More Work from Carnegie Europe

  • Commentary
    Europe’s Global Test

    The coronavirus pandemic could give birth to a more autonomous and strategic EU. But the bloc must resolve its internal tensions and find its place in an increasingly fragmented world.

      Rosa Balfour

  • Commentary
    On Turkey

    Turkey’s newly expansive and independent foreign policy could spark a reckoning with its allies.

      Sinan Ülgen

  • Commentary
    View From Berlin

    Germany’s handling of the coronavirus will fundamentally change Europe’s economic and political direction.

      Judy Dempsey

Get more news and analysis from
Carnegie Europe
Carnegie Europe logo, white
Rue du Congrès, 151000 Brussels, Belgium
  • Research
  • Strategic Europe
  • About
  • Experts
  • Projects
  • Events
  • Contact
  • Careers
  • Privacy
  • For Media
  • Gender Equality Plan
Get more news and analysis from
Carnegie Europe
© 2026 Carnegie Endowment for International Peace. All rights reserved.