Citizens, governments, and tech platforms around the world increasingly struggle to counter influence operations.
We believe that little progress will be made without a spirit of partnership between governments, the tech industry, media, academia, and civil society. Such collaborations are challenging but necessary in order to accomplish the three aims that PCIO believes are vital: to answer difficult policy problems related to influence operations; to find ways to measure the effect of adversarial influence operations; and to develop methods for measurement and evaluation of countermeasures.
Engaging a community of multidisciplinary cross-sector expertise on countering influence operations.
Framing and answering difficult policy problems that require a multidisciplinary cross-sector community to address.
As the world continues to weather the coronavirus pandemic, reliable information from public health experts will continue to be a necessity. At the same time, these experts will still face headwinds in getting their message out to a weary or even disenchanted public.
In 2018, Twitter released a large archive of tweets and media from Russian and Iranian troll farms. This archive of information operations has since been expanded to include activity originating from more than 15 countries and offers researchers unique insight into how IO unfolds on the service.
As fears rise over disinformation and influence operations, stakeholders from industry to policymakers need to better understand the effects of such activity. This demands increased research collaboration. What can tech companies learn from defense-academia partnerships to promote long-term, independent research on influence operations?
A Philippine American journalist has been convicted of “cyber libel.” The troubling case should ring alarm bells in the West too.
Social media companies are better positioned than governments to meet the enforcement challenges posed by influence operations that aren't aligned with hostile states but still cause harm.
Training people who have influence how to wield it, perhaps through a system of licensing, could raise digital literacy and establish reasons to de-platform violators.
The world’s influence operators are exploiting fear and uncertainty around the coronavirus. It will take discipline and discernment to dodge their traps.
The EU Code of Practice on Disinformation was an important experiment that has now come to an end. But what should follow? Without a renewed focus on stakeholder engagement, efforts could stall, putting everyone at risk of disinformation attacks.
The 2020 U.S. presidential election is playing out in the shadow of disinformation, but few candidates are promising to take action against it.
While increasing media coverage is dedicated to how information is used to influence target audiences, a common terminology for describing these activities is lacking.
In an increasingly crowded, chaotic, and contested world and marketplace of ideas, Carnegie offers decisionmakers global, independent, and strategic insight and innovative ideas that advance international peace. Join our mailing list to become part of our network of more than 140 scholars in 20 countries and six global centers.
Carnegie’s Partnership for Countering Influence Operations (PCIO) is grateful for funding provided by the William and Flora Hewlett Foundation, Craig Newmark Philanthropies, Facebook, Twitter, and WhatsApp. PCIO is wholly and solely responsible for the contents of its products, written or otherwise. We welcome conversations with new donors. All donations are subject to Carnegie’s donor policy review. We do not allow donors prior approval of drafts, influence on selection of project participants, or any influence over the findings and recommendations of work they may support.