Citizens, governments, and tech platforms around the world increasingly struggle to counter influence operations.
We believe that little progress will be made without a spirit of partnership between governments, the tech industry, media, academia, and civil society. Such collaborations are challenging but necessary in order to accomplish the three aims that PCIO believes are vital: to answer difficult policy problems related to influence operations; to find ways to measure the effect of adversarial influence operations; and to develop methods for measurement and evaluation of countermeasures.
Engaging a community of multidisciplinary cross-sector expertise on countering influence operations.
Framing and answering difficult policy problems that require a multidisciplinary cross-sector community to address.
Influence campaigns have long targeted journalists, but a recent operation lays bare the Russians’ plan to exploit the media and sow disinformation in a complex information environment.
The EU needs a disinformation strategy that is adaptable and built to last.
EU officials must coordinate better to mount an effective collective response to disinformation campaigns and influence operations throughout Europe.
Disinformation is disrupting democracies. Yet responses between social media platforms lack formal coordination and investments in counter-disinformation approaches are scarce.
But artificial intelligence (AI) is enabling new, more sophisticated forms of digital impersonation. The next big financial crime might involve deepfakes—video or audio clips that use AI to create false depictions of real people.
As the world continues to weather the coronavirus pandemic, reliable information from public health experts will continue to be a necessity. At the same time, these experts will still face headwinds in getting their message out to a weary or even disenchanted public.
Amid the coronavirus pandemic, Europe and the West are grappling with a host of thorny dilemmas posed by disinformation and foreign influence operations.
In 2018, Twitter released a large archive of tweets and media from Russian and Iranian troll farms. This archive of information operations has since been expanded to include activity originating from more than 15 countries and offers researchers unique insight into how IO unfolds on the service.
Bad actors could use deepfakes—synthetic video or audio—to commit a range of financial crimes. Here are ten feasible scenarios and what the financial sector should do to protect itself.
As fears rise over disinformation and influence operations, stakeholders from industry to policymakers need to better understand the effects of such activity. This demands increased research collaboration. What can tech companies learn from defense-academia partnerships to promote long-term, independent research on influence operations?
In an increasingly crowded, chaotic, and contested world and marketplace of ideas, Carnegie offers decisionmakers global, independent, and strategic insight and innovative ideas that advance international peace. Join our mailing list to become part of our network of more than 150 scholars in 20 countries and six global centers.
Carnegie’s Partnership for Countering Influence Operations (PCIO) is grateful for funding provided by the William and Flora Hewlett Foundation, Craig Newmark Philanthropies, Facebook, Twitter, and WhatsApp. PCIO is wholly and solely responsible for the contents of its products, written or otherwise. We welcome conversations with new donors. All donations are subject to Carnegie’s donor policy review. We do not allow donors prior approval of drafts, influence on selection of project participants, or any influence over the findings and recommendations of work they may support.