Influence operations are a complex threat, and the community combating them—academics, social platforms, think tanks, governments—is broad. The goal of the Partnership for Countering Influence Operations (PCIO) is to grow this community and equip it to fight influence operations worldwide.
Disinformation is disrupting democracies. Yet responses between social media platforms lack formal coordination and investments in counter-disinformation approaches are scarce.
But artificial intelligence (AI) is enabling new, more sophisticated forms of digital impersonation. The next big financial crime might involve deepfakes—video or audio clips that use AI to create false depictions of real people.
As the world continues to weather the coronavirus pandemic, reliable information from public health experts will continue to be a necessity. At the same time, these experts will still face headwinds in getting their message out to a weary or even disenchanted public.
Amid the coronavirus pandemic, Europe and the West are grappling with a host of thorny dilemmas posed by disinformation and foreign influence operations.
In 2018, Twitter released a large archive of tweets and media from Russian and Iranian troll farms. This archive of information operations has since been expanded to include activity originating from more than 15 countries and offers researchers unique insight into how IO unfolds on the service.
Bad actors could use deepfakes—synthetic video or audio—to commit a range of financial crimes. Here are ten feasible scenarios and what the financial sector should do to protect itself.
As fears rise over disinformation and influence operations, stakeholders from industry to policymakers need to better understand the effects of such activity. This demands increased research collaboration. What can tech companies learn from defense-academia partnerships to promote long-term, independent research on influence operations?
A Philippine American journalist has been convicted of “cyber libel.” The troubling case should ring alarm bells in the West too.
Social media companies are better positioned than governments to meet the enforcement challenges posed by influence operations that aren't aligned with hostile states but still cause harm.
Training people who have influence how to wield it, perhaps through a system of licensing, could raise digital literacy and establish reasons to de-platform violators.