Two experts discuss how drone technology is shaping yet another conflict and what the United States can learn from Ukraine.
Steve Feldstein, Dara Massicot
{
"authors": [
"Katherine Charlet",
"Danielle Citron"
],
"type": "commentary",
"centerAffiliationAll": "dc",
"centers": [
"Carnegie Endowment for International Peace"
],
"collections": [
"Deepfakes"
],
"englishNewsletterAll": "",
"nonEnglishNewsletterAll": "",
"primaryCenter": "Carnegie Endowment for International Peace",
"programAffiliation": "TIA",
"programs": [
"Technology and International Affairs"
],
"projects": [
"Partnership for Countering Influence Operations"
],
"regions": [
"North America",
"United States",
"Iran"
],
"topics": [
"Political Reform",
"Democracy",
"Technology"
]
}Source: Getty
It is only a matter of time before maliciously manipulated or fabricated content surfaces of a major presidential candidate in 2020. Here is what every campaign needs to do in advance of a deepfake emergency.
It is only a matter of time before maliciously manipulated or fabricated content surfaces of a major presidential candidate in 2020. The video manipulation of House Speaker Nancy Pelosi in May demonstrates the speed with which even a “cheap fake” can spread. But the technology is quickly getting more sophisticated, and we must prepare for “deepfakes”—fully synthesized audio or video of someone saying or doing something they did not say or do. Soon (meaning months, not years), it may be impossible to tell real videos from fake ones. The truth will have a tough time emerging in a deepfake-ridden marketplace of ideas.
Doctored media, typically in the form of short videos or audio clips, could be used to embarrass, defame, or otherwise damage candidates for office. Recent advances in artificial intelligence have increased the realism of deepfakes and substantially cut the resources necessary to create them. On August 9, a deepfake of Democratic National Committee Chair Tom Perez was presented to a conference room of hackers, who largely failed to realize that anything was amiss.
The key is in the timing. Imagine the night before an election, a deepfake is posted showing a candidate making controversial remarks. The deepfake could tip the election and undermine people’s faith in elections. This is not hypothetical. In the past six months, manipulated media has targeted a senior Malaysian minister, Donald Trump, and others.
It does not matter that digital fakery can, for the present moment, be detected pretty easily. People have a visceral reaction to video and audio. They believe what their eyes and ears are telling them—even if all signs suggest that the video and audio content is fake. If the video and audio is provocative, then it will surely go viral. Studies show that people are ten times more likely to spread fake news than accurate stories because fakery evokes a stronger emotional reaction. So no matter how unbelievable deepfakes are, the damage will still be real.
Even if a deepfake appears weeks before an election, it can spread far and wide. Thus, campaigns have to act immediately to combat the spread and influence of deepfakes.
Here is what every campaign needs to do in advance of a deepfake emergency:
And what should a campaign do once a deepfake has been released? Though it’s impossible to predict exactly what steps are necessary, the campaign will need to first assess the situation, next counter the falsehood, and finally repair and prevent future damage.
First, campaigns have to assess the potential damage. How harmful is the digital impersonation and how fast is it spreading? A fake video of a candidate saying she prefers Coke to Pepsi is no big deal, but one where the candidate falsely appears saying or doing something despicable could endanger the candidacy and the democratic process. Digital impersonations undermine people’s ability to make informed choices about candidates for office. Voters would be misled.
Countering the video will require quick action. Social media platforms should remove, block, demonetize, or decrease the visibility of digital impersonations and shut down any bots spreading them. Campaigns should be ready to issue statements, post true content, or other evidence to oppose the false narrative.
Repairing and preventing future damage means tackling the political impact of the video, especially if it lingers in key voter groups or demographics. Campaigns should go to those groups to conduct dedicated outreach dispelling the falsehood. They should take stock of—and share—the lessons learned for, sadly, the next attack.
Disruptive digital impersonations are coming, whether via hostile state actors or individuals. Every campaign should start preparing now.
Special thanks to Miles R. McCain for his contributions to this article.
Danielle Citron is vice president of the Cyber Civil Rights Initiative and a professor of law at Boston University School of Law where she teaches and writes about privacy, free speech, and civil procedure.
Former Director, Technology and International Affairs Program
Katherine Charlet was the inaugural director of Carnegie’s Technology and International Affairs Program.
Danielle Citron
Danielle Citron is vice president of the Cyber Civil Rights Initiative and a professor of law at Boston University School of Law where she teaches and writes about privacy, free speech, and civil procedure.
Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.
Two experts discuss how drone technology is shaping yet another conflict and what the United States can learn from Ukraine.
Steve Feldstein, Dara Massicot
Arguing that Chinese policy is hung on alliances—with imputations of obligation—misses the point.
Evan A. Feigenbaum
As Iran defends its interests in the region and its regime’s survival, it may push Hezbollah into the abyss.
Michael Young
Arms supplies from Russia to Iran will not only continue, but could grow significantly if Russia gets the opportunity.
Nikita Smagin
On the fourth anniversary of Russia’s full-scale invasion, Carnegie experts discuss the war’s impacts and what might come next.
Eric Ciaramella, Aaron David Miller, Alexandra Prokopenko, …