Source: Getty
article

Survey on Countering Influence Operations Highlights Steep Challenges, Great Opportunities

To achieve broad and enduring benefits, disparate groups working to combat influence operations need to build a more unified, professional field.

Published on December 7, 2020

While influence operations are not a new phenomenon, their scale has grown dramatically in recent years. As a result, what was once a niche topic has become front-page news. This surge in interest has led to a corresponding surge in the number of academics, NGOs, journalists, and other actors attempting to track, analyze, fact-check, or counter the narratives and spread of these operations. Yet there has been little assessment of this emerging field of study—how well it functions at a systemic level and what common challenges exist. This type of meta-analysis is critical for helping donors, government, and industry leaders target resources toward initiatives with broad and enduring benefit. Individual researchers and practitioners can understand their place in a larger community to better set agendas that break new ground and avoid known pitfalls.

With this in mind, in early 2020, Carnegie’s Partnership for Countering Influence Operations (PCIO) embarked on a project to survey individuals engaged in researching or countering influence operations. The objectives were to broaden knowledge of the organizations working in this field and the types of work being undertaken, as well as to deepen understanding of their work, the issues that matter most to them, and their common challenges.

Two surveys were conducted: one open to the whole community and one targeted to community leaders. However, similar themes emerged during both, and they offer valuable insights on a handful of systemic challenges, including a lack of funding, poor collaboration, limited access to data, and the absence of standardized definitions and research practices. While these challenges will be difficult to overcome, a close analysis of the survey results and themes reveals numerous opportunities to make significant headway.

Surveying the Community and Its Leaders

The PCIO first conducted a general community survey that was open to anyone who self-identified as working in areas related to researching or countering influence operations. Implemented between January and March 2020, the survey was publicly advertised on Twitter and via direct email invitations to PCIO partners and contacts in the field.1 Fifty-three individuals responded, representing academia, civil society, government, the private sector, and the media.2 This survey asked eight questions about the terminology used to describe influence operations, the challenges encountered during research and response efforts, the issues not being adequately addressed, and possible solutions. It also asked respondents to identify influential leaders in the field so the PCIO could follow up with a more targeted survey.

Seventeen community leaders completed this second survey, conducted between July and August 2020. These individuals were either repeatedly mentioned in responses to the first survey or appeared prominently in the PCIO’s analysis of the field’s network.3 As leaders, these respondents have more power to influence expert discourse, governmental policy, or tech platform policy. The survey asked fourteen open-ended questions about the size and work of the respondents’ organizations, their collaboration efforts with partners and donors, their challenges, important trends and overlooked aspects of countering influence operations, and key and emerging players in the field.4

Difficulties Facing a Nascent Field

Results of the two surveys portray a field in its infancy, facing many of the challenges one might expect when processes, institutions, and relationships are still in development. Undefined terms, disparate practices, and a lack of collaboration have created a multitude of challenges. While funding has increased in recent years, organizations still perceive intense competition for resources to support research and activities to counter influence operations. Some challenges stem from a lack of trust, particularly across academia, industry, government, and civil society.

In addition to naming their own challenges, respondents in the community survey were asked to rank six preselected but frequently cited challenges in order of importance.

Respondents in the targeted survey emphasized many of the same challenges, indicating that the obstacles to improving efforts in the field are indeed largely structural and systemic.

Limited Data Access

In the community survey, 59 percent of respondents cited lack of access to corporate or government data as the most or second-most important challenge, making data access the most frequently cited concern. The lack of access to data was also cited as a concern by five respondents in the targeted survey. Both groups of respondents cited increased access to corporate data as a greater need than access to government data.

Two academics in the targeted survey were particularly concerned about the impact of limited data access on their research 5. One argued that, without sufficient data, it’s impossible to determine the extent of disinformation or hate speech problems and to craft appropriate corresponding policy responses. A third respondent agreed that access to data was a significant problem but emphasized the need for higher-quality data rather than larger quantities of data. Under a separate initiative, the PCIO is researching how to better measure the effects of influence operations and their countermeasures, which may help identify the type of data needed.

One community leader was concerned that the EU’s General Data Protection Regulation (GDPR) is a barrier to data sharing. He stated that the EU has refused to immunize platforms from compliance risk, creating a disincentive for platforms to share their data (though the EU does not acknowledge this problem).

Lack of Funding

Almost 40 percent of respondents in the community survey cited a lack of funding as the most or second-most important challenge. In the targeted survey, six respondents also emphasized the precariousness of funding. The PCIO’s research has found that many organizations working to research or counter influence operations are nonprofit, heavily reliant on donations and grants. It is therefore not surprising that funding is a concern, given the growth in new initiatives in this field and increasing competition for the same funding.

One community leader was concerned that long-term funding insecurities are having a detrimental impact on output; without funding security, decisions about recruitment and training cannot be made. Another leader was concerned that financial challenges might incentivize research organizations to compromise their independence or methodological rigor in an effort to attract resources and attention. A third respondent suggested that donors want reassurances of results in return for their money, but in a field with so many unknowns, it can be hard to promise results. The respondent had the impression that some donors are more drawn to headline-grabbing claims of successful interference in political campaigns or persuasive conspiracy theories and that this leaves fewer resources for research on under-the-radar issues and long-term investigations that have no guarantee of success. Another respondent also noted that investigations tend to be skewed toward retrospective analysis of known influence operations instead of prospective experiments. The latter analysis is more expensive and uncertain but could reveal deeper insights.

Tech platforms have significant resources available to fund research, but this can present conflicts for researchers. One community leader was concerned that accepting money from tech companies could open organizations to criticism regarding the independence of their research. He suggested the establishment of a central fund for platform money and other donations, which could then be distributed through an independent process.

Looking ahead, some respondents suggested that the funding landscape could change dramatically depending on the outcome of the 2020 U.S. presidential elections (unknown at the time of the survey).6 They feared that if President Donald Trump were to lose, then some of the largest private donors to the field may conclude that influence operations would no longer be a major problem and it would be safe to prioritize other areas for their donations. Another suggested that, if funding does decline, this could further intensify the competition for funder interest, potentially contributing to a decline in research standards.7

Poor Cross-Sector Collaboration

More than one-third of respondents in the community survey cited a lack of cross-sector collaboration in the field as an important challenge. The concern was also raised by five respondents in the targeted survey. For most respondents, cross-sector collaboration means cooperative efforts that span across academia, industry (platforms or cybersecurity companies), governments, and civil society. Some respondents also used the term to describe collaboration between different academic fields, such as social sciences and technology.

Several respondents suggested that challenges in accessing data are rooted in cross-sector collaboration problems, as social media companies refuse to provide sufficient access and the U.S. government seems disinterested in countering influence operations. One community leader described mutual mistrust between social media companies and researchers. He argued that platforms need to trust researchers in order to release data for analysis, and researchers need to trust the data provided by the platforms in order to rely on it in their work.8 He argued that researchers and academics have a limited understanding of social media platforms’ underlying technology because a relatively small number of researchers have experience serving as senior leaders at social media companies. This makes it difficult for the two sectors to collaborate, as there are disconnects between what is realistic for companies to deliver and what the research community wants or needs.

Within academia, some respondents expressed concern that researchers are focusing more on competition than collaboration and that many in the community lack awareness of who else is working on the same issues. One academic was concerned that researchers are not well connected to others in complementary fields of research, that universities lack departments that focus on information and influence studies, and that government departments struggle to synchronize and coordinate responses.

Another academic was concerned that governments do not seek research to develop their understanding of influence operations or inform their policies.9 And, in fact, two respondents who work in European governments echoed this perspective. One called for academics and researchers to increase cooperation with practitioners working to counter influence operations. The other cited mistrust as a barrier to effective collaboration and communication. This sentiment was echoed by a respondent working in advocacy who cited a lack of trust and misaligned incentives across industry, civil society, academia, and government.

Collaboration is not entirely absent, however. Respondents in the targeted survey—who are among the most connected in the field—all said that they or their organizations partner or collaborate with others, either formally or informally. Formal collaborations include co-hosting events, co-authoring research papers, running training programs, and partnering with industry or submitting joint applications for funding. Informal collaborations often occur between personal contacts, which highlights the importance of having a good network. However, some respondents in the targeted survey did recognize that informal collaborations, often based on personal contacts, are inadequate substitutes for systematic and institutionalized partnerships. One respondent recommended the creation of an inventory of the capabilities and interests of different organizations to help encourage partnerships; otherwise, it could be difficult for organizations to assess requests for partnerships from initiatives that were previously unknown to them.10

An obvious problem associated with the lack of cooperation is a duplication of effort. The less interconnectivity and mutual awareness that exists in the field, the greater the chance that more than one initiative is duplicating the work of another. Other research by the PCIO has found evidence of this problem. A recently published literature review of policy proposals suggests that analysts frequently repeat recommendations already made by others yet often fail to cite these prior, similar proposals. Unsurprisingly, about half of the policy recommendations call for further collaboration.

Respondents offered different perspectives on how organizations could improve cooperation.11 One respondent in the targeted survey suggested that there should be an intermediary organization, operating between platforms and researchers, to facilitate cooperation.12 Others proposed the establishment of organizations that work to pool and share information, coordinate efforts, or identify synergies between initiatives.13 Another respondent wants the field to better integrate a diversity of skill sets, so that social science–oriented research draws more heavily on STEM techniques and vice versa.14

Improving collaboration is likely to be a difficult and long-term process due to the time required to develop high levels of mutual trust and awareness. It is no surprise that an emerging field such as countering influence operations has yet to achieve these levels. Among other steps to facilitate the process, funders might want to consider creating financial incentives for collaboration. The PCIO has been researching how cross-sector collaboration was successfully fostered in comparable fields and how similar practices could be adapted and applied to influence operations.15

Different Terminology

One-third of respondents in the community survey cited the lack of agreed-upon definitions as the most or second-most important challenge in the field. The community survey asked respondents to specify which of eight terms they used, if any, to describe problems associated with using information to influence target audiences.

In total, including under the “other” category, thirty-one different terms have been used by respondents to describe concepts or techniques related to influence operations. Some of the “other” terms used include psychological operations, strategic communication, and political warfare.

The PCIO has documented this profusion of terms in its other research. The aforementioned review of eighty-four policy proposals for countering influence operations has so far identified at least nineteen terms. Keyword searches indicate that different communities of authors tend to coalesce around different terms. For example, “information operations” and “hybrid warfare” are frequently used by the military, whereas “propaganda” is more often used in research related to political communications.

The variety of terms creates potential for misunderstanding and confusion, particularly when terms are not defined. The same phenomena can be described in different ways, and different phenomena can be described in the same way. For example, one community survey respondent said they use “disinformation” as an umbrella term to cover misinformation, disinformation, fake news, propaganda, information operations, and influence operations—a definition that notably diverges from how many others use this term. If practitioners in the field use terms with different meanings interchangeably, then the general public will struggle even more to grasp this body of knowledge.

Community leaders also highlighted the subjectivity involved when describing influence operations. One respondent suggested that the same activity can be labelled “disinformation” by those who disagree with it but be called “strategic communication” by those who support it. Another community leader stated that “good” influence operations exist—citing those by Voice of America as an example—but that there is a need to define “good” and “bad.” One community survey respondent stated that cognitive bias leads people to label anything they don’t agree with as an influence operation.

The confusing array of terms and lack of standard definitions also affects meetings and conferences on influence operations. Many discussions must begin with clarifying the terms of discourse, which can be a distracting impediment to developing policy solutions.

Absence of Common Research Standards

One-third of respondents in the community survey cited a lack of common research standards as an important concern. Respondents to the targeted survey also stressed the need to pool best practices and trade ideas. They warned that siloed information may protect the short-term interests of individuals but can impede the goals of the wider community.

It can be hard to assess the quality of research across the field when researchers develop different practices and there is no common rubric available to judge them. Compounding the problem, researchers don’t always clearly or transparently explain their practices. For example, many investigations into online influence operations conclude by attributing the operation to a specific culprit, but the confidence level for this judgment is not always clearly articulated and assumptions or gaps may be left unstated.

Relatedly, four respondents in the targeted survey and six respondents in the community survey highlighted a lack of methods to measure the effects of influence operations. Two respondents in the targeted survey specifically cited the challenges of assessing both the harms caused by an influence operation and the effectiveness of countermeasures.16 Such assessments are particularly difficult in the absence of standard research methods for doing so.

Survey respondents were aware of efforts to establish best practices but lacked confidence that these efforts were so far producing results. Two community leaders suggested that agreeing on standards would eventually help to professionalize the field. In October 2020, the PCIO launched the Influence Operations Researchers’ Guild to address the lack of common standards and share best practices across the community. The PCIO has also recently launched a program to address current weaknesses in how the effects of influence operations are measured.

Narrow Research Remits

Community leaders worried that current research is often too narrow and repetitive. For example, one respondent said there is an abundance of research into Twitter bots and fake Facebook accounts but much less focus on how influence operations play out across platforms.

Respondents in both surveys worried that researchers seem more drawn to describing influence operations—what platforms they use and what methods they employ—than answering some of the underlying questions: What are the motives behind a disinformation campaign? Why do people engage with problematic content? What does an effective response look like? What legal loopholes do malicious actors regularly exploit?

Other areas that community leaders said are overlooked or underresourced include how governments use influence operations on their own citizens, the role of media in these campaigns, influence operations in Africa,17 and non–English language influence operations. Many of these gaps relate at least partly to the lack of data access and are further compounded by a lack of experienced and interested researchers with the requisite cultural knowledge and by difficulties in attracting funder attention.

Other Challenges

Five respondents in the community survey expressed a desire to invest more in studying domestic influence operations instead of concentrating mostly on foreign operations. The current imbalance may stem in part from researchers’ pessimism that studies of domestic operations could help produce significant real-world change. One respondent in the targeted survey was concerned about the lack of institutions responsible for taking action against domestic influence actors. Another suggested that less can be done to combat influence operations perpetrated by a government or leader against the domestic populace. While tech platforms have carried out takedowns in places such as Spain and the United States, one community leader believed that these cases likely represent only a fraction of the phenomenon worldwide.

Lack of diversity in the field was raised in both the community and targeted surveys. Respondents believed that a more demographically diverse field of researchers with more diverse skill sets could identify new questions or areas for research.18 One community leader reported that it is largely the same group of people who repeatedly appear in literature and dialogue about influence operations. The general community survey also highlighted a desire for more research on the gendered aspects of influence operations and the impact of operations on minority communities.

Community survey respondents also recommended conducting more research on the long-term aspects of influence operations. They want to better understand the enduring effects of influence operations on individuals and culture and how influence operations evolve over time.

Conclusion

There are no quick fixes for addressing these challenges. However, the survey results do suggest that a number of tangible actions could build a framework for improvements. First, there should be a dedicated effort to establish agreed-upon standard definitions for terms that are used across the field.

Next, organizations should work to share best practices in the research, investigation, attribution, and reporting of influence operations and the measurement of their effects. The PCIO’s Influence Operations Researchers’ Guild aims to build consensus on best practices, and the project is also beginning to investigate ways to research, measure, and report on effects. Demystifying some of the tradecraft could also reduce barriers to entry and thereby improve diversity in the field. A more diverse group of community members could ask novel and important research questions.

Drawing on lessons from other fields, organizations should also establish new institutional processes to secure long-term funding and build regular channels for sharing data. For example, the U.S. Defense Department has overcome many of the same challenges that platforms now face—such as protecting data and maintaining the credibility of research—by using financial, contractual, and reputational mechanisms to engage external researchers.19 The PCIO is helping to explore how these mechanisms might be adapted for research efforts on influence operations.

Though the challenges are daunting, there are ample opportunities to develop a stronger structure for researching and countering influence operations. Now is the time to move from a loose confederation of disparate individuals and organizations toward a more unified and professionalized field. Addressing the aforementioned gaps will create solid foundations on which to build a robust response to the common enemy—disruptive and damaging influence operations.

Notes

1 This is not a scientific survey. There was a relatively low number of respondents compared to the large number of people working in this field. At the time of the survey, the PCIO and its associated Twitter account were less than a year old, potentially limiting the reach of the invitation or discouraging some respondents from participating. Also, because the survey did not have a geographical focus, the respondents were not globally representative; most came from North and South America, Europe, and Australia and none were from Africa or Asia. Lastly, the survey was written and promoted in English, which may have excluded non-English speakers from participation.

2 The fifty-three respondents all self-identified as working in areas related to researching or countering influence operations; they came from academia (nineteen), advocacy or civil society (eighteen), governmental or intergovernmental organizations (eight), industry (five), and the media (three).

3 The seventeen respondents came from initiatives focused on policy, research, or analysis (nine); academia (five); the media (one); donor organizations (one); and intergovernmental organizations (one).

4 Respondents in the targeted survey were asked whether their answers could be attributed to them or their organizations. Seven respondents wished to remain anonymous; therefore, some statements in this article are not attributed or only one person is named.

5 Adam Joinson, Professor of Information Systems at University of Bath.

6 Emerson Brooking, Resident Fellow at the Atlantic Council’s Digital Forensic Research Lab.

7 Emerson Brooking, Resident Fellow at the Atlantic Council’s Digital Forensic Research Lab.

8 Also see Jacob Shapiro, Michelle Nedashkovskaya, and Jan Oledan, “Collaborative Models for Understanding Influence Operations: Lessons From Defense Research,” Partnership for Countering Influence Operations, Carnegie Endowment for International Peace, June 25, 2020, https://carnegieendowment.org/2020/06/25/collaborative-models-for-understanding-influence-operations-lessons-from-defense-research-pub-82150.

9 This respondent’s expectation was unclear—for example, whether he expected governments to actively engage with the academic community in response to specific issues or whether he expected governments to directly sponsor more academics to focus on the subject.

10 Emerson Brooking, Resident Fellow at the Atlantic Council’s Digital Forensic Research Lab.

11 Also see Shapiro, Nedashkovskaya, and Oledan, “Collaborative Models for Understanding Influence Operations.”

12 Keir Giles, Senior Consulting Fellow at the Russia and Eurasia Programme at Chatham House.

13 Ibid.

14 Adam Joinson, Professor of Information Systems at University of Bath.

15 See, for example, Shapiro, Nedashkovskaya, and Oledan, “Collaborative Models for Understanding Influence Operations.”

16 Adam Joinson, Professor of Information Systems at University of Bath.

17 Emerson Brooking, Resident Fellow at the Atlantic Council's Digital Forensic Research Lab.

18 Geysha González, Senior Director for Programs and Strategy at the Center for European Policy Analysis.

19 Shapiro, Nedashkovskaya, and Oledan, “Collaborative Models for Understanding Influence Operations.”