PCIO Baseline

Global Perspectives on Influence Operations Investigations: Shared Challenges, Unequal Resources

Despite the shared nature of these challenges, investigators in the Global South face the greatest shortfalls of capacity, funding, attention, and other support. And those in unstable or authoritarian countries face unique threats to their safety and freedom.

Published on February 9, 2022

Most of the established research on influence operations comes from a loose international network of investigators working to expose campaigns, identify their origins, and assess their impact. Within this network, a small subset has the largest megaphone: Western investigators protecting Western publics against the operations of a few non-Western adversarial states. This narrowness neglects the vast and varied influence operations happening elsewhere in the world. As a result, Western governments, platforms, and funders lack the knowledge they need to craft globally relevant policy or offer targeted assistance. Additionally, stakeholders in all countries are missing opportunities to learn from their counterparts in other regions. A more inclusive, transnational conversation could help investigators spot global trends in influence operations, give early warnings of new influence tactics or narratives before they spread further, and disseminate best practices.

To better understand the global landscape for influence operations investigations, the Partnership for Countering Influence Operations (PCIO) convened private, small group discussions during summer 2021. Each discussion involved experts on, and from, a different region: the Asia-Pacific, Africa, the Middle East, Europe, North America, South America, the Caucasus, and South Asia. In total, these discussions involved more than seventy participants, including representatives from civil society, academia, social media companies, media organizations, the donor community, and other sectors.1

Participants from multiple regions converged on six key themes:

  1. This field needs more investment in capacity building, especially for investigations of the Global South and minority populations worldwide. In some countries, this may mean training new investigators. In other countries, the primary obstacles are overreliance on project-based funding with high administrative burdens, or the inability to gain the attention of platform staff when reporting suspicious activity or requesting technical assistance.
  2. While Russian interference in the 2016 U.S. election continues to shape common understandings of influence operations, many investigators—especially those in South America, Africa, and Europe—believe the most common perpetrators are domestic actors drawing from a much more diverse operational playbook. Participants called for more financial support for actor-agnostic research.
  3. Lack of access to social media data held by platforms continues to stymie researchers and investigators. Existing arrangements should be expanded to include more consistent access to data from multiple platforms. Critically, U.S. and European platforms and policymakers must consider how proposed reforms in these jurisdictions might affect researchers in other regions.
  4. Too many investigations are limited to one platform, or one country, while disinformation narratives and influence operations move fluidly across platforms and borders. Improved methodologies and increased support for transregional collaboration could help.
  5. Private messaging applications such as WhatsApp, Telegram, and Signal are increasingly common channels for influence operations, even outside of the Global South’s “mobile-first” societies where this issue first gained notoriety. Researchers have limited means of studying the spread and effects of influence operations on these platforms and face ethical questions about studying private communications. The community should continue to develop best practices and methodological options for this context.
  6. While for-profit actors have long been involved in influence operations, their business models and approaches are becoming more varied and sophisticated. Their role in executing and supporting operations deserves more attention from policymakers, technology companies, and investigative journalists.

Capacity and Funding Remain Inadequate

Many participants were concerned about capacity and sustainability in the professional field of influence operations investigations. This was especially true of participants outside the United States and Europe, who frequently noted that few individuals possess the combination of regional, contextual expertise and investigatory skills necessary to expose online influence operations in their regions. Those who do possess such skills face significant hurdles including, in some contexts, threats to their physical safety and security.

Participants from multiple regions called for investment in a deeper pool of professional investigators. Those who focused on influence operations in Asia, Africa, South America, and the Caucasus noted that the relative lack of data, funding, training, and access to personnel at major social media companies is a constraint on investigative capacity in many countries, especially smaller countries. Platform trust and safety personnel are deployed very unevenly across the world, and many researchers—especially those not working in Western countries—expressed difficulty identifying and connecting with relevant personnel at platforms.2

Even in well-resourced countries, operations targeting discrete minority populations require special attention as those groups may face different threats and vulnerabilities than the general public. Academic researchers in North America were particularly concerned that Indigenous and Spanish-speaking communities often suffer from “data voids,” or lack of credible information on certain topics available online in their languages, which allows influence operators to easily fill the gap. These researchers also noted that influence operations attributed to the Chinese government have targeted Asian diasporas (both Chinese and non-Chinese) in the United States. Because of Asian Americans’ significant diversity, understanding the reach and impact of influence operations targeting them can require competence in multiple languages and knowledge of multiple ethnic and political contexts. It can also require familiarity with key applications: for example, WeChat is a primary platform used by the Chinese diaspora.

Many participants said their work is impeded by the limited availability and restrictive structure of funding. While this problem has already been documented in the United States and Europe, the PCIO’s group discussions confirmed even greater funding challenges in other parts of the world, where responses to influence operations are only now beginning to receive more sustained attention.3 In South Asia, for instance, investigators worry that a lack of financial support for training opportunities is stunting the growth of their field, in which work is already difficult due to the risk of government reprisals. In the Caucasus, participants said that funding from Washington and Brussels often comes with bureaucratic burdens.

Another capacity challenge frequently cited by independent investigators, as well as platform representatives, is their inability to keep up with the volume of potential investigatory leads. Investigators said that social media companies struggle to build scalable tools capable of accounting for the plethora of languages and communities online—but language is not the only limiting factor. Even in South America, where most countries share a broad lingua franca, albeit with subregional indigenous languages also spoken, investigators expressed difficulty keeping pace with the high volume of influence activities online. More support for training and sustaining independent investigative efforts would help, as would an increase in platform staff, resources, and attention—especially in underserved countries outside Europe and the United States.

Industry, government, foundations, and other funders should invest more in the pipeline for producing talented researchers, particularly in places where the need is highest. It is incumbent on platforms—especially the largest platforms, which possess immense financial resources—to seek out, and in some cases cultivate, independent investigators who can help protect the integrity of their services. More funding earmarked for capacity-building and collaboration is another worthwhile idea. Groups that compete for the same project-based funding have diminished incentives to collaborate, but properly incentivized, they can build collaborations like the Election Integrity Partnership that boost their combined impact.

Stakeholders Overemphasize Operations by Foreign Actors

Across most regions, participants cited a tendency by donors, governments, media, and other stakeholders to overemphasize the scale, reach, and impact of foreign (and especially Russian) influence operations in comparison to domestic operations. During the discussion on Europe, observers noted this is even true of Ukraine, where ongoing Russian information warfare and the conflict in the Donbas region are acute challenges; in the words of one participant, there is a tendency to “attribute all ills” to Russia even though “a lot of damage is done internally” by bad actors and persistent factionalism in Ukraine’s polarized media ecosystem. This suggests that funders should increase support for actor-agnostic research and investigations so that limited capacity can be concentrated on the most troubling operations.

Investigators stressed that influence operations come in various shapes and sizes, draw on many different models, and are carried out by a wide array of actors—foreign and domestic, state and non-state. But some worried that the precedent of Russian operations targeting foreign elections plays too large a role in shaping public understanding, government priorities, and societal responses. Participants focused on Africa were especially wary that an overemphasis on foreign operations might create a “free pass” for domestic bad actors. South American investigators, meanwhile, said that activities attributed to both Russia and Venezuela are overemphasized compared to their sway and impact. They also expressed that it is difficult to secure funding for investigations focused on domestic operations, as compared to operations carried out by foreign state actors.4

One academic researcher in North America raised significant questions about unequal treatment of foreign and domestic operations. Do operations attributed to foreign actors receive harsher enforcement from platforms than those conducted by domestic groups? Are misleading claims and narratives from domestic operations covered more often or more credulously by journalists? The discussion did not yield concrete answers, and this may be an area for future analysis.

Although participants generally felt that operations by foreign state actors received outsized attention, such operations are a serious and growing problem in some regions. In India and Europe, participants noted an increase in operations attributed to the Chinese government or to Beijing-linked actors. Meanwhile, in the Asia-Pacific, participants stressed how different kinds of messaging tools—ranging from state media and diplomatic statements to covert influence operations—are aligned to portray Beijing as a better international partner and more stable society than the United States.5 One researcher observed an increase in businessmen from China contributing to local political campaigns in the Philippines—indicating how traditional forms of political and economic influence can overlap with informational activities as part of a broader strategic effort. Participants also cited Taiwan, which faces significant influence threats from Beijing, as a clear exception to general rule that foreign state-run operations are overemphasized by policymakers and major donors.

Nevertheless, many participants said that Beijing, much like Moscow, receives attention disproportionate to its significance in the overall threat landscape. European researchers characterized Chinese-state-run influence operations as just one component of a complex information environment where disinformation mixes freely from a variety of sources: the far right, foreign states and state-supported actors, and non-state actors in foreign, same-language environments. They believed that studies “too focused” on specific actors fail to capture this dynamic. Africa-focused participants expressed similar views. Participants focused on South America and the Middle East mentioned Chinese operations in passing or not at all.

As with other influence operations, attribution continues to be a challenge for analysts focused on potential Chinese state activities. A Taiwanese participant said a common misperception is that all influence operations emanating from China are masterminded by the Chinese Communist Party, whereas in some cases they have been attributed to pro-Beijing business figures who this participant believes create pro-Beijing or anti-U.S. content “without orders or instructions,” or to nationalist internet users within China. Observers should be careful to avoid misattributing such activity to the Chinese government.

Crucial Social Media Data Is Inaccessible

Participants frequently cited the lack of access to data from social media platforms as a challenge. This theme reinforces earlier surveys of the field, which have repeatedly identified data access as a top challenge for influence operations research. While data access challenges have lately received heightened attention from Western policymakers, our group discussions highlighted the unique and often overlooked problems faced in other regions.

Most researchers today rely on publicly available data, which comes with various limitations. For example, if a prominent account is banned, its posts may vanish (depending on the specific platform) along with any insights into who interacted with that content, and how. Some researchers also have ad hoc data access limited to specific datasets from specific platforms, but these arrangements are far from ideal. Datasets may be narrowly suited to certain types of questions and not at all suited to others: for example, it is challenging to research the reach and impact of most operations if given only a list of banned accounts and a subset of their posts. Data from one platform yields limited insights into operations that may take place across multiple platforms—and, as several investigators told us, cross-platform operations are a common and complex challenge. Ad hoc arrangements also allow industry to pick and choose partners, inhibiting external oversight and raising questions about research independence.

Even as U.S. and European researchers campaign for better access to platform data, many participants in the Global South vocalized their struggle to receive even the limited access that their Northern counterparts have today. For example, when Facebook built an archive of historical CrowdTangle data for external researchers, the company provided initial access to a small group of investigators based in the United States, United Kingdom, and Australia. Twitter, for its part, has made at least twelve datasets widely available—but they all cover state-run operations that the platform has taken enforcement action against, leaving out non-state operations all over the world. While both of these disclosures are valuable, more inclusive and comprehensive data sharing is crucial.

Recognizing that today’s ad hoc data-sharing arrangements are inadequate and inequitable, some participants offered practical suggestions for improving such arrangements while more robust long-term solutions remain under development. First, platforms should proactively encourage the development of independent investigative efforts by increasing the number of staff dedicated to countering influence operations in underserved countries, encouraging those staff to build relationships with independent investigators there, and providing datasets to a wider and more international circle of trusted partners. This last point was especially salient to a South American participant who noted that new researchers in smaller countries often lack access to training data. The Disinfodex (a joint database of investigations and platform takedowns related to influence operations) contains publicly available reports on operations in many smaller South American countries, but these reports do not contain complete raw data and they are authored by platforms themselves or by a small set of U.S.-based independent entities. If investigators in South America do not have the opportunity to access private platform data on influence operations or partner in these investigations, this could indeed limit the growth and potential of the investigative community there—and, it seems likely, wherever else similar conditions prevail.

Second, some investigators asked for access to location data for accounts and pages involved in suspected operations. While accounts and pages do self-report location data in some instances, this information can be unreliable and difficult (if not impossible) to verify independently. Platforms have access to other, more useful signals which could aid investigators in determining the location of account owners. One potential step forward would be for platforms to develop specific tools or processes allowing investigators to request assistance verifying the location of suspicious accounts.

While ad hoc data-sharing could be improved, it should ultimately be replaced with more comprehensive solutions. In the United States, the United Kingdom, and the European Union, legislators are considering rules requiring platforms to share data with researchers. (The specific details—what type of data must be shared, how researchers should be vetted, the most appropriate mechanism for facilitating data access—have not yet been finalized.) While promising, these conversations primarily focus on the immediate needs of U.S. and EU researchers, policymakers, and other stakeholders. Conversely, most other governments around the world have so far been unable or unwilling to create similar arrangements, and in some countries political conditions make government-supervised data sharing arrangements unadvisable. Western policymakers and platforms should consider potential mechanisms for researchers in other jurisdictions to take advantage of new data sharing legislation and platform initiatives.6

Cross-Platform and Cross-Country Research Remains Difficult

Participants from several regions highlighted how influence operations frequently move from country to country. For example, participants observed that information spreads quickly among Spanish-speaking communities in both North and South America as well as Portuguese-speaking Brazil. They cited how far-right narratives originating in the United States have spread into Latin America, Canada, and Eastern Europe. Yet participants worried that too few analyses focus on the transnational spread of disinformation narratives, unless these narratives are somehow tied to foreign state influence operations. Transnational analysis can require navigating multiple languages, political contexts, and legal frameworks around personal data. All of these challenges can be eased through cross-country collaborations, though such partnerships require time and investment to nurture.

At the same time, influence operations often cross multiple online and offline platforms in complex, dynamic patterns. Participants described the importance—and difficulty—of monitoring all these channels with a combination of methodologies while navigating multiple platform policies. One investigator raised the example of a South American social media influencer who avoided platform content moderation by using coded language in videos, then messaging followers over Telegram to explain their strategy. This individual also directed their followers to payment platforms to solicit funding for their channel, indicating a need to involve payment service providers in discussions about influence operations.

Other operations jump between encrypted messaging apps (like WhatsApp, Signal, and Telegram), more open platforms like Facebook and Twitter, and video platforms like YouTube and TikTok. Researchers and investigators explained that in some respects, operations on primarily text-based platforms like Facebook and Twitter are easier to scrutinize: video content requires more time and resources to analyze and is less readily machine readable than text-based content, while encrypted messages are largely inaccessible to outside observers. Local context matters too: as one researcher noted, the messaging app LINE is a popular alternative to WhatsApp in parts of Asia, but it receives less attention from investigators and policymakers.

Visibility Into Private Messaging Is Important but Lacking

Influence operations over private, encrypted messaging apps such as WhatsApp, Telegram, and Signal have become a common challenge for investigators around the globe. Once seen as a problem for mobile-first societies in the Global South, during our group discussions participants from nearly every region—including the United States and Europe—raised this issue and the need for new investigative methodologies to address it.7

Participants worried that these operations are becoming increasingly sophisticated. They are shifting from broader, “impersonal” campaigns (using bots and spam techniques) to more relational techniques targeting smaller audiences in relatively private, close-knit spaces (such as WhatsApp group chats). Participants feared that these newer tactics might be more persuasive and harder to counteract, especially if messages containing disinformation are forwarded from within users’ own social circles.

Encryption leaves researchers, investigators, and even platforms with limited insight into influence operations activity. One participant posited a scenario in which an especially effective disinformation narrative, perhaps a deepfake video, spread across encrypted messaging apps in the days before an election. How would researchers identify it and assess its scale and potential impact quickly enough to inform countermeasures?

Workarounds exist, but they have substantial limitations. For example, some fact-checking services operate tip lines where users can forward messages for verification; researchers can ask these fact-checkers to share data on user submissions.8 Investigators can also join public groups on encrypted messaging apps, or infiltrate closed (permission-only) groups. However, these approaches are difficult to scale and may produce non-representative data. They can also raise ethical questions about the use of deception to access closed groups and the treatment of data collected without users’ knowledge. There is at least one proposed ethical framework to guide such research, but more work remains to develop consensus and share best practices.9

The Economics of Influence Operations Are Evolving

Participants described increasing variation in the economic incentives that drive some influence operations and the business models that bad actors employ.

First, many participants across multiple regions highlighted the use of public relations firms and other private vendors to provide “disinformation for hire.” While unscrupulous public voices have always employed falsehoods and other dirty tricks, these companies use modern digital tactics—such as running networks of fake accounts, distributing false or misleading information via online groups and pages, and harassing government critics—often associated with state actors.10 This practice is not new, but its pervasiveness is concerning.11 Participants noted that these service providers vary widely in operational sophistication. At the low end, some operations have little reach or impact and are easily detected due to their use of low-quality throwaway accounts or simple URL spoofing. At the high end, foreign actors sometimes employ local individuals to craft appropriate content that uses local slang and dialects. African researchers called this a “franchise model” and described cases where foreign actors created local shells, such as nonprofits that exist in name only, to create a veneer of credibility and legitimacy.12

Second, participants from around the world said that bad actors are increasingly using social media “influencers” to gain the trust and attention of audiences while masking their own culpability. Co-opted influencers have often had huge followings on platforms like YouTube; however, many participants cited a recent trend toward the use of “nano-influencers” whose follower count may be in the thousands—mirroring trends in legitimate social media marketing. For example, influence operations attributed to the administration of Philippine President Rodrigo Duterte previously paid internet celebrities with large followings to spread pro-government messages, but Duterte is now shifting toward use of a larger number of niche influencers. Participants identified similar trends in the United States. While the efficacy of nano-influencers has not been well-studied, researchers feared that narrowly tailored messages sent to smaller audiences could be more persuasive.

Third, clickbait remains an enduring challenge. One participant focused on South America pointed out that clickbait farms of the type made famous in Macedonia are still a major vector for disinformation but receive less attention than other influence operations techniques. They said these kinds of pages remain attractive opportunities for bad actors located in countries with lower purchasing power, where ad revenue provides more meaningful income. Like their Macedonian counterparts, these pages can also reach audiences across national borders.

Participants explained that these varied commercial influence models and tactics have further complicated the perennial challenge of attributing influence operations to their source. It is hard enough to identify an influence operation and determine who carried it out. If the operator was a private vendor-for-hire, a paid influencer, or a freelancer working for a clickbait farm, investigators must then try to learn who ordered or sponsored the operation. This may not always be possible with publicly available data. While some platforms and jurisdictions require public disclosure of paid social media posts, bad actors may evade detection by making payments through side channels. Journalists and governmental authorities thus have an important role to play in scrutinizing off-platform or offline activity—for example, by obtaining documents and interviewing sources. Moreover, the economic aspects of influence operations are themselves worthy of additional investigation.

Conclusion

The PCIO’s group discussions with dozens of counter-influence operations professionals from around the world shed new light on a still maturing field which, having grown quickly over the past five years, is now encountering obstacles to its continued development. Access to critical platform data is spotty at best. Resources and attention from platforms and international media are largely concentrated in Western countries. Many investigators are hampered by inadequate capacity and narrowly focused funding. What’s more, the funding and attention they do receive excessively emphasizes Russian and other foreign state operations. Collectively, these trends inhibit the quantity and quality of global investigations and focus scarce investigative resources on less urgent priorities.

The influence operations challenge is also quickly evolving, requiring constant innovation by already overstretched professionals. Investigators continue to have difficulty monitoring how information moves across national borders and a growing number of channels. At the same time, the rise of encrypted messaging apps and the diversification of commercial models for influence operations are creating new problems for investigators around the world.

Despite the shared nature of these challenges, investigators in the Global South face the greatest shortfalls of capacity, funding, attention, and other support. And those in unstable or authoritarian countries face unique threats to their safety and freedom. As Western governments, platforms, and funders consider how to address influence operations in their own societies, they should know that other regions face parallel needs of even greater magnitude. It is imperative that stakeholders continue to build bridges of support and collaboration across regions, and to work toward solutions with global impact.

Notes

1 The discussions took place between May and August of 2021. The Caucasus and South Asia discussions were combined into a single session. Participants included twenty-eight representatives from civil society organizations, twenty-five from academia, five from the social media industry, three from media organizations, two from multilateral organizations, one from a donor organization, and six that did not fit into these categories.

2 For example, Facebook reportedly allocates resources for policy enforcement around elections by sorting countries into “tiers.” Most of the world’s countries fall into the lowest tier, which receives significantly less attention and fewer resources than higher tiers.

3 In particular, the authors draw here on two reports: “The Many Faces Fighting Disinformation,” produced by the EU DisinfoLab, and “The Road Ahead: Mapping Civil Society Responses to Disinformation,” produced by Samantha Bradshaw and Lisa-Maria Neudert for the International Forum for Democratic Studies at the National Endowment for Democracy. Both draw on interview-based research to collect insights on how counter disinformation work is conducted and supported. One of the authors of this essay served as editor of the second report.

Similar findings were reported from a PCIO survey of fifty-three professionals in this field. According to a December 2020 writeup of the results, “almost 40 percent of respondents in the community survey cited a lack of funding as the most or second-most important challenge. . . . One community leader was concerned that long-term funding insecurities are having a detrimental impact on output; without funding security, decisions about recruitment and training cannot be made.” Like the above reports, these findings skew toward Europe and North America; no respondents were from Africa or Asia.

Regarding the recent increased focus on the Global South: in January 2021, the International Development Research Centre in Canada issued a grant for a global study on disinformation in the Global South.

4 Attribution to specific actors is one of the most challenging parts of influence operations research, made more complicated by the use of contractors and third parties. However, the claims made here about the prevalence of domestic influence operations are borne out by other research. Consider, for instance, the Oxford Internet Institute’s annual inventory of organized social media manipulation, which finds dozens of governments engaged in domestic influence operations online.

5 Influence operations need not involve provably false claims. In this case, participants were concerned by Chinese tactics—such as the use of state-controlled media and networks of inauthentic social media accounts—that go beyond typical public diplomacy.

6 The risk that researchers outside of the United States and the European Union will not benefit from legislative data-sharing requirements has led PCIO staff to support a multilateral approach.

7 The sole exception was the discussion with participants focused on the Middle East; presumably, this was a quirk of the conversation, as investigators have found operations linked to Iran and other state actors in the region.

8 For an example of a fact-checking project run over a tipline, consider Verificado 2018, a collaborative effort between fact-checkers during Mexico’s 2018 elections.

9 Anonymizing personal data is standard practice, and many researchers disclose their affiliation and research intent when joining messaging groups. There is less clear agreement on best practices for closed messaging groups where members are engaged in coordinated, sometimes paid efforts to spread disinformation or engage in political astroturfing, or how to monitor radical extremists who might threaten researchers’ physical security if they were aware of their identity.

10 Consider also “Disinformation for Hire, a Shadow Industry, Is Quietly Booming,” by Max Fisher and “Running A Disinformation Campaign Is Risky. So Governments Are Paying Others To Do It,” by the Washington Post editorial board. Jonathan Corpus Ong and Jason Vincent A. Cabañes offer readers a close-up view of how these firms operate in their report, “Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines.”

11 Participants did not address whether the problem is expanding or merely detected more often as the field of investigators grows larger and broadens its aperture. Conceivably, growing awareness of and interest in influence operations could be driving growth in the commercial sector offering these services. Consider also “State Sponsored Trolling: How Governments Are Deploying Disinformation as Part of Broader Harassment Campaigns,” by Carly Nyst and Nick Monaco; “The Reputation-Laundering Firm That Ruined Its Own Reputation,” by Ed Caesar; and “Meet the 29-Year-Old Trying to Become the King of Mexican Fake News,” by Ryan Broderick and Íñigo Arredondo.

12 Perhaps the most prominent example is the Russian Internet Research Agency’s hiring of Ghanaian and Nigerian individuals to run operations targeting the 2020 elections in the United States, but this model has also been deployed by Russian actors targeting African audiences: for example, the Stanford Internet Observatory uncovered Russian operations in Sudan that subcontracted with local Sudanese actors.

The creation of government-owned or -controlled nonprofits, as well as shell nonprofits with no physical presence and limited or no staff or operations, is a long-standing problem in the political influence space. For more information on this problem in Nigeria, consider “Fake Civil Society: The Rise of Pro-Government NGOs in Nigeria,” by Matthew T. Page.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.