Source: Getty
article

How Journalists Become an Unwitting Cog in the Influence Machine

Influence campaigns have long targeted journalists, but a recent operation lays bare the Russians’ plan to exploit the media and sow disinformation in a complex information environment.

by Alicia Wanless and Laura Walters
Published on October 13, 2020

On September 1, 2020, more than twenty journalists worldwide learned they had unwittingly joined a Russian influence operation.

The revelation came shortly after Facebook and Twitter announced the removal of accounts linked to the Russian Internet Research Agency—the same “troll farm” that conducted influence operations during the 2016 U.S. presidential election.

Similar to past operations, the agency created Facebook and Twitter accounts purporting to be news entities. But this time, to appear more legitimate, the campaign “tricked unwitting freelance journalists into writing stories on its behalf” by posing as a nonprofit news organization, Peace Data, and creating editor profiles complete with AI-generated photos.

Following the shock that Peace Data was actually part of a Russian influence operation, some of the affected journalists spoke about their experiences and why they had agreed to work with the organization. Some were happy to be publicly named, while others wanted anonymity to avoid possible career repercussions.

One New York–based writer had just lost his hospitality job due to the COVID-19 pandemic. “I was in need of income and an outlet to build my portfolio. The opportunity to write a column could be the break I was hoping for,” he said. Another journalist was struggling to find work at mainstream media outlets after moving to London during the pandemic and thought it would be a relatively easy way to earn money. One writer said this was her first published article on an independent news source and asked her social media followers to “like” her article so she could prove she deserved a full-time position.

All of the reporters demonstrated a vulnerability—one that the influence operation readily exploited. Even before the pandemic, seasoned journalists were finding it hard to find work in a crowded field. So in the current environment, with mounting layoffs in the media industry, it has become nearly impossible for inexperienced freelancers to catch a break. Therefore, it’s understandable that aspiring and out-of-work writers would jump at the chance to do paid work for an organization that describes itself as independent, without questioning who is behind it or the editorial objectives.

Coupled with limited knowledge of influence operations and little skepticism, the writers’ varying degrees of vulnerability, naiveté, and desperation made them soft targets.

Those posing as so-called Peace Data editors or communications staff fabricated social media accounts with profile pictures created by a generative adversarial network (GAN). They then approached writers and bloggers on Facebook and LinkedIn, as well as on the freelance job site Guru. Claiming to be a new nonprofit covering geopolitical and human rights issues, they offered up to $250 per article via PayPal.

This evolution of foreign influence operations may be concerning, but it shouldn’t be surprising.

This evolution of foreign influence operations may be concerning, but it shouldn’t be surprising. While the technology used in these digital age operations may be new, using proxies to influence a target audience is as old as time. Great military strategists such as Sun Tzu, Niccolò Machiavelli, and Carl von Clausewitz all emphasized using information and disinformation to outmaneuver an adversary. Indeed, Machiavelli counseled his sixteenth-century readers to use those “men who are discontented and desirous of change . . . to open the way to you for the invasion of their country and to render its conquest easy.”

The use of journalists and trusted figures to spread, amplify, and add credibility to influence operations is also not a new tactic. According to political scientist and John Hopkins University professor Thomas Rid, what Soviet and Russian intelligence agencies refer to as “active measures” have been around since the Bolshevik Revolution in 1917. These operations have continually evolved to capitalize on transformations in journalism and the media, but the idea that disinformation and the crisis currently facing Western democracies is novel or unprecedented “is a fallacy, a trap.” This is just the new version of information laundering.

Soviet intelligence honed information laundering during the Cold War. In the 1980s, the KGB ran an operation to implicate the United States in the emergence of the AIDS pandemic. Fears borne from recent revelations about U.S. biological warfare experiments provided active measures specialists with the perfect opening. The KGB planted the story in a small Indian publication, The Patriot, in 1983. Although Soviet news outlets did not run the story until 1985, it spread rapidly over the next few years—and the fake news or conspiracy theory still has legs today.

During this time, the KGB focused on not just recruiting people who had access to secrets but also so-called agents of influence. Agents cultivated contacts with journalists and others, who would sometimes unknowingly be used as platforms to launch or spread fake or leaked stories. Sometimes documents would also be mailed anonymously to journalists.

Since 2016, influence actors have moved away from trying to generate the illusion of grassroots communities using fake accounts toward co-opting existing authentic communities to spread messaging. And, again, this evolution is not surprising: why build audiences from scratch, when existing communities can be easily manipulated or their messaging can simply be amplified as evidence of social discontent?

Many operators “micro-target potential amplifiers via email, direct message, @-mentions, and other forms of direct outreach,” with journalists being a favorite target. They know people are more likely to trust those who are like them or those who they perceive to have influence. This is one reason why countering influence operations is extremely challenging; they occur in a hyperconnected information ecosystem where domestic and foreign actors intersect.

Among other instances, activities targeting the Syrian emergency first responder organization, White Helmets, illustrate the difficulties in disentangling foreign and domestic actors caught up in influence operations. In the West, legitimate domestic voices, including politicians, antiwar activists, independent journalists, and academics, picked up or reflected Russian officials’ narratives discrediting the White Helmets. Far from orchestrating these responses, Russian influence operators basically helped surface counterarguments to claims made by the White Helmets, such as on the origins of a gas attack, or amplified—in international forums—supporting narratives found in Western reports that criticized the White Helmets.

The Peace Data campaign similarly used unwitting but legitimate writers to create credibility. They were “the camouflage” rather than the vehicle used to deliver the payload. The focus on using local authors with domestic reach and credibility was articulated in emails between Peace Data’s so-called communications manager, Alice Schultz, and one of the journalists the organization commissioned:

We are interested in original and honest content concerning the least discussed but important themes. We prefer to ask our authors to write about the countries they live in to reach maxim [sic] objectivity and authenticity.1

Peace Data advertised for writers to cover examples of “anti-war, corruption, abuse of power [and] human rights violations” around the world. Articles relating to the United States painted the picture of “war-mongering and law-breaking abroad while being wracked by racism, COVID-19, and cutthroat capitalism at home,” with the aim of appealing to left-wing voters and steering them away from the campaign of Democratic presidential candidate Joe Biden.

Peace Data asked some writers to share the commissioned work among their own networks and on social media. Its so-called editors also encouraged writers to ask any mainstream media outlets they have a relationship with to republish the material. These efforts help magnify campaigns, especially when the messaging is adopted by well-known reporters, politicians, and celebrities. By compromising key hubs of influence or networks, influence operators can quickly reach a bigger audience and be more believable—thereby increasing the potential impact of an operation that may otherwise languish in the recesses of social media.

By compromising key hubs of influence or networks, influence operators can quickly reach a bigger audience and be more believable.

Peace Data was not thought to have a big impact; it was deemed newsworthy largely because of the links to the Russian Internet Research Agency. Only 14,000 people followed one or more of the suspended accounts linked to the operation. Of the 323 articles that BuzzSumo—a tool for assessing web page engagement rates—had analyzed, 96 percent received fewer than one hundred shares on social media, with 66 percent receiving under ten.2

However, a new framework for assessing the spread of influence operations stresses the importance of factoring in the movement of messaging through high-profile targets such as journalists. Ben Nimmo’s Breakout Scale argues that these targets have the greatest potential to move barely noticed campaigns into prominent positions among substantial new audiences—and influence operators are fully aware of this.

By contrast, many journalists—aspiring or otherwise—have limited knowledge or understanding of these operations; therefore, they can easily become the carrier, or target, of influence operations cloaked within unverified news organizations. “Such influencers can make the difference between a weaponized leak or false story staying in the shadows and reaching a nationwide audience,” says Nimmo.

In the lead-up to the 2020 U.S. presidential election, and beyond, foreign actors will continue to run influence operations. Within weeks of removing the accounts linked to Peace Data, Facebook identified a similar Russian-backed operation, which was attempting to hire contributors and seed their stories with news organizations. And Russians are not alone in using such tactics; earlier this year, tech platforms reported that the American Herald Tribune “was linked to Iranian state media.”

To become harder targets to compromise, journalists need to be aware of this history, the hallmarks of these types of operations, and their own vulnerabilities. Influence operations exist within a complex information ecosystem, of which journalists are an integral part. It’s vital that journalists understand the pivotal role they play in this ecosystem, so they don’t unwittingly become part of the problem. The power to reach many people carries the responsibility to use that power with care.

Five Ways to Be a Hard Target

  1. Be skeptical: If a job offer seems too good to be true, it probably is. Ask questions and put your investigative skills to the test. Be wary of cold-call commissions and direct outreach, and verify any information being leaked to you. If asked to contribute to a non-mainstream publication, research the ownership model, editorial line, and editors’ backgrounds. If unsure, ask for help.
  2. Look for warning signs: In the past, language errors have given away Russian influence operations. In the case of Peace Data, those posing as its staff had new and unpopulated social media profiles, they used GAN-generated avatars, they wanted to pay via PayPal, and they contacted authors out of the blue through social media. Some reporters were encouraged to change the editorial line or angle of their articles. Peace Data staff also asked writers to help get articles republished on mainstream news sites.
  3. Be knowledgeable: Learn the history and hallmarks of influence operations, with books like Thomas Rid’s Active Measures.
  4. Upskill: The European Journalism Centre’s latest edition of the Verification Handbook for Disinformation and Media Manipulation aims to equip journalists with the knowledge to investigate social media accounts, bots, private messaging apps, information operations, deepfakes, and other forms of disinformation and media manipulation.
  5. Stay informed on current influence campaigns: Seek out journalist and newsroom training and real-time information from reputable organizations, such as First Draft.

Notes

1 The email, with personal information redacted, is available from the authors upon request.

2 BuzzSumo data are available from the authors upon request.