Influence Operations Researchers’ Guild
The community working to counter influence operations is disparate, and few common standards exist to guide and assess investigators’ work. The Influence Operations Researchers’ Guild attempts to address these challenges by supporting the development of a more cohesive community around the investigation of these issues. It is our hope that the Guild will serve as a place to define and share best practices, to improve research standards, and to strengthen the research community’s response to influence operations on the basis of shared values and norms. Members pledge to support the motivating principles behind these goals as embodied in the Guild Charter, which defines the values toward which they should strive:
We foster and promote the work of a diverse research community and mentor its researchers.
We work together as a community, coordinating around the release of information when working on the same case study.
We make public attributions of actors behind influence operations only when absolutely certain.
We follow platform policies and relevant legislation in the collection of data.
We protect the privacy of those people targeted by influence operations and identified in research.
We make judgements based on facts and facts alone.
We aim for our work to be of the highest quality.
PCIO acts as an impartial administrator of the Guild, managing its public-facing components and the initial process of establishment.
Guild members and other affiliates are expected to adhere to the Guild’s bylaws, which govern eligibility and standards for membership. Click here to read the bylaws governing Guild membership.
This application process is designed to bolster and add depth to a community of influence operations investigators who have undertaken to follow high research standards. The focus of the network will be on improving research standards by convening the best practitioners in one network and transparently sharing lessons learned with the wider community.
Applications are open to both individual researchers and organizations exploring influence operations, specifically those conducting:
All applicants will be asked to demonstrate:
Duties of Guild members include:
After being completed, the application, follow-up communications, and interviews will be compiled into a dossier that will be reviewed by leading experts in the field. This Review Committee will assess applications on a case-by-case basis against transparent and objective criteria as outlined in the application form. New applications will only be accepted from organizations recommended by a Guild member and will be limited to 10 per calendar year. Applicants must be referred by an existing Guild member.
The Guild application process is currently closed and will reopen later this year. In the meantime, if you have any questions, or would like to be notified when the next review round opens, please contact email@example.com
Curious if you or your organization are a good fit for the Guild? Download one of our prompt sheets for a clear set of guidelines to support the assessment of applications from organizations and individuals for the Influence Operations Researchers’ Guild.
Benjamin T Decker is the founder and CEO of Memetica, a digital investigations consultancy. He specializes in the behavioral dynamics of online radicalization, providing intelligence and risk advisory services on a variety of strategic issues relating to coordinated harassment, disinformation, and violent extremism.
Claudia Flores-Saviaga is a Computer Science Ph.D. candidate at Northeastern University. She uses data science to study disinformation campaigns and design large-scale solutions for Latinx communities in partnership with governments, industry, and universities in Latin America and the United States.
João Guilherme Bastos dos Santos is currently the Data Analyst of Rooted in Trust Brazil (Internews/RiT 2.0) and a researcher at the Brazilian National Institute of Science and Technology for Digital Democracy (INCT.DD). His current work focuses on COVID-19 rumors and indigenous and Quilombo communities; previously, he has studied vaccine disinformation, attacks on journalists and democratic institutions, climate change denialism, and new approaches to the use of bots.
Nick Monaco is a disinformation researcher, linguist, and OSINT practitioner. He currently serves as Chief Innovation Officer and Director of China Research at Miburo Solutions. His primary research focus is Chinese disinformation, particularly as it relates to Taiwan and the cross-Strait context.
ASPI provides expert advice on a wide range of issues – including influence operations through its International Cyber Policy Centre. Its diverse operations range from policy-informing research to capacity-building trainings and workshops, all of which has made ASPI an influential voice in global policy debates.
Cardiff’s OSCAR (Open Source Communications Analytics Research) programme has been designed to deliver conceptual and methodological advances that enhance understanding of the strategies, tactics and impacts of digital (dis)information operations.
The Atlantic Council’s Digital Forensic Lab (DFRLab) has operationalized the study of disinformation, exposing falsehoods and fake news, documenting human rights abuses, and building a digital resilience worldwide.
As part of the School of Communication, Media and Information (FGV ECMI), which is a pioneer in the formation of a new profile of professionals working at this intersection, the DAPP Lab is a center for social applied research, and its goal is to promote innovation in public policy using technology, transparency and data analytics.
The Global Disinformation Index (GDI) is a data-driven not-for-profit that operates on the three principles of neutrality, independence, and transparency. GDI's mission is to catalyse industry and government to defund disinformation.
GLOBSEC Centre for Democracy & Resilience’s mission is to help protect democracies against the efforts and actors undermining them. The Centre conducts pioneering research, capacity-building, and awareness-raising activities in Central and Eastern Europe and works with policy-makers, civil society, and media across the transatlantic democratic space.
Graphika leverages the power of artificial intelligence to create detailed maps of social media landscapes, using new analytical methods and tools to help partners navigate complex online networks.
The Media Forensics Hub builds society's capacity to understand the context, origins, and impact of information operations by connecting scientific expertise with practical application. The Hub is part of the Watt Family Innovation Center at Clemson University.
MediaLab is part of Lisbon University’s Communication Science Laboratory (CIES). We research and analyze how communication process have been shaped by the emergence of internet and social media, addressing issues such as how disinformation and counter-narratives circulate online, how they are built and shared, and how information disorders reflect other social issues like trust and digital literacy.
ProBox Digital Observatory analyzes digital socio-political hashtags in Latin America - mostly Venezuela, Cuba and Nicaragua - through trends on Twitter, with a self-developed tool that detects if a narrative is being manipulated. Its team conducts research and awareness campaigns about online disinformation, propaganda and how this behavior attempts to influence public opinion.
The Stanford Internet Observatory is a cross-disciplinary program of research, teaching, and policy engagement for the study of abuse in current information technologies, with a focus on social media. The Observatory is part of the Cyber Policy Center at Stanford University.
Chris Beall is the policy lead, platform governance, at the Centre for International Governance Innovation where he leads the Global Platform Governance Network (GPGN). The GPGN bring together civil servants, regulators, and legislative staff from around the world working on aspects of digital platform governance (e.g., anti-trust, countering foreign interference, mitigating online harms). The goal is to harmonize efforts and meaningfully move the bar forward on key issues including: digital platform transparency, performance measurement, and research.
Beall was most recently the founding director of the Digital Citizen Initiative with the Government of Canada. Previously, Beall held positions within the Government of Canada in national security, border management, and strategic finance and oversight. Beall holds a doctorate from the University of Oxford and is a college fellow at the Arthur Kroeger College of Public Affairs at Carleton University.
Mike Caulfield is currently the director of blended and networked learning at Washington State University Vancouver. An early believer in the idea of civic digital literacies, his work in this area intensified in spring of 2016. His February 2017 work, Web Literacy for Student Fact-Checkers, won the Merlot Award for best open learning resource in the ICT category. He was a runner up in the Rita Allen/RTI International Misinformation Solutions Award (2018). His SIFT model, a practical approach to quick source and claim investigation, encourages readers to take a minute or two to seek out basic information about sources and claims before they engage more deeply with media, and, if necessary, to move on to better material. It is based on research of Sam Wineburg and his own experiences helping faculty to teach critical consumption in the classroom. (photo credit Leah Nash)
Louise Marie Hurel is Special Digital Security Policy Advisor at Igarapé Institute's Digital Security Programme, where she leads Igarapé’s efforts on cyber and digital policy engagement at the national, regional, and international levels. Louise is also a PhD researcher in Data, Networks, and Society at the London School of Economics’ (LSE) Department of Media and Communications. Her research focuses on risk, cybersecurity governance, and incident response. Louise Marie has experience working on the intersections of cybersecurity, Internet governance, technology and policy, cyber norms, and private governance. Her previous work includes consultancy for technical bodies, and research on Internet governance, privacy, intelligence, and security at the Center for Technology and Society at Getúlio Vargas Foundation (CTS-FGV). Louise Marie is a member of the Advisory Board at the Global Forum of Cyber Expertise (GFCE), Carnegie Endowment’s Partnership for Countering Influence Operations’ (PCIO) and the Centre for Information Resilience (CIR). She has also published in main media vehicles such as the Council on Foreign Relations, Americas Quarterly, Open Democracy, and others as well as journals such as the Journal of Cyber Policy. Her recent publications include a co-authored book chapter on “Putting the technical community back into cyber (policy)” in the Routledge Handbook of International Cybersecurity and another on “Cyber-Norms Entrepreneurship? Understanding Microsoft’s advocacy on cybersecurity” in the Rowman & Littlefield’s Governing Cyberspace: Behaviour, Power and Diplomacy.
Jonathan Corpus Ong is associate professor of global digital media at the University of Massachusetts - Amherst. His research on the shadowy political trolling industries in Southeast Asia uses ethnography to understand the identities and motivations of disinformation producers. His policy engagement with the Philippines' election commission led to policy change in social media political advertising in the 2019 Philippines Elections. He is currently research fellow at the Shorenstein Center of Harvard Kennedy School where he studies 1) Covid-19 racism and disinformation, 2) conspiracy theory in tarot and astrology online communities, and 3) the human costs of targeted harassment from the perspective of communications and tech workers in human rights organizations.
Tarunima Prabhakar is the co-founder and research lead at Tattle Civic Tech and a non-resident fellow at Carnegie India. At Tattle, she coordinates work on community driven approaches to misinformation response in India. Her broader research interests are around the implications of prediction algorithms on development imperatives and democratic processes. As a practitioner, she has worked on ICTD and Data driven development projects with non-profits and tech companies in Asia and the United States.
Joanna Rohozinska is the resident program director for Europe with the International Republican Institute. Based in Brussels she oversees IRI’s Beacon Project, which focuses on building resilient democracies. Joanna has been engaged in programs in the post-Communist space for over 20 years, living and working in several countries in the region, and joined IRI after more than a decade with the National Endowment for Democracy’s Europe programme. She holds a graduate degree in Russian and European history from the University of Toronto where she focused on nationalism and foreign policy issues within the Russian and Soviet Empires.
Eneken Tikk (dr.iur) is executive producer of the Cyber Policy Institute (CPI) in Lieksa, Finland and associate researcher at the Erik Castrén Institute of Helsinki University. She began her career as a lawyer with interest in ICTs and public international law and she has been part of developing Estonian data protection, public e-services and cybersecurity legislation. Dr. Tikk was member of the team that started the NATO CCD COE, where she established and led the legal and policy branch. During her term as senior fellow for cyber security at the International Institute for Strategic Studies (IISS, 2012-2016), Eneken published the Strategic Dossier on the Evolution of the Cyber Domain. She advised the Estonian expert in the UN GGE (2012-2013, 2014-2015 and 2016-2017), advising the Estonian experts on international law, international cyber policy and cyber diplomacy. Eneken leads the Cyber Conflict Prevention project at CPI and heads the 1nternat10nal Law project focused on applying international law to state uses of ICTs. She is co-editor of the Routledge Handbook on International Cybersecurity (2020).