Source: Getty
commentary

Knowing How Your Adversary Thinks: Influence in International Confrontations

Decision- and policy-makers need a set of revised influence and deterrence tools and approaches that are applicable to the modern security environment.

published by
Leveraging Neuroscientific and Neurotechnological Developments Workshop
 on October 18, 2013

Source: Leveraging Neuroscientific and Neurotechnological Developments Workshop

The interactive natures of deterrence and influence have been assessed for decades by students of political science, criminal justice, social and behavioral science, marketing and psychology.

Their findings have been largely shaped by the nature of the challenges to be deterred or influenced. The challenges have altered drastically in the last two decades, demanding novel approaches to meet a rapidly evolving threat environment.

Decision- and policy-makers need a set of revised influence and deterrence tools and approaches that are applicable to the 21st century security environment.

The workshop Leveraging Neuroscientific and Neurotechnological Developments with a Focus on Influence and Deterrence in a Networked World introduced an added layer of novel scientific insights from neuroscience and neurotechnology to complement and extend earlier assessments.

It focused on the ways that neuroscience could be leveraged to influence individual actors toward deterring possible inter-personal aggression and social violence. 

Knowing how your adversary thinks: Influence in international confrontations

To manage crises and escalation, or to conduct deterrence operations, it is necessary to predict how an adversary will decide to respond to our actions. Effective deterrence and escalation management thus crucially depend on an understanding of psychology. My work seeks to apply new insights from the modern brain sciences to international security.

One core insight from neuroscience is that when we make an action, the impact it has on the adversary’s decision-making is crucially modulated by the action’s associated “prediction error.” This prediction error is simply defined as the difference between what actually occurred, and what the adversary expected. The bigger the associated prediction error, the bigger the psychological impact of the action.

One reason that prediction errors matter is because they can cause inadvertent escalation in a crisis. When we make an action, we largely know when, where and how we will make the action. But the adversary does not have such insider knowledge. So, to the adversary the action is more unexpected, has a larger associated prediction error and so has a stronger psychological impact than we understand ourselves. As this occurs with the actions of both sides, it can lead to a spiral of inadvertent escalation.

There are many historical examples of prediction errors leading to inadvertent escalation, including the Soviet placement of nuclear-armed missiles on Cuba in 1962. Soviet decision-makers believed that this deployment was not markedly out of keeping with previous actions, and they underestimated the impact it would have on the United States. An example of a serious “near miss” caused by a prediction error is the Israeli reaction to the opening of the Yom Kippur war. Egyptian and Syrian forces had limited war aims in 1973. But to Israeli decision-makers the highly unexpected attack engendered a large prediction error, making them fear for the existence of the State of Israel. As a result, they discussed, and may have ordered, a nuclear alert, which would have been a potentially dangerous escalation of the conflict.

The preceding instance of prediction errors and inadvertent escalation is just one example of the widespread impacts that prediction errors exert throughout military and diplomatic confrontations. However, whilst the impacts of prediction errors are far-reaching, they can be captured by a simple framework. Further, a prediction error framework also subsumes many important existing strategic concepts: for example, the psychological impact of surprise is just an example of prediction error. Together, these features make operationalization attractive and feasible for escalation and deterrence analysis.

I illustrate the potential role of prediction errors using a near-term Sino-U.S. escalation scenario over the Taiwan Strait. Prediction errors are a simple yet powerful tool, which can help U.S. decision-makers manage escalation and influence an adversary’s decision-making.

This speech can be found on page 11 of the full workshop transcript. It was presented at the Leveraging Neuroscientific and Neurotechnological Developments with a Focus on Influence and Deterrence in a Networked World Workshop.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.