Uploaded by luckapol

Thinking about Nuclear Deterrence Theory Why Evolutionary Psychology Undermines Its Rational Actor Assumptions

advertisement
Comparative Strategy
ISSN: 0149-5933 (Print) 1521-0448 (Online) Journal homepage: https://www.tandfonline.com/loi/ucst20
Thinking about Nuclear Deterrence Theory: Why
Evolutionary Psychology Undermines Its Rational
Actor Assumptions
Bradley A. Thayer
To cite this article: Bradley A. Thayer (2007) Thinking about Nuclear Deterrence Theory: Why
Evolutionary Psychology Undermines Its Rational Actor Assumptions, Comparative Strategy, 26:4,
311-323, DOI: 10.1080/01495930701598573
To link to this article: https://doi.org/10.1080/01495930701598573
Published online: 24 Oct 2007.
Submit your article to this journal
Article views: 770
View related articles
Citing articles: 4 View citing articles
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=ucst20
Thinking about Nuclear Deterrence Theory: Why
Evolutionary Psychology Undermines Its Rational
Actor Assumptions
BRADLEY A. THAYER
Missouri State University
Department of Defense and Strategic Studies
Fairfax, Virginia, USA
For too long, nuclear deterrence theorists have remained apart from the revolution in
the life sciences, and particularly evolutionary psychology, which has fundamentally
changed the scientific understanding of the human mind. As a result of advances in
evolutionary psychology, we now know that how the brain interprets actions and makes
decisions is complicated, imperfect, greatly dependent upon emotions, and varied among
humans. Consequently, it is fundamentally naı̈ve and dangerous to assume a similar
outcome in deterrent situations when there is variation in cognition among leaders.
The rational deterrence model’s assumption of a universal rationality is irredeemably
flawed and students of nuclear deterrence must replace it with a gradated understanding
of rationality.
One of the major findings of the 9/11 Commission Report was that some knowledge of the
attackers and their plot was held by some elements of the intelligence community but not
by all, and no single governmental entity held all the pieces necessary to stop the attack.1
The commission berated the intelligence community for permitting key intelligence to be
stovepiped, held by one community and known within it, even widely known, but not shared
with others.
The problem of stovepiped knowledge is a concern not only for the intelligence community, or more broadly the government, but it is present also in academic communities,
where it is equally hard to break down. As science advances in one area, it takes time for related disciplines to take note and incorporate new scientific knowledge, and more remotely
related disciplines may not notice at all.
We have such a situation today, where great advances in the life sciences, particularly
genetics and cognitive neuroscience, have made possible the rise of evolutionary psychology
and the related fields of biological and cognitive psychology.2 Through their combined
efforts, these sciences are revolutionizing our knowledge of the causes of human behavior
at genetic and somatic levels. For the first time, we are able to perceive how the brain
functions and to understand, in equal parts, what a wonderful and simultaneously imperfect
organ it is. But few scholars in the social sciences notice.3
Most often, the difficulty in removing academic stovepipes is not an urgent matter,
and those wishing change may take heart in Paul Samuelson’s famous quip about theories advancing one funeral at a time. Nonetheless, there are rare instances where academic stovepiping is a critical and time urgent problem. The great progress in evolutionary
psychology is such a problem because of its implications for deterrence theory. This brief
I would like to thank Abigail Marsh for her key insights, and John Friend and Jason Wood for
their comments and research assistance.
311
Comparative Strategy, 26:311–323, 2007
Copyright © 2007 Taylor & Francis Group, LLC
0149-5933/05 $12.00 + .00
DOI: 10.1080/01495930701598573
312
B. A. Thayer
article explains how evolutionary psychology undermines rational deterrence theory. My
argument is important because the key assumption of rational deterrence theory, that nuclear
decision makers will make rational decisions about nuclear use due to their fundamental
rationality, is so influential. This belief is widely shared among governmental decision
makers, the arms control community and major media, and in academic circles. And it is
wrong.
Evolutionary psychology is causing a revolution in our understanding of the human
brain. Comprehending the human brain is now possible due to an understanding of genetics,
neural processing, and technology like the functional MRI (fMRI), which allows scientists
for the first time to be able to understand how the human brain functions by identifying
brain regions with increased blood flow corresponding to increased neuronal activity, after
a stimulus (such as the word “cat”) is provided to a patient. Much work remains to be done
by evolutionary psychologists, but the results are already impressive. The data are troubling
for any discipline or theory that assumes a rational actor.
As a result of advances in evolutionary psychology, we now know that the human
mind is heavily influenced by the environment and body. How the brain interprets actions and makes decisions is complicated, imperfect, greatly dependent upon emotions,
and varied among humans. There is tremendous variation in the human brain, with the
result that threats that work in most circumstances will not work in all and that the appreciation of consequences, upon which rational deterrence theorists depend, cannot be
assumed.
Accordingly, it is fundamentally naı̈ve and dangerous to assume a similar outcome
(e.g. that nuclear deterrence will obtain) in all situations when there is variation in people
(e.g. leaders), even when the consequences are great, as it is when nuclear weapons are
involved. This finding has enormous implications for nuclear deterrence theory: the rational
deterrence model’s assumption of a universal rationality in the face of a nuclear deterrent
threat is irredeemably flawed.
The Rational Actor Assumption of Nuclear Deterrence Theory
The assumption of the rational actor is the anchor for much of nuclear deterrence theory,
and the granite to which the anchor holds is the belief that an opponent will behave as the
actor would in the same conditions. This is the assumption of the “Cartesian mind,” dating
to René Descartes’ philosophical conception that mind and body are discrete entities, that
there is a sharp and resolute distinction between body and mind, that rationality is fully
a function of the mind, and so mind is superior to body.4 It is the belief in the Cartesian
mind that the science of evolutionary psychology is destroying, and replacing with a more
complicated and more sophisticated understanding of the human brain based on the somatic
and neural underpinnings of reason.
The assumptions of the rational actor and the Cartesian mind are reasonable and understandable for several reasons, not the least of which is that the development of nuclear deterrence theory was heavily influenced by game theorists, for whom the Cartesian
mind was a prerequisite. Of course, the assumption of the Cartesian mind is essential to
more than deterrence theory, but often is critical for game theoretic and formal modeling analyses and methodological, and consequently economic, theorizing. Thus, the revolution in evolutionary theory may be applied far beyond nuclear deterrence theory, and
as science advances, other disciplines and methodologies will have to acknowledge its
implications.
Thinking about Nuclear Deterrence Theory
313
The arguments of the rational deterrence theory school of thought depend on the Cartesian mind. The essence of the argument that concerns us here is that the threat of unacceptable retaliation will be sufficient to deter aggression. It is easy for all cultures, economic
systems, ideologies, and leaders to understand the consequences of nuclear destruction,
and that understanding is a powerful deterrence to aggression. Proponents of this school
include such eminent nuclear deterrence theorists as Bernard Brodie, Thomas Schelling,
Glenn Snyder, and Robert Jervis.5
Thomas Schelling captured the essence of rational deterrence theory’s position on
rationality when he wrote: “you can sit in your armchair and try to predict how people will
behave by asking how you would behave if you had your wits about you. You get, free of
charge, a lot of vicarious, empirical behavior.”6
The proponents of rational deterrence theory may be correct in the U.S.-Soviet case.
The history of the Cold War is debatable. But, even if so, generalizing that outcome from the
U.S.-Soviet case is laden with risk. Keith Payne’s scholarship has demonstrated, first, many
of the difficulties and near-run failures of deterrence during the Cold War; and, second,
why deterrence in the post–Cold War era is impregnated with multiple and varied perils
and hazards, some of which are similar to deterrence in the Cold War but many of which
are not.7
The insights of evolutionary psychology for human rationality further weaken rational
deterrence theory in circumstances where a single leader, or small groups of individuals,
makes decisions concerning whether to employ nuclear weapons. Accordingly, its insights
are applicable to decision making in North Korea or Iran, and other states where small
numbers of individuals control the decision to use nuclear weapons, and so where the psychology of these few individuals is more salient. The force of my argument is significantly
reduced when we consider those decisions in the context of an established nuclear power
and democracy like the United States.
The First Insight for Deterrence Theory: Cognitive Abilities Vary Greatly
among Humans Due to Human Evolution, Biology, and Trauma
Evolutionary psychology’s first contribution is to explain scientifically what we already
know intuitively and through personal experience. People’s minds can differ in many important respects, including in the ability to reason, in judgment, in decision making, and
thus in what is considered “rational” or “rational behavior” to the individual.8 Accordingly,
how people react in any situation will vary.
Of course, the causes of differences in rationality are multiple. Some of these causes
have been well studied in the deterrence literature. For example, psychological causes of
rationality failures such as misperception are analyzed by Robert Jervis, Richard Ned Lebow,
and Janice Gross Stein.9 In a similar vein, Jerrold Post’s scholarship on political-personality
profiling is important due to its insights into the leadership of states like Cuba, Iran, and
North Korea, and of terrorist organizations like al Qaeda.10 Psychologists would now call
these insights on misperception or leadership personality the work of social psychology
rather than evolutionary psychology.
What evolutionary psychology permits us to grasp is that the causes of these differences are often due to physical differences, social or environmental forces, or problems
that are not sui generis, but the result of the evolutionary process that has produced significant variation in the human brain, and thus mind.11 There are three major sources of
variation.
314
B. A. Thayer
First, these physical differences may be natural; for example, caused by variations in
genotype, which, in turn, is a result of evolution by natural selection.12 It is important to bear
in mind that evolution does not guarantee what we would deem perfectly rational behavior.
This is because what is in the genes’ best interest may not be in the vehicle’s best interest.
An understanding of evolution allows us to grasp that humans are rational enough to survive
in the environments of evolutionary adaptation. Equally certain, evolution also produced
mental pathways or shortcuts that lead to irrational outcomes. Evolutionary psychology is
only now beginning to document these mental shortcuts or problems with perception that
lead to irrational outcomes. Phobias may be of this category, arising out the recurrent threats
faced by humans in the environments of evolutionary adaptation.
Second, people’s brains, and thus minds, vary as a result of other biological causes.13
The biological causes of variation are being explored by evolutionary psychologists in some
detail. This work is fascinating and allows us to understand how complicated the brain is
and how poorly we understand its evolution and functioning.
Keith Stanovich’s work has been particularly important in this instance. Stanovich and
his colleagues have popularized the theory held by many evolutionary psychologists that
there are two types of cognition that operate simultaneously, each with its own strengths
and weaknesses. This is called the “dual-process” or “two-process” theory of cognitive
functioning.14 The importance of this is to recognize that humans have more than one
thought process operating at the same time, an autonomous set of systems and an analytic
one. The autonomous evolved first and the analytic one built on top of it, but the thought
processes interact and are influenced by others. Such an imperfect structure came about
due to evolution. Each system solved problems that humans encountered over evolutionary
time. But with the result that the brain is not an ideal, seamlessly operating system, but
one with a host of computational biases that promoted efficacy of response in its evolution
rather than pursuit of reason.
After all, an antelope runs from a lion not because it has been bitten before by a lion,
and does not want to repeat the experience. It runs from any large object that intrudes into
its comfort zone, even animals meaning it no harm. Mental heuristics or tools like “run from
any large object approaching,” or “watch out for what might be lurking in the river when
you are near it” or a deep and persistent unease about what might be near you when you are
swimming in the ocean remain in our mental architecture as threat detection devices and
therefore influence thought.
Additionally, Stanovich and Richard West confirm that human behavior often departs
from normative models or the expectations of behavior that rational choice theorists, and
thus, rational deterrence theorists, postulate.15 They find that there is such variability in human reasoning that assumptions of rational competence across populations are problematic.
People make decisions based on internal mechanisms that differ even if threat or other
environmental factors are held constant. Importantly, this is the case for some people even
if the threat is very high, as would be the case with nuclear threats. While most people will
be deterred by nuclear weapons most of the time, it is equally true that some people, in
some situations, will not be because their brain is different in critical ways. Thus, how they
reason in critical situations is dissimilar, and so how they act will be different as well when
compared with a normal or control population of people.
Third, the physical differences may be temporary or permanent as a result of infection,
intoxication, or emotional or physical trauma. The classic example of spectacular trauma
is the 1848 tragedy of Phineas Gage, a Vermont railroad worker who had a three-foot
metal rod pass through his brain in an explosion, which did not kill him, but removed
the limbic areas of his brain—the part of the brain concerned with moral reasoning.16
Thinking about Nuclear Deterrence Theory
315
Gage survived and lived until 1861, but he was a completely different person. The rod that
passed through the left side of his skull “compromised the prefrontal cortices in the ventral
and inner surfaces of both hemispheres while preserving the lateral, or external aspects of
the prefrontal cortices.”17 The part of the brain critical for normal decision making, “the
ventromedial prefrontal region,” was greatly injured or removed, hindering or eliminating
Gage’s “ability to plan for the future, to conduct himself according to the social rules he
previously had learned, and to decide on the course of action that ultimately would be most
advantageous to his survival.”18
Gage’s story is remarkable and vivid, but not unique. The medical literature contains
many other examples of less dramatic injuries to the brain that reveal results that are equally
as interesting as Gage’s saga. These cases have allowed physicians to recognize how complicated trauma to the brain is, how it affects moral choices, and how emotions influence
moral judgment. A study of patients with focal damage to the ventromedial prefrontal cortex
(VMPC) is particularly important. The VMPC “projects to basal forebrain and brainstem
regions that execute bodily components of emotional responses, and neurons within the
VMPC encode the emotional value of sensory stimuli.”19 The study reveals that patients
“exhibit generally diminished emotional responsivity and markedly reduced social emotions
(for example, compassion, shame, and guilt) that are closely associated with moral values,
and also exhibit poorly regulated anger and frustration tolerance” in other circumstances.20
At the same time, there was no impairment of their ability to reason logically, general
intelligence, or declarative knowledge of social and moral norms.
In essence, these patients differ from the rest of the population only when it comes to
high-conflict personal moral dilemmas, such as sacrificing some lives for a greater good.
In life-or-death situations, these patients have no emotional reaction, and so “people with
this rare injury expressed increased willingness to kill or harm another person if doing so
would save others’ lives,” and they might also be willing to sacrifice a population for the
greater good.21
The prefrontal cortex is the “guardian angel” of human behavior. In most cases, it
forces us to take a step back from initial thoughts of rage. In these individuals, it is not
normal, and as we move further away from the mean, we enter the realm of the extreme
male mind, and further still, the dominion of the psychopath. That is, a person who does not
fear and has no or little empathy, as revealed by patient interviews and tests such as fMRI
and Positron Emission Tomography analyses, which document deficits in the prefrontal
cortex for convicted murderers.22 The reasons why the extreme male mind would evolve
might be due to the pressures of hunting, navigation, leading small bands of males, some of
whom are potential rivals, defense against predation from animals or conspecifics, or battle.
The power of these findings may be appreciated when we apply it to political decision
makers. Leaders of states are far from a typical group of people. They are older than
average, overwhelmingly male, with an overrepresentation of the characteristics of the
extreme male mind including great ambition, a great sensitivity to threats, less sensitivity
to fear, aggression, a high tolerance for risk and solitude or isolation, and low empathy for
other humans.23 These behaviors are so common among political leaders or scientists that
they are ordinary.
Think of Isaac Newton’s ability to enter into seclusion for days while working on a
theoretical problem, or Albert Einstein’s intense focus while doing the same. Consider as
well Stalin’s reactions to mass killings (one death is a tragedy, a million deaths is a statistic)
or, as a man, his reactions to his mother’s or son’s death.24 Mao’s mind must have worked
remarkably like Stalin’s. Reflect on Mao’s similar reactions to mass killings during the
316
B. A. Thayer
Revolution, the Great Leap Forward, and the Cultural Revolution, or, again at a personal
level, as a man, to his son’s death in the Korean War.25
More recently, there have been numerous stories in the British Press about the behavior
of Gordon Brown, who has been described as “a control freak,” “taciturn,” “uncooperative,”
but, tellingly, really no different from most of those who are at the top of politics. According
to one former British minister, who worked with Brown: “If being psychologically flawed
means he focuses on details, almost obsessively so about the detail, then he is guilty as
charged. He is a workaholic. He is difficult. But tell me which prime minister hasn’t been.”26
F. Scott Fitzgerald’s comment to Ernest Hemingway that the very rich really are different
also applies to leaders’ brains.
Clearly, the leader of a nuclear state with VMPC trauma or VMPC symptoms due to
genetic causes would be a particularly difficult case to deter. He might be very willing to
sacrifice a population or a segment of a population for the general will or an ideological
goal. If we ponder the great leaders of history, we cannot know, of course, precisely how
their brains functioned, but if we consider their behavior through its lens, some of their
actions are more understandable.
Ultimately, the behaviors associated with these variations in the male brain would have
paid dividends for humans in the environments of evolutionary adaptation of the human
past.27 These individuals have minds that are most likely to be ruthless, to be in positions
of leadership, and to engage in bold and risk-accepting behavior, if unchecked by other
factors, which will lead to deterrence failure.
The Second Insight for Deterrence Theory: The Mind is the Product
of the Brain and Somatic Inputs Such as Emotion
Evolutionary psychology now recognizes the importance of emotion and the body for
rational thought.28 This finding results in the “death of Descartes,” since René Descartes
argued for a sharp and irresolute distinction between the body and mind, and fully located
rationality in the mind. Descartes’ position was overwhelmingly influential for centuries, but
is now rejected by science. Psychologists submit that a mind/body bifurcation is impossible,
and information provided by the emotions is a key part of good decision making.29 Indeed,
emotions may have evolved to help humans make decisions that would advance fitness over
evolutionary time.30
The abandonment of Descartes’ position is required because evolutionary psychology
has revealed how somatic based signals to the brain influence and can impair reasoning.
Modern psychology has revealed the importance of each of the components of the brain,
including the amygdala and nucleus accumbens. The amygdala triggers social and emotional reactions, particularly those related to fear, aggression, and for faces that humans
deem “unapproachable and untrustworthy,” all of which is consistent “with the amygdala’s
demonstrated role in processing threatening and aversive stimuli.”31 The nucleus accumbens
integrates cognitive information and emotional information.32
The discipline has shown that failures of rationality often occur due to the influence of
biological drives that can be manifest as emotion or mood, periods of emotion that are last
longer lasting, longer than a few hours. But there is much to be done to reveal the mysteries
of the human brain’s components.
While each of the components of the brain need to be studied in more detail, the work
of evolutionary psychologist Antonio Damasio is particularly important because he has
shown how the brain integrates with the rest of the body. Damasio’s scholarship reveals two
Thinking about Nuclear Deterrence Theory
317
critically important elements: first, the human brain and body are indissociable organisms
integrated by means of interacting biochemical and neural regulatory circuits; and second,
the organism interacts with the environment as an ensemble—neither body nor mind are
alone.33 In sum, emotions are essential for rational thinking. They are part of the neural
underpinnings of reason. Feeling is a product of the body as well as the brain, and, again,
somatic sources of feeling vary among humans due to evolution.
A major implication of this argument is that our biological drives, and the automated somatic-marker mechanisms that interpret them in the brain, influence humans at
subconscious levels. These are emotions and emotions are considered by psychologists to
be a mental state of readiness for action, or change in readiness which may not be made
consciously.34 We may think of this as our gut feelings that have considerable attentional
properties, such as: “Danger!,” “She is really hot!,” “Wonderful!,” or “Go for it!” They
are essential for some rational behaviors, especially in the personal and social domains. It
is no exaggeration to conclude that we could not function without them. As Keith Oatley
explains, “an emotion tends to specify a range of options for action. When frightened, we
evaluate a situation in relations to a concern for safety and become ready to freeze, fight,
or flee. We stop what we are doing and check for signs of danger.”35 In such emotional
states “we are pressed toward a small range of actions in a compulsive way,” including a
preoccupying, even compulsive, inner dialogue.36
Yet, it is equally true that emotions as causes of a change in action readiness can be
pernicious to rational decision making in certain circumstances. This is because research
has shown that events “encoded in a certain state of affect or mood are more retrievable than
in a different” state, that is the coding and retrieval of memory can be mood dependent.37
In addition, they create an overriding bias against objective facts, or interfere with support
mechanisms of decision making such as one’s working memory, or the lessons we draw
from past experience.
There are numerous examples from history of this influence on decision making. Think
of Hitler acting on his gut reactions to events. Saddam Hussein did. Today, we worry that
Kim Jong Il may act this way as well. A more prosaic manifestation is the fear of flying.
Many more people fear flying more than they do driving in spite of the fact that the rational
calculation of risk unequivocally demonstrates that we are far more likely to survive a flight
between two cities than a drive between them. Yet, the fear is still there because the somatic
marker is more powerful than countervailing reason.
Interestingly, Chandra Sripada and Stephen Stich postulate not only that emotions
are critically important in specific aspects of decision making, specifically, the mental
representation of norms, values, and goals, but cultural aspects are important as well.38 As
Sripada and Stich explain, some cultures, “cultures of honor,” will have more aggressive
responses to a challenge than other subjects. These are cultures that have developed in many
parts of the world where:
males in these cultures are prepared to protect their reputation for strength
and probity by resorting to violence. The importance placed on reputation for
strength leads to a corresponding importance placed on insult and the necessity
to respond to it. An individual who insults someone must be forced to retract.
If the instigator refuses, he must be punished with violence or even death.39
They suggest that cultures of honor have arisen at many different places in the world in
conditions “where resources are liable to theft and where the state’s coercive apparatus
cannot be relied upon to prevent or punish theft. These conditions often occur in relatively
318
B. A. Thayer
remote areas where herding” is important and the “‘portability’ of animals makes them
prone to theft”; and consequently a reputation for strength and a willingness to display
aggression would have major benefits.40
Their research has documented that males who are insulted will have much higher
physiological responses, significantly greater increases in cortisol and testosterone, than
males from other cultures who are exposed to the same stimuli in tests.41 We can imagine
that similar factors might be at play in matters of prestige.
Thus, somatic factors heavily influence the mind, as do social forces such as one’s
culture. If unrestrained by other forces, as a leader in a dictatorship would not be, this would
defeat the outcome predicted by rational decision making and result in deterrence failure.
In consequence, the rational deterrence theory must be modified so that an appreciation of
the importance of this variation is acknowledged by practitioners.
The Third Insight for Deterrence Theory: Kahneman and Tversky’s
Central Arguments are Supported
Briefly, I want to touch upon the important arguments of Daniel Kahneman and Amos
Tversky. My objective is not to review their arguments, made over twenty years, in detail,
but rather to note how evolutionary psychology confirms the thrust of their arguments.
Evolutionary psychology allows us to understand that the cognitive errors identified
by many psychologists, including most importantly Kahneman and Tversky’s arguments
on heuristics and biases, are indeed the product of the human brain.42 When they first
advanced their arguments in the 1970s, Kahneman and Tversky were widely attacked, but
their research has since transformed psychology, and the insights of evolutionary psychology
support their arguments, so deterrence theorists should have no hesitation drawing upon
their research.43
Under the standard view of rationality used by economists and most deterrence theorists,
it is assumed that people have stable, underlying preferences for each of the options presented
in a decision situation. That is, a person’s preferences for the options available for choice
are complete, well ordered and ranked in terms of the axioms of reason and for transitivity.
Kahneman and Tversky have shown that this conception of bounded, rational decision
making with well-ordered preferences is not true—people do not have stable, well-ordered
preferences, and humans have multiple factors operating upon them that lead to nonrational
outcomes. These can be social pressures, as many psychologists have identified. But they
also can be the cognitive biases, the “faulty” wiring of the brain, and a great dependence
on judgmental heuristics, or mental short cuts.
The key findings of their work, called “cumulative prospect theory,” are that humans
have a “value function that is concave for gains, convex for losses, and steeper for losses
than for gains,” and “a nonlinear transformation of the probability scale, which overweights
small probabilities and underweights moderate and high probabilities.”44
The implications of cumulative prospect for my purposes are three.45 Each adversely
affects decision making. The first, called framing effects, is that people’s choices—even
choices about very important decisions—can be altered by irrelevant changes in how the
alternatives are presented and in how they respond to the alternatives presented to them.
Many cognitive biases, including the endowment effect, result from different, nonrational
weights that people put on risks versus benefits, or losses versus gains.
Human minds frame information in specific ways, often using heuristics. Each of these
can introduce failures in rationality. When humans frame, the process itself can affect their
Thinking about Nuclear Deterrence Theory
319
preferences. A classic example is querying subjects in a psychological test: “Which cancer
therapy do you choose? The first has a 90% survival rate, the second a 10% mortality rate.”
Alternative descriptions of a decision problem give rise to different preferences. Indeed, how
we frame can cause an outright reversal of our actual preferences, violating the principle of
invariance so important to rational choice.
Second, in certain circumstances humans will be risk seeking. Tversky and Kahneman
report two interesting points. First, that people “often prefer a small probability of winning a
large prize over the expected value of that prospect”; and second, “risk seeking is prevalent
when people must choose between a sure loss and a substantial probability of a larger
loss.”46
The third is loss aversion. For the human brain, there is a strong asymmetry between
gain and loss.47 The latter is always perceived more acutely, and humans are risk accepting
in the domains of losses. Humans have a strong attachment to possessions and their losses
are felt more powerfully than nonmaterial losses.
Kahneman and Tversky’s findings ensure that bargaining and negotiation are more
difficult than rational choice theorists recognize because a concession that results in a
loss will weigh more heavily than a concession that results in a gain. Humans widely use
heuristics which adversely impacts rational decision making. One of these weaknesses is a
devastating ignorance and defective use of probability theory and statistics by humans. A
vivid story, even fictive, seems more real to humans than a statistical table. Another is that
humans tend to make up their minds quickly and change them only with reluctance, and to
attribute behavior to dispositions rather than impersonal causes. We may prefer to think of
the British military leadership in World War I as stupid “donkeys,” rather than recognizing
that no one, not the British, French, or Germans, could solve the problem of stalemate on
the Western Front.
In sum, prospect theory not only confirms what we already know, that our choices can
be fallible, but that the patterns of fallibility may be identified, and thus demonstrate the
contingency of assumptions of rationality. For deterrence theory, this means that decision
making will diverge from the expectations of rational deterrence theory in the domain
of losses, and theorists must also pay close attention to how events are framed by the
participants on all sides of a deterrence equation.
These sources of failure for rationality cannot be turned off, they can only be overridden
if individuals are sensitive to them, or if there are countervailing pressures on the individual.
Ultimately, they arise because of the manner in which the human brain evolved. The brain
evolved not seamlessly, but modularly. Each module was created by evolutionary pressures,
with each modular building over evolutionary time upon one another, and connecting to the
rest imperfectly. Students of psychology and nuclear deterrence theory must always keep
in mind that the human brain evolved not for the present conditions in which humans in
modern societies find themselves in the last few generations. Rather, humans evolved over
hundreds of thousands of generations in radically different conditions marked by resource
scarcity and omnipresent threats from the natural world and from conspecifics.
Conclusion
Humans do not have a Cartesian brain. They have a human brain. Its cognitive capabilities
are equally impressive and imperfect, as is everything produced by evolution. Drawing
on evolutionary psychology allows deterrence theorists to understand that there is great
variation in the human brain and that it is influenced by heretofore unappreciated factors,
and these insights allow deterrence theorists to grasp that deterrence may fail for solid
320
B. A. Thayer
biological reasons (cognitive, somatic), as well as for environmental (political/cultural,
etc.) or other causes.
That recognition, in turn, should force deterrence theorists to abandon sanguine assumptions about rationality based on the Cartesian brain that economists have used for
generations and upon which rational deterrence theory is anchored. This recognition also
compels us to understand that the rationality of political decision makers, particularly in
authoritarian and totalitarian states. Evolutionary psychology allows us to understand what
most deterrence theorists would label as flawed decisions taken by imperfect leaders, and
to understand that imperfect leaders are inevitable since they are the product of human
evolution, which has produced variations among people in many physical respects. And
it allows students of nuclear deterrence to appreciate the importance of having checks on
political decision making, as democracies do.
The threat of annihilation due to nuclear weapons is real and potent. But it is equally true
that minds vary. And that threat is not sufficient to deter all people. Since threat is a product
of mind, mind depends on the brain, the brain on neural events and somatic inputs such as
hormones. And good health always helps the functioning of each. Each of these elements
varies across populations. In leaders, certain traits are going to be overrepresented—many of
these traits are necessary for good leadership. But they may also lead to deterrence failures
since leaders may react rashly or precipitously, or in the face of reality, as did Adolf Hitler
or Saddam Hussein.
The greatest impact is where decisions are taken by a dictatorial leader. Understanding
that his mind may interpret data differently than you would is the beginning of comprehension. Not everyone who sits in Thomas Schelling’s armchair and reflects on a particular
situation will come to Schelling’s conclusions. He will. Others will concur with him. But
those who matter—leaders of dictatorial states—certainly may not. What is salient for
nuclear deterrence is not the mind of Thomas Schelling but the mind of Kim Jong Il.
Notes
1. The 9/11 Commission Report: Final Report of the National Commission on Terrorist Attacks
Upon the United States (New York: Norton, n.d.), p. 403. For a more detailed analysis of the problem
of stovepiping in the intelligence community, see William J. Lahneman, “Knowledge-Sharing in the
Intelligence Community After 9/11,” International Journal of Intelligence and Counterintelligence,
vol. 17 (October–December 2004): 614–633.
2. Each of these fields is closely related, of the same psychological family if not twins. For
brevity’s sake, I will refer to evolutionary psychology throughout this article. As with all evolutionary explanations, evolutionary psychology provides ultimate explanations of human cognition, the
evolutionary foundations of human behaviors and the cognitive processes associated with them. The
types of explanations are focused on determining that a certain behavior is widespread or even universal and so evolved or innate; and attempting to show why a particular behavior would be adaptive.
Cognitive psychology focuses on proximate explanations and is centered on how humans think. It
includes topics like memory, learning, attention, language, and motor control. Biological psychology
also is concerned with proximate explanations of what the different components of the brain do. I am
grateful to Dr. Abigail Marsh, Mood and Anxiety Program, National Institute of Mental Health for
these distinctions. Dr. Abigail Marsh, personal communication, April 17, 2007.
3. Certainly there are some in the social sciences who recognize the revolution in the life
sciences. The relatively new discipline of evolutionary psychology is perhaps the most important.
Steven Pinker, John Tooby, and Leda Cosmides, among others, deserve great credit for helping to
bring this about through their scholarship, and for training the next generation of scholars who will
build on their work. See John Tooby and Leda Cosmides, “The Psychological Foundations of Culture,”
Thinking about Nuclear Deterrence Theory
321
in Jerome H. Barkow, Leda Cosmides, and John Tooby, The Adapted Mind: Evolutionary Psychology
and the Generation of Culture (New York: Oxford University Press, 1992), pp. 19–136; and Steven
Pinker, The Blank Slate: The Modern Denial of Human Nature (New York: Viking, 2002). A few
scholars in political science are aware as well. The Association for Politics and the Life Sciences
and its journal Politics and the Life Sciences are the focal points for research. I have used the life
science approach to reveal the origins of war and ethnic conflict in Bradley A. Thayer, Darwin
and International Relations: On the Evolutionary Origins of War and Ethnic Conflict (Lexington:
University Press of Kentucky, 2004).
4. This argument is advanced most forcefully in René Descartes, Discourse on the Method (Je
pense donc je suis); Principles of Philosophy, and Meditations on First Philosophy (Cogito ergo sum).
See The Philosophical Writings of Descartes, vol. 1, trans. by John Cottingham, Robert Stoothoff,
and Dugald Murdoch (New York: Cambridge University Press, 1984), pp. 111–175, 179–291; and
The Philosophical Writings of Descartes, vol. 2, trans. by John Cottingham, Robert Stoothoff, and
Dugald Murdoch (New York: Cambridge University Press, 1984), pp. 1–60.
5. See Bernard Brodie, Escalation and the Nuclear Option (Princeton, NJ: Princeton University
Press, 1966); Thomas Schelling, The Strategy of Conflict (Cambridge, MA: Harvard University Press,
1960); and Schelling, Arms and Influence (New Haven: Yale University Press, 1966); Glenn H. Snyder,
Deterrence and Defense: Toward a Theory of National Security (Princeton, NJ: Princeton University
Press, 1961); and Robert Jervis, The Meaning of the Nuclear Revolution: Statecraft and the Prospect
of Armageddon (Ithaca, NY: Cornell University Press, 1989).
6. Thomas Schelling, quoted in Kathleen Archibald, ed., Strategic Interaction and Conflict:
Original Papers and Discussion (Berkeley, CA: Institute of Internationals Studies, University of
California, Berkeley, 1966), p. 150. I thank Keith Payne for bringing this quote to my attention.
7. Keith B. Payne, Deterrence in the Second Nuclear Age (Lexington: University Press of
Kentucky, 1996); Payne, The Fallacies of Cold War Deterrence and a New Direction (Lexington:
University Press of Kentucky, 2001); and Payne, On Deterrence and Defense: After the Cold War
(Fairfax, VA: National Institute Press, 2007).
8. Keith E. Stanovich, Who Is Rational? Studies of Individual Differences in Reasoning (Mahwah, NJ: Lawrence Erlbaum, 1999). As Stephen Stich argues: “The mere fact that your cognitive
processes and mine are innate would not establish that they are the same” and even were we to “assume that all cognitive systems are innate and that all cognitive systems are optimal from the point
of view of natural selection, it still would not follow that all normal cognitive systems are the same.”
See Stephen P. Stich, The Fragmentation of Reason: Preface to a Pragmatic Theory of Cognitive
Evaluation (Cambridge, MA: MIT Press, 1990), pp. 73–74 (emphasis original).
9. Robert Jervis, Richard Ned Lebow, and Janice Gross Stein, Psychology and Deterrence
(Baltimore: Johns Hopkins University Press, 1985).
10. Jerrold M. Post, Leaders and Their Followers in a Dangerous World: The Psychology of
Political Behavior (Ithaca, NY: Cornell University Press, 2004).
11. John R. Anderson, Daniel Bothell, et al., “An Integrated Theory of the Mind,” Psychological
Review, vol. 111 (October 2004): 1036–1060.
12. Variation might be caused as well by genetic drift or genetic mutation, but would be
maintained by natural selection.
13. Naturally, there can be other sources of difference and different modes of reasoning. See
Richard E. Nisbett, The Geography of Thought: How Asians and Westerners Think Differently. . . and
Why (New York: Free Press, 2003).
14. This theory is advanced in Keith E. Stanovich, The Robot’s Rebellion: Finding Meaning in
the Age of Darwin (Chicago: University of Chicago Press, 2004).
15. Keith E. Stanovich and Richard F. West, “Individual Differences in Reasoning: Implications
for the Rationality Debate?” Behavioral and Brain Sciences, vol. 23 (October 2000): 645–726.
16. The details of Phineas Gage’s unfortunate story are found in Antonio R. Damasio,
Descartes’ Error: Emotion, Reason, and the Human Brain (New York: Avon Books, 1994), pp.
3–33.
17. Ibid. p. 32.
322
B. A. Thayer
18. Ibid. p. 33.
19. Michael Koenigs, Liane Young, Ralph Adolphs, Daniel Tranel, Fiery Cushman, Marc
Hauser, and Antonio Damasio, “Damage to the Prefrontal Cortex Increases Utilitarian Moral Judgements,” Nature, vol. 446 (21 March 2007), published online edition, p. 1.
20. Ibid., p. 1.
21. Benedict Carey, “Brain Injury Said to Affect Moral Choices,” The New York Times, March
22, 2007, available at: http://www.nytimes.com/2007/03/22/science/22brain.html? 11 r=1&ref =
science&oref = slogin (accessed March 22, 2007).
22. Here the work of Adrian Raine and his colleagues has been particularly important. See
Adrian Raine, Monte Buchsbaum, and Lori LaCasse, “Brain Abnormalities in Murderers Indicated
by Positron Emission Tomography,” Biological Psychiatry, vol. 42 (1997), 495–508; and Sharon
Ishikawa and Adrian Raine, “The Neuropsychiatry of Aggression,” in Randolph B. Schiffer, Stephen
M. Rao, and Barry S. Fogel, eds., Neuropsychiatry (2nd ed., Philadelphia: Lippincott Williams and
Wilkins, 2003), pp. 660–679.
23. The literature on this topic is expanding rapidly. An excellent place to start is Simon BaronCohen, The Essential Difference: The Truth about the Male and Female Brain (New York: Basic Books,
2003). Also see Linda Mealey, Sex Differences: Developmental and Evolutionary Strategies (San
Diego: Academic Press, 2000); Steven E. Rhoads, Taking Sex Differences Seriously (San Francisco:
Encounter Books, 2004); and Irwin Silverman and Marion Eals, “Sex Differences in Spatial Abilities:
Evolutionary Theory and Data,” in Barkow, Cosmides, and Tooby, eds., The Adapted Mind, pp.
533–549.
24. Stalin’s reaction to each of these events is documented in Simon Sebag Montefiore, Stalin:
The Court of the Red Tsar (New York: Knopf, 2004).
25. On the death of Mao’s son see Jung Chang and Jon Halliday, Mao: The Unknown Story
(New York: Knopf, 2005), pp. 378–379. This work is a valuable contribution for understanding why
Mao would wage democide against the Chinese people.
26. Andrew Pierce, “If You’re Not On His Side, You Don’t Figure,” Daily Telegraph, March 14, 2007, available at: http://www.telegraph.co.uk/news/main.jhtml?xml =
/news/2007/03/14/nbrown14.xml, accessed on March 14, 2007.
27. For a discussion of the environments of evolutionary adaptation and its influence on human
behavior see Thayer, Darwin and International Relations, pp. 22–59, and 96–218.
28. A good introduction is Michael S. Gazzaniga, Nature’s Mind: The Biological Roots of
Thinking, Emotions, Sexuality, Language, and Intelligence (New York: Basic Books, 1992).
29. For an introduction into emotion see Keith Oatley, “The Structure of Emotions,” in Paul
Thagard, ed., Mind Readings: Introductory Selections on Cognitive Science (Cambridge, MA: MIT
Press, 1998).
30. There is a growing body of literature advancing this argument. Excellent and easily approachable studies are Marc D. Hauser, Moral Minds: How Nature Designed Our Universal Sense of
Right and Wrong (New York: HarperCollins, 2006); and Frans de Waal, Primates and Philosophers:
How Morality Evolved (Princeton, N.J.: Princeton University Press, 2006).
31. Ralph Adolphs, Daniel Tranel, and Antonio R. Damasio, “The Human Amygdala in Social
Judgment,” Nature, vol. 393 (4 June 1998): 472.
32. Brandon M. Wagar and Paul Thagard, “Spiking Phineas Gage: A Neurocomputational
Theory of Cognitive-Affective Integration in Decision Making,” Psychological Review, vol. 111
(January 2004): 67–79.
33. In addition to Damasio, Descartes’ Error; see Damasio, Looking for Spinoza: Joy, Sorrow,
and the Feeling Brain (New York: Harcourt, 2003).
34. Oatley, “The Structure of Emotions,” p. 244.
35. Ibid., pp. 244–245.
36. Ibid., p. 245.
37. Eric Eich and Jonathan W. Schooler, “Cognition/Emotion Interactions,” in Cognition and
Emotion (New York: Oxford University Press, 2000), p. 5.
Thinking about Nuclear Deterrence Theory
323
38. Chandra Sekhar Sripada and Stephen Stich, “Evolution, Culture, and the Irrationality of
the Emotions,” in Dylan Evans and Pierre Cruse, eds., Emotion, Evolution, and Rationality (New
York: Oxford University Press, 2004), pp. 133–158. For their conception of “culture of honor,” they
reference Richard E. Nisbett and Dov Cohen, Culture of Honor: The Psychology of Violence in the
South (Boulder, CO: Westview Press, 1996).
39. Sripada and Stich, “Evolution, Culture, and the Irrationality of the Emotions,” p. 147.
40. Ibid., p. 147.
41. Ibid., pp. 147–149.
42. Their views were first advanced in Amos Tversky and Daniel Kahneman, “Judgment under
Uncertainty: Heuristics and Biases,” Science, vol. 185, no. 4157 (September 1974): 1124–1131; and
further developed in Daniel Kahneman and Amos Tversky, “Judgment Under Uncertainty: Heuristics
and Biases,” in Daniel Kahneman, Paul Slovic, and Amos Tversky, eds., Judgment Under Uncertainty:
Heuristics and Biases (London: Cambridge University Press, 1982), pp. 3–20.
43. The last jointly written response is Daniel Kahneman and Amos Tversky, “On the Reality
of Cognitive Illusions,” Psychological Review, vol. 103 (July 1996): 582–591. Amos Tversky died in
1996.
44. Amos Tversky and Daniel Kahneman, “Advances in Prospect Theory: Cumulative Representation of Uncertainty,” Journal of Risk and Uncertainty, vol. 5 (October 1992): 297–298.
45. The others discussed by Tversky and Kahneman are nonlinear preferences and source
dependence. Tversky and Kahneman, “Advances in Prospect Theory,” p. 298.
46. Ibid., p. 298.
47. Ibid., p. 298. This also discussed in Daniel Kahneman and Dan Lovallo, “Timid Choices
and Bold Forecasts: A Cognitive Perspective on Risk Taking,” Management Science, vol. 39 (January
1993): 17–31; and Amos Tversky and Daniel Kahneman, “Loss Aversion in Riskless Choice: A
Reference-Dependent Model,” The Quarterly Journal of Economics, vol. 106 (November 1991):
1039–1061.
Download