Reasoning and Emotion, in the light of the Dual Processing Model of

advertisement
1
Reasoning and Emotion, in the light of the Dual Processing Model of Cognition
Ronald de Sousa
Philosophy, University of Toronto (http://www.chass.utoronto.ca/~sousa)
sousa@chass.utoronto.ca
ABSTRACT:
I begin, in §1, with some distinctions and an attempt at a working characterization of rationality that is intended to be
usable across the domains of action, belief, desire, and even feeling. In §2 I sketch the ambiguous role that emotions
play in our capacity to reason. I suggest that emotions span the two tracks or “systems” posited by “dual process”
theories of reasoning, or what Daniel Kahneman (2011) has recently called “thinking, fast and slow”. In §3, the main
features of that hypothesis are described, and some questions raised about its significance. In §4, I briefly characterise
emotions and describe some of the ways in which they seem to contribute both indirectly and directly to our capacity
for reasoning, and straddle the two systems. In §5 I compare the learned and the evolutionarily more primitive
components of S1. In §6 I turn specifically to the contributions that epistemic feelings make to our epistemic ends. §7
summarises my main conclusions.
§ 1.
Preliminaries: What is rationality?
To describe someone as rational is generally held to be a form of praise. It suggests that they
reason soundly, take appropriate notice of evidence in forming their opinions, and are willing to
change their minds when confronted with a good argument for doing so; it implies that their
attitudes are not grounded in superstition, swayed by prejudice, or driven by blind passions.
Although it is difficult to see how one might find fault with such characteristics, it is also true that
both laypersons and philosophers can be found to complain that one can be “too rational”. The
accusation can stem from a number of concerns. Before trying to sort these out, it might be useful
2
to fix the boundaries of my own usage. To that end, I begin with two stipulations about the word
‘rationality’, and make bold to offer a definition.
First, it is important to remember that the term can be used in either a normative or a
categorial sense. These are distinguished by their antonyms. In the categorial sense, the opposite of
‘rational’ is ‘arational’, which applies to inanimate things and lower animals. (Whether it applies
to higher nonhuman animals is contested). In the normative sense its antonym is ‘irrational’, which
is usually taken as pejorative. In the phrase ‘rational animal’ the word must, of course, be
understood in the categorial sense: it is precisely because human beings are capable of irrationality
that they are said to be rational animals.
Second, the word ‘rationality’ covers more than ‘reasoning’. The latter concept belongs
exclusively in the category of what is rational/irrational. There is no such thing as a-rational
reasoning. Furthermore, reasoning aims at rationality in transitions between mental states
(typically but not exclusively propositional states), to the exclusion of questions about the
acquisition of such states. But although my title mentions ‘reasoning’ rather than rationality, I shall
not adhere closely to this restriction. For my concern is with the role of emotions in epistemology
more generally, and emotions relate to intuitions as well as transitions. I shall be interested in both.
Now for my definition. Despite the disputes and the vast technical literature to which the
concept has given rise, I think it is possible to cut a fairly clean swath through those debates and
provide a generic definition of normative rationality and irrationality. I suggest that normative
rationality consists in the efficiency of means used in the pursuit of any given goal. Thus baldly
stated, the definition must appear simplistic. Most of the complexities of the notion are packed into
the questions that arise about the “goals” in question.
Our nature as human agents comprises four basic faculties. We experience the world, we
have desires and form beliefs about it, and we act to change it. In acting, we pursue goals in the
most obvious ordinary sense. So far, my simple definition works well enough: for any intentional
action, we can identify a goal and assess the means we choose to it for efficiency. Any putative
3
counterexample will, I surmise, rest on the fact that when pursuing a given goal, we necessarily
have a welter of other goals and concerns that must also be taken into account. This can cause
indefinitely many complications, but doesn’t impugn the general definition.
The specific form of rationality pertaining to the other faculties will be relative to the
characteristic ends of that faculty. Although the notion of a practical goal is the most intuitively
easy to grasp, it is not the most fundamental. More fundamental is the question of what goals are
worth having: we could call this the goal of correctness in desire, or simply of valuing what is
valuable. Similarly, we can criticise the rationality of our beliefs, in terms of the epistemic ends
that govern what we believe and how we acquire beliefs. We can also, though more
controversially, speak of the rationality of what we feel, providing we can identify the “ends” of
emotion. In that spirit, we can tabulate the main forms of rationality in terms of the distinct goals to
which they tend, their characteristic Direction of Fit (DoF), the intentional states typically
concerned, and the processes that are assessed for rationality, as in Table 1.
TABLE 1 ABOUT HERE
Note that there is a certain symmetry between our practical and evaluative goals on the one
hand and our epistemic goals on the other. For the pragmatic tradition, which goes back to
Protagoras, truth is a tool of success: one needs to know how the world is in order to act
effectively. Conversely, if one holds, with Socrates, that truth is the fundamental value, practical
success is just a consequence of correct belief: its corollaries are that “no one does wrong
willingly” and “virtue is knowledge” (see Meno and Protagoras in (Plato 1997).) In more recent
history, William Clifford (1886) is on the side of Socrates: one ought to care more about truth than
advantage – practical rationality be damned. William James (James 1979) was in the tradition of
4
Protagoras, and so perhaps was Richard Rorty (1979): you should care about real consequences
and not abstract truth – and your epistemic scruples be dammed.
In the light of these proposals, we can interpret and rebut the suggestion that we can be too
rational. The reproach might stem from a number of concerns. One might be accused of being too
rational because one fails to acknowledge the emotional reality of so-called “irrational beliefs”. If
you have a strong feeling that so-and-so is not to be trusted, but are quite unable to articulate any
reasons for that judgment, you might insist that those who deride your hunch are being “too
rational”. Since hunches of this sort not infrequently turn out to be correct, however, you might
retort that actually those who dismiss them outright are not being rational enough. The same might
be said of someone whose idea of comforting a grieving friend is confined to urging a “stiff upper
lip”, or pointing out that “life must go on”.
In a more theoretical vein, some philosophers have attacked the very idea that one can
reason one’s way to solutions for life’s deepest problems. Telling examples of this last attitude are
to be found in the attacks sustained by Richard Dawkins from thinkers who, while themselves
acknowledged atheists, charge Dawkins’s dismissal of all religious faith with being simplistic.
(Terry Eagleton’s reviews of Dawkins’s God Delusion, in which he describes him as a
“card-carrying rationalist”, took this line, opining that “even Richard Dawkins lives more by faith
than by reason” (Eagleton 2006).1)
It might also be suggested that the very idea of positing an all-encompassing life goal such
as happiness or the good life, which one then undertakes to pursue, is misguided hyper-rationality.
Chance plays an ineliminable role in our lives, and any life plan drawn up so carefully as to leave
nothing to chance is bound to fail. Yet in the light of my definition such a plan fails because it is
delusive: it simply is not the most efficient way to secure happiness. Trusting oneself to respond
————————————
I know of no better counter to Eagleton’s trite accusation that Tim Minchin’s “Storm”, which can be
viewed at http://www.youtube.com/watch?v=KtYkyB35zkk&feature=related: “Science adjusts its views
/Based on what’s observed / Faith is the denial of observation / So that belief can be preserved.”
1
5
spontaneously to serendipity just might be the better way. Again, then, the over-anxious planner is
not being excessively rational, but not rational enough.
And then there are some wilder and grander rejections of rationality, as in Nietzsche’s
“Why truth? Why not untruth?”, or the contention attributed to Sartre that even the acceptance of
the law of non-contradiction is an arbitrary subjective free choice.
§ 2. Emotions and Reason
Emotions, it is widely believed, are the enemies of reason. While many of our emotions are
swiftly triggered by and incorporate beliefs, they are not as easily extinguished if evidence is
produced against the belief in which they are grounded. Emotions typically control the salience of
perceptible or conceivable features of our environment. When in the grip of fear, or hope, or
jealousy, or anger, we apprehend the world in very different ways; emotion highlights certain
features and blocks out others. “Emotion skews the epistemic landscape” (Goldie 2008, 159):
when we are in the grip of passion, we are blinkered.
Blinkers have obvious disadvantages, but they also have their use. From the point of view
of adaptive evolution, it is easy to see that the drawbacks of emotions are simply the reverse of
their crucial function. By narrowing the range of facts to those most relevant to an urgent response,
they spare us the need to deal with the “Frame problem” (Pylyshyn 1987). The Frame Problem
consists in the fact that we cannot consider an indefinitely large amount of information concerning
the potential consequences of any action we might take. Even if we could list all potentially
relevant consequences of an action, the moment would have passed by the time we had done so.
We must therefore, without reflection, choose what to ignore as currently irrelevant – but we must
do so without deliberating about what to consider and what to ignore, lest we get trapped in an
endless regress. So it seems we must either act rashly or get lost in precautionary reflections. The
blinkers imposed by emotion control salience for us, and spare us that dilemma. In this way, they
6
help to prepare the body for a quick response to a large range of situations likely to present
themselves in most of our lives.
2
Emotions are Janus-faced in yet another sense. Because they can be triggered by
cognitions, and specifically by beliefs conveyed in explicit language, and because they involve
feelings, emotions seem to form an important part of our conscious life. At the same time,
however, they escape conscious control: “experience shows that those who are most agitated by
their passions are not those who know them best” (Descartes 1984 §28).
In what follows, I propose to trace some of the ambiguity in the role that emotions play in
our capacity to reason to the way in which they bridge what Daniel Kahneman (2011) has recently
called “Thinking fast and slow”, or what is more commonly referred to as the hypothesis of “dual
processing”. In the next section, I sketch the importance of that hypothesis.
§3.
The Two-Track Mind
A moment’s reflection can attest that we know many things without knowing how.
Retrieving trivial pieces of knowledge such as your mother’s maiden name, is an obvious example.
More interesting is the disconcerting evidence suggesting that when faced with a moderately
complex problem, we are sometimes better off not thinking about it explicitly, and allowing our
unconscious thinking to decide the issue without the help of explicit calculations and reasoning.
This has led some researchers, notably Ap Dijksterhuis, to suggest that there are two kinds of
thinking, one of which is conscious and the other unconscious (Dijksterhuis and Nordgren 2006) .
This is one of many forms taken by the hypothesis of the Two Track Mind, or Dual Processing.
————————————
This Janus-like character of emotions was noted by Descartes: “The utility of the passions consists alone in
their fortifying and perpetuating in the soul thoughts that it is good that it should preserve, and which
without that might easily be effaced from it. Similarly, the harm they do is that they fortify these thoughts
more than necessary, or that they can serve others on which it is not good to dwell.” (Descartes 1984, §74).
2
7
Keith Stanovich listed over 20 variants of this view in (Stanovich 2004, 30), and I shall follow him
in referring to the two tracks as the Intuitive and the Analytic, or simply S1 and S2.
The most important supporting observations for the Two Track Mind derive from the
discovery of systematic modes of irrationality in reasoning: I shall turn to these in §4. But first, I
note that the apparent power of unconscious thought, just noted, in itself constitutes a compelling
motivating observation, particularly when joined to the realization of the surprising limitations of
conscious awareness. In a classic paper, George Miller (1956) pointed out that our capacity for
simultaneous attention to distinct items of thought or perception is extremely limited:
Concentrating on seven unrelated items at once stretches most people’s powers. We deal with this
limitation by “chunking”, which re-encodes complexes of related information into a single item – a
regional code for a phone number, for example, may consist of 3 digits but is encoded as one
“chunk”. While the “magical” character of the number seven (presumably offered tongue-in-cheek
in the first place) has not proved robust (Baddely 1994), the limitations of conscious memory and
attention have been well confirmed. It has therefore become apparent that any reasoning about
even moderately complex matters inevitably relies on much processing that is inaccessible to
consciousness.3
The narrowness of the immediate memory “channel” in “Global Work-space theory”
(Baars 1997; 2002) confirms this first characteristic of one processing system and the requirement
that something else be able to process information without being subject to similar limitations.
Other contrasting characteristics of the two systems are handily summarised by Jonathan Evans, in
an article that presents an authoritative summary of the state of the art on dual processes. (Evans
notes, however, that “the attributes listed in Table 2 do not include emotion, the discussion of
which is generally beyond the scope of this review.” (Evans 2008, 257).)
————————————
This is not the “dynamic”, “repressed” Unconscious of Freud; but it shares with it a good claim to being a
mental, as opposed to simply a neural or physiological state, thus dissociating the issue of mentality from
that of consciousness.
3
8
[TABLE 2 ABOUT HERE]
The contrasts listed in this table form a dauntingly large set, and it is far from obvious that
the two columns really represent two unified systems, each containing all the features listed.
Further, a single one of the contrasting pairs on the list may conceal a number of different
distinctions. To take but the case of automaticity: Agnes Moors and Jan De Houwer have argued
that this term and its contrast collect a number of relatively independent traits that don’t
necessarily belong to a single “system” either in function or brain circuitry: “unintentional,
uncontrolled/uncontrollable, goal independent, autonomous, purely stimulus driven, unconscious,
efficient, and fast” (Moors and De Houwer 2006). Some of these items already appear separately
on Evans’s table. Moors rightly worries about the tangle of conceptual and empirical assumptions
that underlie the assimilation of so many contrasts to two “systems”.
Nevertheless, when we consider all these contrasts in the light of evolution, there is
considerable heuristic value in the idea of a Two Track Mind. The contrasts in Evans’s list reflect
three very general facts about human beings: First, we are mammals, and share with other
mammals adaptations that have established themselves over hundreds of millions of years.
Mammals, including humans, are superb at multi-tasking and solving everyday problems of living
without explicit or deliberate thought, and many of them come about in the course of maturation.
Second, among those mammalian adaptations is the capacity to learn complex routines that begin
with effortful conscious practice, become “overlearned”, and come to look and feel like reflexive
routines.4 Third, we have something that other mammals do not have, namely language. This
involves both maturation and learning. Among other unique features, language enables us to make
————————————
4
Instinctual and learned intuitions or responses can look very similar, but are distinguished by their origins.
Our fear of spiders belongs in the first category; our fear of guns, in the second.
9
goals, belief and desires explicit, and to argue about them in such a way as to generate entirely new
goals, beliefs and desires which could not have existed without the intervention of linguistic
processing. The new potentialities that arise from our use of language interact with, and give rise
to, a vast new repertoire of overlearned skills, beliefs, and values (de Sousa 2007).
Given these three characteristics, and given the importance of emotions in all aspects of our
lives, it is worth asking what specific role emotions might play in our pursuit of epistemic goals.
§ 4. Emotions and Reasoning
What is an emotion? Most philosophers and psychologists would endorse something like
the following definition of central cases of emotion, due to a leading psychologist: “an episode of
interrelated, synchronised changes in the states of all or most of the five organismic subsystems in
response to the evaluation of an external or internal stimulus event as relevant to major concerns of
the organism.” (Scherer 2005, 695). The five subsystems are 1) a cognitive component; 2) a
motivational component; 3) a subjective phenomenological component, or feeling; 4) a
physiological component, which works to prepare the body for a response in accordance (2); and
5) an expressive component.
Note that the first three components are generally available to consciousness. The fourth
belongs rather to the sub-personal level of organization. As for the last, while generally available
to consciousness, it functions to communicate, in a way that is only partly under the subject’s
control. We might look at emotional expression as setting up a sort of arms race between the
sender’s ability to control what is communicated and the receiver’s ability to detect states that the
sender would prefer to conceal. Non-human mimicry in nature contains many examples of
deceptive messages; and it has been argued that human intelligence, with the large brain that
supports it, evolved as an essentially Machiavellian tool destined to facilitate the manipulation of
others’ responses (Dunbar 2003). Since it is a familiar cliché that the emotions play a dominant
role in the sort of rhetorical art that aspires to carry conviction without regard to truth, the point
10
might be sharpened. That seems to be the brunt of the contention in (Mercier and Sperber 2011)
that the real evolutionary drive behind the honing of our capacity for argument lies in the need to
persuade others (and perhaps oneself) of one’s existing convictions, rather than in the
establishment of truth. But that is not a line of argument I shall have time to pursue.
I turn now to some of the classic forms of systematic irrationality that seem to call for the
Dual Process Hypothesis. Some of these, but not all, seem to implicate emotional causes. What
follows is a small sampling of cases, many of are expounded in (Kahneman 2011).5
My first example might be thought to follow almost logically from the characterisation of
S1 as comparatively automatic, fast and effortless: If a S1 solution is available, we can expect it to
be preferred to any S2 solution in view of a sort of principle of least action. Kahneman sums up a
number of these effects in a single diagram:
FIGURE 1: [COGNITIVE EASE] ABOUT HERE
Kahneman comments:
The various causes of ease or strain have interchangeable effects. When you are in a
state of cognitive ease, you are probably in a good mood, like what you see, believe
what you hear, trust your intuitions, and feel that the current situation is
comfortably familiar. You are also likely to be relatively casual and superficial in
your thinking. When you feel strained, you are more likely to be vigilant and
suspicious, invest more effort in what you are doing, feel less comfortable, and
make fewer errors, but you also are less intuitive and less creative than usual.”
————————————
5
Kahneman's book collects a lifetime of research into a rich intellectual autobiography. Most of this
research was not originally reported within the framework of Dual Processing. In the book, however, it has
been recast in those terms; that is my first reason for referring to Thinking, Fast and Slow for several
examples rather than to the original papers. The other is that it is a fascinating read.
11
(ibid).
While taking the easy way out might be motivated by mood or emotion, some types of
systematic mistake don’t involve any other specific emotion. Such types often come from our
inability to reason statistically. We are good at apprehending that something is quite frequent, or
more frequent than something else (As attested by a gambling friend of Blaise Pascal who had
detected, but couldn’t explain, that he lost more often when betting on 10 than 9 when casting two
dice.)6 Noticing differences in frequencies seems to be an S1 process. But understanding why
requires explicit mathematical calculation, a typical S2 process.
Here is another striking example, again from Kahneman, of a tricky fallacy concerning
probability:
A study of the incidence of kidney cancer in the 3,141 counties of the United States
reveals a remarkable pattern. The counties in which the incidence of kidney cancer
is lowest are mostly rural, sparsely populated, and located in traditionally
Republican states in the Midwest, the South, and the West. What do you make of
this?.... It is both easy and tempting to infer that their low cancer rates are directly
due to the clean living of the rural lifestyle—no air pollution, no water pollution,
access to fresh food without additives….
Now consider the counties in which the
incidence of kidney cancer is highest. These ailing counties tend to be mostly rural,
sparsely populated, and located in traditionally Republican states in the Midwest,
the South, and the West.... It is easy to infer that their high cancer rates might be
directly due to the poverty of the rural lifestyle—no access to good medical care, a
————————————
6
In the (possibly apocryphal: I can recover no source for it) story I am relying on, the Chevalier de Méré
was puzzled by his observation of the difference in frequency, because he reasoned that since there are two
ways of getting 9 and two ways of getting 10 with two dice, the probabilities should be the same. Pascal
diagnosed his friend's mistake, which lay in the confusion between combinations and permutations.
12
high-fat diet, and too much alcohol, too much tobacco.... Something is wrong, of
course. The rural lifestyle cannot explain both very high and very low incidence of
kidney cancer.” (Kahneman 110).
The answer is that it that variance is inversely correlated to sample size. So if kidney cancer
is distributed in a strictly random way, you should expect exactly this result: “sparsely populated”
counties – small samples – will exemplify the most extreme deviations from the average for
strictly mathematical reasons. To become aware of this, we need S2.
In that example, emotion is involved only by dint of our preference for intellectual laziness.
The same is true of some other classic cases involving probability, such as the base rate fallacy, or
the conjunction fallacy (Kahneman Chapter 15, 16). But here is an example that arguably involves
more than merely intellectual laziness, concerning the effect of familiarity. The word ‘familiarity’
here doesn’t actually need to refer to any awareness of prior acquaintance. Mere “priming”, an
important effect the workings of which have been demonstrated in countless experimental
situations, acts in a way that requires no consciousness of the priming stimulus. Exposure to some
visual or verbal stimulus for a time so short as to produce no conscious awareness whatever can
have significant effects on subsequent interpretations of a situation or a sentence. Mere familiarity,
in that weak sense, induces both the boosting of preference and an illusion of truth. What is not
clear is whether it does the latter because it has done the former, or whether the two effects are both
produced in parallel by the same cause.
A classic experiment illustrates the boosting of preference on the basis of unconscious
familiarity. Robert Zajonc flashed on a screen a number of patterns (e.g. Chinese characters, to
subjects who were not Chinese readers) for a time too short to register in any subject’s awareness
or memory. He then presented the subjects with a number of patterns, and asked them to rate them
for attractiveness. What he found was that those patterns that the subjects had been exposed to (but
had not “seen”) were regarded as more aesthetically pleasing than comparable patterns to which
13
they had not been exposed. Zajonc concluded, in the words of his famous title, that “preferences
need no inferences” (Zajonc 1980).
A second experiment is more disturbing, and more relevant to reasoning. It shows that to be
familiar with a random phrase is enough to make you inclined to believe any statement that
contains that phrase. “People who were repeatedly exposed to the phrase “the body temperature of
a chicken” were more likely to accept as true the statement that “the body temperature of a chicken
is 144°” (or any other arbitrary number). The familiarity of one phrase in the statement sufficed to
make the whole statement feel familiar, and therefore true.” (Kahneman 2011, 62).
Mere perceptual salience, like familiarity, will also promote both credibility and
attractiveness. Kahneman cites another experiment in which proverbs couched in a catchy
rhyming formula were thought more insightful than formulations identical in import but lacking
the rhyme. Thus, “woes unite foes” was judged “more insightful” than “woes unite enemies”, and
“a fault confessed is half redressed” more insightful than “a fault admitted is half redressed”.
(Kahnema 2011, 63) citing (McGlone and Tofighbakhsh 2000) who conclude that “rhyme, like
repetition, affords statements an enhancement in processing fluency that can be misattributed to
heightened conviction.” (424).7
One more effect of effort is worth mentioning. Subjects were asked to solve a simple
problem: In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48
days for the patch to cover the entire lake, how long would it take for the patch to cover half of the
lake? Note that this problem is extremely easy, and that the subjects were all Princeton
undergraduates whose IQ can therefore be assumed to have been unusually high. Here is what was
surprising: to half of the subjects, the puzzle was presented “in a small font in washed-out gray
print. The puzzles were legible, but the font induced cognitive strain. The results tell a clear story:
————————————
7
The authors don't discuss an alternative to the hypothesis that the effect on belief is mediated by the effect
on fluency. Familiarity may have caused both fluency and conviction independently.
14
90% of the students who saw the CRT in normal font made at least one mistake in the test, but the
proportion dropped to 35% when the font was barely legible.” (Kahneman 66).
From these various experiments Kahneman concludes: “These findings add to the growing
evidence that good mood, intuition, creativity, gullibility, and increased reliance on System 1 form
a cluster. At the other pole, sadness, vigilance, suspicion, an analytic approach, and increased
effort also go together.” (Kahneman 2011, 69).
§ 5 Evolution and Learning
One item on Evans’s table that is puzzling is the contrast between “Evolutionarily Old” and
“Evolutionarily Recent”. Although the capacity for learning is one we share with other mammals,
as I noted above, the outcome of a process of overlearning can seem just as automatic, fast and
effortless — in short, typically S1 — as any reflexive behaviour. And that should remind us that in
the normal process of development we learn many things that are difficult at the beginning and
become easy with practice. Operant conditioning is the best known mechanism. It works according
to a logic that has essentially the same structure as natural selection: random operant behaviour
provides the original variety corresponding to random mutation, memory plays the role of
inheritance, and reinforcement, by increasing the probability of recurrence, takes on the role of
selection. In the process of learning, a number of other emotions may intervene: anxiety,
emulation, frustration, and so forth. Phoebe Ellsworth describes the change in emotional response
that we can expect when a situation is encountered over and over:
Appraisals of a truly novel situation, except for the few biologically built-in
stimuli, are slower, less certain, and more conscious than they will be the 30th time
the situation is encountered, and the emotion less well-defined. Babies, who
encounter novel situations every day, look to their parents for information about
what to feel... By the time a person has experienced a situation several times, and it
is more familiar, the emotional response is more automatic, and the person will
15
immediately experience the full-blown emotion — anger, for example, at the
person who has cut her off and taken her parking place – with little or no awareness
of the component appraisals (Frijda, 1986; Ellsworth & Scherer, 2003).” (Ellsworth
2013).
In short, we might say, normal human development consists in making the transition from
S2 to S1 performances.
§ 6 Epistemic Feelings
I turn now to consider emotions that are specifically involved in the process of reasoning.
Such emotions have come under scrutiny quite recently, in particular with the work of Christopher
Hookway, Catherine Elgin, and Paul Thagard (Hookway 2003, 2008; Elgin 2008; Thagard 2006,
2008), and other philosophers whose essays collected in a volume devoted to Emotions and
Epistemology (Brun, Doğuoğlu, and Kuenzle 2008). But contemporary philosophers were not the
first to notice them. They have important antecedents, including Plato, Descartes, Hume and
Nietzsche. To recall just the first two: in the Meno, Socrates celebrates the despair of which Meno
complains, by pointing out that it is, together with doubt, an essential stage in the rejection of
falsehood and the acquisition of knowledge. Later in the same dialogue, the slave-boy, when
confronted with a number of erroneous answers to the geometrical problem set by Socrates, finally
experiences the feeling of rightness that goes with recognition of the correct answer. As for
Descartes, he declares that wonder is the “first of the passions”; doubt is what drives his
investigation, and the feeling of certainty, which he labels “clarity and distinctness” becomes the
very criterion of truth. It doesn’t work out, alas; and others, including Hume and Nietzsche, are
less sanguine about the connection of feelings to truth, but emphatic about the primacy of feeling
over reason.
A wide range of feelings may be triggered in the pursuit of knowledge. They include hope,
frustration, anger, envy, fear, greed, and many others. But for the most part these are not triggered
16
by anything specific to reasoning or inquiry as such: they are associated with the perils involved in
any sort of undertaking. In the rest of this essay, I shall be concentrating on a range of states I shall
refer to as “epistemic feelings”. I choose that term in preference to two others that might stake a
claim: ‘intuitions’, and ‘emotions’. I avoid the latter, because to speak of emotions may not sound
altogether natural in the light of common usage, simply because some might contest (though they
would not be obviously right) that our epistemic feelings never involve the sort of physiological
upheavals typical of genuine emotions. And yet simply to speak of ‘intuitions’ would seem too
weak. The terminology, however, is of minor importance: it remains that these are indeed affective
phenomena essentially involved in the pursuit of epistemic aims.
Feelings specifically involved in reasoning can be divided into four types, classified in
terms of their object, and of the epistemic operation to which they contribute.
(1) Wonder motivates inquiry, but presupposes no specific prior belief, and need not target
any existing supposition. While it may be evoked by the contemplation of a particular statement or
state of affairs, it can function as a completely general spur to seeking knowledge.
(2) Doubt also motivates inquiry but bears on hypotheses already entertained.
(3) Certainty bears on specific beliefs; it is, in a sense, antithetical to inquiry, in that it
freezes any further quest for evidence or argument. On the other hand, it frees us for action by
stamping certain facts or values as appropriate ones to be acting upon. The feeling of rightness
seems to belong in the same general category.
(4) The Feeling of Knowing bears on specific propositions, but is unable to specify them: it
is a kind of indication that it is worth the time and effort to keep trying to recall something that is in
fact “somewhere in my head”. There is evidence that the Feeling of Knowing is a fairly reliable
indicator of something’s being available to memory even if it is not currently retrievable; but like
the other three, it can be manipulated to fall into error (Koriat 2000).
17
I have discussed these at greater length in (de Sousa 2008); their role is fairly obvious, and
I will say little more. Instead, I will end by noting some more unusual connections between
emotions and reasoning. These concern the sort of emotion that we might be tempted to call moods
rather than emotions.
One such link is suggested by experiments that provide evidence that perceptual estimates,
e.g., of size, are made more accurately by people who are depressed than by those who are not
(Alloy and Abramson 1979; Carson 2001). Although this affects very concrete judgments of very
specific quantities, this seems to lend some support to the intuition, reported by many depressives,
that their condition affords a deeper insight into the nature of life (Dollimore 2001). On the other
hand, as Keith Oatley notes, citing Alice Isen (1990), when they are happy “people are more
generous to others, they make more creative word associations, they more easily solve certain
kinds of problems, etc. When they are sad or depressed people tend to have previous sad episodes
from their lives coming to mind” (Oatley 1996). Such effects of emotional states are manifestly not
directly influencing conclusions arrived at from given premises; they appear, we might say, rather
to grease the wheels of inference.
Something similar, but more difficult to interpret and rather more arcane, was also brought
out in research conducted by Keith Oatley and his students. Here mood appears to affect neither
the disposition to accept a conclusion, nor the choice of premises, nor even the feeling of rightness
in the process of inference. Rather it seems to influence the order in which a piece of reasoning is
presented. It seems that emotions influence our style rather than the specifics of our reasoning.
After giving subjects a story to read, describing a caddish man behaving badly, the
experimenters asked the subjects a number of interpretative questions. They also assessed the
readers’ emotional state. In the following quotation, “forward chaining” describes reasoning
which is close to the standard in which reasons are placed before the conclusion; “backward
chaining” refers to statements in which the conclusion was laid out first, and followed by the
reasons for it.
18
Among our participants we found anger, sadness and, less frequently, disgust.
Participants who became sad engaged predominantly in backward chaining.
Participants who became angry engaged predominantly in forward chaining. This
difference was significant at the p < 0.02 level for each of the three interpretive
questions. In other words, each reader was emotionally affected by the story, and
his or her mode of thinking became different depending on what emotion was
experienced. Sadness is a mode in which one starts from the current state and
reasons backwards to try and understand its causes. Anger is a state in which one
reasons forwards from the current state about what to do next. Our result is from a
single study. We nevertheless think it suggestive that distinctive modes of
reasoning were associated with specific emotions (Oatley 2002, 53–4).
I find this particularly intriguing, because the way that the emotion is affecting the
reasoning in this case is quite different from what one has been led to expect. It affects a
style of presentation, rather than what is accepted or what is inferred. And precisely
because it seems to affect a level in the process of argument that has little or no epistemic
importance, it provides one more indication of the depth of the involvement of emotion in
reason.
§7. Summary and Conclusion
It is generally accepted that both the virtues and the drawbacks of the emotions stem from
their basic function: to facilitate the body’s preparation when it needs to respond efficiently to
some relatively common life situation, affecting core concerns, in which the combination of
evolutionary adaptation and learning has found it worthwhile to set up a relatively stereotyped
strategy. That basic function requires them to straddle whatever wobbly line divides S1 from S2
processes. But it is not easy to say precisely how this general principle applies to the different ways
in which emotions are involved in reasoning. Both moods and emotions seem to be causally
19
influential in generating some of the systematic mistakes that have been uncovered by the work of
Tversky, Kahneman, and their associates: sometimes, the only emotion that appears to be involved
– if indeed it can be called an emotion at all – is a preference for the lazy option. But some more
specific influences, such as the curious power of familiarity to breed liking and conviction, do
seem to fall in with the general principle that emotions dispose us to respond in the most efficient
way, when a quick and dirty (or “fast and frugal8”) response is required. I argued that specific
epistemic feelings, including wonder, doubt, certainty, the feeling of rightness and the feeling of
knowing, have very specific roles, either in stamping a kind of seal of approval on the steps of an
argument or the conclusion of an inference, or, on the contrary in spurring further enquiry. In some
cases, such as the different styles of presentations of arguments favoured by sadness and anger, the
role of emotions is both more subtle and more difficult to fathom.
————————————
8
The former expression is preferred by those who, like Kahneman and Tversky, stress our proclivity to systematic
error; the latter is used by those, notably (Gigerenzer et al., 1999; Dijksterhuis et al 2006) who insist that intuitive
thinking is generally adaptive, and almost always more efficient in attaining correct decisions than S2 processes of
explicit deliberation.
20
References
Alloy, Lauren B., and Lyn Y. Abramson. 1979. “Judgment of Contingency in Depressed
and Nondepressed Students: Sadder but Wiser?” Journal of Experimental
Psychology: General 108(4): 441–85.
Baars, Bernard J. 1997. “In the Theatre of Consciousness: Global Workspace Theory, a
Rigorous Scientific Theory of Consciousness.” Journal of Consciousness
Studies 4(4): 292–309.
Baars, Bernard J. 2002. “The Conscious Access Hypothesis: Origins and Recent
Evidence.” Trends in Cognitive Sciences 6(1): 47–52.
Baddely, Alan. 1994. “The Magical Number Seven: Still Magic After All These Years?”
Psychological Review 101 (April):353–56.
Brun, Georg, Ulvi Doğuoğlu, and Dominique Kuenzle, eds. 2008. Epistemology and
Emotions. Aldershot: Ashgate.
Carson, Richard Courtney. 2001. “Depressive Realism: Continuous Monitoring of
Contingency Judgments Among Depressed Outpatients and Non-Depressed
Controls.” Vanderbilt.
Clifford, William K. 1886. “The Ethics of Belief.” In Lectures and Essays (2nd. Ed), ed.
Leslie Stephen and Frederick Pollock. London: Macmillan.
de Sousa, Ronald. 2007. Why Think? Evolution and the Rational Mind. New York: Oxford
University Press.
de Sousa, Ronald. 2008. “Epistemic Feelings.” Canadian Journal of
Philosophy. 2(1): 185–204Epistemology and Emotions. Vol. 2, eds Georg Brun,
Ulvi Doğuoğlu, and Dominique Kuenzle. Aldershot: Ashgate.
Descartes, René. [1649] 1984. The Passions of the Soul. Vol. 1 of The Philosophical
Writings of Descartes. Trans. John Cottingham, Robert Stoothoff, and Dugald
Murdoch. Cambridge: Cambridge University Press.
21
Dijksterhuis, Ap, and Loran F. Nordgren. 2006. “A Theory of Unconscious Thought.”
Perspectives on Psychological Science 1:95–109.
Dollimore, Jonathan. 2001. “Diary.” London Review of Books 23(16): 32–33.
Dunbar, Robin I. M. 2003. “The Social Brain: Mind, Language, and Society in
Evolutionary Perspective.” Annual Review of Anthropology 32:163–81.
Elgin, Catherine Z. 2008. “Emotion and Understanding.” In Epistemology and the
Emotions, eds Georg Brun, Ulvi Doguoglu, and Dominique Kuenzle. Aldershot:
Ashgate.
Ellsworth, Phoebe. 2013. “Appraisal Theory: Old and New Questions.” Emotion
Review 5:forthcoming.
Evans, Jonathan St.B.T. 2008. “‘Dual-Processing Accounts of Reasoning, Judgment and
Social Cognition’.” Annual Review of Psychology 59:pp. 255–78.
Gigerenzer, Gerd, Peter Todd, and ABC Research Group. 1999. Simple Heuristics That
Make Us Smart . New York: Oxford University Press.
Goldie, Peter. 2008. “Misleading Emotions.” In Epistemology and the Emotions, eds
Georg Brun, Ulvi Doguoglu, and Dominique Kuenzle. Aldershot: Ashgate.
Hookway, Christopher. 2003. “Affective States and Epistemic Immediacy.”
Metaphilosophy 34:78–96.
Hookway, Christopher. 2008. “Epistemic Immediacy, Doubt and Anxiety: On a Role for
Affective States in Epistemic Evaluation.” In Epistemology and Emotions, eds
Georg Brun, Ulvi Doğuoğlu, and Dominique Kuenzle. Aldershot: Ashgate.
Isen, A. 1990. “The Influence of Positive and Negative Affect on Cognitive Organization.”
In Psychological and Biological Processes in the Development of Emotion, eds B.
Leventhal, T. Trabasso, and N. Stein. Hillsdale: Erlbaum.
James, William. [1896] 1979. “The Will to Believe,” ed. Frederick H. Burkhardt. In The
Will to Believe: And Other Essay in Popular Philosophy. Cambridge, MA:
Harvard University Press.
22
Kahneman, Daniel. 2011. Thinking, Fast and Slow. D. Kahneman, P. Slovic, and A.
Tversky. New York; Toronto: Farrar, Straus and Giroux; Doubleday Canada.
Koriat, Asher. 2000. “The Feeling of Knowing: Some Metatheoretical Implications for
Consciousness and Control.” Consciousness and Cognition 9:149–71.
McGlone, Matthew S., and Jessica Tofighbakhsh. 2000. “Birds of a Feather Flock
Conjointly (?): Rhyme as Reason in Aphorisms.” Psychological
Scie3nce 11(3): 424–28.
Mercier, Hugo, and Daniel Sperber. 2011. “Why Do Humans Reason? Arguments for an
Argumentative Theory.” Behavioral and Brain Sciences 34(2): 57–74.
Miller, George A. 1956. “The Magical Number Seven, Plus or Minus Two: Some Limits
on Our Capacity for Processing Information.” Psychological Review 63:81–97.
Moors, A., and J. De Houwer. 2006. “Automaticity: A Theoretical and Conceptual
Analysis.” Psychological Bulletin, 32, 297–326.
Oatley, Keith. 1996. “Inference in Narrative and Science.” In Modes of Thought, eds D.R.
Olson and N. Torrance. New York: Cambridge University Press.
Oatley, Keith. 2002. “Emotions and the Story Worlds of Fiction.” In Narrative Impact:
Social and Cognitive Foundations, eds. M.C. Green, J.J. Strange, and T.C. Brock.
Mahwah: Erlbaum.
Plato. 1997. Complete Works. Ed. John M. Cooper. Indianapolis: Hackett.
Pylyshyn, Zenon, ed. 1987. The Robot’s Dilemma: The Frame Problem and Other
Problems of Holism in Artificial Intelligence. Norwood, NJ: Ablex.
Rorty, Richard. 1979. Philosophy and the Mirror of Nature. Princeton, New Jersey:
Princeton University Press.
Scherer, Klaus R. 2005. “What Are Emotions? And How Can They Be Measured?” Social
Science Information 44(4): 695–729.
Stanovich, Keith E. 2004. The Robot’s Rebellion: Finding Meaning in the Age of Darwin.
Chicago: University of Chicago Press.
23
Thagard, Paul. 2006. Hot Thought: Mechanisms and Applications of Emotional Cognition.
Cambridge, Mass.: MIT Press.
Thagard, Paul. 2008. “How Cognition Meets Emotion: Beliefs, Desires Amd Feelings as
Neural Activity.” In Epistemology and Emotions, eds Georg Brun, Ulvi Doğuoğlu,
and Dominique Kuenzle. Aldershot: Ashgate.
Zajonc, Robert B. 1980. “Feeling and Thinking: Preferences Need No Inferences.”
American Psychologist 35:151–75.
24
25
Type
Goal
DoF
Intent’l State Processes inducing change
Epistemic
Truth
M→W
Belief
Intuition, perception, reasoning.
Practical
State of affairs M←W
Action
Deliberation
Evaluative
Good
M←W
Desire
Intuition, practical reason
Emotional
Appropriate
M→W
Emotion
Imagination, perception, reasoning
TABLE 1: Domains of Rationality
26
TABLE 2 (from Evans 2008 p. 257
27
FIGURE 1:
Kahneman’s picture of the causes and consequences of Cognitive Ease
Download