KNOWLEDGE AS SUFFICIENT INFORMATION

advertisement
1
KNOWLEDGE AS SUFFICIENT INFORMATION
(DRAFT --- NOT FOR QUOTATION)
Richard Foley
New York University
2
PART ONE: THE THEORY
1. The basic idea
Someone glances at a clock that unbeknownst to her is not working and as a result
comes to believe it is quarter past seven. In fact, it is quarter past seven. Her belief is
true, but it is not knowledge. Out of this classic example comes a classic philosophical
problem, that of determining what must be added to a true belief in order to make it into a
plausible candidate for knowledge.
The solution to this problem, I will be arguing, is to be found in the observation that
whenever an individual S has a true belief P but does not know P, there is significant
information about P that S lacks. This is a seemingly modest point, but it has the
capacity to reorient the theory of knowledge.
Developing this observation into a theory requires that the reference to information be
unpacked without making use of the concept of knowledge. I will do so with the notion
of true belief. Every philosophical theory must start somewhere, helping itself to some
starting assumptions that can be revisited if they lead to difficulties. For purposes here, I
am taking for granted that belief is a psychological state and is a kind of state that can be
true or false. I am thus setting aside any suggestions to the effect that the concepts of
belief or truth be abandoned.1
Making use of the notion of true belief, the core observation becomes that whenever S
has a true belief P but does not know P, there is some significant aspect of the situation in
which P is true about which S lacks true beliefs. Knowledge is thus a matter of having
sufficient information, where the test of what is sufficient is negative. One must not lack
important true beliefs in the neighborhood of P. S knows that a red ball is on the mat in
the hallway if she believes that a red ball on the mat in the hallway, her belief is true, and
there is no important gap in her information about the situation that makes her belief true.
Information comes in many sizes and shapes, however. The red ball that is on the
1 See Paul Boghossian’s discussion of the view that there are no truths, only truths relative to a
particular way of thinking, in Fear of Knowledge (Oxford: Oxford University Press, 2006), especially
Chapters 3 and 4. See Richard Foley, Working Without a Net (Oxford: Oxford University Press, 1993),
especially Chapter 4, for a discussion of the idea that the concept of belief might be discarded in favor of
a concept of degrees of belief.
3
mat in the hallway has a precise circumference. It also has a definite weight. It is
made of rubber. The rubber is a certain shade of red. The mat likewise has its own set of
specific characteristics. So does the hallway. Its ceiling is of a certain height. Its walls
are covered with black walnut paneling. It has a mahogany door leading to the outside.
There are historical truths about the situation as well. The black walnut paneling was
installed last year. It cost $25 per square foot. The ball was bought two months ago in a
Target store in Brooklyn. These historical truths are connected to still others. The rubber
making up the ball came from a tree grown on a rubber plantation in Kerala, India. The
rubber plantation belongs to Harrisons Malayalam Ltd., which has operated in India since
the middle of the 19th Century. The HML corporation also grows tea. Negative
information is also information. There is not a bicycle in the hallway. Nor is there a
truck or an oak tree. The hallway does not have a linoleum floor. The ball is not made of
steel or wood or wool. It is not purple or yellow, and is not larger than a standard
basketball.
There is no end to the truths associated with there being a red ball on the mat in the
hallway. They radiate out in all directions. Nor is this situation atypical. Every situation
is metaphysically lush, brimming over with truths.
Information is by comparison arid. No one, no matter how well informed, can be in
a possession of all truths about a situation. Even being in possession of most is too much
to expect. If the number of such truths is not infinite, it is at least mind numbingly vast.
Our grasps of situations are thus inevitably partial. Not all partial grasps are equal,
however. Sometimes the information we lack is important. Sometimes it is not. If the
latter, we know.
Whether a true belief constitutes knowledge thus hinges on the significance of the
information one has and lacks about the matter at issue. This means that questions of
knowledge cannot be separated from pragmatic considerations.
This is getting ahead of the story, however. The best way to understand the view
that knowledge is sufficient information and appreciate how it reorients the theory of
knowledge is to contrast it with received views.
2. Post-Gettier accounts of knowledge
4
Before leaving her office, Joan always puts her laptop on the corner of her
desk. Unbeknownst to her, her laptop has just been stolen and placed by the thief on the
corner of the desk in his apartment. Joan believes that her laptop is on the corner of a
desk, and in fact it is, but she does not know this.
On Tuesday evening, Mary went to sleep at 11 PM as is her habit, unaware that she
had been given a sleeping potion that would cause her to sleep thirty-two hours instead of
her usual eight. When she awakes in her heavily curtained and clock-less bedroom on
Thursday, she believes it is about 7 AM, because this is the hour at which she regularly
wakes. It is in fact 7 AM, but she nonetheless does not know this to be true.
Jim has bought a ticket in a lottery of a million tickets. The winning ticket has
been chosen but not yet announced. Jim believes that his ticket is not the winner, and he
is correct, but he does not know this to be the case.
Examples such as these, which are familiar from the contemporary literature and
can be multiplied indefinitely, illustrate two points about the concept of knowledge,
which in turn create an agenda for the theory of knowledge. The first point is that one
can have a true belief P and yet not know P. The second is that one can even have a
justified true belief P and yet not know P. The agenda thereby established is to identify
what has to be added to true belief in order to get knowledge. If the assumption is made
that knowledge requires justified belief, the more specific agenda is to identify what has
to be added to justified true belief to get knowledge.
This agenda has dominated epistemology since Edmund Gettier’s influential 1963
article.2 Gettier used a pair of examples to illustrate that one can be justified in
believing a falsehood P from which one deduces a truth Q, in which case one has a
justified true belief Q but does not know Q. Gettier’s article inspired a host of similar
examples, and the search was on for a condition that could be added to justified true
belief to produce knowledge.
There has been no lack of proposals. Some have suggested that a special kind of
justification is needed. The justification has to be non-defective in the sense that it must
not justify any falsehood,3 or it has to be indefeasible in the sense that it cannot be
2 Edmund Gettier, “Is Justified Belief Knowledge?” Analysis XXV (1963), 121-123.
3 Roderick Chisholm, Theory of Knowledge, 2nd ed. (Englewood Cliffs, NJ: Prentice-Hall,
1977), 102-118; Ernest Sosa, “Epistemic Presupposition,” in G. Pappas, ed., Justification and
Knowledge (Dordrecht: Reidel, 1979), 79-92; and Ernest Sosa, “How do you know?” in E. Sosa,
5
defeated by the addition of any truth.4
Other epistemologists, however, inspired by the movement to “naturalize”
epistemology, came to think that justification based approaches were misdirected. They
argued that we cannot intellectually defend much of what we know, and hence something
less explicitly intellectual than justification is needed to understand knowledge. One
early idea was that knowledge requires there to be an appropriate causal connection
between the fact that makes a belief true and the person’s having that belief.5 This idea
handled the original cases described by Gettier, but it ran into with problems with
knowledge of mathematics, general facts, and the future.
Nevertheless, in the eyes of many, whatever its defects, the causal theory of
knowledge had the virtue of shifting the focus away from questions of being able to
justify one’s beliefs intellectually and towards questions of being in an appropriate causal
or causal-like relation with one’s environment. The task was to identify the precise
character of this relation. If a simple causal connection between the fact that makes a
belief true and the belief itself won’t do, some other relation needed to be found.
Again, there has been no lack of proposals. One idea is that for a true belief
to count as knowledge, it must be reliably generated.6 A second idea is that in
close counterfactual situations the subject’s beliefs about the matter in question
must track the truth.7 A third is that the belief must be the product of properly
functioning cognitive faculties.8 And, there are important variants of each of these
proposals.9
Knowledge in Perspective (Cambridge: Cambridge University Press, 1991), 19-34.
4 Robert Audi, The Structure of Justification (Cambridge: Cambridge University Press, 1993);
Peter Klein, Certainty: A Refutation of Scepticism (Minneapolis: University of Minnesota Press,
1981); Keith Lehrer, Knowledge (Oxford: Oxford University Press, 1974); John Pollock,
Contemporary Theories of Knowledge (Totowa, NJ: Rowman and Littlefield, 1986); and
Marshall Swain, Reasons and Knowledge (Ithaca: Cornell University Press, 1981).
5 Alvin Goldman,“A Causal Theory of Knowing,” The Journal of Philosophy, 64 (1967), 357372.
6 Alvin Goldman, Epistemology and Cognition (Cambridge: Harvard University Press, 1986).
7 Robert Nozick, Philosophical Explanations (Cambridge: Harvard University Press, 1981).
8 Alvin Plantinga, Warrant: The Current Debate (Oxford: Oxford University Press, 1993).
9 For example, see D.M. Armstrong, Belief, Truth, and Knowledge (Cambridge: Cambridge
University Press, 1973); Fred Dretske, Knowledge and the Flow of Information (Cambridge:
MIT Press, 1981); David Lewis, “Elusive Knowledge,” Australasian Journal of Philosophy 74
(1996), 549-567; and Ernest Sosa, Knowledge in Perspective, especially Chapters 13-16.
6
All these proposals, however, assume that what needs to be added to true
belief in order to get knowledge is something related to true belief but distinct from
it --- non-defective justification, indefeasible justification, reliability, truth tracking
in close counterfactual situations, proper functioning, or whatever. My
suggestion, by contrast, is that whenever S has a true belief P but does not know P,
there is some significant piece of information that she lacks. So, what has to be
added to a true belief P in order to get knowledge? More true beliefs, especially
more true beliefs in the neighborhood of P. In particular, there must not be
important truths in the neighborhood of P of which S is unaware. A fortiori, there
must not be important truths that S positively disbelieves.
A merit of this view is that there is a concrete way to test it. If S has a true
belief P but nonetheless does not know P, then according to the above view, it
ought to be possible to identify a proposition Q such that (i) Q is a significant truth
about the situation in which P is true, and (ii) S does not believe Q. I claim that
this test can always be successfully met.
Why does Joan not know that her laptop is on the corner of a desk, and Mary
not know that it is 7 AM, and Jim not know that his lottery ticket is not the winner,
even though their beliefs are true? They lack significant true beliefs about the
situation. Joan isn’t aware that her laptop has been stolen and that the desk on
which it now sits is that of the thief; Mary isn’t aware that she is just waking up
from a drug induced sleep; and Jim isn’t aware which ticket has won the lottery.
3. Knowledge Stories
Contemporary theory of knowledge is driven by stories. A common practice
is to tell a tiny story, use it to elicit an intuition as to whether the subject has or
lacks knowledge, and then draw a moral for the theory of knowledge. Some of
these stories are stripped down versions of familiar situations. Others depict
unusual circumstances, for example, my story about Mary and her drug induced
sleep. Still others are beyond the merely unusual, for example, stories about brains
in vats. A superficial risk of this methodology is that of turning epistemology into
a playground for amateur storytellers. A deeper issue is that every story is of
necessity incomplete.
7
Sartre once remarked that in writing philosophy the aim is to eliminate ambiguity
and vagueness and thereby fix upon a meaning that discourages multiple interpretations,
whereas in writing fiction the aim is the opposite, to create different levels of meaning
that resist a single interpretation. Whether Sartre’s fiction or his philosophy succeeds on
these terms is debatable, but his remark is instructive nonetheless. The story telling of
contemporary epistemology creates small fictions in hopes of fixing upon philosophical
points about knowledge, but the stories are always susceptible to multiple interpretations,
if only because they are incomplete, with ways of filling in the unsupplied details that
potentially blunt the intended lesson. Sparely told stories are especially susceptible to
reversals. The fewer the details, the more room there is for retelling.
The moral is not that stories cannot be useful for epistemology. They can be if
used judiciously. More on this later, but for now the relevant points are that
contemporary theory of knowledge is driven by stories, especially ones in which the
subject has a true belief but seems to lack knowledge; the scenarios I sketch above (of
Joan, Mary, and Jim) are themselves examples of such stories; and these stories, mine
included, have a similar structure. In particular, they make us of the common literary
device of providing the audience with information that the subject does not have.
Moreover, the stories are told in a way that suggests that this information is important.
Consider the much discussed barn case. George, who is a sightseer touring in his
automobile through farm country, is charmed by the picturesque old barns he is seeing.
He pulls his car over to the side of the road to spend a moment gazing at the latest barn he
has happened across. Unbeknownst to him, the local tourist board has populated the
region with highly realistic facades of old barns, and up until now he has been seeing
these facades, not real barns. By chance, however, he has stopped in front one of the few
genuine barns left in the area. As he stares out of his car window, he thus has a true belief
that he is looking at an old barn, and yet there is a pull upon us, the audience, to concede
that his true belief does not rise to the level of knowledge.10
Why is this? Is it because the justification he has for his belief is defeasible, or
because his normal perceptual processes are unreliable in this locale, or because his belief
would not track the truth in close counterfactual situations? All these may well be true of
him, but the most obvious explanation is the simplest. He lacks important true beliefs
10 See Carl Ginet, “The Fourth Condition,” in Philosophical Analysis, ed. D. Austin (Dordrecht:
Kluwer Academic Publishers, 1988); and Alvin Goldman, “Discrimination and Perceptual
Knowledge,” The Journal of Philosophy 73 (1976), 771-791
8
about the situation. He is unaware that there are numerous, highly realistic barn
facades in the area and unaware as well that on his tour up until now he has been seeing
these facades instead of real barns.
All the stories in the vast post-Gettier literature that describe a subject who has a
true belief but seems to lack knowledge can be given a similar treatment. Each can be
seen as an attempt to draw attention to some aspect of the situation about which the
subject S lacks true beliefs and to suggest that this aspect is in some way significant. To
the degree that we the audience are convinced that the gap in S’s information is indeed
significant, we conclude that S has a true belief but does not know.
4. Intuitions about knowledge stories
Intuitions about whether someone knows something are varied, frequently sloppy,
and at times in conflict with one another. Epistemologists react differently to this
complexity. Some ignore it. Some question whether the ordinary concept of knowledge
is coherent.11 Others try to shrink the complexity to a manageable level by introducing
qualifiers such as “real” or “genuine,” with recalcitrant cases dismissed as ones in which
the subject lacks real or genuine knowledge.12 Still others try to impose uniformity by
insisting on such high standards of knowledge that most claims of knowledge turn out to
be false.13
My approach lies in between. It assumes we know a good deal and it would be
folly to claim otherwise14 but concedes that intuitions about knowledge can be puzzling,
even jumbled. These intuitions can nonetheless be useful in constructing a theory of
11 See Noam Chomsky, Rules and Representation (New York: Columbia University Press,
1980), p. 82.
12 See David Lewis, “Scorekeeping in a Language Game,” in Philosophical Papers, vol.1 (New
York: Oxford University Press, 1983).
13 Contrast with Peter Unger, Ignorance: A Case for Scepticism (Oxford: Oxford University
Press, 1975 and 2002); Richard Fumerton, Epistemology (Oxford: Blackwell, 2006), especially
Chapter 2; and Richard Fumerton, Meta-epistemology and Skepticism (Boston: Rowan and
Litttlefield, 1996).
14 Compare G.E Moore, “Proof of an External World,” in T. Baldwin (ed.), G.E. Moore:
Selected Writings (London: Routledge, 1993), 147-170. For an especially good discussion of
Moore’s “proof,” see James Pryor, What’s Wrong with Moore’s Argument?” Philosophical
Topics (2004), 14: 349-78.
9
knowledge, but as starting points, not rigid constraints. They vary too much from
person to person and occasion to occasion, and when stories are used to elicit them, the
stories themselves are too under described.
Besides, regarding knowledge intuitions as hard data to which theories must be
adequate makes for a feeble hearted epistemology. It invites the question of whose
intuitions are supposed to matter. Those with the concept of knowledge? Speakers of
English? Philosophers? In addition, if the point of a theory of knowledge were to be
adequate to some group’s intuitions about knowledge, the working procedure ought to be
to survey in a systematic way the intuitions of the relevant group, thereby collecting data
as clean and representative as possible. No philosopher actually does this, however, and
with good reason. A philosophical theory of knowledge ought to be made of loftier stuff.
It is more important for a theory of knowledge to explain how and why intuitions
about knowledge arise than to conform to them in a rote way, and more important also
for the theory to provide a general framework for engaging the various philosophical
questions and puzzles that arise about knowledge, from those about why knowledge is
valuable and what its relationship is with justified belief to whether it can be acquired by
luck and what kind of knowledge, if any, is possible in lottery and preface cases.
There are limitations as to what a theory of knowledge should be be expected to
accomplish, however, and not just because the intuitions are complex and the questions
and puzzles difficult. There are also distinct kinds of knowledge, which might well
require separate theories. There is knowledge of people, places, and things; S knows
Mike Bloomberg, knows Greenwich Village, and knows various Internet sites. There is
also knowledge of concepts; S knows what valid and sound arguments are and what an
irrational number is. In addition, there is knowledge of subject areas; S knows baseball
and knows American politics. Knowledge how is yet something different; S knows how
to fix a faulty light switch and how to telephone New York from Paris. S has knowledge
of various facts as well; some are general (the boiling point of water is 212°F), some
specific (Accra is the capital of Ghana), and some personal (her mother’s family name
was Hass).
There is also non-human knowledge, for example, dogs that know when their
owners are approaching and birds when a big storm is due. Body parts are even
sometimes said to know. The liver knows when to excrete bile and the pituitary gland
how and when to regulate the thyroid. Knowledge is also attributed to machines; my
laptop knows when to shut itself down to prevent overheating.
10
Some of these attributions are metaphorical, no doubt. No matter. My focus is
upon human knowledge, and specifically on human knowledge of facts. Such knowledge
may well be linked with other kinds of knowledge. It may even be the ground for some.
But even setting aside questions of its relationships with other kinds of knowledge, there
are difficulties enough in providing an account of it, because even within this category,
there appears to be great variation, especially with respect to the amount of neighboring
information required.
Often the expectation is that one needs broad and deep information about the
matter in question in order to have knowledge of it, but sometimes only scanty
information seems required. S is a contestant on a quiz show who is asked the date of the
Battle of Marathon. She is able to recall from her high school world history course that
the battle took place in 490 BCE, but she may not remember that the Greeks won the
battle or even that the adversaries were the Greeks and Persians. Still, she at least knows
the date, it seems.
Stories can be helpful in illuminating what kinds of associated information are
required for knowledge, but for the reasons already cited, they need to be treated with
care: the stories are incomplete; there are different concepts of knowledge; and our uses
of these concepts are not precise. Intuitions about knowledge stories are correspondingly
malleable, and the stories themselves can be told to exploit this malleability.
All stories are structured to elicit reactions --- laughter, sympathy, outrage, or
whatever --- and there are standard techniques available to the storyteller for generating
these reactions. The most basic is selectivity. Real situations are metaphysically lush,
but stories are selectively incomplete. The storyteller decides which of a potentially
limitless set of details about characters and settings to omit and which to include, and
what kind of emphasis to put on those that are included. The ways in which these
storytelling decisions are made pull the listener’s reactions in one direction or another.
It is no different with knowledge stories. Stories in which S has a true belief P can
be told to make gaps in her information about P seem important enough that they
preclude her from knowing, but they can also be told to make whatever gaps there are in
her information seem inconsequential. Call the latter a “narrow telling” and the former a
“broad telling.”
One way to tell a narrow knowledge story is to dwell upon the importance of P
11
itself being true and thereby diminish the significance of those aspects of the
situation of which the subject is unaware. The barn story, for example, can be retold to
lower the significance of George’s being unaware of barn facades at other nearby
locations. Imagine that the location of the barn where he has stopped his car is the site of
a meaningful event of his childhood, and it was his memory of this event that motivated
him to return to the region and seek out this specific location. He thus has little interest in
the other barns he has apparently been seeing. It is the barn at this particular location that
is the purpose of his trip. Moreover, he has vivid memories from his childhood of
specific characteristics of the barn: the sharper than normal pitch of the roof, the
placement of the barn’s openings, its position on the landscape, the crowbar shaped
handle on the main door, and so on. All these details match what he now sees in front of
him.
As the story has been retold, it may no longer seem as important that George is
unaware of the barn facades in the region. Correspondingly, in hearing the story, the
audience may be more ready to concede that he knows he is looking at an old barn,
indeed, the very barn he remembers from his childhood.
Can a knowledge story ever be told so narrowly that all that matters is whether S
has a true belief P, without any surrounding information? Return to the story of S who
when asked the date of the Battle of Marathon is able to reply immediately that it
occurred in 490 BCE but is not aware of such adjacent information as who the opponents
were, who won the Battle, and so on. Still, we may judge, she knows the year of the
battle.15
It is not so easy to have a single true belief in isolation, however. Beliefs come in
clusters. Larger or smaller clusters perhaps but clusters nonetheless. This is to be
expected given that truths themselves are clustered. The truth that the Battle of Marathon
occurred in 490 BCE is conceptually intertwined with truths about location (Marathon is
the name of a location in Greece; Greece is on the Mediterranean Sea; the Mediterranean
Sea is a body of water on the planet Earth; etc.) as well as ones about time (BCE stands
for “before the common era”; the date 490 makes use of a dating system that has a year as
its core unit of time a year is the period of time it takes the Earth to make a complete
revolution around its sun; etc.) and battles (a battle is large scale fight between armed
15 For other such cases, see John Hawthorne, Knowledge and Lotteries (Oxford: Oxford
University Press, 2004), pp. 68-9; Crispin Sartwell, “Why Knowledge is Merely True Belief,”
The Journal of Philosophy LXXXIX (1992), 167-180; and Sartwell, “Knowledge as Merely True
Belief,” American Philosophical Quarterly 28 (1991), 157-164).
12
forces; an armed force is equipped with weapons; weapons have the capacity to
inflict harms; etc.). To believe that the Battle of Marathon occurred in 490 BCE, S need
not be aware of all conceptually connected truths, but she does have to be aware of many.
Otherwise, it would be questionable whether she understands, much less believes, the
proposition that the Battle of Marathon occurred in 490 BCE.
In addition, background information typically plays a role in the acquisition of
beliefs, even simple beliefs. S enters a room, sees a ball on the floor, and comes to
believe that the ball is red, but she is also in possession not only of general information
about how vision allows her (and other humans) to determine the color of objects and
how it is not unusual for balls to be colored red, but also such specific information as the
lighting in the room is not such as to make everything appear red and she is not wearing
red tinted sunglasses. If the proposition that the ball is red did not cohere with this
background information, a fortiori if it were in conflict with it, she might well have not
believed the proposition.16
There are, in other words, built in limitations on just how scanty S’s information
can be while still knowing P. This having been said, the Marathon story and others like it
do succeed in pulling intuitions in the direction of at least extremely narrow knowledge.
They do so by focusing attention on a key truth around which the story revolves and by
making it clear that the subject of the story is aware of this truth, whatever her other
deficiencies may be.17
As the Marathon story also illustrates, one way of narrowing the focus is to have
the story revolve around a specific question that the subject knows how to answer. When
asked on the quiz show the date of the Battle of Marathon, S immediately responds “490
BCE,” but does not remember that the Greeks won the battle or that the adversaries were
the Greeks and Persians. Still, it may seem she at least knows the date. Why is it that
these gaps in her information may not strike us as all that important? Because the story
16 This is a point that coherence theorists emphasize in order to show how their theory is
compatible with simple observational and memory beliefs being justified. See, for example,
Gilbert Harman, Change in View (Cambridge: MIT Press, 1986), and Keith Lehrer, Knowledge
(Oxford: Oxford University Press, 1974).
17 Here is another such story. When asked to provide the decimal expansion of pi to 5 decimal
places, S immediately responds, “3.14159.” S understands that pi is the ratio of a circle’s
circumference to its diameter, but she is otherwise largely ignorant of its properties (that pi is an
irrational number, etc.) and its uses in geometry (that pi can be used to calculate the area of a
circle via the formula, A =πr2, etc.). Nevertheless, it may seem as if she at least knows this
much about pi: that its decimal expansion to 5 decimal places is 3.14159.
13
has been told to suggest that what matters is S’s knowing how to answer the
specific question, what is the date of the Battle of Marathon? Knowing how seems to be
towing knowing that behind it in its wake.18
5. Significant gaps
To divide true beliefs that potentially rise to the level of knowledge from those that
do not, theories of knowledge identify a dimension for making the division. Other
conditions may also need to be satisfied, but this dimension does the initial work in
qualifying a true belief as a prima facie candidate for knowledge.
According to justification-based theories, the relevant dimension is the strength of
the person’s evidence. According to reliability theories, it is the degree of reliability of
the processes generating the belief. For tracking theories, it is the extent to which the
person’s belief would track the truth in close counterfactual situations.
Whatever the proposed dimension, there is no non-arbitrary way of specifying a
precise point along it at which a true belief becomes a candidate for knowledge.
Proponents of justification-based theories may insist that for S to know P, the proposition
has to be more probable than not on her evidence, or if they are willing to concede there
is little we know, they may try maintaining that the probability has to be 1, but between
these extremes, it is not possible to stipulate a specific degree of probability greater than
.5 and less than 1 as the exact knowledge point.
Similarly, reliabilists must be content with the observation that the process
producing a true belief has to be reliable enough to make a claim of knowledge prima
facie credible. It is not possible to identify a precise degree of reliability greater than .5
and less than 1 as the knowledge point. Likewise for tracking theories. How extensive
must be the range of close counterfactual situations in which S’s belief would track the
truth? The most that can be said is that it has to be extensive enough to make S’s belief P
a good candidate for knowledge.
18 More exactly, it does so on the presupposition that S understands the proposition P about
which the question is being asked. If S does not understand P, she cannot believe P and
accordingly cannot know P (even if she does know how to answer a question about it).
14
I have been arguing, however, that the relevant dimension is not evidence,
reliability, or truth tracking, but rather information about the situation in which P is true.
On this view too, there is no way to quantify a precise knowledge point. How much
information must S have in order for her true belief P to be a good candidate for
knowledge? Enough.
On the other hand, there is a negative test of what constitutes enough. When S has
a true belief P but does know P, there cannot be significant truths in the neighborhood of
P that S lacks. When one has a true belief P, there are always gaps in one’s information
about P, but if one has knowledge, the missing information is not significant.
When there is a disagreement about whether someone knows something, this is not
a test that will always allow the disputants to resolve their difference. Then again, it is not
the job of a theory of knowledge to provide a decision mechanism for hard cases. A
theory, however, should be able to explain what it is that makes these cases hard, and in
doing so should also be able to sort out the kinds of issues at stake.
Disputes over whether S knows P are sometimes simply disagreements about what
is true. If A and A* agree that S believes P but disagree whether P is true, it should not
be surprising if they also disagree whether she knows P. Alternatively, if they agree that
P is true but disagree whether S believes P, it again would not be surprising if they were
to be at odds over whether she knows P. Disputes can also be the result of disagreements
about what other information S has. If A and A* agree that P is true and that S believes P
but disagree about what additional information she has about the situation that makes P
true, this too can lead them to disagree over whether she knows P. In addition, according
to the sufficient information account, a dispute can be about whether or not the gaps in
S’s information about P are significant. Indeed, often it is this that makes hard cases
hard.
Suppose A and A* agree that S has a true belief P, agree also on what else is true
about the situation, agree on which of these other truths S believes, but nonetheless
disagree whether S knows P. According to A, S knows P, whereas according to A*, she
does not. Contextualists are willing to grant that it is possible for both A’s and A*’s
claims to be true, since the contexts of their claims may differ.19 My concern, however,
19 For some discussions of contextualism and related issues (e.g, closure), see Stewart Cohen,
“Contextualist Solutions to Epistemological Problems: Skepticism, Gettier, and the Lottery,”
Australasian Journal of Philosophy (1998), 76: 289-306; Cohen, “Contextualism Defended,”
Philosophical Studies (2001), 103: 87-98; Keith DeRose, “Contextualism and Knowledge
15
is not with contextualism but rather with whether there need be anything
distinctively epistemological at stake in this kind of disagreement between A and A*. I
think not.
Consider an analogy. I am told that a measurement done by a workman is in error
by an inch. If I am asked whether I regard this as a significant error, I should not answer
until I know what is being measured and for what purposes. If the measurement is of a
wood slat that is to be used for the back of a chair, an inch error may lead to a gap in the
joint and hence be of consequence. On the other hand, if the measurement is of a two
hundred foot lot and is to be used to guide the framing of a house, an error of an inch is
probably not consequential. The point, and it is an altogether familiar one, is that there is
no single standard to turn to in answering questions about what counts as a significant
error of measurement. I need rather to know or presuppose something about the purposes
and aims of the measurement and have this inform whatever answer I make.
Return now to the case in which A and A* agree that S has a true belief P, agree on
what else is true about the situation in which P is true and upon which of these other
truths S believes, but disagree about whether S knows P. According to the sufficient
information approach, whenever S has a true belief P but does not know P, there is a
significant truth about the situation that S does not believe. So, in saying that S lacks
knowledge, A* is committed, on this view, to there being some such truth. Suppose A*
thinks that the significant truth that S lacks is T. Since by hypothesis A and A* agree on
what S believes and agree also on what else is true about the situation, A is committed to
T’s not being a significant gap.
At this point, insofar as I want to understand whether and how A and A* disagree,
I need to ask A*, “Why is not being aware of T a significant gap?” and correspondingly
ask A, “Why isn’t it a significant gap?” On the other hand, as with significant errors of
measurement, it should not be assumed there is a simple standard for them to appeal to in
answering such questions. Various considerations can legitimately be cited as reasons for
thinking that a gap is significant enough to preclude knowledge, including pragmatic
Attributions,” Philosophy and Phenomenological Research (1992), 52: 913-29; Richard
Feldman, “Skeptical Problems, Contextualist Solutions,” Philosophical Studies (2001); ; John
Hawthorne, Knowledge and Lotteries (Oxford: Oxford University Press, 2004; David Lewis,
“Elusive Knowledge,” Australasian Journal of Philosophy (1996), 74:549-67; James Pryor,
“Highlights of Recent Epistemology,” British Journal for the Philosophy of Science 2001 52: 95124; and Peter Unger, “The Cone Model of Knolwedge,” Philosophical Topics 14 (1986), 12578).
16
considerations.
Consider a foreman who is responsible for the air purification equipment in a
factory that makes highly noxious chemicals. The equipment is exceedingly well
engineered, and there has never been any problem with it. Accordingly, the foreman has
strong inductive evidence that it is once again in good working order today. His job
nonetheless requires him to test the equipment daily, which he does on his morning
rounds. He is unaware, however, that the testing mechanism has just malfunctioned in
such a way that even if there were a problem with the air purification equipment, it would
not register on the testing device. When he uses the device, it registers negatively as it
always has in the past, and hence he believes, again as he always has, that the air
purification equipment is working properly. Moreover, he is correct. Nevertheless,
because his ignorance of the faulty testing device seems to represent a significant gap in
his information, there is a pull away from regarding his true belief as a case of
knowledge. Why does this gap seem significant? The most obvious answer invokes
pragmatic considerations. In other circumstances, a gap of this sort in his information
might well lead to a loss of life.
In Knowledge and Practical Interests, Jason Stanley presents a variety of examples
designed to show that the standards of knowledge tend to go up as the practical stakes go
up.20 The sufficient information approach explains why this should be so. Whenever S
believes P and P is true, there are inevitably gaps in S’s information about P. The
relevant question is whether the gaps are significant. If a gap in S’s information could
lead to a major harm, there is a presumption that it is significant. Indeed, the greater the
harm, the stronger the presumption
In Stanley’s examples and in the above case involving the foreman, pragmatic
considerations play a direct role in determining whether S’s true belief P is an instance of
knowledge, but they also play a role, albeit indirect, even when the relevant gap
immediately concerns an intellectual defect or difficulty of some sort.
One of the core insights of pragmatist epistemology is that there is no purely
intellectual test for how important a piece of information is. Information about the
atomic numbers of chemical elements is more significant than information about the
20 Jason Stanley, Knowledge and Practical Interests (Oxford: Oxford University Press, 2005).
See also Jeremy Fantl and Matthew McGrath, “Evidence, Pragmatics, and Justification,” The
Philosophical Review (2002), 67-94; and John Hawthorne, Knowledge and Lotteries (Oxford:
Oxford University Press, 2004), Chapter 4.
17
number of grains of salt in a saltshaker, but not because former is intrinsically
more important than the latter, and not solely for intellectual reasons either. Its
importance, rather, derives from complex links with human interests, needs, and projects,
in other words, from connections with our lives in all their details and purposes. With
enough imagination, it might be possible to conceive a world in which the number of
grains of salt in saltshakers is as significant as the atomic numbers of chemical elements,
but this is not our world.
Keeping this point in mind, recall stories in which S has a true belief P but is
unaware that the processes and faculties that produced her belief are unreliable, or
unaware that they would not track the truth in close counterfactual situations, or that her
evidence for P also justifies falsehoods, or that there is a truth commonly possessed by
others in her community that would undermine her evidence for P. When considering
whether S knows P, gaps of these sorts can seem significant even if P itself is about
something so unimportant that the gap is unlikely to bring about harms of any magnitude.
Why is this? Because of broader pragmatic considerations. Even though the gap may not
create difficulties in the situation at hand, were she to employ this same unreliable
process on other occasions, it would produce a widening circle of error, which in turn
would adversely affect her overall decision-making, with attendant harmful effects on the
quality of her life and the lives of others as well. Similarly, ignorance of relevant and
commonly available information, if generalized, would be likely to multiply the number
of mistaken beliefs that S has, and this in turn would likely result in multiple harmful
effects.21
Return to the story of the foreman and air purification equipment, and contrast it
with a second story about the same foreman. Each morning, before going to work, he
uses an electric toaster to toast a piece of bread. The toaster has a small red light that
comes on when the handle is pressed down, indicating that the machine’s coils are
heating. The make of the toaster has a very high reliability rating, and the foreman has
21 Suppose, however, that S is aware that the processes and faculties that produced her belief P are unreliable (or
that there is a truth that undermines her evidence for P) and yet still believes P. For example, retell the barn story so
that George is aware that there are many barn facades in the region that when viewed from the road look like real
barns. As he sits in his car gazing out the window, he nonetheless believes that it is real barn he is seeing, and he is
correct. By chance he has stopped in front of the few remaining real barns in the area. Still, he lacks knowledge, it
would seem. But why? By hypothesis, he is aware that there are barn facades in the area. So, this isn’t the
significant gap in his information. On the other hand, other gaps in his information now take on more importance.
In particular, as the story has been retold, he should no longer be content with the information available to him from
his car window. A careful believer under these circumstances would gather additional information. Before
accepting that it is a barn he is seeing, not simply a barn façade, a careful believer would get out of the car, walk up
to the barn for a closer look, examine it from the rear, and so on.
18
never had problems with his. He thus has strong inductive evidence that it will
toast his bread as usual again this morning. Unbeknownst to him, however, the circuitry
in the toaster has just shorted in such a way that the red indicator light will come on even
if the coils are not heating. He pushes the handle down, the red light illuminates, and as
he goes to prepare his coffee, he believes that his bread is toasting. And indeed it is. The
toaster’s heating coils are working just as they always do.
In most ways, this story is analogous to the one about the air purification system,
and yet the initial pull against his true belief (that his bread is toasting) being a case of
knowledge may not be as strong. If so, why? The most ready explanation is that the gap
in his information here (that the warning light has malfunctioned) is not as important as
the gap in the previous story. The stakes of being wrong are not as high, and he has no
job related responsibilities to ensure that the toaster is in a good working order.
On the other hand, to the degree that there still is a pull against his true belief being
an instance of knowledge, that too is explainable. There is a flaw in his evidence (his
mistaken belief that the red light is a reliable indicator of that the toaster’s coils are
heating) and a corresponding unreliability in the processes that led to his belief, and
although his lack of awareness of this flaw and unreliability is unlikely to cause major
problems in the immediate situation since so little is at stake, they are the kind of defects
that if generalized would be likely to lead to harmful consequences in other situations.
Indeed, telling the stories together (the one about the air purification equipment and the
one about the toaster) encourages this worry.
According to the sufficient information approach, whenever S has a true belief P
but does not know P, there is a significant gap in her information about P. The view thus
invites questions about what makes a gap significant. In the face of such questions, one
might try to formulate of set of necessary and sufficient conditions of significant gaps,
but even if such a project is doable, my aim is more modest. It is to illustrate that
assessments of whether S knows P are partially a function of whether one thinks there is a
significant gap in S’s information about P, and to illustrate as well that a complex set of
considerations can be invoked in making such assessments Complex for two reasons.
First, every situation brims over with truths, so much so that every inquirer, no matter
how well informed, has numerous gaps in her information about the situation that in turn
create numerous opportunities for denying that she has knowledge. But in addition,
complex because the full range of purposes, aims, and needs that characterize human
19
lives are potentially relevant for these assessments.22
6. Maximally accurate and comprehensive beliefs
It can be difficult to resolve questions of whether a gap in S’s information
about P is significant enough to preclude her from knowing, but the room for such
questions arise tends to shrink as S’s understanding of the situation in which P is
true becomes more and more complete. Accumulating truths does not always
produce understanding, however. One can acquire lots of information about a
situation and still not be in a position to have knowledge of itnif the information is
unimportant. Moreover, how bits of information are connected is itself
information, indeed, often enough, crucial information.
Two-thirds of the way through the mystery novel, it may be obvious that the
detective has a true belief that the victim’s childhood friend is the guilty party and
obvious as well that the detective has uncovered all the important clues --- the
murder weapon, the jimmied lock, the empty safe, and so on. Nonetheless, if the
detective is unable to piece these clues together to form an accurate picture of how,
why and when the childhood friend committed the crime, readers are likely to
think that he does not yet know that the childhood friend is guilty. The significant
gap in his information is information about how the pieces of information in his
possession fit together.23
22 Is the sufficient information approach committed to contextualism? Not necessarily. It depends
whether a non-contextualist treatment can be given of what makes a gap in S’s information significant.
There are challenges of providing such a treatment, but nothing in the sufficient information approach
per se precludes it.
23 A passage from the Aeneid Book III, lines 576-591 (trans., Allen Mandelbaum (New York:
Bantam, 1971) describing the cavern of the prophet Sibyl makes this point in a splendidly
imaginative way:
When on your way you reach the town of Cumae,
the sacred lakes, the loud wood of Avernus,
there you will see the frenzied prophetess.
Deep in her cave of rock she charts the fates,
Consigning to the leaves her words and symbols.
Whatever verses she has written down
Upon the leaves, she puts them in place and order
And then abandons them inside her cavern.
20
So, sheer quantity of information is not what makes it hard to deny that S
has knowledge. The point, rather, is that as S comes to have an increasingly
complete grasp of the situation in which P is true, where this involves not only
acquiring the important pieces of information but also seeing how they are linked,
it becomes increasingly difficult to deny that she knows P.
Consider a limiting case. Imagine that Sally’s beliefs are as accurate and
comprehensive as it is humanly possible for beliefs to be. She has true beliefs
about the basic laws of the universe, and in terms of these, she can explain virtually
everything that has happened, is happening, and will happen. She can accurately
explain the origin of the universe, the origin of the earth, the origin of life on earth,
the mechanisms by which cells age, the year when the sun will die, and so on. She
can even explain how it is that she came to have all this information.
Consider a proposition, Pcells, which concerns the ageing mechanisms in
cells. Because Sally’s beliefs about how cells age are maximally accurate and
comprehensive, there are few gaps of any sort in her information about Pcells, much
less important ones. Thus, she knows Pcells.
According to some theories of knowledge, one knows only if one’s true
belief has an appropriate ancestry. Call these “ancestral accounts.” Different
theorists have different views about the precise ancestry required. Some maintain
that the belief must have been caused in an appropriate way by the facts that make
it true; others say that the belief must be the product of cognitive processes that are
in general reliable about matters of the kind in question; still others say that the
belief must be the product of cognitive faculties functioning as they were designed
for function.
When all is still, that order is not troubled;
But when soft winds are stirring and the door,
Turning upon its hinge, disturbs the tender
Leaves, then she never cares to catch the verses
That flutter through the hollow grotto, never
Recalls their place and joins them all together.
Her visitors, when they have had no counsel,
Depart, and then detest the Sibyl’s cavern.
21
In the story about Sally, however, there is nothing about her beliefs being
caused by facts that make Pcells true or being the products of reliable processes or
being the products of properly functioning cognitive faculties. It is consistent with
the story that it was some combination of strange processes and unusual events that
against all odds generated her beliefs about how cells age. However strange these
processes and unlikely these events may be, Sally is fully aware of how in her case
they led to maximally accurate and comprehensive beliefs.
Yet, ancestral accounts seem committed to denying that Sally knows Pcells, or
for that matter much of anything else. Indeed, on the assumption that the cognitive
faculties of the rest of us sometimes operate reliably in producing true beliefs,
these accounts imply that the rest of us know more about the world than Sally
does. This is implausible. Sally has far greater knowledge than do the rest of us.
The apparent lesson here is that there is no single, privileged kind of causal
history that true beliefs need to have in order to count as knowledge. I say
“apparent,” because it might be maintained that it is not really possible for Sally’s
causal interactions with the world to have produced maximally accurate and
comprehensive beliefs and yet for these beliefs not to have had an appropriate
ancestry, for example, not to have been reliably generated.
This would be a difficult argument to make, however, if the reliability being
claimed is supposed to be the same kind that reliability theorists use to characterize
knowledge. Indeed, reliability theorists typically go out of their way to insist that a
process that in fact produces only true beliefs is not thereby necessarily the kind of
process that makes a true belief into a plausible candidate for knowledge.24 On
the other hand, it is a different matter if the claim is merely that Sally’s beliefs
must be reliable in some minimal sense. Every story is incomplete. The story
about Sally is no exception, and it may be that the details of this story cannot be
fully filled in without assuming an orderly universe that Sally has interacted with
in law-like and hence at least minimally reliable ways.
So be it. The key point for purposes here remains, namely, the simplest and
best explanation of why Sally knows is not that her beliefs are the products of
processes that are reliable in some minimalist sense, but rather that they are
accurate and comprehensive.
24 See, for example, Alvin Goldman, Epistemology and Cognition, pp. 106-109.
22
Tracking accounts of knowledge face similar problems. The statement that
Sally’s beliefs are maximally accurate and comprehensive is not inconsistent with
the statement that they would not track the truth in close counterfactual situations.
But according to tracking accounts, if in close counterfactual situations Sally
would not have accurate beliefs about the origin of the universe, how life came into
existence on earth, and the mechanisms by which cells age, she does not have
knowledge.
Once again, this is implausible. By hypothesis, Sally has maximally
accurate and comprehensive beliefs about these matters, and also maximally
accurate and comprehensive beliefs about the fortunate circumstances that led her
to have such beliefs. Thus, if it is the case that her beliefs would not track the truth
in close counterfactual situations, she is aware of this and understands why they
would not do so. She is fully aware, in other words, of the fragility of her
information. Fragility, however, is not incompatible with knowledge. Think by
analogy of powerfully precise scientific instruments whose workings are so
delicate that if the operating conditions were changed in minor ways, they would
no longer be accurate. Although even minor vibrations would create inaccurate
readings, as long as those using the instruments are aware that there are no such
vibrations, these counterfactual inaccuracies are beside the point. So it may be with
Sally’s beliefs.25
What about justification based theories? It might seem easier to argue that
Sally’s beliefs must of necessity be justified, since by hypotheses she can explain
the origin of the universe, the origin of the earth, the origin of life on earth, the
mechanisms by which cells age, etc. Even so, at least some theories of justified
belief are such that it is not a foregone conclusion that Sally’s beliefs meet their
requirements.
Consider explanatory coherence accounts that use intellectual virtues such as
simplicity and conservatism to distinguish among equally coherent but competing
sets of beliefs. There are no assurances that Sally’s beliefs have such virtues. A set
of beliefs that is maximally accurate and comprehensive is not necessarily the
simplest or most conservative.
25 See the related discussion of stability in Chapter 15.
23
Or consider foundationalist theories, according to which a belief is
justified only if it is self-justifying or can be adequately defended in terms of one’s
self-justifying beliefs, where not just any kind of proposition can be believed in
self-justifying way. Traditional foundationalists, for example, insist that only
beliefs about relatively simple necessary truths and about what one is currently
experiencing can be self-justifying. There are no assurances, however, that Sally’s
beliefs about the origin of the universe, how life came into existence on earth, and
the mechanisms by which cells age can be adequately defended in terms of her
beliefs about simple necessary truths and what she is experiencing.
So, at least on some familiar theories of justification, Sally’s beliefs, despite
being maximally accurate and comprehensive, are not necessarily justified. Which
is not to say that these theories fail as accounts of justified belief. If it were
stipulated that justified belief is a prerequisite of knowledge, there might be a case
this conclusion, since given this stipulation, any suggestion to the effect that
Sally’s beliefs about Pcells are not justified is also a suggestion that she does not
know Pcells, but she does know Pcells.
It is a mistake, however, to stipulate that justification is a prerequisite of
knowledge. More on this in Chapter 8. It is equally a mistake to stipulate that
maximally accurate and comprehensive beliefs of necessity also have to be
justified. After all, why should it be an a priori constraint on theories of justified
belief that there be such a tight, necessary connection between how the world is
and how humans are able to justify their beliefs?
On the other hand, there may be looser connections. Just as there may be
ways of arguing that maximally accurate and comprehensive beliefs inevitably
satisfy minimal standards of reliability, so too it may be that such beliefs inevitably
satisfy minimal standards of reasonability. Indeed, the nature of belief may impose
some structural constraints on its own. Donald Davidson, amongst others, has
argued that it is not possible for something to qualify as a set of beliefs without
being minimally well ordered, specifically, without meeting minimum standards of
coherency. 26 And even if the nature of belief per se doesn’t produce significant
constraints, the world may do so, or more cautiously, may do so to the degree that
26 See Donald Davidson, “On the Very Idea of a Conceptual Scheme,” Proceedings and
Addresses of the American Philosophical Association, 17 (1973-74), 5-20; and Davidson, “A
Coherence Theory of Truth and Knowledge,” in E. LePore (ed.), The Philosophy of Donald
Davidson, Perspectives on Truth and Interpretation (London: Basil Blackwell, 1986), 307-19.
24
one’s beliefs are accurate and comprehensive. For, it may not be possible for
beliefs to be maximally accurate and comprehensive without their also being
largely coherent.27
None of these nuances affect the key point, however, which is that the most
basic, most obvious, and best explanation of why Sally has knowledge is that there
are no significant gaps in her information. There may be correlations between her
beliefs being maximally accurate and comprehensive and their having various,
other interesting properties, but these other properties are not needed to explain
why she has knowledge.
7. The beetle in the box
S comes into room and sees a small, sealed box on the table. She looks at the
outside of the box from all angles but cannot see into it. There is nothing distinctive
about its weight. Nor does it make a special sound when shaken. Relative to what she is
able to observe, there might be nothing at all inside, or there might be a coin wrapped in
cloth so as to make no noise, or perhaps a marble or a key. S, however, believes that
there is a beetle in the box, and she is correct. There is in fact a beetle inside. But she
does not know this to be the case.28
As with other stories meant to persuade that a subject has a true belief but lacks
knowledge, it is important to ask what details have been omitted and whether filling in
these details in one way rather than another would make a difference in our reactions to
the story. How did the beetle get inside the box? Did someone place it there? If so, how
and when?
27 The point here can also be approached from the opposite direction: a world in which Sally’s
beliefs do not meet minimum standards of coherence despite being maximally accurate and
comprehensive may not be a world in which knowledge is possible at all. It may be too chaotic
for Sally or anyone else to have knowledge of it. If so, the scope of the above claim about
maximally accurate and comprehensive beliefs would have to be restricted. The claim would
become: if one has maximally accurate and comprehensive beliefs about P, then one knows P
provided the world is such that P is knowable at all. In Chapter 7, I discuss in more detail a case
in which one has accurate and comprehensive beliefs about P and yet does know P, because P is
not knowable.
28 John Hawthorne first pointed out to me the relevance of this kind of case.
25
Let H be a full history of the box, the beetle, and how the beetle came to be inside.
If we stipulate that S, like Sally of the previous chapter, has maximally accurate and
comprehensive beliefs about this history H, then even though she cannot now see inside
the box, it may nonetheless be plausible to say that she knows that there is a beetle inside.
Suppose there is no rich history H, however. The world is such that the beetle has
always been in the box. This is just the way the world is. There is this box, and it has a
beetle inside it.
Is this really possible? In our world, boxes and beetles come into existence via
specific histories. It is thus hard to conceive, that is, fully and completely conceive, how
it could be that this box and beetle do not likewise have such histories. Whatever
material the box is made from, must not it have come from somewhere? If the material is
wood, the wood came from a specific tree that had a parent tree which itself had a parent
tree, and so on, and at some point the wood from the offspring tree was made into the
box. Similarly with the beetle. If it is like other beetles, it came from parent beetles,
which themselves had their own histories, and moreover there is a history about how
beetles in general came into existence.
Let’s agree, however, to stretch the limits of intelligibility by stipulating that there
really is no (or at least very little) further history to be told about this beetle being in this
box. They didn’t come into existence the way that other things do. Since the beginning
of the universe, there has been this box with a beetle inside of it. The history of this box
and beetle is unconnected with the histories of how other things in the universe, including
other beetles and other boxes, came into existence.
Suppose, finally, that S believes all this to be the case. What then? Although she
has a true belief that there is a beetle in this box, it still seems as if she lacks knowledge.
And yet, what is the significant truth here that she lacks?
Even in the case of this unusual box, there are ways to test whether there is a beetle
inside. An fMRI scan of the box would display a beetle shaped image. A device that can
be used detect and analyze DNA through the box would register beetle DNA. If one
pried open the box, one would see a beetle. And so on.
But suppose we now push the story a step further and stipulate that no such test
ever has been conducted and that none ever will. Imagine, if you wish, that the box is
26
sacred, that tests on it are prohibited, and that this prohibition is rigidly enforced.
The religious mandate is that one should believe that the box has a beetle in it without
having to resort to tests.
Even in a world where no tests are performed, there is nonetheless an extensive set
of counterfactual truths about what would be the case were such tests to be conducted.
But suppose S also believes all these truths. After all, if she believes that there is a beetle
in the box and has enough background information and knowledge of the laws of nature,
she can infer these counterfactual truths. Accordingly, she believes that if an fMRI scan
were taken of the box, it would display a beetle shaped image; she likewise believes if a
DNA analyzing device were used on the box, it would register beetle DNA; and she
believes that if she were to pry the box open, she would see a beetle.
In summary then, S believes that there is a beetle in the box; her belief is true; and
she is aware of all the various counterfactual truths about what would happen were the
box to be tested. Still, she does not know that there is a beetle in the box. The intuition
here is that for all she knows, there could be a coin or a marble or nothing at all inside.
For, if she had believed that there were a coin inside rather than a beetle, she would have
inferred that were the appropriate coin detection tests to be employed on the box, they
would confirm that there is a coin inside. These beliefs would have been false, because in
fact there isn’t a coin in the box, whereas all of the beliefs she in fact infers from her
belief about the beetle in the box are true.
In thinking about this case, it’s necessary to distinguish the theory that knowledge
is a matter of having sufficient information from the standard test that the theory
recommends, the test being that whenever S has a true belief P but does not know P, there
is a significant truth about the situation that makes P true that S lacks. For, the theory is
well positioned to explain why S here lacks knowledge, namely, she does not have
enough information about P to know P. What is peculiar about the case is that her lack of
information is not a matter of her being unaware of important and available information.
Rather, it is a matter of there not being enough information about P for her or anyone else
to have knowledge. Ordinarily, if S has a true belief P but lacks knowledge, there are
significant truths she lacks but might have had, but in this case there are no such truths. It
is the world that is lacking rather than S. The world is informationally impoverished with
respect to the proposition that there is a beetle in the box.
The beetle in the box story thus supports the basic insight of the theory. S doesn’t
know that there is a beetle in the box because she does not have sufficient information. It
27
also illustrates, however, that the standard way for testing for sufficient
information (there is some significant truth that S lacks) has restricted applicability. In
particular, the test has to be restricted to cases in which the world is not informationally
deprived with respect to P. Put another way, it has to be restricted to worlds in which
there is sufficient information about P for it to be capable of being known.
8. The theory of knowledge and the theory of justified belief
When S knows P, she and her true belief P typically have various other
merits. There ordinarily will not have been a malfunction in those cognitive
faculties that were involved in producing the belief P; the specific processes and
methods involved will typically have been ones with reliable track records; and it
will not have a lucky happenstance that she arrived at a true belief.
Such merits are frequent accompaniments but not prerequisites of
knowledge. If S has a true belief P and there is no gap in her information about P
that is significant, she knows P. Nothing more is required to explain why she
knows. Not even justification.
In his 1963 article, “Is Justified Belief Knowledge?,” Edmund Gettier
argued that justification when added to true belief is not sufficient for knowledge,
but he simply assumed that it is necessary.29 He was not alone. This has been a
common presupposition in contemporary epistemology. It is also a presupposition
that has had unfortunate consequences for both the theory of knowledge and the
theory of justified belief, consequences that became conspicuous in the aftermath
of Gettier’s article.
The immediate effect of the article was to inspire a search for a special sort
of justification that when added to true belief produces knowledge. The
justification, it was said, had to be indefeasible or non-defective or in some other
way a cut above ordinary justification. Others were prompted to search in a
different direction, however. Justification, traditionally understood, is associated
with being able to generate reasons in defense of one’s beliefs, but in many
29 Gettier, op. cit.
28
everyday instances of knowledge, they noted, the knower is not in a position to
provide such a defense.
Phil knows that the bus is a block away because he sees it, but he would be
hard pressed to justify his belief. He knows little about optics and even less about
the human visual system. Like most of us, he relies upon his eyes for information,
but he would not be able to provide a non-circular defense of the overall reliability
of his perceptual beliefs. Indeed, if pressed for such a defense, he would simply
become confused. Beth knows that she was at the bank last week because she
remembers being there, but she is not able to cite convincing evidence in support
of her belief or in support of the general accuracy of her memory. Margaret knows
that the sun is much larger than the earth, and Larry knows that Lincoln was
assassinated by John Wilkes Booth, but neither is in a position to provide a defense
of the truth of what they believe, since neither recalls the original source of their
beliefs, thus limiting their ability to mount a defense based on the authority of that
source. Gary knows he is about to run out of gas because he see the arrow of his
fuel gauge pointing to E, but if asked to justify his belief, he might well be
stumped. He knows little about how fuel gauges work, and because he has never
run out of gas himself, he has no personal experiences that would provide the basis
for an inductive defense of his belief. John glimpses a face in a parade and
recognizes it to be that of a childhood friend whom he has not seen for decades, but
he has no idea of what visual cues caused him to recognize his old friend.
Examples such as these suggested to many epistemologists that knowledge
does not require justification, at least not in any traditional sense, and that to
assume otherwise is to overly intellectualize how people acquire much of their
everyday knowledge. These epistemologists accordingly shifted their focus away
from justification and began to propose accounts in which S’s true belief P has to
be caused by the fact that makes it true, or has to be the product of a highly reliable
process, or had to track the truth in close counterfactual situations, or has to be
product of cognitive faculties operating in the way that they were designed to
function. Because these accounts shifted attention away from the knower being
able to mount “internally” a defense of what she knows and towards causal and
causal-like properties, they came to be known as “externalist” accounts of
knowledge.
These accounts of knowledge, in turn, ushered in a new class of externalist
accounts of justification. Initially, externalism was part of a reaction against
29
justification-driven theories of knowledge, but an assumption drawn from the old
epistemology tempted many externalists to re-conceive justification as well. The
assumption is that by definition justification is that which has to be added to true
belief to generate a serious candidate for knowledge, with some fourth condition
added to handle Gettier-style counterexamples. If, as many wanted to claim,
knowledge is best understood as reliably produced true belief with perhaps an
additional condition to address Gettier counterexamples, then relying on the above
assumption, it seemed a small step to the conclusion that having justified beliefs at
its core must also be a matter of one’s beliefs being produced and sustained by
reliable cognitive processes.
Such proposals sparked a literature on the relative advantages and
disadvantages of externalism and internalism. Much of this literature assumes that
externalists and internalists are defending rival theories, but a more charitable
reading is that they are primarily concerned with different issues.
Externalists are first and foremost interested in the theory of knowledge, that
is, in understanding the relationship that has to obtain between one’s beliefs and
the subject matter of one’s beliefs in order for the beliefs, when true, to count as
knowledge. However, in carrying out this project, they often see themselves as
also offering an account of justification, because justification, they presuppose, is
that which has to be added to true belief in order to produce a serious candidate for
knowledge.
Internalists, on the other hand, are first and foremost interested in the theory
of justified belief, that is, in understanding what is involved in having beliefs that
one is able to defend, but they frequently also see themselves as providing the
materials for an adequate account of knowledge, because they too presuppose that
justification is by definition that which has to be added to true belief to get a
serious candidate for knowledge.
This is a presupposition to be resisted, however. As the theory of knowledge
and the theory of justified belief are independently developed, interesting
connections may possibly emerge, but the primary focus of the theory of
knowledge is very different from that of the theory of justified belief. So, it should
not be simply assumed from the start that knowledge and justified belief are
necessarily linked, as opposed to being frequently associated with one another.
30
Suspending this assumption, moreover, is liberating. It frees the theory of
knowledge from an uncomfortable dilemma: Either embrace an overly intellectual
conception of knowledge, which overlooks the fact that people cannot always
provide adequate intellectual defenses of what they know, or engage in awkward
attempts to force back into the account some duly externalized and non-traditional
notion of justified belief, because the definition of knowledge is thought to require
it.
Simultaneously, it frees the theory of justified belief from servitude to the
theory of knowledge. If it is stipulated that the properties that make a belief
justified must also be properties that turn true belief into a good candidate for
knowledge, an account of justified belief can be regarded as adequate only if it
contributes to a successful account of knowledge. The theory of justified belief is
thereby separated from our everyday assessments of each other’s opinions, which
are more concerned with whether individuals have been appropriately careful and
responsible in regulating their opinions than on whether they have satisfied the
prerequisites of knowledge.30
Once it is no longer taken for granted that there is a conceptual link between
justification and knowledge, epistemology is re-oriented. The working strategy
that has dominated epistemology since Gettier’s article has been to employ the
assumption that knowledge and justification are conceptually connected to draw
strong and sometimes antecedently implausible conclusions about both knowledge
and justification.
The strategy can be thought of as an epistemology game. Call it “the Gettier
game.” It starts with a story in which a subject has a true belief but intuitively
seems not to have knowledge, and the play is governed by the rule that justification
is one of the conditions that has to be added to true belief in order for it to be a
serious candidate for knowledge. The goal of the game is to identify, within the
constraints imposed by this rule, the defect that explains why the subject of the
story lacks knowledge.
A solution to the Gettier game can be of three sorts. First, one can claim that
although the subject’s belief is true, it is not plausible to regard it as justified.
Second, one can claim that although the subject’s belief as justified, it lacks an
30 In Chapter 22, I sketch a theory of justified belief with such a focus.
31
additional condition (non-defectiveness or indefeasibility or whatever) that has
to be present in order for a true justified belief to be an instance of knowledge.
Third, one can claim that although at first glance it might seem plausible to regard
the subject’s belief as justified, the case illustrates why it is necessary to amend the
traditional notion of justification (for example, by claiming that a belief is justified
only if it is reliably generated); one is then in a position to explain that the subject
lacks knowledge because her belief is not justified in the amended sense.
Once the stipulation that justification is a necessary condition of knowledge
is abandoned, the Gettier game can no longer be played. In its place, however,
there is a much simpler and more illuminating epistemology game to play. The
game starts identically, with a story in which a subject has a true belief P but
intuitively seems not to have knowledge, but it is governed by a different rule:
Look for significant information about P that the subject lacks.
Except for a few extreme situations, for example, the beetle in the box
scenario, this game always has a solution, a solution that explains why the subject
lacks knowledge. That is, in the enormous literature generated since Gettier’s
article, with its vast number of stories describing a subject who has a true belief P
but intuitively lacks knowledge, in each and every one of these stories it is possible
to identify some significant aspect of the situation about which the subject lacks
true beliefs.31
31 There are others who have repudiated the Gettier game. See, for example, Timothy
Williamson, Knowledge and Its Limits (Oxford: Oxford University Press, 2000). Williamson’s
“knowledge first” approach rejects the assumption that justified belief is an ingredient of
knowledge. It instead makes knowledge into an ingredient of justified belief, the core
assumptions being that one is justified in believing only that which is supported by one’s
evidence and that one’s evidence is limited to what one knows. So, on this view, the theory of
knowledge and theory of justified belief are still conceptually married, only in the reverse
direction. But once again, the forced marriage produces strained results. For details, see Richard
Foley, “Review of Timothy Williamson, Knowledge and Its Limits,” Mind 111 (2002) 726-30.
Download