Artificial Intelligence and Personal Identity Author(s): David Cole Reviewed work(s): Source:

advertisement
Artificial Intelligence and Personal Identity
Author(s): David Cole
Reviewed work(s):
Source: Synthese, Vol. 88, No. 3 (Sep., 1991), pp. 399-417
Published by: Springer
Stable URL: http://www.jstor.org/stable/20116948 .
Accessed: 19/08/2012 03:32
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .
http://www.jstor.org/page/info/about/policies/terms.jsp
.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact support@jstor.org.
.
Springer is collaborating with JSTOR to digitize, preserve and extend access to Synthese.
http://www.jstor.org
DAVID
ARTIFICIAL
INTELLIGENCE
PERSONAL
ABSTRACT.
of personal
Considerations
and on the opposed
argument,
natural
In this paper
language.
COLE
position
I develop
AND
IDENTITY
Room
identity bear on John Searle's Chinese
a
that a computer
itself could really understand
on the
the notion of a virtual person, modelled
of virtual machines
familiar
in computer
I show how Searle's
science.
concept
argument,
to defend
and J. Maloney's
that Searle
is correct
in holding
it, fail. I conclude
attempt
but wrong
that no digital machine
could understand
in holding
that artificial
language,
are impossible:
are not the same as the machines,
minds
minds
and persons
biological
or electronic,
that realize
them.
can poten
in cognitive
workers
science believe
that computers
a
have
John
mental
abilities.
Searle
has
been
tially
genuine
prominent
critic of this optimism
about the abilities of computers.
Searle argues
can at best simulate, but not possess,
that computers
If
intelligence.
Searle is correct, even though a computer might eventually
the
pass
Many
Turing Test, no computer will ever actually understand natural language
or have genuine propositional
such as beliefs. Searle's argu
attitudes,
ment
it appears
is interesting both because of its import and because
to some to be valid (e.g., Maloney
to
to
and
others
be
invalid
1987),
(e.g., many of the "peer" commentators
following Searle 1980, Sharvy
1983, Carleton
1984, Rey 1986, Anderson
1987).
The following
is a defense of the potential mental
abilities of digital
I shall clarify precisely why his
computers
against Searle's criticism.
is logically invalid. The missing
that would
render
argument
premise
the argument valid reflects a form of personal
identity theory that Searle
and I believe rightly, regarded as false.
may accept but which is widely,
in proving that no
Searle has indeed, I believe,
succeeded
However,
computer will ever understand English or any other natural language.1
And he is correct in rejecting
the "system reply".2 But, I shall show
how this is consistent with the computer's
causing a new entity to exist
but (b) that exists solely
(a) that is not identical with the computer,
in virtue of the machine's
and (c) that does
activity,
computational
understand English. That is, showing that the machine
itself does not
can introduce
does not show that nothing does. We
understand
the
a
an
of
Virtual
Person
Virtual
that
may be
concept
(or
Mind),
entity
realized by the activity of something
that is not a person
(or mind).
fails in establishing
Searle's argument
Thus, I believe,
any limitations
on Artificial
Intelligence
(AI).
?
1991.
SS: 399-417,
Synthese
1991 Kluwer Academic
Publishers.
Printed
in the Netherlands.
400
DAVID
COLE
of Searle's,
Thus, one can show, by a line of reasoning
independent
a
to
to a
be
mistake
attribute
that it would
always
understanding
line
of
is
The
considerations
raised
argument
computer.
by
inspired by
John Locke and his successors (Grice, Quinton,
Parfit, Perry and Lewis)
of theories of personal
in the development
identity, a branch of analyti
perhaps not
abstractness
reveals
the
ing
irrelevance of the fact that
not understand.
Searle was
he wins a battle, but loses
cal metaphysics
related to AI. This line of reason
obviously
of the entity that understands,
and so the
the
hardware
(including
system) itself does
both right and wrong on this assessment;
the war.
SEARLE'S
ARGUMENT
It is possible
to write computer
is straightforward.
argument
in
about
natural
that
programs
language to questions
produce responses
some subject domain. Some believe
that through such clever program
is produced.
Others
believe
that genuine
ming actual understanding
not
it
be
in
the
future with
is
but
yet achieved,
may
understanding
databases
and
faster ma
larger
techniques,
improved programming
Searle's
chines.
But,
as
Searle
argues,
consider
that
no
matter
how
clever
and
a human could do exactly what the computer
the program,
complex
does: follow instructions
for generating
strings of symbols in response
to incoming strings.
a person (Searle, in the original statement of
for example,
Suppose,
sits in a room with instruc
the argument) who does not know Chinese
tell one in detail how to
in English
that
tions written
(a "program")
Chinese
strings in response to the strings
symbols, producing
manipulate
are
one.
to
to
We
the
instructions are such that they
that
suppose
given
permit successful passage of this variation on a Turing Test: even with
the room cannot
tell the
those outside
trials of indefinite duration
the room as described
and a room that contains a
difference
between
Since the instructions
tell one what
human native speaker of Chinese.
to do entirely on the basis of formal or syntactic features of the strings,
ever mentioning
one can generate
(or revealing) meaning,
sentences without
of
what
Chinese
any understanding
they mean
even knowing
sentences. That is,
that they are Chinese
indeed without
a
not
would
does
what
computer
give one the ability to
doing exactly
the
does not understand
Chinese.
understand
Therefore,
computer
without
INTELLIGENCE
ARTIFICIAL
AND
PERSONAL
IDENTITY
401
cannot produce
either. Thus, mere
programming
a
of
natural
standing
language.
I wish now to consider
three claims made
in the course
Chinese
under
of
this
argument:
the claim that the person following
not understand Chinese;
(1)
the English
instructions
following
a program
would
inferred claim that a computer
and
would not understand Chinese;
the
(2)
the inferred
(3)
gramming
final claim in the preceding
cannot produce understanding
summary that pro
of a natural lan
guage.
is some
reason
to be critical of claim (1). Searle's self-report of
of
Chinese
in his scenario conflicts with other evi
incomprehension
to Chinese
in Chinese
the
One
dence,
response
notably
questions.
room
hold
that
the
in
the
understands
Chinese
with
albeit
person
might
not characteristic
certain odd deficiencies
of polyglots: most notably,
an inability to translate.3 One may also be critical of the inference
There
a human following
to (2). There are important disanalogies
between
a
instructions
and
understood
computer
English
running a program
the computer does not literally understand
its program "instructions";
the computer would not be conscious
of its program;
the explicitly
a red herring
in
character
of
the
"instructions"
is
the
program
syntactic
a
com
in that whatever
is produced
by programming
programmable
a
in
could
have
been
hardwired
dedicated
puter
computer
(Cole 1984).4
But let us suppose that the crucial premise
(1) is true: Searle would
not understand Chinese merely by following
in English
the instructions
for syntactic manipulation.
Let us also concede
for
the
sake of
(2)
argument.
Clearly
does not
Nevertheless,
(3) does not follow.
from the fact that someone does not understand
Chinese
it
follow that no one understands
Chinese. And
from Searle's
in the Chinese Room
it does not follow
scenario,
linguistic disabilities
that no one in the room understands
unless
Searle is alone in
Chinese,
the room. And,
and
this
is
the
main
it does not
finally
point here,
the
from
that
Searle
is
alone
in the
follow
logically
premise,
initially
room and that no one else enters the room from outside,
that Searle
remains alone in the room.
402
DAVID
THE
KORNESE
COLE
ROOM
The Chinese Room argument as it stands is logically invalid. The ques
tion then iswhether
it is legitimate to assume that if anyone understands
it must be Searle (who does not). It might be thought, since
Chinese,
the question-answering
that there is someone
suggests
performance
who understands
Chinese
and that the only one around is Searle, that
the only one who could possibly be the one who understands
Chinese
But
it is not necessary
that it be the case that the
room
of
the
that
there
is any one who understands
suggests
performance
room.
in
the
Oriental
languages
To show that, let us consider a variation on Searle's thought experi
ment. Once again we are to imagine Searle in a room, following
instruc
is just Searle.
tions for dealing with strings of alien characters slipped under the door.
room is a bit more
crowded
than before:
the size of the set of
instruction manuals
is almost doubled
from before. And
the pace is
more hectic. Searle flips vigorously
as
a
manuals
through
steady stream
The
of squiggles
and squoggles
is slipped under the door. He generates
reams of marks
in return, as instructed by the manuals.
As before,
someone
room
of
those
outside
in
believe
that
the
understands
many
for they are receiving replies to questions
submitted
in Chi
Chinese,
nese.
someone
in the room
those outside also believe
before,
are
to
for
in
Korean
Korean,
they
questions
receiving
speaks
replies
in Korean.
that the room is
submitted
Furthermore,
they also believe
more crowded
in
than before:
there appear to be at least two people
not
Chinese
him
but
the room, one who understands
Korean,
Pc)
(call
but not Chinese
and another who understands
Korean
(call him Pk).
seems quite clear that Pc is not Pk. In fact let us suppose
The evidence
But
unlike
that the response abilities and dispositions
embodied
in the Pc database
were derived from an elderly Chinese woman, whereas
those in the Pk
database were from a young Korean male who was a victim of a truck
if any, who understands
is not
Chinese,
bicycle collision.5 The person,
if any, who understands Korean. The answers to the ques
the person,
tions in Chinese
appear to reveal a clever, witty, jocular mind, knowl
about
things in China, but quite ignorant of both the Korean
edgeable
events
in Korea. Pc reports being seventy-two
and
years old
language
on what it has been
and is both wise and full of interesting observations
in China for the tumultuous past half-century.
like to be a woman
By
ARTIFICIAL
INTELLIGENCE
AND
PERSONAL
IDENTITY
403
reveal quite a dull young man.
contrast, the replies to Korean questions
Pk has a vitriolic hatred of China, but is largely
Pk is misogynous.
and soon discovers
that
ignorant of events there. Pk is very avaricious,
he can demand money
under
for answering
the questions
the
slipped
door. Pk reports that he works in a television factory and gives accurate
of television assembly. The only other subject about which
descriptions
are the Olympic Games
Pk exhibits much
held
interest or knowledge
in Korea.
that the behavioral
evidence
is a clear as can be in
Thus,
suppose
in the room. Try as
such a case that there are two distinct individuals
room
can
no hint that there
the
find
interlocutors
outside
they might,
a
to
two.
be
Information
be
but
might
single person pretending
provided
to Pk, even when offers of substantial rewards
in Chinese
is unavailable
are made
to him in Korean
for answers based on the information
in Chinese.
provided
is certainly not decisive here. Indeed be
But behavioral
evidence
havioral evidence
is the very category of evidence
called into question
in rejecting the adequacy of the Turing Test as a test of mental abilities.
a stronger case can be made by considering
information
Fortunately,
not available to the interlocutors
outside the room. There
is in fact no
and Korean.
individual inside the room who understands
both Chinese
Searle understands
neither. And the instructions
for generating
replies
to Chinese
with
and
those
for
Korean
dealing
input
input are distinct,
with no exchange of information between
the databases
consulted
(al
were
a
not
this
is
If
known
there
though
by Searle.)
single person
duplicitously
feigning being two, the person would, on the one hand,
realize that something was being asked about events that he/she knew
about and then would pretend not to know about them. That never
in the room. And
the histories of the representations
in the
happens
are completely
Pc and Pk databases
individuals
independent,
involving
in China and Korea
Thus ifFc and Pk are persons,
they
respectively.
are distinct.
At this point considerations
in other
emerge familiar from arguments
contexts
for
identity (cf.,
concerning
Perry 1978,
personal
example,
cannot
Pc
be
identical
with
Pk.
The
for
pp. 32-36).
grounds
saying that
Pc is Searle are just the same as those for holding
that Pk is identical
with Searle. We cannot hold that both are identical with Searle,
for
we must hold
this would violate the transitivity of identity. Therefore,
that neither
is Searle.
Thus,
by a line of reasoning
quite
independent
404
DAVID
COLE
of that used by Searle, we arrive at the conclusion
is not Searle.
Chinese
any, who understands
COMPUTERS
AND
that the person,
if
PERSONS
now a computer
that a program has been
Consider
system. Suppose
the very algorithm for replying
written for this machine which embodies
to questions
that was used in the English
in
instructions
imagined
room scenario. Again,
of world
the Kornese
there is no interchange
between
the Korean
and the Chinese
Let us
information
databases.
to questions
asked in
then that the computer
system responds
asked in Korean, with performance
and to questions
indistin
us
someone
room.
Now
that
from
the
Kornese
let
suppose
guishable
to say that the computer
wished
itself understands
Chinese,
say, and
can answer questions
the system is asked in
about China. But when
if it understands
the reply comes back that it does
Korean
Chinese,
suppose
Chinese
not. Does
the computer
lie? The behavioral
evidence
anyone lie? Does
was
ex
what
in
it
the
Kornese
Room
scenario.
When
is,
just
hypothesi,
asked in Korean
about China,
the replies do not demonstrate
knowl
asked
edge of China,
only a vitriolic prejudice
against China. When
in Chinese,
the replies exhibit knowledge
and love of
similar questions
the computer
like China or not?
China. Does
to attribute
These considerations
suggest that it would be a mistake
to the computer
itself. One would have equally good
these properties
to the computer.
For
grounds for attributing
incompatible
properties
the same reason it would be equally
incorrect to attribute knowledge
one cannot attribute
or ignorance of China
to the program.
Again,
a
to
is to hold that
The
solution
inconsistent
single entity.
properties
no single entity understands
both Chinese
and Korean;
there are two
one
two
virtual
Chinese
and one
who
understands
persons:
subjects,
two virtual subjects are realized by a
Korean.6 These
the
computer.
single
science.
The concept of a virtual machine
is familiar in computer
some computer
as
such
Paul
have
And
viewed
scientists,
Smolensky,
as a virtual machine.7
has an intrinsic
Each computer
consciousness
who
understands
substratum,
a certain set of operations
set: the capacity
to perform
is
instruction
wired
into the central processor. And
in virtue of the construction
of
a certain syntactic string will cause one of the intrinsic
the machine,
to
to write a program which
be
But it is possible
performed.
operations
ARTIFICIAL
INTELLIGENCE
AND
PERSONAL
IDENTITY
405
one with a
cause one machine
to behave as a different machine,
set. Thus,
such emulation
software may
different
intrinsic instruction
make a computer built using an Intel 8088 processor behave as though
it were a Z80. Then,
there is said to be a virtual machine.
The virtual
can run software written
for a Z80, using the Z80 instruction
machine
will
set. The 8088 realizes a virtual Z80. Of interest, some newer processors
virtual processors
the Intel 80386) can realize multiple
(for example,
as
an
is
in
intrinsic
and
this
wired
concurrently,
capability of
capacity
the
computer.
itmight be thought that a virtual machine
is not a real machine.
can very
there is no reason for this reservation.
One machine
or
or
In
make
another
several
other
machines.
realize,
real,
literally
could decide to sell very real computers
in which
fact, a manufacturer
Now
But
the nominal processor was realized by another. For example, a Reduced
Instruction Set Computer
(RISC) processor might be used, because of
its great speed, instead of a normal full instruction set 80386. The user
that his 80386 was in fact a virtual machine,
might be quite unaware
realized on a RISC that was not intrinsically an 80386. And there are
as well as virtual LISP language processing
physical LISP machines
the physical
and the virtual
The only difference
between
machines.8
machine
has to do with intrinsic instruction sets, with consequences
for
not
and
When
the
emulation
software
is
the
volatility.
speed
running,
virtual machine
does not exist.
that the physical and the virtual machines
differ in properties.
run
a
At
the
different
time,
programs.
given
They
physical machine
will be running the emulation
whereas
the
virtual machine
program,
an
two
be
The
machines
have
in
different
may
running
application.
sets. And
struction
the speeds at which
the two machines
perform
basic operations will differ. Thus, the physical machine
and the virtual
it realizes are not identical. And the virtual machine
is not
machine(s)
identical with the emulation program that realizes it when the program
is run on the physical machine.
The emulation
program may be long
or short, may be written
in a language, may contain comments,
may
Note
be copyrighted,
but the virtual machine
has none of these properties.
There are additional
considerations
that count against holding
that
or
the program
the
Chinese
Kornese
Room
incorporating
algorithm
understands
language or likes China. The program itself is entirely inert
until it runs. In Aristotelian
of some matter
without
terms, a program could
an underlying
substance,
be but
the form
it does
nothing.
406
DAVID
COLE
if any,
The program exists before and after it is run, but understanding,
exists only while the program
is running.
In the light of this result and consideration
of the Kornese
room, let
us reconsider
Searle's original scenario.
It is clear that the knowledge,
an
in the Chinese
beliefs and desires apparently
revealed
personality,
swers to questions
submitted to the room might be quite unlike Searle's
own. Indeed, Searle himself could receive no knowledge
through infor
- and he can reveal none
mation provided
in Chinese
that he has. He
an
cannot express himself or anything
in the Chinese
about himself
the answers
in Chinese
reflect no access to Searle's
Similarly,
or
If
there is a person who under
knowledge,
preferences,
personality.
it is clearly not Searle.
stands Chinese,
Thus, what follows from the fact that Searle does not understand
is just that the person,
if any, who does understand
Chinese
Chinese
is not Searle. A tacit premise, needed for the inference from (2) to (3),
is that there is no other mind. But Searle gives us no reason for believing
that this premise
is true. There may well be a mind realized by Searle's
a
virtual
person. But the same mind could have been realized
activity,
someone
of
other than Searle - Searle could even resign
the
activity
by
his job in the Room
and be replaced by another - while
the Chinese
that the Chinese
This is additional
evidence
conversation
continues.
to the exist
person is not Searle. Searle is not essential
understanding
ence of the Chinese
person.
understanding
swers.
FUNCTIONALISM
AND
MULTIPLE
a version
Georges Rey (1986) advocates
ting the robot reply. Rey holds that
guage,
burdens
example
a view that would
guistic
symbol
Searle's
manipulations,
MINDS
of the system
reply
incorpora
view about
the 'autonomy'
of lan
[AI] with a quite extreme
a language need involve only infra-lin
allow that understanding
(p. 171)
will require more
than that. Rey goes
Clearly functionalism
integration
on to consider a robot (a system with sense organs) and argues that,
on a sketched in causal theory of meaning,
the system would understand
correct
clear
in pointing out, in his
But
Searle
is
and
language.
quite
in its
of the robot reply, that his argument
is not affected
discussion
to
extension
include
There
essentials
by
extra-linguistic
capabilities.
fore, Searle is not (merely) attacking a strawman. Sharvy (1983, p. 128)
ARTIFICIAL
AND
INTELLIGENCE
IDENTITY
PERSONAL
407
seems
too cavalier
in saying that Searle's argument
against the
an
is
"intuition
system reply
just
pump".
rejection of the extreme view
Maloney
(1987) disagrees with Rey's
as
of autonomy
of language,
individuals
citing severely handicapped
are
not
The
I
for
the
conclusive,
autonomy.
support
counterexamples
- even
on
overt
focus
behavior
the
believe, because
they
handicapped
serve motor
that in normal
individuals
have the neuronal
subsystems
and perceptual
skills. In any case, we can sidestep this issue by simply
a robot system, for Searle claims that his argument applies
considering
to
to the
robot system as to the original person confined
the
equally
room and to exclusively
and
output.
linguistic input
also
hold that Searle, despite
(1984) and Rey
(1986, p. 174-75)
room.
in
understands
Chinese
the
protests,
Maloney
(1987, p. 355-59)
and Cole point to Searle's failure to translate; Cole finds it only odd,
but Maloney
finds it decisive for denying language comprehension.
But
on
a
one
to
to
then Maloney
consider
view
similar
the
advocated
goes
Cole
of the room
agrees that the English-speaking
occupant
not
calls
him
is
the
who
understands
Chi
person
"Marco")
(Maloney
nese. The view Maloney
considers
is that "there must be another agent,
Chinese"
Polo, who does realize the program and thereby understands
here,
which
(p. 359).
"there is an overwhelming
But, says Maloney,
difficulty with postulat
Polo
that
him" (p. 360). Polo
emerges
upon closely examining
ing
motor
shares Marco's
and
and
sensory
goes wherever Marco
systems
goes; thus how can Polo be distinct from Marco?
Marco
learned
how
learned
previously
to manipulate
lots of different
the
cards
in much
the
same
in which
way
things,
poker. According
including
involves mastering
the proper program,
he
has
to Strong AI,
just as under
how to play poker
understanding
amounts
to running
the right program.
since Marco
both learned
Now,
standing Chinese
how to play poker
and also plays poker,
i.e. understands
is it Polo
rather
poker, why
not Polo, who mastered
than Marco
who understands
since it was Marco,
the
Chinese,
for Chinese?
program
(p. 363)
does not wait
Maloney
all that Marco
understands
lished
Chinese,
did
Chinese,
that,
despite
Strong AI
in order
it must
for an answer:
to understand
be Marco,
poker was learn a program.
not Polo. And
so, since we
the formal program
realizing
is finally and thoroughly
false,
for Chinese,
(p. 363)
...
have
Marco
If anyone
here
estab
already
is ignorant
of
408
DAVID
COLE
is premature. Maloney's
This conclusion
ful as a critique of Sharvy (1983). Sharvy
seems most
force
in a room receiving
symbolic
inputs and calculating
a
to
formal
all this completely
but
who
does
symbolic
according
algorithm,
purely
of those symbols. That man cannot
blind to any interpretation
truly be said to be playing
as representing
chess. This
is so even if men outside
the room
the symbols
interpret
moves
in a chess game. But a computer
that very same program
is playing
chess
running
First,
a man
argument
says:
consider
who
is locked
outputs
is doing
So playing
and
a program,
program,
so by running
that program.
of something
chess is an example
a man
in a room does
but which
(p.
come to do by instantiating
that computers
come
to do by instantiating
that same
not
127)
needs an explanation
of the
In any case, Sharvy's position
to mentality.
is incompatible
with functionalist
task
My
approaches
to minds
of the relation of machines
here is to develop an assessment
which is compatible with functionalism
and to show why a functionalist
This position
sort Maloney
is odd, and surely Sharvy
demands for the difference.
ought to view Searle's argument as unsound.
argument does not show, as he seems to think, that there
Maloney's
is not a difference
between
learning to play poker and learning to run
the Chinese Room
program. For one thing, Maloney
appears not to
the possible relations between Marco
appreciate
in the following
he offers us a false dichotomy
and Polo. Accordingly,
passage:
Either Marco
and Polo
AI must
accept one of two alternatives.
now one using
time in the same nervous
it, now
system,
systems
sharing
different
processors,
programs
they must be genuine
simultaneously
parallel
....
of the central nervous
different
sections
system
(p. 361)
Strong
are
cognitive
or
the other,
realized
in
one system realizes another,
it is not the same
is not true. When
as
with
both
either
programs running indepen
thing
parallel processing,
nor time sharing, with now one program
dently and simultaneously,
and then the other, sequentially.
The virtual
using the central processor
now
now
a
is
realized
the
of
system
by
single implementing
operations
This
and sequential
time sharing imply
parallel processing
the
of
the two systems
between
independence
operations
complete
B are running
if
A
and
for
time
For
programs
delays).
example,
(except
on parallel processors,
A
the operations
by
(say, a statistical
performed
no
to
make
difference
of B, which
the
analysis program)
operations
a
same
two
true
is
The
of
time
The
be
game.
programs
might
sharing.
are logically independent
feature of their re
and this is an essential
system.
Both
ARTIFICIAL
lation.
If they
INTELLIGENCE
interacted,
it would
AND
PERSONAL
defeat
the whole
IDENTITY
purpose
409
of
the
system.
The case of a virtual system is quite different. Here
the activity of
the virtual system A occurs solely in virtue of the activity of the realizing
in operations
in B directly affect the
system B. Changes
performed
a
in
A.
And
B's
is the same as
certain
of
group
operations
operations
one of A's operations,
This misunderstanding
hence, the direct
of the relation
effect.
the physical and the
to see the differ
failure
(and Searle's)
a
for
and realizing
poker,
example,
a foreign
un
distinct personality
which might
understand
language
known to the physical system itself. The failure is striking inMaloney's
case, for in arguing that the physical
system ('Marco') does not itself
he presents
understand Chinese,
the considerations
relevant to answer
to
to
the
of
how
ing
question
distinguish
learning
play poker from
a
Chinese
understander.
Polo,
realizing
Not surprisingly,
the key difference
is psychological
that
integration,
is not surprising
for these are just the
is, access and control. This
virtual system
ence between
between
affects Maloney's
learning to play
characteristics
that arise in assessments
of multiple
personality. When
one learns to play poker, one understands
the objectives
of the game
and how the rules constrain the players' pursuit of those objectives. All
aspects of play can be affected by other psychological
aspects of the
a
one
If
is
it
will
in
risk
be
reflected
one's
taker,
player.
poker play.
And one can explicitly
relate poker playing to other games, or to life
in general. But this level of psychological
is precisely what
integration
is missing
in Marco's
that
of
Chinese
charac
activity
produces
strings
ters. None
of Marco's
is
in
reflected
Polo's
psychology
performance
because Marco
has no access to the semantic content of the Chinese
characters.
two ways of learning to play poker.
that we consider
one is told the rules and
the usual method,
whereby
as
as
various
the
well
informal
of
game,
objectives
strategies for success,
to watch a game or run through a couple of hands. The
and allowed
This suggests
One would be
second would be where someone who did not know how to play poker
was given a formal program for manipulating
strings of ones and zeros
to one, represented
in binary code various combi
which, unbeknown
nations of cards and information
about the betting behavior of poker
a
in
One
could
then
sit
"Poker
Room",
players.
having strings of digits
fed to one and, in accord with the program,
issue strings in response.
410
DAVID
COLE
the room might
outside
interpret these strings as poker play,
were
told
that
been
having
they
computer
playing with an eccentric
scientist recluse.
one thereby have learned to play poker? One would deny
Would
to play if asked if one could play. And the actual "play"
how
knowing
Persons
in no way reflect one's personality,
aversion to risk, flamboyance,
or
assessment
skill
in
the
of the psychology
of
skills,
memory
cunning,
others. One would derive no more pleasure
from 'winning' than from
- one would not
know that one had done any of these things.
'losing'
count against
The same considerations
of lack of integration which
or
room
that
Marco
in
the
Searle
understands
Chinese
Chinese
saying
would
count
Poker Room
occupant
against
saying that the physical
plays
the
difference
between
Thus,
poker.
alleged problematic
learning poker
at
turns out not to be a difference
and learning to respond to Chinese
that Strong AI is "thoroughly
all, and so certainly is not a demonstration
and finally false".
PERSONS
FUNCTIONALISM,
AND
BODIES
I do not believe it can be proven that there is a person who understands
is a completely
in the scenario. But this difficulty
Chinese
general
as
so far
familiar
Problem
of
Minds.
The
Other
My argument
difficulty
has been to show that even those who do believe that there might be one
to
in the scenario should resist any temptation
who understands Chinese
think, as Searle would have them, that it would have to be Searle. Now
I wish to argue that support for the view that it is possible
for there to
be other than a one-to-one
between
living bodies and
correspondence
accounts of the relation of mind to body.
minds comes from plausible
if the actual replies to questions
demonstrate
Then,
understanding
a personality,
to the best explanation
it would be an inference
there is a person who understands
the questions.
and
that
cases of multiple
has long recognized
The
Psychiatry
personality.
is rare (under two hundred total reported cases), but ismore
condition
now than in the past. The American
diagnosed
frequently
Psychiatric
Association
and Statistical Manual
III)
(Diagnostic
of Mental Disorders,
characterizes
this disorder as follows:
The
essential
personalities,
fully
integrated
feature
each
and
is the
of which
complex
within
existence
unit with
the
individual
at a particular
unique memories,
is dominant
of
time.
behavior
two
Each
or more
personality
and
patterns,
distinct
is a
social
relationships
dominant,
AND
INTELLIGENCE
ARTIFICIAL
that determine
the nature
of
the
PERSONAL
individual's
IDENTITY
acts when
411
that personality
is
(p. 257)
It is of philosophic
interest that the diagnostic
indication emphasizes
The disorder
the functionalist
is a subtype of
feature of integration.
a variety that includes retrograde
"Dissociative
amnesia.
Disorders",
The DSM reports that usually the 'original' person has no knowledge
of any of the "subpersonalities",
but that the latter may be aware of
one
another.
Recently
William
the
law has had occasion
was found not
Mulligan
to take note of the
guilty by reason of
phenomenon:
ten personalities;
"the Les
insanity of four rapes. Mulligan
displayed
bian Adelena
is thought to be the 'personality' who committed
the
some
and
Sarason
1987,
pp. 138-39). Apparently,
rapes" (Sarason
evidence
for
suggests that different areas of the brain are responsible
different personalities
(Braun 1984).
In any case, the considerations
raised by the Chinese
and Kornese
Room
scenarios mirror
familiar metaphysical
and
problems
positions
in the philosophy
of mind. There appear to be good reasons for holding
that it is false to say that I am identical with my body. But this is not
to say, with the dualist, that I am identical with some substance other
than my body, or identical with a whole composed
of two substances.
And yet the dualist is correct in holding
that I might exist while my
(present) body did not.
are metaphysical
I believe,
of functionalism.
These,
consequences
Functionalists
have not generally
been
concerned
with how func
as the possi
tionalism bears on traditional metaphysical
such
questions,
I
of
but
has
believe
it
for
bility
immortality,
interesting
implications
these questions
1984). I shall not defend func
(cf. Cole and Foelber
tionalism here, but shall indicate how it bears on the nature of persons
and how this is relevant to Artificial
and Searle's argument.
Intelligence
Functionalism
rejects a type-type
identity between
psychological
states and physical
states. Some instances of being in pain, say, may
be physiologically
different from others. There could even be alien life
states that were of the same type as psychologi
forms with psychological
cal states had by humans,
but that had quite a different
underlying
physical system (for example, based on silicon rather than carbon).
this has received
less attention,
these con
Furthermore,
although
a
across
to
siderations
individual
time.
single
My psychological
apply
states in 1989 need not be realized by the same physical states as were
412
DAVID
COLE
a portion
states in 1979. For example,
psychological
in
have sustained
the
interim
and
its
function
injury
a
structure.
have
assumed
distinct
been
Or, it
may
by
physiologically
to
of
brain
with
may become
my
replace damaged
possible
portions
tissue that grows to assume functions
cultured neonatal
temporarily
my type-identical
of my brain may
to replace entire damaged
lost. Finally,
itmight even become possible
neurons by functionally
silicon-based
electronic
devices.
equivalent
of
holistic
models
function
discussions
of
cognitive
Contemporary
also underscore
that identity of realizations
is not essential
for type
or Parallel
states over time. If Connectionist
identity of psychological
are
of
function
Distributed
models
correct,
Processing
psychological
states of persons
is
the underlying
system that realizes the cognitive
as
takes
The
is
system
continually
learning
radically
changing
place.
as each bit of new information
dynamic,
slightly changes weightings
of a given global response.
and probabilities
Functionalism
thus takes the underlying
substance
type to be non
to the psychological
states. This is not to suppose that there
essential
- the
can be psychological
states without
substratum
any underlying
to the non
inference from the non-essentiality
of any given substratum
some
or
would
be a
of
the
existence
of
substratum
other
essentiality
that the most reasonable
is that
modal
scope fallacy. Given
supposition
are wrong
in holding
that there is any other than physical
dualists
is capable at pres
and in fact only organic neural substance
substance,
ent of realizing mental
states, the result is clear: no brain, no pain. But
this is not to say that in order for me to experience
pain it must be
with this brain with each of its current constituent
cells and molecules.
WHY
THE
"SYSTEM
REPLY"
TO
SEARLE'S
ARGUMENT
IS WRONG
on persons
The functionalist
diachronic
suggests that per
perspective
sons or minds are more abstract than a simple identity of a person with
a body (or a Cartesian
soul or individual res cogitans) would
suppose.
This position
is not new in this century; it was an implication of Locke's
invoked a functional
identity which
theory of personal
as
the
of
mind.
Locke
the
memory,
says:
glue
it must
thinking
. . can be
if the same consciousness.
allowed,
that,
to another,
it will be possible
that two thinking
substance
be
connection,
transferred
substances
from
may
one
make
ARTIFICIAL
but one
INTELLIGENCE
AND
For the same consciousness
person.
preserved,
is preserved.
the personal
(Essay,
identity
substances,
Locke
more
PERSONAL
whether
Bk.
IDENTITY
in the same
ii, chap.
413
or different
27)
goes on to indicate that a single substance might be the seat of
over
than one person
(if there is not psychological
continuity
time).
This
view rejects a simple identity of person with underlying
sub
that substance
is or is not material.
Searle
stance, whether
Presumably
would say that I am identical with my body and could not exist without
the brain. But Locke's
view suggests that I
it, or at least not without
could. Another
and
brain
like
this
one, with the same
body
exactly
one.
"causal powers",
this
could, in principle,
(Indeed, for all
replace
I know
this may have happened!)
that even in the case of bodies, a simple identification with the
fails to do justice to the identity conditions we in
physical constituents
fact employ:
identical with
this particular
Is my body
collection
of
it? No, my body can change, acquiring
molecules
that now constitutes
A person
and losing constituents.
is an attribute of a body. A single
more
one
than
and a single person might
realize
person,
body might
Note
be realized by more than one body.
the "system reply", as Searle represents
it, is not quite right
Thus,
not
either. The Chinese understanding
is
identical
with a whole
person
of
the
instruction
and
the scraps of
manuals,
Searle,
system composed
on
notes.
As
which
he
makes
Searle
he could in
notes,
paper
rightly
contents
to memory
the
of
all
the
manuals
commit
instruction
principle
in his head. That is, he could sit
and follow the instructions completely
in an otherwise
bare room with only a pen and the pieces of paper
which
he
writes
the outgoing
upon
strings of symbols. Still, he would
no more understand Chinese
than he did when he consulted
the manuals
for instruction each step of the way. So, the entity that understands
Chinese
is not Searle nor Searle and paper and manuals.
It will not do to hold (with Cole 1984 and Rapaport
1990) that Searle
understands
Chinese but does not understand
that he understands
Chi
room scenario shows that contradictory
nese; for the Kornese
psycho
logical attributes can be had by the (virtual) persons manifested
by the
a
While
individual
understand
Chinese
system.
single
might conceivably
but not know this, a single individual cannot plausibly be held in any
sense to both find Chinese music
straightforward
always restful and
conducive
to thought
and also
to find
this music
always
to be cacoph
414
DAVID
COLE
and disturbing.
Nor could one believe
that Malraux's
novels
one
never
events
in
also
China
and
believe
that
had
heard
misportrayed
of Malraux.
There is no reason to suppose that persons realized in the
room would have psychological
Chinese
properties
compatible with one
another nor that the realized would have the properties
of the realizer.
onous
in the original Chinese Room
there is no reason to
Thus,
situation,
to Searle
attribute
the psychological
of
the
virtual
person
properties
himself nor to the system consisting of Searle and inanimate parapher
nalia.
So who or what does understand
in the Chinese
room? An
Chinese
not
unnamed Chinese
This
is
but
this
Searle,
person.
person
person
cannot exist unless someone - Searle or any competent
other - brings
to life the Chinese mind by following
the instructions
in the room.
CONCLUSION
In the rejection by functionalists
of type-type
identities of psychological
events or states with physical events or states, the way is opened
for
different persons to have quite different underlying physical realizations
states. Functionalism
of their mental
takes certain of the causal proper
of the psychological
ties of an event to be determinants
properties. One
is
which
mind the
of the psychological
thus
determined
just
properties
or
event belongs
to. Causality
not
cement
of the
be
the
may
may
states and events
universe but it is what holds bundles of psychological
together to form a single mind over time.
Functionalism
does
not
require
a one-to-one
correspondence
between
is such a correspond
there generally
persons and bodies. Contingently,
is exactly what functionalism
would
lead one to expect.
ence, which
states are just those of the system
The causal properties of psychological
course of events,
in the ordinary
them. And
the
literally embodying
characteristics
of brains permit psychological
integration
physiological
life of the brain.
and continuity
for the entire duration of the operating
and may in fact be. There may be multiple
But it could be otherwise,
a
embodied
persons
by
single brain in cases of "multiple personality".
are or not,
I believe,
turns on the extent of causal
im
the personalities.
between
(and, hence,
access)
My
cases
to
is
actual
be
that
of
multiple
personality
reported
pression
is not
typically involve a degree of shared access to information which
the multiple
characteristic
of distinct persons;
typically
personalities
Whether
connection
there
INTELLIGENCE
ARTIFICIAL
AND
PERSONAL
IDENTITY
415
of the shared
speak the same language and (this is not independent
facts.
shared
of
have
knowledge
general
linguistic abilities)
Experience
with severed corpus callosum and with various memory
deficits, as in
thresh
Alzheimer's
disease,
suggests that there may be no well-defined
at which one can say that
old of integration
(causal interconnectivity)
above this threshold there is a single mind and below it there are two
or none. Experience
the
specifically with the split brains demonstrates
of the character and count of minds upon causal features
contingency
of the underlying
system.
From the fact that there is a single physical
then, nothing
system,
the system might
follows about the number of minds which
realize.
on
no
causal
character
of
the
it
the
realize
system,
Depending
might
the
minds, one mind, or more than one mind. This is the case whether
system employs neurons, as in humans, entire humans, as in the Chinese
as in AL As a result, Searle's
or programmed
Room,
computers,
Chinese
ficial
Room
shows nothing
argument
about
the possibilities
of arti
intelligence.
NOTES
1
2
This
represents
I defend
Searle
in the Australasian
a rejection
against
Journal
Searle's
I took several years ago in Cole
the position
(1984).
advanced
other criticisms,
in a paper
by Philip Cam,
I reconstruct
it here,
1991). As
of Philosophy
(September
are correct. But, Searle's
valid and the indicated
conclusions
to consider
of virtual entities discussed
the possibilities
here
of
several
is clearly
reasoning
turns on a failure
argument
and so fails to refute an interesting
of Strong AI.
version
3
See Cole
1984.
4
A similar point
debate
is made
in the Searle-Churchlands
in Scientific American.
(See
Churchland
1990 and Searle
and Churchland
1990.)
"
causal process
the representa
The derivation
should be by whatever
that can preserve
of the brain states of the original biological
tional properties
persons.
preserves
Xeroxing
so does conversion
to electronic
the representational
of written
information;
properties
media.
The
equivalent
at a higher
personality
expeditious,
translations.
6
As some
as a digitized
for brains might
be as detailed
functional
process
analogous
- a
- or
a functional
neural net
of the entire brain
(more plausibly)
equivalent
such as would
and
level of analysis,
be provided
intellectual
by an exhaustive
more
the former might well be technically
and
However,
inventory.
simpler
just as xerographic
copies
are more
easily
obtained
than
paraphrases
or
one could avoid the incompatible
suggested,
problem
properties
to
of the program,
with Chinese
and
parts
by attributing
understanding
understanding
to different
Korean
attributed
tack raises problematic
issues
parts. This
understanding
some of which
are currently
the identity conditions
of programs,
concerning
being ex
readers
have
DAVID
416
be
as persons.
by distinct
realized
This
program.
reasoning
parallels
his/her
body. These
with
person
below
are abstract,
In a nutshell,
is that programs
but
my response
it is a person who understands,
Since
and the same person
the understanding
is not identical with
programs,
person
the courts.
plored by
as abstract
on
the
standard
considerations
functionalist
are
set out
was
reasoning
Difference
between
I believe
'systems
reply'.
in his argument
for The Real
the
Descartes
COLE
the
not
can
the
a
to identifying
objections
in this paper
in the section
by
that
interestingly
anticipated
mind
and body,
but
the scope of this paper.
topic is beyond
7
"We can view
the top-level
for example,
conscious
of
See,
processor
Smolensky:
as
a
we
can
conscious
individual
virtual
machine
the
rule
and
view
interpreter
people
as a program
that runs on that machine"
realized
persons
by a single body as virtual persons,
as a collection
of virtual machines.
cutural
knowledge
distinct
I treat
p. 4). While
views
(1988,
a single
Smolensky
person
8
in
thanks to an anonymous
referee
for this point. The referee goes on to remark,
My
ever suggests
that "nobody
support of my main point about the locus of understanding,
that the LISP
'understands'
the application
the LISP program
encodes".
interpreter
REFERENCES
Association:
and Statistical Manual
1987, Diagnostic
Psychiatric
of Mental
D.C.
III, American
Association,
Psychiatric
Washington
David:
Room
the Real Thing?',
62, 389-93.
1987, Ts the Chinese
Anderson,
Philosophy
a Theory
and Other Dissociative
B. G.:
of Multiple
1984, Toward
Braun,
Personality
American
Disorders,
on Multiple
Clinics of North
Psychiatric
Symposium
Personality,
Vol.
7, pp. 171-93.
Philadelphia,
Journal
1990, 'Searle on Strong AT, Australasian
of Philosophy
Philip:
America,
Phenomena',
Saunders,
Cam,
08.
Lawrence:
Carleton,
59, 219-30.
1984,
Paul M.
Churchland,
and Patricia
Scientific American
Cole, David:
1984,
44.
1990,
Cole, David:
Rankin
(eds.).
Cole,
David:
1991,
'Programs,
Language
Smith
Churchland:
1990,
'Could
Searle',
Synthese
a Machine
Think?',
32-37.
January,
and Thought
'Thought
and
Understanding
103
68,
'Cognitive
and
Inquiry
'Artificial Minds:
Cam
the Philosophy
on Searle',
of Mind',
Australasian
45, 431
Studies
Philosophical
Experiments',
in Cole,
Journal
Fetzer
and
of Philosophy
69(3).
Cole,
David,
Cognitive
Cole, David
James Fetzer,
and Terry Rankin
1990, Philosophy,
(eds.):
Dordrecht.
Publishers,
Inquiry, Kluwer Academic
Foelber:
and Robert
1984, 'Contingent
Materialism',
Pacific
65, 74-85.
Quarterly
Richard:
Double,
1983,
'Searle,
Programs
and Functionalism',
Nature
and
Mind,
Philosophical
and System
5, 107
14.
Dretske,
Fred:
the Central
Fetzer
1985,
Division
& Rankin
'Machines
of
(eds.)).
and
the Mental',
the American
Presidential
Philosophical
Address
Association
delivered
(reprinted
before
in Cole,
ARTIFICIAL
C. A.:
Fields,
Grice,
H.
on Searle's
'Double
1984,
Paul:
AND
INTELLIGENCE
'Personal
1941,
Chinese
Identity', Mind
and Identity',
IDENTITY
Nature
Room',
417
6, 51-54.
and System
50, 330-50.
in A. E. Rorty
The Identities
'Survival
1976,
(ed.),
of
of California
Press, Berkeley.
University
J. Christopher:
1987, 'The Right
Stuff,
70, 349-72.
Synthese
Maloney,
of California
John (ed.):
1975, Personal
Press, Berkeley.
Identity, University
Perry,
on Personal
Hackett
Identity and Immortality,
Perry, John: 1978, A Dialogue
Publishing
Lewis,
David:
PERSONAL
Persons,
Company,
Indianapolis.
A. M.:
Quinton,
1962, 'The Soul',
Journal
59, 393-409
of Philosophy
in Perry
(reprinted
1975).
William
Rapaport,
53, 271-79.
William
Rapaport,
Artificial
Rapaport,
XXII,
J.:
'Searle's
1986,
J.:
1988,
of Science
Philosophy
Thought',
in James
Publishers,
of John Searle's
'Review
1988,
with
Semantics',
'Syntactic
Academic
Kluwer
Intelligence,
William
J.:
Experiments
Fetzer
(ed.),
Aspects
of
Dordrecht.
Brains
Minds,
and Science",
Nous
585-609.
on
J.: 1990,
and Virtual
Processes
Persons:
Comments
'Computer
and Personal
Technical
90-13, Depart
Intelligence
Identity",
Report
ment of Computer
at Buffalo.
of New York
State University
Science,
on in Searle's
"Chinese
Room'",
1986, 'What's Really Going
Rey, Georges:
Philosoph
ical Studies 50, 169-85.
Irwin G. and Barbara R. Sarason:
Sarason,
1987, Abnormal
Prentice-Hall,
Psychology,
Cliffs, NJ.
Englewood
John: 1980, 'Minds, Brains
and Programs',
The Behavioural
and Brain Sciences
Searle,
Rapaport,
Cole's
William
'Artificial
3, 417-57.
Searle,
John:
'The Myth
1982,
1982, pp. 3-6.
John: 1984, Minds,
Searle,
MA.
Searle,
John:
1990,
26-31.
January,
Sharvy, Richard:
Paul:
Smolensky,
Brain Sciences
Brains,
the Computer',
and Programs,
'Is the Brain's
1983,
1988,
11(1),
Whitmer,
1983,
Jeffrey:
the Brain', Auslegung
Mind
New
York
Harvard
a Computer
Review
University
Program?',
'It Ain't
It's the Motion',
the Meat,
Inquiry
of Connectionism',
'On the Proper Treatment
1-74 (reprinted
in Cole, Fetzer
and Rankin
Tntentionality,
10, 194-210,
of Philosophy
of Minnesota/Duluth
University
MN
55812-2496
Duluth,
U.S.A.
Dept.
of
Artifical
214-17.
Intelligence
and
29 April
of Books,
Press,
Cambridge
American
Scientific
26,
125-31.
Behavioural
(eds.)).
the Causal
Powers
and
of
Download