Is the Brain a Digital Computer? - John Searle

advertisement
Is the Brain a Digital Computer?
Author(s): John R. Searle
Reviewed work(s):
Source: Proceedings and Addresses of the American Philosophical Association, Vol. 64, No. 3
(Nov., 1990), pp. 21-37
Published by: American Philosophical Association
Stable URL: http://www.jstor.org/stable/3130074 .
Accessed: 19/08/2012 03:34
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .
http://www.jstor.org/page/info/about/policies/terms.jsp
.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact support@jstor.org.
.
American Philosophical Association is collaborating with JSTOR to digitize, preserve and extend access to
Proceedings and Addresses of the American Philosophical Association.
http://www.jstor.org
IS THE BRAINA DIGITALCOMPUTER?
JohnR. Searle
ofCalifornia/Berkeley
University
Address
theSixty-fourth
Presidential
delivered
AnnualPacific
Division
ofthe
before
Meeting
Association
inLosAngeles,
March30, 1990.
American
California,
Philosophical
I. Introduction,
Strong
AI, WeakAI andCognitivism.
Therearedifferent
a Presidential
totheAPA;theoneI have
Address
waystopresent
chosenissimply
toreport
onworkthatI amdoingright
I am
now,on workinprogress.
someofmyfurther
intothecomputational
modelofthe
goingto present
explorations
mind.[1]
Thebasicideaofthecomputer
modelofthemindisthatthemindistheprogram
and
thebrainthehardware
ofa computational
A sloganoneoften
seesis "themind
system.
is to thebrainas theprogram
is to thehardware."
[2]
Letus beginourinvestigation
ofthisclaimbydistinguishing
threequestions:
1. Is thebraina digital
computer?
2. Is theminda computer
program?
3. Can theoperations
ofthebrainbe simulated
on a digital
computer?
I willbe addressing
answered
in the
1 andnot2 or3. I think2 can be decisively
Sinceprograms
aredefined
or
and
since
minds
have
negative.
purely
formallysyntactically
an intrinsic
mentalcontent,
it follows
that
the
itself
cannot
immediately
program
by
constitute
themind.The formal
oftheprogram
doesnotbyitself
the
syntax
guarantee
of
mental
contents.
I
showed
this
a
decade
the
in
Chinese
Room
presence
ago
Argument
meforexample,
couldrunthestepsoftheprogram
forsome
(Searle,1980). A computer,
mental
as
such
without
a
word
of
Chinese.
Chinese,
capacity,
understanding
understanding
Theargument
rests
on thesimple
thatsyntax
isnotthesameas,noris itby
logicaltruth
itself
sufficient
semantics.
So theanswer
to thesecondquestion
isobviously
"No".
for,
The answerto 3. seemsto me equallyobviously
at
on
least
a
natural
"Yes",
Thatis,naturally
thequestion
means:
Is there
somedescription
interpretation.
interpreted,
ofthebrainsuchthatunderthatdescription
could
do
a
simulation
of
you
computational
theoperations
ofthebrain.Butsinceaccording
toChurch's
that
can
be
thesis,
anything
as a setofstepscan be simulated
on a digital
givena precise
enoughcharacterization
itfollows
that
the
has
an
affirmative
answer.
The
computer,
trivially
question
operations
ofthebraincan be simulated
on a digital
in thesamesenseinwhichweather
computer
thebehavior
oftheNewYorkstockmarket
orthepattern
ofairline
over
systems,
flights
LatinAmerica
can. So ourquestion
isnot,"Istheminda program?"
Theanswer
tothat
21
22
APA PROCEEDINGS,
VOL. 64,NO.3
to thatis,"Yes". The
The answer
is,"No". Noris it,"Canthebrainbe simulated?"
I am
of
"Is
the
brain
a
And
for
is,
purposes thisdiscussion
question
digital
computer?"
that
as
to:
"Are
brain
computational?"
processes
taking question equivalent
ifquestion2
wouldlosemuchof itsinterest
thinkthatthisquestion
One might
unless
the
mind
is a program,
a negative
answer.Thatis,onemight
that
receives
suppose
Butthatisnotreally
thebrainisa computer.
tothequestion
whether
thereisnointerest
of
arenotconstitutive
thecase. Evenforthosewhoagreethatprograms
bythemselves
to
there
is
more
the
Granted
that
an
there
is
still
mental
question:
phenomena,
important
be the
itmight
ofthedigital
mindthanthesyntactical
nonetheless,
computer;
operations
are
statesand mentalprocesses
case thatmentalstatesare at leastcomputational
mental
of
states.
formal
these
over
the
structure
This,
operating
computational
processes
ofpeople.
takenbya fairly
in fact,seemstometheposition
largenumber
likethis:At
thattheviewisfully
I amnotsaying
clear,buttheideaissomething
thereareso tospeak,"sentences
aresyntactical;
brainprocesses
somelevelofdescription
in the
butperhaps
or Chinese,
in English
in thehead". Theseneednotbe sentences
a
have
like
of
sentences,
(Fodor,1975). Now, any
syntactical
they
"Language Thought"
from
ofsyntax
can be separated
andtheproblem
ormeaning,
anda semantics
structure
in the
is:Howdo thesesentences
ofsemantics
The problem
ofsemantics.
theproblem
of the
headget theirmeanings?But thatquestioncan be discussed
independently
that
to
answer
A
sentences?
work
in
these
brain
does
the
How
typical
processing
question:
latterquestionis: The brainworksas a digitalcomputer
computational
performing
inthehead.
structure
ofsentences
overthesyntactical
operations
a mind
I calltheviewthatallthereistohaving
tokeeptheterminology
straight,
Just
can
mental
brain
the
view
that
a
ishaving program,
(and
AI,
processes)
processes
Strong
a
brain
is
the
that
Weak
the
view
and
besimulated
computer,
digital
computationally, AI,
Cognitivism.
whatmotivates
andI hadbetter
ThispaperisaboutCognitivism,
sayat thebeginning
and
Nicholls
or
Kuffler
brain
about
the
books
read
it. Ifyou
(1976))
(1983)
(sayShepherd
oninthebrain.Ifyouthenturntobooksabout
ofwhatisgoing
picture
yougeta certain
ofthe
ofthelogicalstructure
1989)yougeta picture
(sayBoolosandJeffrey,
computation
about
books
then
turn
to
Ifyou
ofcomputation.
science,
(sayPylyshyn,
cognitive
theory
is reallythesameas whatthe
1985)theytellyouthatwhatthebrainbooksdescribe
to
thisdoesnotsmellright
booksweredescribing.
Philosophically
speaking,
computability
of
sense
to
follow
an
of
at
the
at least
meandI havelearned,
my
beginning investigation,
smell.
II. The PrimalStory
as I canwhyCognitivism
tostateas strongly
I wanttobeginthediscussion
bytrying
ofhuman
the
relation
is
a
about
There
intelligence
hasseemed
story
intuitively
appealing.
classicpaper(1950),andI believeitis
thatgoesbackat leasttoTuring's
tocomputation
view.I willcallitthePrimal
oftheCognitivist
Story:
thefoundation
thesis(sometimes
inmathematical
We beginwithtworesults
logic,theChurch-Turing
theChurch-Turing
calledChurch'sthesis)and Turing'stheorem.Forour purposes,
thereis someTuringmachinethatcan implement
thesisstatesthatforanyalgorithm
PRESIDENTIALADDRESSES
23
thatalgorithm.
theorem
isa Universal
Machine
which
Turing's
saysthatthere
Turing
can simulate
Machine.
Now
if
we
we
have
these
two
the
anyTuring
put
together
result
thata Universal
Machine
can
whatever.
implement
Turing
anyalgorithm
Butnow,whatmadethisresult
so exciting?
Whatmadeitsendshivers
upanddownthe
a
of
whole
of
workers
in
artificial
is thefollowing
spines
generation young
intelligence
the
brain
is
a
Universal
Machine.
thought:
Suppose
Turing
forsupposing
thebrainmight
bea Universal
Well,arethere
anygoodreasons
Turing
Machine?Letus continue
withthePrimal
Story.
It is clearthatat leastsomehumanmental
arealgorithmic.
abilities
Forexample,
I
canconsciously
dolongdivision
the
of
an
for
bygoing
through steps
algorithm solving
It is furthermore
a consequence
oftheChurch-Turing
thesis
longdivision
problems.
andTuring's
theorem
thatanything
a humancando algorithmically
can be doneon
a Universal
Machine.I canimplement,
forexample,
theverysamealgorithm
Turing
thatI use forlongdivision
on a digital
In
a case,as described
such
computer.
by
both
the
human
and
the
mechanical
are
I,
Turing(1950),
computer,
computer
thesamealgorithm;
I amdoingitconsciously,
themechanical
implementing
computer
Now
it
seems
reasonable
to
there
alsobe a wholelot
nonconsciously.
suppose
might
ofmental
oninmybrain
which
arealsocomputational.
processes
going
nonconsciously
Andifso,wecouldfindouthowthebrainworks
theseveryprocesses
bysimulating
ona digital
as wegota computer
simulation
oftheprocesses
fordoing
Just
computer.
sowecouldgeta computer
simulation
oftheprocesses
forunderstanding
longdivision,
visualperception,
etc.
language,
categorization,
"Butwhataboutsemantics?
After
arepurely
Hereanother
set
all,programs
syntactical."
oflogico-mathematical
results
comesintoplayin thePrimal
Story.
The development
ofproof
showedthatwithin
certain
wellknownlimits
the
theory
semantic
relations
between
can be entirely
mirrored
propositions
bythesyntactic
relations
between
thesentences
thatexpress
thosepropositions.
Nowsuppose
that
mental
contents
intheheadareexpressed
inthehead,thenallwewould
syntactically
needtoaccountformental
wouldbe computational
the
between
processes
processes
inthehead. Ifwegettheproof
elements
thesemantics
will
syntactical
theory
right
takecareofitself;
andthatiswhatcomputers
do:theyimplement
theproof
theory.
We thushavea welldefined
research
We trytodiscover
theprograms
program.
being
in thebrainbyprogramming
to implement
thesameprograms.
implemented
computers
We do thisin turnbygetting
themechanical
tomatchtheperformance
ofthe
computer
humancomputer
thepsychologists
tolook
(i.e.topasstheTuring
Test)andthengetting
forevidence
thattheinternal
arethesameinthetwotypes
ofcomputer.
processes
in
Now whatfollows
I wouldlikethereader
tokeepthisPrimal
inmind--notice
Story
of theprogram
especially
Turing'scontrastbetweentheconsciousimplementation
bythe
humancomputer
and thenonconscious
ofprograms,
whether
implementation
bythebrain
or by the mechanicalcomputer;
noticefurthermore
the idea thatwe mightjustdiscover
24
APA PROCEEDINGS,
VOL. 64,NO.3
in nature,
theverysameprograms
thatwe putintoourmechanical
running
programs
computers.
onefinds
certain
common
Ifonelooksat thebooksandarticles
Cognitivism
supporting
often
assumed
it
is
thatthe
but
nonetheless
often
First,
unstated,
assumptions,
pervasive.
form
of
is
some
dualism.
to
the
view
the
brain
is
a
alternative
that
digital
computer
only
ofimmortal
Cartesian
Theideais thatunless
intheexistence
souls,youmust
youbelieve
believethatthebrainis a computer.Indeed,it oftenseemsto be assumedthatthe
ourmentalstatesand
thebrainis a physical
mechanism
whether
determining
question
the
are
the
same
whether
thebrainis a digital
speaking,
computer
question.Rhetorically
thatunlesshe acceptstheideathatthebrainis
intothinking
ideais to bullythereader
the
views.Recently
he is committed
tosomeweird
somekindofcomputer,
antiscientific
von
notbe an old fashioned
fieldhas openedup a bitto allowthatthebrainmight
kindofparallel
a moresophisticated
butrather
Neumann
processing
digital
computer,
style
is to risklosing
that
the
brain
is
to
Still, deny
computational
computational
equipment.
inthescientific
community.
yourmembership
arecomputational
whether
brainprocesses
is
it
also
assumed
thatthequestion
Second,
in thesame
is
be
factual
It
to
settled
is justa plainempirical
investigation
by
question.
is
or
whether
a
as
whether
the
heart
that
such
greenleavesdo
pump
way
questions
or
is
no
room
for
fact.
as
of
There
were
settled
matters
logicchopping
photosynthesis
fact.
Indeed
I
hard
scientific
about
matters
of
we
are
since
talking
analysis,
conceptual
an
of
this
the
title
doubt
that
work
in
this
field
would
think
who
paperposes
manypeople
isnomore
a digital
at all. "Isthebrainreally
computer?"
question
appropriate
philosophic
at
neuro-muscular
"Is
neurotransmitter
the
a philosophical
than
really
junctions
question
acetylcholene?"
seem
andDreyfus,
suchas Penrose
toCognitivism,
Evenpeoplewhoareunsympathetic
aboutthe
issue.Theydo notseemto be worried
to treatit as a straightforward
factual
ButI ampuzzled
be thattheyaredoubting.
whatsortofclaimitmight
bythe
question
a
its
of
could
constitute
sort
fact
about
the
brain
What
being computer?
question:
even
is thehasteand sometimes
feature
of thisliterature
another
Third,
stylistic
are
over.
What
are
the
withwhich foundational
carelessness
exactly the
questions glossed
is
discussed?
What
ofbrains
thatarebeing
andphysiological
features
anatomical
exactly
to
connect?
to
these
two
a digital
Andhowaretheanswers
supposed
questions
computer?
aboutO's and
in thesebooksand articlesis to makea fewremarks
The usual procedure
andthengetonwiththemore
oftheChurch-Turing
thesis,
summary
l's, givea popular
in readingthis
andfailures.To mysurprise
achievements
suchas computer
exciting
things
hiatus.On theone
thatthere
seemstobe a peculiar
literature
I havefound
philosophical
resultsranging
fromTuring'stheorem
hand,we have a veryelegantset ofmathematical
On theother
torecursive
function
thesis
toChurch's
hand,wehavean impressive
theory.
set of electronicdeviceswhichwe use everyday. Since we have such advanced
musthave
we assumethatsomehowsomebody
and such goodelectronics,
mathematics
to theelectronics.But
themathematics
workofconnecting
done thebasicphilosophical
we are in a peculiarsituation
as faras I can tellthatis not thecase. On thecontrary,
on such absolutely
the
wherethereis littletheoretical
among
practitioners
agreement
is a symbol?
What
a
is
What
fundamental
as,
exactly
exactly digitalcomputer?
questions
are two
conditions
what
Under
Whatexactlyis a computational
exactly
physical
process?
same
the
program?
systems
implementing
PRESIDENTIALADDRESSES
25
ofComputation
m. The Definition
Sincethere
isnouniversal
onthefundamental
I believe
itisbest
agreement
questions,
togobackto thesources,
backto theoriginal
Alan
definitions
Turing.
givenby
toTuring,
a Turing
machine
cancarry
outcertain
According
elementary
operations:
It can rewrite
a 0 on itstapeas a 1,itcanrewrite
a 1 on itstapeas a 0, itcanshift
the
to
the
or
it
can
shift
the
to
the
controlled
It
is
1
left,
1
tape square
tape square
right.
by
a program
ofinstruction
andeachinstruction
a condition
andan actionto be
specifies
carried
outifthecondition
is satisfied.
Thatis thestandard
ofcomputation,
itis at leasta bit
definition
but,takenliterally,
If
home
are
most
to
misleading.youopenupyour
you
computer
unlikely findanyO'sand
1'sorevena tape. Butthisdoesnotreally
matter
forthedefinition.
To findoutifan
is
a
it
turns
out
that
we
do
not
have
object really digital
computer,
actually tolookforO's
andl's, etc.;rather
wejusthavetolookforsomething
thatwecouldtreat
as orcount
as
orcouldbeusedtofunction
as O'sandl's. Furthermore,
tomakethematter
morepuzzling,
itturns
outthatthismachine
couldbemadeoutofjustaboutanything.
AsJohnson-Laird
"It
could
be
made
out
of
like
and
levers
old
an
fashioned
mechanical
says,
calculator;
cogs
itcouldbe madeoutofa hydraulic
whichwaterflows;
itcouldbe made
system
through
outoftransistors
etchedintoa silicon
whichan electrical
flows.It
current
chipthrough
couldevenbe carried
outbythebrain.Eachofthesemachines
usesa different
medium
to represent
of
the
or
absence
of
binary
symbols--the
water,the
positions cogs, presence
levelofthevoltage
andperhaps
nerveimpulses"
Laird,1988,p. 39).
(Johnson
Similar
remarks
aremadebymostofthepeoplewhowrite
onthistopic.Forexample,
Ned Block(Block,1990),showshowwecanhaveelectrical
gateswherethe1'sandO's
areassigned
tovoltage
levelsof4 voltsand7 voltsrespectively.
So wemight
think
that
weshouldgoandlookforvoltage
levels.ButBlocktellsusthat1 isonly"conventionally"
toa certain
level.Thesituation
morepuzzling
whenheinforms
us
assigned
voltage
grows
further
thatwedidnotneedtouseelectricity
at all butwecouldhaveusedan elaborate
ofcatsandmiceandcheeseandmakeourgatesinsucha waythatthecatwill
system
at theleashandpullopena gatewhichwecan alsotreatas ifitwerean 0 or 1.
strain
The point,as Blockis anxiousto insist,
is "theirrelevance
of hardware
realization
to
Thesegatesworkindifferent
computational
description.
waysbuttheyarenonetheless
computationally
equivalent"
(p.260). Inthesamevein,Pylyshyn
saysthata computational sequence
couldberealized
ofpigeons
trained
topeckas a Turing
machine!"
by"a group
1985,p. 57)
(Pylyshn,
Butnowifwearetrying
totakeseriously
theideathatthebrainisa digital
computer,
wegettheuncomfortable
result
thatwecouldmakea system
thatdoesjustwhatthebrain
doesoutofpretty
muchanything.
on thisview,youcanmake
Computationally
speaking,
a "brain"
thatfunctions
andmineoutofcatsandmiceandcheeseorlevers
justlikeyours
orwaterpipesorpigeons
oranything
elseprovided
thetwosystems
are,inBlock'ssense,
You
would
need
an awfullotofcats,or pigeons
or
"computationally
equivalent".
just
orwhatever
itmightbe. The proponents
ofCognitivism
waterpipes,
reportthisresultwith
sheerand unconcealeddelight.But I thinktheyoughtto be worriedaboutit,and I am
goingto tryto showthatit is justthetipof a wholeicebergofproblems.
26
APA PROCEEDINGS, VOL. 64, NO.3
IV. FirstDifficulty:
Syntaxis notIntrinsicto Physics
ofmultiple
ofcomputationalism
notworried
Whyarethedefenders
bytheimplications
thatthe
it
is
functional
accounts
The
answer
is
that
of
think
realizability?
they
typical
same functionadmitsof multiplerealizations.In thisrespect,computersare just like
can be made of brass or steel, so
and thermostats.Justas carburetors
carburetors
indefinite
hardware
materials.
can
be
made
of
an
of
range
computers
are definedin
and thermostats
The classesof carburetors
But thereis a difference:
ofcertainphysical
effects.That is why,forexample,nobodysays
termsof theproduction
out of pigeons. But the class of computersis defined
you can make carburetors
is a
of O's and l's. The multiplerealizability
the
in
terms
of
assignment
syntactically
in
different
can
be
same
achieved
the
effect
not
the
fact
that
of
consequence
physical
is
arepurelysyntactical.The physics
butthattherelevantproperties
substances,
physical
of
and
state
of
and
of
as
the
irrelevant
insofar
it
admits
O's
assignments
except
l's
transitions
betweenthem.
whichmightbe disastrous:
But thishas twoconsequences
would seem to imply
1. The same principlethat impliesmultiplerealizability
of
is definedin termsof the assignment
If computation
universalrealizability.
whatever
because
be
a
would
then
anyobject
digitalcomputer,
everything
syntax
in
made to it. You could describeanything
could have syntactical
ascriptions
termsofO's and l's.
to physics. The ascriptionof syntactical
2. Worse yet,syntaxis not intrinsic
who treatscertainphysical
to
an
relative
is
agentor observer
properties always
as
phenomena syntactical.
be disastrous?
Now whyexactlywouldtheseconsequences
howit producesmental
the
brain
to
know
how
works,specifically
Well, we wanted
that
thebrainis a digital
be
told
that
to
answer
question
phenomena.And itwouldnot
and
solar
in thesenseinwhichstomach,liver,heart,
system, thestateofKansas
computer
we
that
we
had
was
are all digitalcomputers.The model
mightdiscoversomefactabout
is
a
it
show
that
would
theoperationof thebrainwhich
computer.We wantedto know
in a way
were
brains
which
in
iftherewas notsomesense
intrinsically
digitalcomputers
hearts
or
thatgreenleaves intrinsically
pumpblood.
intrinsically
photosynthesis
perform
word
the
or
us
a
of
It is not matter
"pump"to hearts
assigning
arbitrarily"conventionally"
And
whatwe were
fact
of
the
matter.
is
an
actual
There
to
leaves.
or "photosynthesis"
that
would
makethem
about
brains
of
the
matter
fact
a
that
in
"Is
there
way
askingis,
brains
are digital
to
be
that
not
answer
It
does
told,
yes,
question
digitalcomputers?"
a
is
because
everything digitalcomputer.
computers
ofcomputation,
definition
On thestandardtextbook
1. For any object thereis some description
of thatobjectsuch that underthat
is
a
the
digitalcomputer.
description object
PRESIDENTIALADDRESSES
27
there
2. Foranyprogram
issomesufficiently
issome
objectsuchthatthere
complex
of
the
under
which
it
is
the
Thus
for
implementingprogram.
description object
back
is
now
the
wall
behind
the
Wordstar
my
right
implementing
example
becausethereis somepattern
of moleculemovements
whichis
program,
the
with
structure
of
Wordstar.
formal
But
if
the
wall
is
isomorphic
implementing
Wordstar
thenifitisa bigenough
wallitisimplementing
anyprogram,
including
inthebrain.
anyprogram
implemented
I thinkthemainreasonthattheproponents
do notsee thatmultiple
or universal
is
is
a
that
do
not
it
as
a
see
of
much
a
realizability problem
they
consequence
deeper
thatthe"syntax"
isnotthenameofa physical
likemassorgravity.
feature,
point,
namely
On thecontrary
andeven"semantic
as ifsuch
theytalkof"syntactical
engines"
engines"
talkwerelikethatofgasoline
or
diesel
as
if
it
a
could
be
engines
engines,
just plainmatter
offactthatthebrainoranything
elseis a syntactical
engine.
I think
itisprobably
toblocktheresult
ofuniversal
possible
realizability
bytightening
ofcomputation.
weoughttorespect
thefactthatprogrammers
upourdefinition
Certainly
andengineers
itas a quirkofTuring's
definitions
andnotas a realfeature
regard
original
ofcomputation.
works
Brian
Vinod
Smith,
Goel,andJohnBataliall
Unpublished by
that
a
more
realistic
definition
of
will
suchfeatures
as the
suggest
computation emphasize
causal relations
and
of
states,programmabilitycontrollabilitythe
amongprogram
and situatedness
in therealworld.Butthesefurther
restrictions
on the
mechanism,
definition
ofcomputation
areno helpin thepresent
discussion
becausethereallydeep
is thatsyntax
isessentially
an observer
relative
notion.Themultiple
problem
realizability
ofcomputationally
in
different
mediawasnotjusta signthat
equivalent
processes
physical
theprocesses
wereabstract,
butthattheywerenotintrinsic
to thesystem
at all. They
on
an
from
outside.
We
were
for
some
facts
of
thematter
depended interpretation
looking
whichwouldmakebrainprocesses
but
the
we
have
defined
computational; given way
there
never
could
be
such
facts
of
the
matter.
We
on
the
one
computation,
can't,
any
that
a
is
if
we
can
a
to
it
and
hand,say
then
anything digital
computer
assign syntax
intrinsic
to itsphysical
whether
or nota
supposethereis a factualquestion
operation
natural
suchas thebrainis a digital
system
computer.
Andiftheword"syntax"
seemspuzzling,
thesamepointcan be statedwithout
it.
Thatis,someone
claim
that
the
notions
of"syntax"
and"symbols"
arejusta manner
might
ofspeaking
and thatwhatwe arereallyinterested
in is theexistence
ofsystems
with
discrete
andstatetransitions
between
them.On thisviewwe don't
physical
phenomena
needO'sandI's; theyarejusta convenient
shorthand.
thismoveis
really
But,I believe,
no help. A physical
stateof a system
is a computational
stateonlyrelative
to the
tothatstateofsomecomputational
orinterpretation.
Thesame
assignment
role,function,
ariseswithout
suchas computation,
and
problem
algorithm
O'sand 1's becausenotions
do
not
name
features
of
statesarenot
intrinsic
program
physical
systems.
Computational
discovered
within
thephysics,
to thephysics.
theyareassigned
Thisisa different
from
RoomArgument
andI should
haveseen
argument theChinese
it tenyearsago butI didnot. The ChineseRoomArgument
showedthatsemantics
is not
to syntax.I am nowmakingtheseparateand different
intrinsic
pointthatsyntaxis not
intrinsic
to physics.Forthepurposes
oftheoriginalargument
I was simplyassuming
that
thesyntactical
characterization
ofthecomputer
wasunproblematic.
Butthatis a mistake.
28
APA PROCEEDINGS,VOL. 64,NO.3
thatsomething
a digitalcomputer
Thereis no wayyoucoulddiscover
is intrinsically
to an observer
becausethecharacterization
ofitas a digital
is alwaysrelative
computer
to thepurely
features
ofthesystem.As
a syntactical
whoassigns
physical
interpretation
thatthethesis
ofThought
thishastheconsequence
hypothesis,
appliedto theLanguage
unknown
thatthereare,intrinsically,
is incoherent.
Thereis nowayyoucoulddiscover
tosomeagentoruser
isa sentence
inyourheadbecausesomething
sentences
onlyrelative
modelgenerally,
the
who usesit as a sentence.As appliedto thecomputational
ofa physical
is a characterization
ofa process
as computational
characterization
system
an
doesnotidentify
as computational
oftheprocess
from
andtheidentification
outside;
characterization.
relative
itisessentially
an observer
ofthephysics,
intrinsic
feature
on
limits
therearea priori
I amnotsaying
Thispointhastobe understood
precisely.
a pattern
ofevents
innature.We couldnodoubtdiscover
wecoulddiscover
thepatterns
on this
of thevi program
to the implementation
in mybrainthatwas isomorphic
is
to say
as
a
is
to
that
But
process
say
somethingfunctioning computational
computer.
It
the
is
events
a
of
more
than
that
occurring.
requires
pattern physical
something
we
some
of
a
might
Analogously,
by
agent.
interpretation
assignment computational
innatureobjectswhichhadthesamesortofshapeas chairsandwhichcould
discover
be usedas chairs;butwe couldnotdiscover
therefore
objectsin naturewhichwere
themorusedthemas
who
to
relative
some
as
chairs,
agents regarded
except
functioning
chairs.
V. SecondDifficulty:
The Homunculus
Fallacy
is Endemicto Cognitivism
Thishas
isnotpartofphysics.
at a problem.
So far,weseemtohavearrived
Syntax
is
then
is defined
thatifcomputation
theconsequence
syntactically nothing intrinsically
Is thereanywayoutofthis
ofitsphysical
a digital
properties.
solelyinvirtue
computer
butitisout
in
taken
is
a
and
it
there
science,
is,
Yes,
cognitive
waystandardly
problem?
in
the
seen
the
works
I
have
of
the
Most
and
into
fire.
ofthefrying
computational
pan
on thehomunculus
somevariation
ofthemindcommit
fallacy.The ideaalways
theory
with.A
is to treatthebrainas ifthereweresomeagentinsideit usingit to compute
a
from
as
of
vision
the
task
describes
who
Marr
is
David
case
(1982)
proceeding
typical
of
the
visualarrayon theretinato a three-dimensional
two-dimensional
description
the
is: Who is reading
worldas outputof thevisualsystem.The difficulty
external
the
works
on
standard
and
in
other
Marr's
looks
it
book,
Indeed,
throughout
description?
its
treat
to
order
in
the
inside
a
homunculus
to
invoke
have
as
we
if
system
subject,
as
computational.
operations genuinely
with
a problem,
is notreally
because,
feelthatthehomunculus
fallacy
Manywriters
Since
is
this:
The
idea
be
can
the
homunculus
feel
that
Dennett
"discharged".
(1978),they
intoprogressively
canbe analyzed
ofthecomputer
thecomputational
simpler
operations
itseemsthatthe
wereachsimple
untileventually
"1-0"patterns,
units,
"yes-no",
flip-flop,
withprogressively
homunculican be discharged
stupiderhomunculi,until
higher-level
at
thatinvolvesno realhomunculus
we reachthebottomlevelofa simpleflip-flop
finally
willeliminatethehomunculi.
all. The idea,in short,is thatrecursive
decomposition
It tookme a longtimeto figureout whatthesepeopleweredrivingat, so in case
someoneelse is similarly
puzzledI willexplainan examplein detail:Supposethatwe have
PRESIDENTIAL
ADDRESSES
29
a computer
thatmultiplies
sixtimeseightto getforty-eight.
Now we ask "Howdoesit do
it?" Well,the answermightbe thatit addssix to itselfseventimes.[3] But ifyouask
"Howdoesit add six to itselfseventimes?",
theanswermightbe that,first,
it converts
all
and
ofthenumerals
intobinary
foroperating
notation, second,itappliesa simplealgorithm
on binarynotationuntilfinally
we reachthebottomlevelat whichtheonlyinstructions
are of the form,"Printa zero,erase a one." So, forexample,at the top level our
homunculus
six timeseightto get forty-eight".
intelligent
says"I knowhow to multiply
But at thenextlower-level
is
he replacedbya stupider
who says"I do not
homunculus
but I can do addition."Belowhimare some
actuallyknowhow to do multiplication,
oneswhosay'We do notactuallyknowhowto do additionormultiplication,
but
stupider
we knowhowto convertdecimalto binary."Belowtheseare stupider
oneswhosay"We
do not knowanything
about anyof thisstuff,
but we knowhow to operateon binary
whojustsay "Zeroone,
symbols."At thebottomlevelare a wholebunchof homunculi
zeroone". All of the higherlevelsreduceto thisbottomlevel. Only thebottomlevel
reallyexists;thetoplevelsare all justas-if.
Variousauthors(e.g.Haugeland(1981),Block(1990)) describe
thisfeature
whenthey
a semanticengine. But we stillmust
say thatthesystemis a syntactical
enginedriving
facethequestionwe had before:What factsintrinsic
to thesystemmakeit syntactical?
Whatfactsaboutthebottomlevelor anyotherlevelmaketheseoperations
intozerosand
ones? Without
a homunculus
thatstandsoutsidetherecursive
we do noteven
decomposition,
havea syntaxtooperatewith.The attemptto eliminatethehomunculus
fallacythrough
recursivedecomposition
to the
fails,because the onlyway to get the syntaxintrinsic
is to put a homunculus
in thephysics.
physics
Thereis a fascinating
featureto all of this. Cognitivists
concedethatthe
cheerfully
levelsofcomputation,
6
times
8"
are
observer
thereis nothing
higher
relative;
e.g."multiply
there
that
to
it
is
all
in
the
really
corresponds
directly multiplication;
eye of the
But theywantto stopthisconcessionat the lowerlevels. The
homunculus/beholder.
electroniccircuit,theyadmit,does not reallymultiply
6X8 as such,but it reallydoes
and
and
these
so
to
But
O's
manipulate
manipulations, speak,add up to multiplication.
l's
to concedethatthehigherlevelsofcomputation
arenotintrinsic
to thephysics
is already
to concedethatthelowerlevelsarenotintrinsic
either.So thehomunculus
fallacyis still
withus.
Forrealcomputers
ofthekindyoubuyin thestore,thereis no homunculus
problem,
each useris thehomunculusin question. But ifwe are to supposethatthe brainis a
we are stillfacedwiththe question"Andwho is the user?" Typical
digitalcomputer,
homunculus
sciencearesuchas thefollowing:
"Howdoesthevisual
questionsin cognitive
from
how
does
it
distance
fromsize of
systemcomputeshape
shading;
computeobject
retinalimage?"A parallelquestionwouldbe,"Howdo nailscomputethedistancetheyare
to travelin theboardfromtheimpactofthehammerand thedensity
ofthewood?"And
theansweris thesamein bothsortsofcase: Ifwe are talkingabouthowthesystem
works
neither
nailsnorvisualsystems
We
as
outside
homunculi
intrinsically
computeanything.
and it is oftenusefulto do so. But you do not
mightdescribethemcomputationally,
understand
hammering
by supposingthatnails are somehowintrinsically
implementing
and you do not understand
visionby supposingthe systemis
hammering
algorithms
implementing,
e.g.,theshapefromshadingalgorithm.
30
APA PROCEEDINGS, VOL. 64, NO.3
VI. ThirdDifficulty:
SyntaxHas No Causal Powers
Certainsortsof explanationsin the naturalsciencesspecifymechanismswhich
ofthephenomenato be explained.This is especially
function
causallyin theproduction
of disease,the accountof
commonin thebiologicalsciences. Thinkof thegermtheory
and
even
the Darwiniantheoryof
of
inherited
the
DNA
traits,
theory
photosynthesis,
and in each case the
is specified,
naturalselection. In each case a causal mechanism
Now ifyougo back
of
the
mechanism.
of
the
an
output
specification
gives explanation
andlook at thePrimalStoryit seemsclearthatthisis thesortofexplanation
by
promised
Cognitivism.The mechanisms
bywhichbrainprocessesproducecognitionare supposed
thecausesof
we willhave specified
theprograms
to be computational,
and byspecifying
do notneed
is
that
we
often
of
this
research
One
remarked,
program,
cognition.
beauty
Brain
in
to
order
to knowthe detailsof brainfunctioning
processes
explaincognition.
buttheprogram
ofthecognitive
level
programs,
onlythehardware
implementation
provide
aregiven.On thestandardaccount,as statedby
is wheretherealcognitive
explanations
Newell forexample,thereare threelevels of explanation,hardware,program,and
(Newell calls this last level, the knowledgelevel), and the special
intentionality
level.
of cognitive
scienceis madeat theprogram
contribution
aboutthiswhole
thenthereis something
ButifwhatI havesaidso faris correct,
fishy
theorywas at least
project. I used to believethatas a causal accountthecognitivist's
a versionofit thatis coherenteven to
false;but I nowam havingdifficulty
formulating
thesisat all. The thesisis thatthereis a whole
thepointwhereit couldbe an empirical
in the brain,O's and 1's flashing
lot of symbolsbeingmanipulated
throughthe brainat
to
even
the mostpowerful
naked
but
to
the
invisible
not
and
eye
only
lightning
speed
is thatthe
and it is thesewhichcause cognition.But thedifficulty
electronmicroscope,
O'sand 1's as suchhave no causal powersat all becausetheydo notevenexistexceptin
hasno causalpowersotherthanthose
theeyesofthebeholder.The implemented
program
has no real existence,no ontology,
mediumbecause the program
of the implementing
medium.
that
the
of
Physically
speakingthereis no such thingas
beyond
implementing
a separate"program
level".
of the
You can see thisif you go back to the PrimalStoryand remindyourself
and Turing'shumancomputer.In Turing's
difference
betweenthemechanicalcomputer
andit is functioning
levelintrinsic
to thesystem
humancomputer
therereallyis a program
is consciously
human
is
the
This
because
to
at
level
to
convert
that
output.
input
causally
this
and
a
certain
the
rules
for
explainshis
causally
computation,
doing
following
the same
to
mechanical
the
we
when
But
perform
computer
program
performance.
to
relative
now
is
a
of
us, the
the
interpretation
computational
computation, assignment
to the
causationintrinsic
outsidehomunculi.And thereis no longera levelofintentional
rules,and thisfactexplainshis
following
system.The humancomputeris consciously
any rulesat all. It is
following
behavior,but the mechanicalcomputeris not literally
rules,and so forpractical,commercial
designedto behaveexactlyas ifit werefollowing
like the
tellsus thatthebrainfunctions
purposesit does not matter.Now Cognitivism
commercialcomputerand this causes cognition. But withouta homunculus,both
haveno causalpowers
andthepatterns
andbrainhaveonlypatterns
commercial
computer
media. So it seemsthereis no wayCognitivism
in additionto thoseoftheimplementing
couldgivea causal accountofcognition.
PRESIDENTIAL
ADDRESSES
31
Howeverthereis a puzzleformyview. Anyonewho workswithcomputers
even
do
that
often
in
fact
causal
knows
we
that
to
the
casually
give
appeal
explanations
can
that
when
I
hit
For
we
this
I
such
and
such
results
say
key got
program. example,
becausethemachineis implementing
thevi program
and nottheemacsprogram;
and this
lookslikean ordinary
causalexplanation.So thepuzzleis,howdo we reconcilethefact
that syntax,as such, has no causal powerswith the fact that we do give causal
would these sortsof
explanationsthat appeal to programs?And, more pressingly,
an
model
for
will
explanations
Cognitivism, theyrescueCognitivism?
provide appropriate
Couldwe forexamplerescuetheanalogywiththermostats
out thatjustas the
bypointing
notion"thermostat"
in
causal
of
to the
figures
explanations
independently anyreference
of
its
so
the
notion
whileequally
physics
implementation,
"program",
mightbe explanatory
of thephysics.
independent
To explorethispuzzlelet us tryto makethecase forCognitivism
the
byextending
PrimalStoryto showhowtheCognitivist
workin actualresearch
investigative
procedures
is to program
a commercial
so thatit simulates
practice.The idea,typically,
computer
somecognitive
suchas visionorlanguage.Then,ifwegeta goodsimulation,
one
capacity,
that givesus at least Turingequivalence,we hypothesize
that the braincomputeris
thesameprogram
as thecommercial
and to testthehypothesis
we look
running
computer,
forindirectpsychological
evidence,such as reactiontimes. So it seems that we can
in exactlythe
causallyexplainthebehaviorof thebraincomputer
bycitingtheprogram
samesensein whichwe can explainthebehavior
ofthecommercial
computer.Now what
iswrongwiththat?Doesn'titsoundlikea perfectly
scientific
research
legitimate
program?
We knowthatthecommercial
conversion
of inputto outputis explainedby
computer's
a program,
and in the brainwe discoverthe same program,
hence we have a causal
explanation.
Two thingsoughtto worry
us immediately
aboutthisproject.First,we wouldnever
foranyfunction
ofthebrainwherewe actuallyunderstood
acceptthismodeofexplanation
howit workedat theneurobiological
level. Second,we wouldnotacceptit forothersorts
ofsystem
thatwe can simulatecomputationally.
To illustrate
thefirst
point,considerfor
et
example,thefamousaccountof"WhattheFrog'sEyeTells theFrog'sBrain"(Lettvin,
al. 1959 in McCulloch,1965). The accountisgivenentirely
in termsoftheanatomyand
of thefrog'snervoussystem.A typicalpassage,chosenat random,goes like
physiology
this:
1. SustainedContrastDetectors
An unmyelinated
axonof thisgroupdoesnotrespondwhenthegeneralillumination
is turnedon or off. If thesharpedgeof an objecteitherlighteror darkerthanthe
and continues
backgroundmovesinto its fieldand stops,it dischargespromptly
no matterwhattheshapeoftheedgeor whethertheobjectis smalleror
discharging,
field.(p. 239).
largerthanthereceptive
I have neverheardanyonesaythatall thisis justthehardware
and
implementation,
thattheyshouldhave figured
out whichprogram
the frogwas implementing.
I do not
doubtthatyou could do a computersimulation
of the frog's"bugdetectors".Perhaps
32
APA PROCEEDINGS, VOL. 64, NO.3
how the frog'svisual
someonehas done it. But we all knowthatonce you understand
the
irrelevant.
level"
is
works,
systemactually
"computational
just
of othersortsof systems.I am,
To illustrate
thesecondpoint,considersimulations
forexample,typingthesewordson a machinethatsimulatesthe behaviorof an old
fashionedmechanicaltypewriter.
[4] As simulations
go, the wordprocessing
program
betterthananyAl program
I knowofsimulatesthebrain. But no
simulatesa typewriter
sane person thinks:"At long last we understandhow typewriters
work,they are
in generalthat
not
the
case
word
It
is
of
processing
programs." simply
implementations
of
the
causal
simulations
phenomenasimulated.
provide
explanations
computational
simulations
So whatis goingon? We do notin generalsupposethatcomputational
in place of or in additionto neurobiological
of brainprocessesgiveus anyexplanations
accountsof how the brainactuallyworks. And we do not in generaltake "X is a
relation. That is, we do not
simulationof Y" to name a symmetrical
computational
the typewriter
thattherefore
a
the
simulates
that
because
typewriter
computer
suppose
simulatesa
simulatesa computer.We do notsupposethatbecausea weatherprogram
is
the
hurricane
of
of
the
behavior
that
the
causal
hurricane,
bythe
provided
explanation
unknown
brain
where
to
these
should
we
make
an
principles
exception
program.So why
the
for
there
are
Are
anygood grounds making exception?And
processes concerned?
an
is explanationthatcitesa formalprogram?
whatkindof a causal explanation
from
Here,I believe,is thesolutiontoourpuzzle.Once youremovethehomunculus
from
the
outside
thesystem,
youare leftonlywitha patternofeventsto whichsomeone
Nowtheonlysenseinwhichthespecification
couldattacha computational
interpretation.
of the patternby itselfprovidesa causal explanationis thatifyou knowthata certain
forthepattern.
existsin a system
youknowthatsomecauseorotheris responsible
pattern
if you
So you can, forexample,predictlaterstagesfromearlierstages. Furthermore,
youcan
byan outsidehomunculus,
alreadyknowthatthesystemhas beenprogrammed
of thehomunculus.You can
to theintentionality
thatmakereference
giveexplanations
say,e.g., thismachinebehavesthe way it does becauseit is runningvi. That is like
and doesnotcontainany
thatthisbookbeginswitha bitabouthappyfamilies
explaining
becauseit is Tolstoy'sAnna Kareninaand not
longpassagesabouta bunchof brothers,
The Brothers
Karamazov.Butyoucannotexplaina physicalsystemsuchas
Dostoevsky's
a patternwhichit shareswithits computational
or a brainby identifying
a typewriter
doesnotexplainhowthesystemactually
of thepattern
becausetheexistence
simulation,
is at muchtoo higha level
thepattern
worksas a physical
system.In thecase ofcognition
to explainsuch concretemental(and therefore
of abstraction
physical)eventsas the
of a sentence.
or theunderstanding
of a visualperception
occurrence
work
andhurricanes
Now,I thinkitis obviousthatwecannotexplainhowtypewriters
simulations.Whyis
theysharewiththeircomputational
bypointingto formalpatterns
it notobviousin thecase of thebrain?
Here we come to thesecondpartof oursolutionto thepuzzle. In makingthecase
thatthebrainmightbe implementing
we weretacitly
forCognitivism
algorithms
supposing
forcognition,in the same sense that Turing'shumancomputerand his mechanical
whichwe have seen
thatassumption
But it is precisely
algorithms.
implement
computer
an
whathappenswhena systemimplements
to be mistaken.To see this,ask yourself
the
thestepsof
consciously
goesthrough
algorithm.In thehumancomputerthesystem
so theprocessis bothcausalandlogical;logical,becausethealgorithm
provides
algorithm,
ADDRESSES
PRESIDENTIAL
33
a setofrulesforderiving
theoutput
from
theinputsymbols;
causal,becausethe
symbols
a
the
conscious
effort
to
in thecaseofthe
is
agent making
go through steps.Similarly
an
mechanical
the
whole
includes
outside
andwiththe
homunculus,
computer,
system
homunculus
thesystem
is bothcausaland logical;logical,becausethe homunculus
an interpretation
to theprocesses
and causal,becausethe
of themachine;
provides
hardware
of themachine
causesit to go through
theprocesses.Buttheseconditions
cannotbemetbythebrute,
blindnonconscious
ofthebrain.
operations
neurophysiological
In thebraincomputer
thereis no conscious
intentional
of
the
implementation algorithm
as there
isinthemechanical
becausethatrequires
anoutside
homunculus
either,
computer
toattacha human
but
there
can't
be
as there
nonconscious
computer,
any
implementation
is inthecomputational
to
the
we
events.
The
most
could
findin
interpretation physical
thebrainis a pattern
ofevents
whichis formally
similar
to theimplemented
in
program
themechanical
but
that
as
has
no
causal
to
call
its
own
computer,
pattern, such,
powers
andhenceexplains
nothing.
In sum,thefactthattheattribution
ofsyntax
identifies
no further
causalpowers
is
fataltotheclaimthatprograms
causal
of
To
the
provide
explanationscognition. explore
ofthis,
letusremind
ofwhatCognitivist
ourselves
look
consequences
explanations
actually
like.Explanations
suchas Chomsky's
account
ofthesyntax
ofnatural
or
Marr's
languages
accountofvisionproceed
a setofrulesaccording
towhicha symbolic
bystating
inputis
converted
intoa symbolic
In
a
for
case, example,single
output. Chomsky's
inputsymbol,
intoanyoneofa potentially
infinite
ofsentences
number
S, isconverted
bytherepeated
of a set of syntactical
rules. In Marr'scase, representations
of a two
application
dimensional
visualarray
areconverted
intothree
dimensional
of
theworld
"descriptions"
in accordance
withcertain
Marr's
distinctions
between
the
algorithms.
tripartite
computational
solution
ofthetaskandthehardware
ofthe
task,thealgorithmic
implementation
has
Newell's
a
become
famous
as
statement
of
the
(like
algorithm
distinctions)
general
oftheexplanation.
pattern
Ifyoutaketheseexplanations
as I do,it is bestto think
ofthemas saying
naively,
thatit isjustas ifa manaloneina roomweregoingthrough
a setofstepsoffollowing
rulestogenerate
sentences
or3D descriptions,
as thecasemight
be. Butnow,let
English
us askwhatfactsin therealworldaresupposed
to correspond
to theseexplanations
as
to
the
brain.
In
for
we
are
not
to
think
that
case, example,
applied
Chomsky's
supposed
theagentconsciously
a setofrepeated
of rules;norarewe
goesthrough
applications
to
think
that
he
is
his
theset ofrules.
supposed
unconsciously
thinking waythrough
Rather
therulesare"computational"
andthebrainiscarrying
outthecomputations.
But
whatdoesthatmean?Well,wearesupposed
to thinkthatit isjustlikea commercial
Thesortofthing
thatcorresponds
totheascription
ofthesamesetofrulesto
computer.
a commercial
is
to
to
the
ofthoserulesto the
computersupposed correspond ascription
brain.Butwehaveseenthatinthecommercial
the
isalways
observer
computer ascription
theascription
is maderelativeto a homunculus
relative,
whoassignscomputational
tothehardware
states.Without
thehomunculus
there
isnocomputation,
interpretations
circuit.So howdo we getcomputation
intothebrainwithout
justan electronic
a
homunculus?As faras I know,neither
norMarreveraddressed
thequestionor
Chomsky
even thoughttherewas such a question. But withouta homunculusthereis no
of the programstates. There is just a physical
explanatory
powerto the postulation
34
APA PROCEEDINGS, VOL. 64, NO.3
causal levelsof
the brain,withitsvariousreal physicaland physical/mental
mechanism,
description.
VII. FourthDifficulty:The BrainDoes
Not Do Information
Processing
to whatI thinkis,in someways,thecentralissuein all
In thissectionI turnfinally
of this,the issue of information
processing.Many people in the "cognitivescience"
and theywill
is simplyirrelevant
feel
that
of mydiscussion
will
much
scientific
paradigm
as
follows:
it
argueagainst
betweenthebrainand all of theseothersystems
Thereis a difference
youhave been
in thecase of
simulation
a
this
difference
and
explainswhy computational
describing,
whereasin thecase of thebraina computais a meresimulation,
theothersystems
and not merelymodelingthe functional
tionalsimulationis actuallyduplicating
is an
reason
is
that
thebrain,unliketheseothersystems,
of
the
brain.
The
properties
in
the
brain
about
And
this
fact
is,
words,
your
processing
information
system.
"intrinsic".It is just a fact about biologythat the brain functionsto process
and sincewe can also processthe same information
information,
computationally,
role
a
different
brain
have
of
models
altogetherfrom
processes
computational
well
research
is
a
defined
there
So
the
weather.
models
for
of,
example,
computational
information
brain
the
which
the
"Are
processes
by
computational
procedures
question:
thesameas theprocedures
processthesameinformation?
bywhichcomputers
What I just imaginedan opponentsayingembodiesone of the worstmistakesin
are used
science.The mistakeis tosupposethatin thesensein whichcomputers
cognitive
To see thatthatis a mistake,
brainsalso processinformation.
to processinformation,
withwhatgoeson in thebrain. In thecase of the
whatgoeson in thecomputer
contrast
in a formthatcan be processedby
encodes
some
information
outside
an
agent
computer,
of the
realization
of thecomputer.That is,he or she providesa syntactical
thecircuitry
thatthe computercan implement
information
in, forexample,different
voltagelevels.
a seriesof electricalstagesthattheoutsideagentcan
The computerthengoes through
has no
ofcourse,thehardware
eventhough,
and
both
interpret syntactically semantically
not
does
the
And
beholder.
the
all
in
the
of
It
is
or
semantics:
intrinsic
physics
eye
syntax
an
the
can
to
it
that
matterprovided
output
Finally,
implement algorithm.
get
only
you
as symbols
can interpret
whichan observer
is producedin theformofphysical
phenomena
witha syntaxand a semantics.
But nowcontrastthatwiththebrain. In thecase of thebrain,noneof therelevant
relative(thoughof course,likeanything,
theycan
neurobiological
processesare observer
oftheneurophysirelativepointofview)and thespecificity
froman observer
be described
clear,let us go throughan example.
ologymattersdesperately.To makethisdifference
modelof visionwill
standard
me.
A
toward
car
a
I
see
computational
coming
Suppose
about the visual arrayon myretinaand eventuallyprintout the
take in information
sentence,"Thereis a car comingtowardme". But thatis notwhathappensin theactual
reactionsare set
seriesofelectro-chemical
biology.In thebiologya concreteand specific
cellsofmyretina,and thisentire
up bytheassaultofthephotonson thephotoreceptor
PRESIDENTIALADDRESSES
35
ina concrete
visualexperience.
results
Thebiological
isnotthat
eventually
reality
process
ofa bunchofwords
orsymbols
the
visual
rather
it
is
a matter
by
beingproduced
system,
of a concrete
conscious
visual
this
visual
event; very
specific
experience.Now,that
visualeventis as specific
concrete
andas concrete
as a hurricane
or thedigestion
ofa
meal. We can,withthecomputer,
do an information
of
model
that
event
or
processing
ofitsproduction,
as wecando an information
model
of
the
weather,
processing
digestion
or anyotherphenomenon,
butthephenomena
themselves
arenotthereby
information
systems.
processing
In short,
thesenseofinformation
thatis usedin cognitive
is at
science,
processing
muchtoohigha levelofabstraction
tocapture
theconcrete
of
intrinsic
biological
reality
The"information"
inthebrainisalways
tosomemodality
orother.
intentionality.
specific
It is specific
to thought,
or vision,or hearing,
or touch,forexample.The levelof
information
whichis described
in thecognitive
sciencecomputational
models
processing
ofcognition,
on theotherhand,is simply
a matter
ofgetting
a setofsymbols
as output
toa setofsymbols
inresponse
as input.
We areblinded
to thisdifference
"I see a car
bythefactthatthesamesentence,
toward
can
be
used
to
record
both
the
visual
and
me",
coming
intentionality theoutput
ofthecomputational
modelofvision.Butthisshouldnotobscure
from
us thefactthat
thevisualexperience
is a concrete
eventandis produced
in thebrainbyspecific
electrochemical
To confuse
theseeventsandprocesses
withformal
biological
processes.
symbol
is to confuse
thereality
withthemodel.The upshotofthispartofthe
manipulation
discussion
is thatinthesenseof"information"
usedincognitive
itis simply
false
science,
tosaythatthebrainis an information
device.
processing
VI. Summary
oftheArgument
Thisbrief
hasa simple
andI willlayitout:
argument
logicalstructure
textbook
isdefined
interms
1. On thestandard
definition,
computation
syntactically
ofsymbol
manipulation.
2. Butsyntax
andsymbols
arenotdefined
in terms
ofphysics.Thoughsymbol
tokens
arealways
and"samesymbol"
arenotdefined
in
tokens,
physical
"symbol"
terms
ofphysical
features.
inshort,
is notintrinsic
tophysics.
Syntax,
3. Thishastheconsequence
thatcomputation
isnotdiscovered
inthephysics,
itis
toit. Certain
areassigned
orusedorprogrammed
assigned
physical
phenomena
orinterpreted
andsymbols
areobserver
relative.
syntactically.
Syntax
thatyoucouldnotdiscover
4. It follows
thatthe brainor anything
else was
a
intrinsicallydigitalcomputer,
although
youcouldassigna computational
interpretation
toitas youcouldtoanything
else.Thepointisnotthattheclaim
"The brainis a digitalcomputer"
is false. Ratherit does notgetup to thelevel
of falsehood.It does not have a clearsense. You willhave misunderstood
my
accountifyou thinkthatI am arguingthatit is simplyfalsethatthe brainis a
Is as ill defined
digitalcomputer.The question"Is thebraina digitalcomputer?"
36
APA PROCEEDINGS,
VOL. 64,NO.3
"Isit
as thequestions
"Isitan abacus?",
"Isit a book?",
"Isit a setofsymbols?",
a setofmathematical
formulae?"
thanothers.
5. Somephysical
facilitate
thecomputational
usemuchbetter
systems
and use them. In suchcaseswe are the
That is whywe build,program,
inthesystem
thephysics
inbothsyntactic
andsemantic
homunculus
interpreting
terms.
different
we thengivedo notcitecausalproperties
6. Butthecausalexplanations
ofthehomunculus.
thephysics
oftheimplementation
andtheintentionality
from
thehomunculus
7. Thestandard,
tacit,
fallacy.
though
wayoutofthisistocommit
of
and
to
models
The homunculus
is
endemic
cognition
fallacy
computational
beremoved
the
recursive
cannot
standard
They
by
decomposition
arguments. are
to a different
addressed
question.
thatthe brainis doing
8. We cannotavoidthe foregoing
results
bysupposing
are
intrinsic
far
as
its
The
as
"information
brain,
operations
processing".
and
It
is
a
doesnoinformation
concerned,
organ its
biological
processing. specific
In the
forms
of
cause
intentionality.
neurobiological
processes
specific
specific
and
sometimes
there
are
brain,
theycause
intrinsically,
neurobiological
processes
consciousness.
Butthatis theendofthestory.
[5]
Footnotes
seeSearle(1980)andSearle(1984).
[1] Forearlier
explorations
ofbooksandarticles
anddefended
in a largenumber
This
view
is
announced
[2]
andThought
more
or
less
the
same
which
to
have
of
title,
e.g.Computers
many
appear
et
and
and
eds.
al,
(Sharples 1988),The
(Feigenbaum Feldman, 1963),Computers Thought
and
and
the
Mind
1985),
1988),ComputationCognition
(Pylyshyn,
(Johnson-Laird,
Computer
and of course,
Modelof the Mind"(Block,1990, forthcoming),
"The Computer
andIntelligence"
1950).
(Turing,
Machinery
"Computing
times.Butthat
sometimes
eight
[3] People
saythatitwouldhavetoaddsixtoitself
six
addedto itself
because
is
times
Six addedto itself
is badarithmetic.
fifty-four,
eight
is made.
howoftenthismistake
zerotimesis stillsix. It is amazing
Batali.
wassuggested
byJohn
[4] Theexample
oftheissues
ofpeoplefordiscussions
number
a
to
am
indebted
I
remarkably
large
[5]
areduetoJohnBatali,Vinod
inthispaper.I cannotthankthemall,butspecialthanks
Goel,IvanHavel,KirkLudwig,
DagmarSearleandKlausStrelau.
References
ModeloftheMind".
"TheComputer
Block,Ned (1990),forthcoming
andLogic.Cambridge
C. (1989). Computability
Richard
Boolos,GeorgeS. andJeffrey,
Press,
Cambridge.
University
PRESIDENTIALADDRESSES
37
onMindandPsychology.
DanielC. (1978). Brainstorms:
MIT
Dennett,
Philosophical
Essays
Mass.
Press,
Cambridge,
andThought.
E.A. andFeldman,
McGraw-Hill
J.eds.(1963). Computers
Feigenbaum,
New
and
York
San
Francisco.
Company,
NewYork.
ThomasY. Crowell,
Fodor,
J.(1975). TheLanguage
ofThought.
Mass.
Mind
MIT
ed.
Press,
(1981).
John,
Design.
Cambridge,
Haugeland,
P.N.
and
the
Mind.
Harvard
The
Press,
(1988).
Johnson-Laird,
Computer
University
Mass.
Cambridge,
Kuffler,
JohnG. (1976). FromNeuronto Brain. Sinauer
StephenW. and Nicholls,
Mass.
Associates,
Sunderland,
Lettvin,
H.R.,McCulloch,
W.S.,and Pitts,W.H. (1959). "Whatthe
J.Y.,Maturana,
the
Brain".
Tells
47,
ofRadioEngineers,
Frog'sEye
Proceedings
oftheInstitute
Frog's
in
230-255.
1940-1951,
McCulloch,
1965,pp.
reprinted
San Francisco.
andCompany,
Marr,David(1982). Vision.W.H.Freeman
Warren
The
Embodiments
Mind.
MIT
Mass.
McCulloch,
S. (1965).
Press,
of
Cambridge,
Z.
and
MIT
Mass.
Pylyshyn,(1985). Computation Cognition. Press,
Cambridge,
R. (1980). "Minds,
andPrograms",
Brains
TheBehavioral
andBrainSciences.
Searle,John
417-424.
3, pp.
R. (1984). Minds,
andScience.Harvard
Brains
Searle,John
Press,
University
Cambridge,
Mass.
D. (1988). Computers
M.,Hogg,D., Hutchinson,
C., Torrance,
S., andYoung,
Sharples,
andThought.
MIT Press,
Mass.
and
London.
Cambridge,
GordonM. (1983). Neurobiology.
OxfordUniversity
Press,NewYorkand
Shepherd,
Oxford.
Alan(1950). "Computing
andIntelligence."
Mind,59,433-460.
Turing,
Machinery
Download