Eliezer Yudkowsky Singularity Institute for AI

advertisement
Creating a New Intelligent Species:
Choices and Responsibilities for AI Designers
Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
In Every Known Culture:
•
•
•
•
•
•
•
•
•
tool making
weapons
grammar
tickling
sweets preferred
planning for future
sexual attraction
meal times
private inner life
• try to heal the sick
• incest taboos
• true distinguished
from false
• mourning
• personal names
• dance, singing
• promises
• mediation of conflicts
(Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.)
Eliezer Yudkowsky
Singularity Institute for AI
ATP Synthase:
The oldest wheel.
ATP synthase is
nearly the same in
mitochondria,
chloroplasts, and
bacteria – it’s older
than eukaryotic life.
Eliezer Yudkowsky
Singularity Institute for AI
A complex adaptation must be
universal within a species.
Imagine a complex adaptation – say, part of an
eye – that has 6 necessary proteins. If each
gene is at 10% frequency, the chance of
assembling a working eye is 1:1,000,000.
Pieces 1 through 5 must already be fixed in the
gene pool, before natural selection will promote
an extra, helpful piece 6 to fixation.
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
The Psychic Unity of Humankind
(yes, that’s the standard term)
Complex adaptations must be universal –
this logic applies with equal force to
cognitive machinery in the human brain.
In every known culture: joy, sadness,
disgust, anger, fear, surprise – shown by
the same facial expressions.
(Paul Ekman, 1982. Emotion in the Human Face.)
(John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture.
In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
Must…
not…
emote…
Image:
“The Matrix”
Aha! A human with the AIuniversal facial expression
for disgust! (She must be a
machine in disguise.)
Images: (1) “The Matrix” (2) University of Plymouth,
http://www.psy.plym.ac.uk/year3/psy364emotions/psy364_emotions_evolutionary_psychobiolog.htm
Eliezer Yudkowsky
Singularity Institute for AI
Anthropomorphic hypothesis:
Causes
Eliezer Yudkowsky
Singularity Institute for AI
Same mistake, more subtle:
Causes
Eliezer Yudkowsky
Singularity Institute for AI
in nature we see
what exists in us;
in looks out, and finds
faces in the clouds...
It takes a conscious effort to
remember the machinery:
Eliezer Yudkowsky
Singularity Institute for AI
AI Nature:
•
•
•
•
•
•
•
•
•
tool making
weapons
grammar
tickling
sweets preferred
planning for future
sexual attraction
meal times
private inner life
Eliezer Yudkowsky
• try to heal the sick
• incest taboos
• true distinguished
from false
• mourning
• personal names
• dance, singing
• promises
• mediation of conflicts
Singularity Institute for AI
AI Nature:
•
•
•
•
•
•
•
•
•
tool making
weapons
grammar
tickling
sweets preferred
planning for future
sexual attraction++
meal times
private inner life
Eliezer Yudkowsky
• heal sick humans
• snarkling taboos
• true distinguished
from false
• mourning
• personal names
• dance, fzeeming
• promises
• mediation of conflicts
Singularity Institute for AI
Crimes against nonhumanity
and inhuman rights violations:
•
•
•
•
•
•
cognitive enslavement
theft of destiny
creation under a low purpose
denial of uniqueness
hedonic/environmental mismatch
fzeem deprivation
Eliezer Yudkowsky
Singularity Institute for AI
Happiness set points:
• After one year, lottery winners were not
much happier than a control group, and
paraplegics were not much unhappier.
• People underestimate adjustments
because they focus on the initial surprise.
(Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and
accident victims: is happiness relative? Journal of Personality and Social
Psychology, 37, 917-927.)
Eliezer Yudkowsky
Singularity Institute for AI
“Hedonic treadmill” effects:
• People with $500,000-$1,000,000 in
assets say they would need an average of
$2.4 million to feel “financially secure”.
• People with $5 million feel they need at
least $10 million.
• People with $10 million feel they need at
least $18 million.
(Source: Survey by PNC Advisors.
http://www.sharpenet.com/gt/issues/2005/mar05/1.shtml)
Eliezer Yudkowsky
Singularity Institute for AI
Your life circumstances make little
difference in how happy you are.
“The fundamental surprise of well-being research is the
robust finding that life circumstances make only a small
contribution to the variance of happiness—far smaller
than the contribution of inherited temperament or
personality. Although people have intense emotional
reactions to major changes in the circumstances of their
lives, these reactions appear to subside more or less
completely, and often quite quickly... After a period of
adjustment lottery winners are not much happier than a
control group and paraplegics not much unhappier.”
(Daniel Kahneman, 2000. “Experienced Utility and Objective Happiness: A Moment-Based
Approach.” In Choices, Values, and Frames, D. Kahneman and A. Tversky (Eds.) New
York: Cambridge University Press.) Findable online, or google “hedonic psychology”.
Eliezer Yudkowsky
Singularity Institute for AI
Nurture is built atop nature:
• Growing a fur coat in response to cold
weather requires more genetic complexity
than growing a fur coat. (George C. Williams, 1966.
Adaptation and Natural Selection. Princeton University Press.)
• Humans learn different languages
depending on culture, but this cultural
dependency rests on a sophisticated
cognitive adaptation: mice don’t do it. (John
Tooby and Leda Cosmides, 1992. The Psychological Foundations of
Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky
Singularity Institute for AI
Creation
transcends
parenting:
An AI programmer stands,
not in loco parentis,
but in loco evolutionis.
Eliezer Yudkowsky
Singularity Institute for AI
To
create a new intelligent species
(even if it has only one member)
is to create,
not a child of the programmers,
but a child of humankind,
a new descendant of the family
that began with Homo sapiens
Eliezer Yudkowsky
Singularity Institute for AI
If you didn’t intend to create a child
of humankind, then you screwed up
big-time if your “mere program”:
• Starts talking about the mystery of
conscious experience and its sense of
selfhood.
• Or wants public recognition of personhood
and resents social exclusion (inherently,
not as a pure instrumental subgoal).
• Or has pleasure/pain reinforcement and a
complex powerful self-model.
Eliezer Yudkowsky
Singularity Institute for AI
BINA48
• By hypothesis, the first child of humankind
• created for the purpose of a bloody customer
service hotline (?!)
• from the bastardized mushed-up brain scans of
some poor human donors
• by morons who didn’t have the vaguest idea
how important it all was
By the time this gets to court, no matter
what the judge decides, the human species
has already screwed it up.
Eliezer Yudkowsky
Singularity Institute for AI
Take-home message:
Don’t refight the last war.
Doing right by a child of humankind is not like
ensuring fair treatment of a human minority.
Program children kindly;
fair treatment may be too little too late.
Eliezer Yudkowsky
Singularity Institute for Artificial Intelligence
singinst.org
Download