Artificial Intelligence: Can Machines Think?

advertisement
Can Machines Think?
Peter Bock
Professor of Machine Intelligence and Cognition
Director of Project ALISA
Department of Computer Science
The George Washington University
Background Issues
Assumption: ... the question of whether Machines Can Think ... is about as relevant
as the question of whether Submarines Can Swim. [Dijkstra 1984]
Axiom: The whole is greater than the sum of its parts.
[??????????]
Definition: A part of an entity consists exclusively of matter and/or energy.
[Bock 2005]
Axiom: The whole is exactly equal to the sum of its parts; if it seems otherwise,
at least one of its parts has been overlooked.
Definition: A set may be arbitrarily large and complex.
[Bock 2005]
[Cantor 1874]
1
Background Issues
Assumption: ... the question of whether Machines Can Think ... is about as relevant
as the question of whether Submarines Can Swim.
Axiom: The whole is greater than the sum of its parts.
[Dijkstra 1984]
[??????????]
Definition: A part of an entity consists exclusively of matter and/or energy.
[Bock 2002]
Axiom: The whole is exactly equal to the sum of its parts; if it seems otherwise,
at least one of its parts has been overlooked.
Definition: A set may be arbitrarily large and complex.
[Bock 2002]
[Cantor 1874]
Fundamental Propositions
Definition: Intelligence is the ability of an entity to synthesize responses that are
significantly correlated with its stimuli. [Bock 1993]
Postulate: Intelligence capacity is a measure of the amount of information
that can be stored in the memory of an entity. [Bock 1993]
Definition: The standard unit of information is the bit, which is the base-2 logarithm of the
number of unique states an entity can be in.
[Shannon & Weaver, 1949]
2
Examples of Intelligence Capacity
Entity
Intelligence Capacity (bits)
toggle switch
100 = 1
worm
104 = 10,000
sea slug
107 = 10,000,000
tiny lizard
108 = 100,000,000 = 10 MB
desktop computer
1010 = 10,000,000,000 = 1 GB
DNA molecule
1010 = 10,000,000,000 = 1 GB
frog
1011 = 100,000,000,000 = 10 GB
mainframe computer 1012 = 1,000,000,000,000 = 100 GB
dog
1014 = 100,000,000,000,000 = 10,000 GB = 10 TB
human being
1015 = 1,000,000,000,000,000 = 100 TB
human species
1025 = 10,000,000,000,000,000,000,000,000 = 1 YB
universe
1084 = 1,000,000,000,000,000,000,000,
000,000,000,000,000,000,000,
000,000,000,000,000,000,000,
000,000,000,000,000,000,000
(number of baryons)
3
Growth of Computer Memory Capacity
RAM capacity (bytes)
generation
period
technology
PC
% human
1
1952 - 1958
vacuum tube
2
1958 - 1964
transistor
1 KB
3
1964 - 1970
SSI
10 KB
4
1970 - 1976
MSI
100 KB
5
1976 - 1982
LSI
1 MB
100 KB
0.000001
6
1982 - 1988
VLSI
10 MB
1 MB
0.00001
7
1988 - 1994
CISC
100 MB
10 MB
0.0001
8
1994 - 2000
RISC
1 GB
100 MB
0.001
9
2000 - 2006
NOW
MP RISC
10 GB
1 GB
mainframe
0.1 KB
0.01
Frog
4
Growth of Computer Memory Capacity
Generation
1
2
3
4
5
6
7
8
9
1 Petabyte
human brain
1 Terabyte
Memory
Capacity
1 Gigabyte
1 Megabyte
PC RAM
1 Kilobyte Mainframe RAM
1952
1958
NOW
1964
1970
1976
1982
1988
1994
2000
2006
Time Period
5
Growth of Computer Memory Capacity
Generation
1
2
3
4
5
6
7
8
9
1 Petabyte
human brain
1 Terabyte
Memory
Capacity
1 Gigabyte
1 Megabyte
my PC disk capacities
my PC RAM capacities
PC RAM
1 Kilobyte Mainframe RAM
1952
1958
NOW
1964
1970
1976
1982
1988
1994
2000
2006
Time Period
6
Growth of Computer Memory Capacity
Generation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
1 Petabyte
human brain
1 Terabyte
Memory
Capacity
technology change
1 Gigabyte
1 Megabyte
1 Kilobyte
PC RAM
Mainframe RAM
1952
1958
1964
1970
NOW
1976
1982
1988
1994
2000
2006
2012
2018
2024
2030
2036
Time Period
7
Knowledge Acquisition
Definition: Knowledge is the instantiation of intelligence.
Definition: Cognition (Thinking) is the mental process of acquiring, representing,
processing, and applying knowledge.
8
Knowledge Acquisition
Definition: Knowledge is the instantiation of intelligence.
Definition: Cognition (Thinking) is the mental process of acquiring, representing,
processing, and applying knowledge.
Programming
10% capacity of the brain
1 line of code (rule)
software production rate
software production time
≈
≈
≈
≈
≈
1014 bits
1000 bits ≈ 100 billion rules
10 lines of code per person-hour
1010 person-hours
10,000,000 person-years !!!
IMPOSSIBLE !!!
Fact: This approach for achieving robust AI was abandoned in the mid-1980’s.
NONETHELESS...
Fact: CYC: rule-based system funded by DARPA and directed by Douglas Lenat
• under construction for more than 20 years at MCC in Texas
• objective is to include 1 billion “common sense” rules
• no significant successes and many, many failures
9
Knowledge Acquisition
Direct Transfer
10% capacity of the brain ≈ 1014 bits
data transfer rate ≈ 108 bits per second
data transfer time ≈ 106 seconds
≈ 12 days
GREAT !!!
HOW ???
10
Knowledge Acquisition
Learning
Definition: Learning is the dynamic acquisition and application of knowledge
based on unsupervised and supervised training.
10% capacity of the brain ≈ 1014 bits
average rate of sensory input ≈ 500,000 bits per second
knowledge acquisition time ≈ 200,000,000 seconds
≈ 3500 days (16 hours per day)
≈ 10 years
THAT’S BETTER !!!
Collective Learning Systems (CLS)
[Bock 1976]
Definition: Project ALISA is an adaptive non-parametric parallel-processing
statistical knowledge acquisition and classification system based on
CLS theory. [Bock, et al. 1992]
Practical applications are illustrated on my website.
11
Training Style
Derived Art
Edvard Munch
(10 images)
Source Image
mimicry = 25%
brush size = thick
influence = high
photograph
Courtesy of Ben Rubinger
12
Training Style
Derived Art
Monet
(39 images)
Source Image
photograph
Courtesy of Ben Rubinger
mimicry = 28%
brush size = large
influence = high 13
Training Style
Derived Art
Sam Brown
(171 images)
Source Image
mimicry = 28%
brush size = medium
influence = medium
photograph
Courtesy of Ben Rubinger
14
Training Style
Derived Art
brick walls
(6 images)
Source Image
mimicry = 24%
brush size = medium
influence = high
photograph
Courtesy of Ben Rubinger
15
le début
Download