Storing Memories as Demonstration of Hopfield Net Attractors

advertisement
Part 3: Autonomous Agents
11/8/04
Storing
Memories as
Attractors
Demonstration of Hopfield Net
Run Hopfield Demo
11/8/04
(fig. from Solé & Goodwin)
1
Example of
Pattern
Restoration
11/8/04
11/8/04
2
Example of
Pattern
Restoration
(fig. from Arbib 1995)
3
11/8/04
(fig. from Arbib 1995)
4
1
Part 3: Autonomous Agents
11/8/04
Example of
Pattern
Restoration
11/8/04
Example of
Pattern
Restoration
(fig. from Arbib 1995)
5
Example of
Pattern
Restoration
11/8/04
11/8/04
(fig. from Arbib 1995)
6
(fig. from Arbib 1995)
8
Example of
Pattern
Completion
(fig. from Arbib 1995)
7
11/8/04
2
Part 3: Autonomous Agents
11/8/04
Example of
Pattern
Completion
11/8/04
Example of
Pattern
Completion
(fig. from Arbib 1995)
9
Example of
Pattern
Completion
11/8/04
11/8/04
(fig. from Arbib 1995)
10
(fig. from Arbib 1995)
12
Example of
Pattern
Completion
(fig. from Arbib 1995)
11
11/8/04
3
Part 3: Autonomous Agents
11/8/04
Example of
Association
11/8/04
Example of
Association
(fig. from Arbib 1995)
13
Example of
Association
11/8/04
11/8/04
(fig. from Arbib 1995)
14
(fig. from Arbib 1995)
16
Example of
Association
(fig. from Arbib 1995)
15
11/8/04
4
Part 3: Autonomous Agents
11/8/04
Applications of
Hopfield Memory
Example of
Association
11/8/04
•
•
•
•
(fig. from Arbib 1995)
17
Pattern restoration
Pattern completion
Pattern generalization
Pattern association
11/8/04
18
Hopfield Net for Optimization
and for Associative Memory
• For optimization:
– we know the weights (couplings)
– we want to know the minima (solutions)
Demonstration of Hopfield Net
• For associative memory:
– we know the minima (retrieval states)
– we want to know the weights
11/8/04
Run Hopfield Demo
19
11/8/04
20
5
Part 3: Autonomous Agents
11/8/04
Example of Hebbian Learning:
Pattern Imprinted
Hebb’s Rule
“When an axon of cell A is near enough to
excite a cell B and repeatedly or persistently
takes part in firing it, some growth or
metabolic change takes place in one or both
cells such that A’s efficiency, as one of the
cells firing B, is increased.”
—Donald Hebb (The Organization of Behavior, 1949, p. 62)
11/8/04
21
Example of Hebbian Learning:
Partial Pattern Reconstruction
11/8/04
22
Mathematical Model of Hebbian
Learning for One Pattern
xi x j ,
Let W ij = 0,
if i j
if i = j
Since x i x i = x i2 = 1, W = xx T I
For simplicity, we will include self-coupling:
W = xx T
11/8/04
23
11/8/04
24
6
Part 3: Autonomous Agents
11/8/04
A Single Imprinted Pattern is a
Stable State
Questions
• Suppose W = xxT
• Then h = Wx = xxTx = nx
since T
n
n
2
2
x x = x i = (±1) = n
i=1
i=1
• Hence, if initial state is s = x, then new state
is s = sgn (n x) = x
• May be other stable states (e.g., –x)
11/8/04
25
• How big is the basin of attraction of the
imprinted pattern?
• How many patterns can be imprinted?
• Are there unneeded spurious stable states?
• These issues will be addressed in the
context of multiple imprinted patterns
11/8/04
Definition of Covariance
Imprinting Multiple Patterns
Consider samples (x1, y1), (x2, y2), …, (xN, yN)
• Let x1, x2, …, xp be patterns to be imprinted
• Define the sum-of-outer-products matrix:
Let x = x k and y = y k
Covariance of x and y values :
Cxy =
p
W ij =
1
n
x
k
i
W=
1
n
k T
x )( y k y )
= xkyk x y x y + x y
Cxy = x k y k x y
k=1
11/8/04
k
= xkyk x yk xk y + x y
x (x )
k
(x
= x k y k xy k x k y + x y
x kj
k=1
p
26
27
11/8/04
28
7
Part 3: Autonomous Agents
11/8/04
Characteristics
of Hopfield Memory
Weights & the Covariance Matrix
x1,
x2,
xp
Sample pattern vectors:
…,
Covariance of ith and jth components:
k
i
Cij = x x
k
j
• Distributed (“holographic”)
– every pattern is stored in every location
(weight)
xi x j
If i : x i = 0 (±1 equally likely in all positions) :
k
i
Cij = x x
k
j
=
1
p
p
k=1
k
i
x y
• Robust
– correct retrieval in spite of noise or error in
patterns
– correct operation in spite of considerable
weight damage or noise
k
j
W = np C
11/8/04
29
Stability of Imprinted Memories
• Suppose the state is one of the imprinted
patterns xm
T
• Then: h = Wx m = 1n x k ( x k ) x m
[
=
1
n
=
1
n
k
k
m T
m
= xm +
1
n
(x
km
11/8/04
k T
x (x )
x (x ) x
m
• xk xm = –n if they are complementary
– highly correlated (reversed)
• xk xm = 0 if they are orthogonal
1
n
k T
x (x )
k
xm
– largely uncorrelated
• xk xm measures the crosstalk between
patterns k and m
km
k
Interpretation of Inner Products
– highly correlated
xm
+
30
• xk xm = n if they are identical
]
k
11/8/04
x m )x k
31
11/8/04
32
8
Part 3: Autonomous Agents
11/8/04
Cosines and Inner products
u
uv
u v = u v cos uv
v
2
If u is bipolar, then u = u u = n
Hence, u v = n n cos uv = n cos uv
11/8/04
33
9
Download