Document 11908969

advertisement
Part 3A: Hopfield Network
9/30/09
A.
The Hopfield Network
III. Recurrent Neural Networks
9/30/09
1
9/30/09
Typical Artificial Neuron
Typical Artificial Neuron
connection
weights
inputs
2
linear
combination
activation
function
output
net input
(local field)
threshold
9/30/09
3
9/30/09
Hopfield Network
Equations
• 
• 
• 
• 
• 
 n

hi =  ∑ w ij s j  − θ
 j=1

h = Ws − θ
Net input:
New neural state:
€
9/30/09
Symmetric weights: wij = wji No self-action: wii = 0
Zero threshold: θ = 0
Bipolar states: si ∈ {–1, +1}
Discontinuous bipolar activation function:
s′i = σ ( hi )
−1,
σ ( h ) = sgn( h ) = 
+1,
s′ = σ (h)
5
€
4
9/30/09
h<0
h>0
6
€
1
Part 3A: Hopfield Network
9/30/09
What to do about h = 0?
Positive Coupling
•  There are several options:
•  Positive sense (sign)
•  Large strength
  σ(0) = +1
  σ(0) = –1
  σ(0) = –1 or +1 with equal probability
  hi = 0 ⇒ no state change (si′ = si)
•  Not much difference, but be consistent
•  Last option is slightly preferable, since
symmetric
9/30/09
7
9/30/09
Weak Coupling
Negative Coupling
•  Either sense (sign)
•  Little strength
•  Negative sense (sign)
•  Large strength
9/30/09
8
9
9/30/09
10
State = –1 & Local Field < 0
State = –1 & Local Field > 0
h < 0
h > 0
9/30/09
11
9/30/09
12
2
Part 3A: Hopfield Network
9/30/09
State Reverses
State = +1 & Local Field > 0
h > 0
h > 0
9/30/09
13
9/30/09
14
State = +1 & Local Field < 0
State Reverses
h < 0
h < 0
9/30/09
15
9/30/09
16
Hopfield Net as Soft Constraint
Satisfaction System
•  States of neurons as yes/no decisions
•  Weights represent soft constraints between
decisions
NetLogo Demonstration of
Hopfield State Updating
–  hard constraints must be respected
–  soft constraints have degrees of importance
•  Decisions change to better respect
constraints
•  Is there an optimal set of decisions that best
respects all constraints?
Run Hopfield-update.nlogo
9/30/09
17
9/30/09
18
3
Part 3A: Hopfield Network
9/30/09
Convergence
•  Does such a system converge to a stable
state?
•  Under what conditions does it converge?
•  There is a sense in which each step relaxes
the “tension” in the system
•  But could a relaxation of one neuron lead to
greater tension in other places?
Demonstration of Hopfield Net
Dynamics I
Run Hopfield-dynamics.nlogo
9/30/09
19
9/30/09
20
Quantifying “Tension”
Total Energy of System
•  If wij > 0, then si and sj want to have the same sign
(si sj = +1)
•  If wij < 0, then si and sj want to have opposite signs
(si sj = –1)
•  If wij = 0, their signs are independent
•  Strength of interaction varies with |wij|
•  Define disharmony (“tension”) Dij between
neurons i and j:
The “energy” of the system is the total
“tension” (disharmony) in it:
E {s} = ∑ Dij = −∑ si w ij s j
ij
=−
ij
1
2
∑∑ s w
i
i
ij
sj
j≠i
= − 12 ∑ ∑ si w ij s j
Dij = – si wij sj Dij < 0 ⇒ they are happy
Dij > 0 ⇒ they are unhappy
i
j
1 T
2
= − s Ws
9/30/09
21
9/30/09
22
€
Review of Some Vector Notation
 x1 
 
T
x =    = ( x1  x n )
 
 xn 
Another View of Energy
The energy measures the number of neurons
whose states are in disharmony with their
local fields (i.e. of opposite sign):
(column vectors)
n
x T y = ∑ xi yi = x ⋅ y
i=1
€



 x1 y1

xy =  

 x m y1
T
€
x T My = ∑
€
€
9/30/09
m
i=1
∑
n
j=1
x1 y n 

 

xm yn 
x i M ij y j
(inner product)
E {s} = − 12 ∑ ∑ si w ij s j
i
j
= − 12 ∑ si ∑ w ij s j
(outer product)
i
j
= − 12 ∑ si hi
i
(quadratic form)
= − 12 sT h
23
9/30/09
24
€
4
Part 3A: Hopfield Network
9/30/09
Do State Changes Decrease Energy?
Energy Does Not Increase
•  Suppose that neuron k changes state
•  Change of energy:
•  In each step in which a neuron is considered
for update:
E{s(t + 1)} – E{s(t)} ≤ 0
•  Energy cannot increase
•  Energy decreases if any neuron changes
•  Must it stop?
ΔE = E {s′} − E {s}
= −∑ s′i w ij s′j + ∑ si w ij s j
ij
ij
= −∑ s′k w kj s j + ∑ sk w kj s j
j≠k
j≠k
= −( s′k − sk )∑ w kj s j
€
= −Δsk hk
€
j≠k
<0
9/30/09
25
9/30/09
26
€
€
€
Steps are of a Certain Minimum Size
Proof of Convergence
in Finite Time


hmin = min hk s.t. hk = ∑ w kj s j ∧ s ∈ {±1}n 


j≠k
•  There is a minimum possible energy:
–  The number of possible states s ∈ {–1, +1}n is
finite
–  Hence Emin = min {E(s) | s ∈ {±1}n} exists
•  Must show it is reached in a finite number
of steps
9/30/09
ΔE = 2 hk ≥ 2hmin
€
27
9/30/09
28
€
Lyapunov Functions
Conclusion
•  A way of showing the convergence of discreteor continuous-time dynamical systems
•  For discrete-time system:
•  If we do asynchronous updating, the
Hopfield net must reach a stable, minimum
energy state in a finite number of updates
•  This does not imply that it is a global
minimum
–  need a Lyapunov function E (“energy” of the state)
–  E is bounded below (E{s} > Emin)
–  ΔE < (ΔE)max ≤ 0 (energy decreases a certain
minimum amount each step)
–  then the system will converge in finite time
•  Problem: finding a suitable Lyapunov function
9/30/09
29
9/30/09
30
5
Part 3A: Hopfield Network
9/30/09
Example Limit Cycle with
Synchronous Updating
The Hopfield Energy Function is
Even
•  A function f is odd if f (–x) = – f (x),
for all x •  A function f is even if f (–x) = f (x),
for all x •  Observe:
w > 0
w > 0
E {−s} = − 12 (−s) T W(−s) = − 12 sT Ws = E {s}
9/30/09
31
9/30/09
32
€
Conceptual
Picture of
Descent on
Energy
Surface
9/30/09
Energy
Surface
(fig. from Solé & Goodwin)
33
9/30/09
(fig. from Haykin Neur. Netw.)
Energy
Surface
+
Flow
Lines
9/30/09
(fig. from Haykin Neur. Netw.)
35
34
Flow
Lines
Basins of
Attraction
9/30/09
(fig. from Haykin Neur. Netw.)
36
6
Part 3A: Hopfield Network
9/30/09
Basins
in
Bipolar
State
Space
Bipolar
State
Space
energy decreasing paths
9/30/09
37
Demonstration of Hopfield Net
Dynamics II
9/30/09
38
Storing
Memories as
Attractors
Run initialized Hopfield.nlogo
9/30/09
39
Example of
Pattern
Restoration
9/30/09
9/30/09
(fig. from Solé & Goodwin)
40
Example of
Pattern
Restoration
(fig. from Arbib 1995)
41
9/30/09
(fig. from Arbib 1995)
42
7
Part 3A: Hopfield Network
9/30/09
Example of
Pattern
Restoration
9/30/09
Example of
Pattern
Restoration
(fig. from Arbib 1995)
43
Example of
Pattern
Restoration
9/30/09
(fig. from Arbib 1995)
44
(fig. from Arbib 1995)
46
(fig. from Arbib 1995)
48
Example of
Pattern
Completion
(fig. from Arbib 1995)
45
Example of
Pattern
Completion
9/30/09
9/30/09
9/30/09
Example of
Pattern
Completion
(fig. from Arbib 1995)
47
9/30/09
8
Part 3A: Hopfield Network
9/30/09
Example of
Pattern
Completion
9/30/09
Example of
Pattern
Completion
(fig. from Arbib 1995)
49
Example of
Association
9/30/09
(fig. from Arbib 1995)
50
(fig. from Arbib 1995)
52
(fig. from Arbib 1995)
54
Example of
Association
(fig. from Arbib 1995)
51
Example of
Association
9/30/09
9/30/09
9/30/09
Example of
Association
(fig. from Arbib 1995)
53
9/30/09
9
Part 3A: Hopfield Network
9/30/09
Applications of
Hopfield Memory
Example of
Association
9/30/09
• 
• 
• 
• 
(fig. from Arbib 1995)
55
9/30/09
Hopfield Net for Optimization
and for Associative Memory
“When an axon of cell A is near enough to
excite a cell B and repeatedly or persistently
takes part in firing it, some growth or
metabolic change takes place in one or both
cells such that A’s efficiency, as one of the
cells firing B, is increased.”
–  we know the weights (couplings)
–  we want to know the minima (solutions)
•  For associative memory:
–  we know the minima (retrieval states)
–  we want to know the weights
—Donald Hebb (The Organization of Behavior, 1949, p. 62)
57
Example of Hebbian Learning:
Pattern Imprinted
9/30/09
56
Hebb’s Rule
•  For optimization:
9/30/09
Pattern restoration
Pattern completion
Pattern generalization
Pattern association
9/30/09
58
Example of Hebbian Learning:
Partial Pattern Reconstruction
59
9/30/09
60
10
Part 3A: Hopfield Network
9/30/09
Mathematical Model of Hebbian
Learning for One Pattern
x x ,
Let W ij =  i j
 0,
A Single Imprinted Pattern is a
Stable State
•  Suppose W = xxT
•  Then h = Wx = xxTx = nx since T
n
n
2
2
x x = ∑ x i = ∑ (±1) = n
if i ≠ j
if i = j
Since x i x i = x i2 = 1, W = xx T − I
€
i=1
For simplicity, we will include self-coupling:
W = xx T
€
€
9/30/09
i=1
•  Hence, if initial state is s = x, then new state
is s′ = sgn (n x) = x €•  May be other stable states (e.g., –x)
61
9/30/09
62
€
Questions
Imprinting Multiple Patterns
•  Let x1, x2, …, xp be patterns to be imprinted
•  Define the sum-of-outer-products matrix:
•  How big is the basin of attraction of the
imprinted pattern?
•  How many patterns can be imprinted?
•  Are there unneeded spurious stable states?
•  These issues will be addressed in the
context of multiple imprinted patterns
p
W ij =
1
n
∑x
k
i
x kj
k=1
p
W=
1
n
k T
∑ x (x )
k
k=1
9/30/09
63
9/30/09
64
€
Definition of Covariance
Weights & the Covariance Matrix
Consider samples (x1, y1), (x2, y2), …, (xN, yN)
Let x = x k and y = y k
Covariance of x and y values :
Sample pattern vectors: x1, x2, …, xp Covariance of ith and jth components:
Cij = x ik x kj − x i ⋅ x j
Cxy = ( x k − x )( y k − y )
€
€
If ∀i : x i = 0 (±1 equally likely in all positions) :
= x k y k − xy k − x k y + x ⋅ y
= xkyk − x yk − xk y + x ⋅ y
€
€
= xkyk − x ⋅ y − x ⋅ y + x ⋅ y
k
€
9/30/09
€
Cxy = x y
k
Cij = x ik x kj =
€
€
∑
p
k=1
x ik y kj
∴ W = np C
−x⋅ y
€
65
1
p
€
9/30/09
66
€
€
11
Part 3A: Hopfield Network
9/30/09
Characteristics
of Hopfield Memory
•  Distributed (“holographic”)
–  every pattern is stored in every location
(weight)
Demonstration of Hopfield Net
•  Robust
–  correct retrieval in spite of noise or error in
patterns
–  correct operation in spite of considerable
weight damage or noise
9/30/09
Run Malasri Hopfield Demo
67
Stability of Imprinted Memories
•  Suppose the state is one of the imprinted
patterns xm T
•  Then: h = Wx m = 1n ∑k x k (x k ) x m
[
=
1
n
=
1
n
k T
k
k
m T
m
= xm +
1
n
∑ (x
k≠m
m
1
n
–  highly correlated
–  highly correlated (reversed)
k T
∑ x (x )
k
•  xk ⋅ xm = 0 if they are orthogonal
xm
–  largely uncorrelated
k≠m
k
Interpretation of Inner Products
•  xk ⋅ xm = –n if they are complementary
xm
+
68
•  xk ⋅ xm = n if they are identical
]
∑ x (x )
x (x ) x
9/30/09
•  xk ⋅ xm measures the crosstalk between
patterns k and m ⋅ x m )x k
9/30/09
69
9/30/09
70
€
Cosines and Inner products
Conditions for Stability
u
θ uv
u ⋅ v = u v cosθ uv
Stability of entire pattern :


x m = sgn x m + ∑ x k cosθ km 


k≠m
v
2
If u is bipolar, then u = u€⋅ u = n
€
€
Hence, u ⋅ v = n n cosθ uv = n cos θ uv
k
k≠m
€
9/30/09
€
€
Hence h = x + ∑ x cos θ km
m
71
9/30/09
Stability of a single bit :


x im = sgn x im + ∑ x ik cosθ km 


k≠m
72
€
12
Part 3A: Hopfield Network
9/30/09
Sufficient Conditions for
Instability (Case 1)
Sufficient Conditions for
Instability (Case 2)
Suppose x im = −1. Then unstable if :
Suppose x im = +1. Then unstable if :
(−1) + ∑ x ik cosθkm > 0
(+1) + ∑ x ik cosθkm < 0
k≠m
€
∑x
k≠m
€
k
i
∑x
cosθ km > 1
k≠m
k
i
cosθ km < −1
k≠m
€
€
9/30/09
73
9/30/09
€
74
€
Sufficient Conditions for Stability
Capacity of Hopfield Memory
•  Depends on the patterns imprinted
•  If orthogonal, pmax = n ∑ x ik cosθkm ≤ 1
–  but every state is stable ⇒ trivial basins
k≠m
•  So pmax < n •  Let load parameter α = p / n The crosstalk with the sought pattern must be
sufficiently
small
€
9/30/09
75
9/30/09
Single Bit Stability Analysis
76
equations
Approximation of Probability
xk
•  For simplicity, suppose are random
•  Then xk ⋅ xm are sums of n random ±1
Let crosstalk Cim =
1
2
p
€
€

 t 

1− erf 
 2n 

€
77
€
k
i
k
⋅ xm )
We want Pr{Cim > 1} = Pr{nCim > n}
[See “Review of Gaussian (Normal) Distributions” on course website]
9/30/09
∑ x (x
k≠m
  binomial distribution ≈ Gaussian
  in range –n, …, +n   with mean µ = 0
  and variance σ2 = n •  Probability sum > t:
1
n
€
n
Note : nCim = ∑ ∑ x ik x kj x mj
k=1 j=1
k≠m
A sum of n( p −1) ≈ np random ± 1s
Variance σ 2 = np
9/30/09
78
€
13
Part 3A: Hopfield Network
9/30/09
Tabulated Probability of
Single-Bit Instability
Probability of Bit Instability

 n 
Pr{nCim > n} = 12 1− erf 


 2np 
=
1
2
=
1
2
[1− erf (
[1− erf (
Perror
)]
1 2α )]
n 2p
α
0.1%
0.105
0.36%
0.138
1%
0.185
5%
0.37
10%
0.61
€
9/30/09 (fig. from Hertz & al. Intr. Theory Neur. Comp.)
79
(table from Hertz & al. Intr. Theory Neur. Comp.)
9/30/09
Spurious Attractors
Basins of Mixture States
•  Mixture states:
– 
– 
– 
– 
– 
– 
80
sums or differences of odd numbers of retrieval states
number increases combinatorially with p
shallower, smaller basins
basins of mixtures swamp basins of retrieval states ⇒ overload
useful as combinatorial generalizations?
self-coupling generates spurious attractors
x k1
x k3
x mix
•  Spin-glass states:
–  not correlated with any finite number of imprinted patterns
–  occur beyond overload because weights effectively random
€
€ k2
x
€
9/30/09
81
9/30/09
€
Fraction of Unstable Imprints
(n = 100)
9/30/09
(fig from Bar-Yam)
x imix = sgn( x ik1 + x ik2 + x ik3 )
82
€
Number of Stable Imprints
(n = 100)
83
9/30/09
(fig from Bar-Yam)
84
14
Part 3A: Hopfield Network
9/30/09
Number of Imprints with Basins
of Indicated Size (n = 100)
9/30/09
(fig from Bar-Yam)
Summary of Capacity Results
•  Absolute limit: pmax < αcn = 0.138 n •  If a small number of errors in each pattern
permitted: pmax ∝ n •  If all or most patterns must be recalled
perfectly: pmax ∝ n / log n •  Recall: all this analysis is based on random
patterns
•  Unrealistic, but sometimes can be arranged
85
9/30/09
III B
86
15
Download