Document 11911340

advertisement
Part 3A: Hopfield Network
2/7/12
III. Recurrent Neural Networks
2/7/12
1
A.
The Hopfield Network
2/7/12
2
1
Part 3A: Hopfield Network
2/7/12
Typical Artificial Neuron
connection
weights
inputs
output
threshold
2/7/12
3
Typical Artificial Neuron
linear
combination
activation
function
net input
(local field)
2/7/12
4
2
Part 3A: Hopfield Network
2/7/12
Equations
⎛ n
⎞
hi = ⎜⎜ ∑ w ij s j ⎟⎟ − θ
⎝ j=1
⎠
h = Ws − θ
Net input:
New neural state:
€
sʹ′i = σ ( hi )
sʹ′ = σ (h)
2/7/12
5
€
Hopfield Network
• 
• 
• 
• 
• 
Symmetric weights: wij = wji No self-action: wii = 0
Zero threshold: θ = 0
Bipolar states: si ∈ {–1, +1}
Discontinuous bipolar activation function:
⎧−1,
σ ( h ) = sgn( h ) = ⎨
⎩+1,
2/7/12
h<0
h>0
6
€
3
Part 3A: Hopfield Network
2/7/12
What to do about h = 0?
•  There are several options:
  σ(0) = +1
  σ(0) = –1
  σ(0) = –1 or +1 with equal probability
  hi = 0 ⇒ no state change (siʹ′ = si)
•  Not much difference, but be consistent
•  Last option is slightly preferable, since
symmetric
2/7/12
7
Positive Coupling
•  Positive sense (sign)
•  Large strength
2/7/12
8
4
Part 3A: Hopfield Network
2/7/12
Negative Coupling
•  Negative sense (sign)
•  Large strength
2/7/12
9
Weak Coupling
•  Either sense (sign)
•  Little strength
2/7/12
10
5
Part 3A: Hopfield Network
2/7/12
State = –1 & Local Field < 0
h < 0
2/7/12
11
State = –1 & Local Field > 0
h > 0
2/7/12
12
6
Part 3A: Hopfield Network
2/7/12
State Reverses
h > 0
2/7/12
13
State = +1 & Local Field > 0
h > 0
2/7/12
14
7
Part 3A: Hopfield Network
2/7/12
State = +1 & Local Field < 0
h < 0
2/7/12
15
State Reverses
h < 0
2/7/12
16
8
Part 3A: Hopfield Network
2/7/12
NetLogo Demonstration of
Hopfield State Updating
Run Hopfield-update.nlogo
2/7/12
17
Hopfield Net as Soft Constraint
Satisfaction System
•  States of neurons as yes/no decisions
•  Weights represent soft constraints between
decisions
–  hard constraints must be respected
–  soft constraints have degrees of importance
•  Decisions change to better respect
constraints
•  Is there an optimal set of decisions that best
respects all constraints?
2/7/12
18
9
Part 3A: Hopfield Network
2/7/12
Demonstration of Hopfield Net
Dynamics I
Run Hopfield-dynamics.nlogo
2/7/12
19
Convergence
•  Does such a system converge to a stable
state?
•  Under what conditions does it converge?
•  There is a sense in which each step relaxes
the “tension” in the system
•  But could a relaxation of one neuron lead to
greater tension in other places?
2/7/12
20
10
Part 3A: Hopfield Network
2/7/12
Quantifying “Tension”
•  If wij > 0, then si and sj want to have the same sign
(si sj = +1)
•  If wij < 0, then si and sj want to have opposite signs
(si sj = –1)
•  If wij = 0, their signs are independent
•  Strength of interaction varies with |wij|
•  Define disharmony (“tension”) Dij between
neurons i and j:
Dij = – si wij sj Dij < 0 ⇒ they are happy
Dij > 0 ⇒ they are unhappy
2/7/12
21
Total Energy of System
The “energy” of the system is the total
“tension” (disharmony) in it:
E {s} = ∑ Dij = −∑ si w ij s j
ij
ij
= − 12 ∑ ∑ si w ij s j
i
j≠i
= − 12 ∑ ∑ si w ij s j
i
j
= − 12 sT Ws
2/7/12
22
€
11
Part 3A: Hopfield Network
2/7/12
Review of Some Vector Notation
⎛ x1 ⎞
⎜ ⎟
T
x = ⎜  ⎟ = ( x1  x n )
⎜ ⎟
⎝ x n ⎠
(column vectors)
n
x T y = ∑ xi yi = x ⋅ y
(inner product)
i=1
€
€
x T My = ∑
€

⎛ x1 y1
⎜
xy T = ⎜ 
⎜
⎝ x m y1


m
i=1
∑
n
j=1
x1 y n ⎞
⎟
 ⎟
⎟
x m y n ⎠
(outer product)
x i M ij y j
(quadratic form)
2/7/12
23
€
Another View of Energy
The energy measures the degree to which the
neurons’ states are in disharmony with their
local fields (i.e. of opposite sign):
E {s} = − 12 ∑ ∑ si w ij s j
i
j
= − 12 ∑ si ∑ w ij s j
i
j
= − 12 ∑ si hi
i
= − 12 sT h
2/7/12
24
€
12
Part 3A: Hopfield Network
2/7/12
Do State Changes Decrease Energy?
•  Suppose that neuron k changes state
•  Change of energy:
ΔE = E {sʹ′} − E {s}
= −∑ sʹ′i w ij sʹ′j + ∑ si w ij s j
ij
ij
= −∑ sʹ′k w kj s j + ∑ sk w kj s j
j≠k
j≠k
= −( sʹ′k − sk )∑ w kj s j
€
€
2/7/12
= −Δsk hk
j≠k
<0
25
€
€
€
Energy Does Not Increase
•  In each step in which a neuron is considered
for update:
E{s(t + 1)} – E{s(t)} ≤ 0
•  Energy cannot increase
•  Energy decreases if any neuron changes
•  Must it stop?
2/7/12
26
13
Part 3A: Hopfield Network
2/7/12
Proof of Convergence
in Finite Time
•  There is a minimum possible energy:
–  The number of possible states s ∈ {–1, +1}n is
finite
–  Hence Emin = min {E(s) | s ∈ {±1}n} exists
•  Must reach in a finite number of steps
because only finite number of states
2/7/12
27
Conclusion
•  If we do asynchronous updating, the
Hopfield net must reach a stable, minimum
energy state in a finite number of updates
•  This does not imply that it is a global
minimum
2/7/12
28
14
Part 3A: Hopfield Network
2/7/12
Lyapunov Functions
•  A way of showing the convergence of discreteor continuous-time dynamical systems
•  For discrete-time system:
–  need a Lyapunov function E (“energy” of the state)
–  E is bounded below (E{s} > Emin)
–  ΔE < (ΔE)max ≤ 0 (energy decreases a certain
minimum amount each step)
–  then the system will converge in finite time
•  Problem: finding a suitable Lyapunov function
2/7/12
29
Example Limit Cycle with
Synchronous Updating
w > 0
2/7/12
w > 0
30
15
Part 3A: Hopfield Network
2/7/12
The Hopfield Energy Function is
Even
•  A function f is odd if f (–x) = – f (x),
for all x •  A function f is even if f (–x) = f (x),
for all x •  Observe:
E {−s} = − 12 (−s) T W(−s) = − 12 sT Ws = E {s}
2/7/12
31
€
Conceptual
Picture of
Descent on
Energy
Surface
2/7/12
(fig. from Solé & Goodwin)
32
16
Part 3A: Hopfield Network
2/7/12
Energy
Surface
2/7/12
(fig. from Haykin Neur. Netw.)
33
Energy
Surface
+
Flow
Lines
2/7/12
(fig. from Haykin Neur. Netw.)
34
17
Part 3A: Hopfield Network
2/7/12
Flow
Lines
Basins of
Attraction
2/7/12
(fig. from Haykin Neur. Netw.)
35
Bipolar
State
Space
2/7/12
36
18
Part 3A: Hopfield Network
2/7/12
Basins
in
Bipolar
State
Space
energy decreasing paths
2/7/12
37
Demonstration of Hopfield Net
Dynamics II
Run initialized Hopfield.nlogo
2/7/12
38
19
Part 3A: Hopfield Network
2/7/12
Storing
Memories as
Attractors
2/7/12
(fig. from Solé & Goodwin)
39
Example of
Pattern
Restoration
2/7/12
(fig. from Arbib 1995)
40
20
Part 3A: Hopfield Network
2/7/12
Example of
Pattern
Restoration
2/7/12
(fig. from Arbib 1995)
41
(fig. from Arbib 1995)
42
Example of
Pattern
Restoration
2/7/12
21
Part 3A: Hopfield Network
2/7/12
Example of
Pattern
Restoration
2/7/12
(fig. from Arbib 1995)
43
(fig. from Arbib 1995)
44
Example of
Pattern
Restoration
2/7/12
22
Part 3A: Hopfield Network
2/7/12
Example of
Pattern
Completion
2/7/12
(fig. from Arbib 1995)
45
(fig. from Arbib 1995)
46
Example of
Pattern
Completion
2/7/12
23
Part 3A: Hopfield Network
2/7/12
Example of
Pattern
Completion
2/7/12
(fig. from Arbib 1995)
47
(fig. from Arbib 1995)
48
Example of
Pattern
Completion
2/7/12
24
Part 3A: Hopfield Network
2/7/12
Example of
Pattern
Completion
2/7/12
(fig. from Arbib 1995)
49
(fig. from Arbib 1995)
50
Example of
Association
2/7/12
25
Part 3A: Hopfield Network
2/7/12
Example of
Association
2/7/12
(fig. from Arbib 1995)
51
(fig. from Arbib 1995)
52
Example of
Association
2/7/12
26
Part 3A: Hopfield Network
2/7/12
Example of
Association
2/7/12
(fig. from Arbib 1995)
53
(fig. from Arbib 1995)
54
Example of
Association
2/7/12
27
Part 3A: Hopfield Network
2/7/12
Applications of
Hopfield Memory
• 
• 
• 
• 
Pattern restoration
Pattern completion
Pattern generalization
Pattern association
2/7/12
55
Hopfield Net for Optimization
and for Associative Memory
•  For optimization:
–  we know the weights (couplings)
–  we want to know the minima (solutions)
•  For associative memory:
–  we know the minima (retrieval states)
–  we want to know the weights
2/7/12
56
28
Part 3A: Hopfield Network
2/7/12
Hebb’s Rule
“When an axon of cell A is near enough to
excite a cell B and repeatedly or persistently
takes part in firing it, some growth or
metabolic change takes place in one or both
cells such that A’s efficiency, as one of the
cells firing B, is increased.”
—Donald Hebb (The Organization of Behavior, 1949, p. 62)
“Neurons that fire together, wire together”
2/7/12
57
Example of Hebbian Learning:
Pattern Imprinted
2/7/12
58
29
Part 3A: Hopfield Network
2/7/12
Example of Hebbian Learning:
Partial Pattern Reconstruction
2/7/12
59
Mathematical Model of Hebbian
Learning for One Pattern
⎧ x x ,
Let W ij = ⎨ i j
⎩ 0,
if i ≠ j
if i = j
Since x i x i = x i2 = 1, W = xx T − I
€
For simplicity, we will include self-coupling:
€
€
2/7/12
W = xx T
60
€
30
Part 3A: Hopfield Network
2/7/12
A Single Imprinted Pattern is a
Stable State
•  Suppose W = xxT
•  Then h = Wx = xxTx = nx since
n
n
2
2
x T x = ∑ x i = ∑ (±1) = n
i=1
i=1
•  Hence, if initial state is s = x, then new state
is sʹ′ = sgn (n x) = x €•  For this reason, scale W by 1/n
•  May be other stable states (e.g., –x)
2/7/12
61
Questions
•  How big is the basin of attraction of the
imprinted pattern?
•  How many patterns can be imprinted?
•  Are there unneeded spurious stable states?
•  These issues will be addressed in the
context of multiple imprinted patterns
2/7/12
62
31
Part 3A: Hopfield Network
2/7/12
Imprinting Multiple Patterns
•  Let x1, x2, …, xp be patterns to be imprinted
•  Define the sum-of-outer-products matrix:
p
W ij =
1
n
∑x
k
i
x kj
k=1
p
W=
1
n
∑ x k (x k )
T
k=1
2/7/12
63
€
Definition of Covariance
Consider samples (x1, y1), (x2, y2), …, (xN, yN)
Let x = x k and y = y k
Covariance of x and y values :
Cxy = ( x k − x )( y k − y )
€
€
= x k y k − xy k − x k y + x ⋅ y
= xkyk − x yk − xk y + x ⋅ y
€
€
€
2/7/12
= xkyk − x ⋅ y − x ⋅ y + x ⋅ y
Cxy = x k y k − x ⋅ y
64
€
€
32
Part 3A: Hopfield Network
2/7/12
Weights & the Covariance Matrix
Sample pattern vectors: x1, x2, …, xp Covariance of ith and jth components:
Cij = x ik x kj − x i ⋅ x j
If ∀i : x i = 0 (±1 equally likely in all positions) :
Cij = x ik x kj =
€
€
1
p
∑
p
k k
x
xj
i
k =1
∴nW = pC
€
€
2/7/12
65
€
Characteristics
of Hopfield Memory
•  Distributed (“holographic”)
–  every pattern is stored in every location
(weight)
•  Robust
–  correct retrieval in spite of noise or error in
patterns
–  correct operation in spite of considerable
weight damage or noise
2/7/12
66
33
Part 3A: Hopfield Network
2/7/12
Demonstration of Hopfield Net
Run Malasri Hopfield Demo
2/7/12
67
Stability of Imprinted Memories
•  Suppose the state is one of the imprinted
patterns xm T
•  Then:
h = Wx m = 1n ∑k x k (x k ) x m
[
=
1
n
=
1
n
k T
∑ x (x )
x (x ) x
k
k
m T
m
= xm +
1
n
∑ (x
k≠m
2/7/12
]
m
xm
+
1
n
T
∑ x k (x k ) x m
k≠m
k
⋅ x m )x k
68
€
34
Part 3A: Hopfield Network
2/7/12
Interpretation of Inner Products
•  xk ⋅ xm = n if they are identical
–  highly correlated
•  xk ⋅ xm = –n if they are complementary
–  highly correlated (reversed)
•  xk ⋅ xm = 0 if they are orthogonal
–  largely uncorrelated
•  xk ⋅ xm measures the crosstalk between
patterns k and m 2/7/12
69
Cosines and Inner products
u
θ uv
u ⋅ v = u v cosθ uv
v
2
If u is bipolar, then u = u€⋅ u = n
€
€
Hence, u ⋅ v = n n cosθ uv = n cos θ uv
Hence h = x m + ∑ x k cos θ km
k≠m
€
2/7/12
70
€
35
Part 3A: Hopfield Network
2/7/12
Conditions for Stability
Stability of entire pattern :
⎛
⎞
x m = sgn⎜ x m + ∑ x k cosθ km ⎟
⎝
⎠
k≠m
Stability of a single bit :
⎛ m
⎞
m
k
x i = sgn⎜ x i + ∑ x i cosθ km ⎟
⎝
⎠
k≠m
€
2/7/12
71
€
Sufficient Conditions for
Instability (Case 1)
Suppose x im = −1. Then unstable if :
(−1) + ∑ x ik cosθkm > 0
k≠m
€
∑x
k
i
cosθ km > 1
k≠m
€
2/7/12
72
€
36
Part 3A: Hopfield Network
2/7/12
Sufficient Conditions for
Instability (Case 2)
Suppose x im = +1. Then unstable if :
(+1) + ∑ x ik cosθkm < 0
k≠m
€
∑x
k
i
cosθ km < −1
k≠m
€
2/7/12
73
€
Sufficient Conditions for Stability
∑x
k
i
cos θ km ≤ 1
k≠m
The crosstalk with the sought pattern must be
sufficiently
small
€
2/7/12
74
37
Part 3A: Hopfield Network
2/7/12
Capacity of Hopfield Memory
•  Depends on the patterns imprinted
•  If orthogonal, pmax = n –  but every state is stable ⇒ trivial basins
•  So pmax < n •  Let load parameter α = p / n 2/7/12
equations
75
Single Bit Stability Analysis
•  For simplicity, suppose xk are random
•  Then xk ⋅ xm are sums of n random ±1
  binomial distribution ≈ Gaussian
  in range –n, …, +n   with mean µ = 0
  and variance σ2 = n •  Probability sum > t:
1
2
⎡
⎛ t ⎞⎤
1−
erf
⎜
⎟⎥
⎢
⎝ 2n ⎠⎦
⎣
[See “Review of Gaussian (Normal) Distributions” on course website]
2/7/12
76
€
38
Part 3A: Hopfield Network
2/7/12
Approximation of Probability
Let crosstalk Cim =
1
n
∑ x (x
k
i
k≠m
k
⋅ xm )
m
i
We want Pr{C > 1} = Pr{nCim > n}
p
€
€
€
€
n
Note : nC = ∑ ∑ x ik x kj x mj
m
i
k=1 j=1
k≠m
A sum of n( p −1) ≈ np random ± 1s
Variance σ 2 = np
2/7/12
77
€
Probability of Bit Instability
⎡
⎛ n ⎞⎤
Pr{nC > n} = ⎢1− erf ⎜
⎟⎥
⎢⎣
⎝ 2np ⎠⎥⎦
m
i
1
2
=
1
2
=
1
2
[1− erf (
[1− erf (
)]
1 2α )]
n 2p
€
2/7/12
(fig. from Hertz & al. Intr. Theory Neur. Comp.)
78
39
Part 3A: Hopfield Network
2/7/12
Tabulated Probability of
Single-Bit Instability
Perror
2/7/12
α
0.1%
0.105
0.36%
0.138
1%
0.185
5%
0.37
10%
0.61
(table from Hertz & al. Intr. Theory Neur. Comp.)
79
Spurious Attractors
•  Mixture states:
– 
– 
– 
– 
– 
– 
sums or differences of odd numbers of retrieval states
number increases combinatorially with p
shallower, smaller basins
basins of mixtures swamp basins of retrieval states ⇒ overload
useful as combinatorial generalizations?
self-coupling generates spurious attractors
•  Spin-glass states:
–  not correlated with any finite number of imprinted patterns
–  occur beyond overload because weights effectively random
2/7/12
80
40
Part 3A: Hopfield Network
2/7/12
Basins of Mixture States
x k1
x k3
x mix
€
€ k2
x
€
2/7/12
€
x imix = sgn( x ik1 + x ik2 + x ik3 )
81
€
Fraction of Unstable Imprints
(n = 100)
2/7/12
(fig from Bar-Yam)
82
41
Part 3A: Hopfield Network
2/7/12
Number of Stable Imprints
(n = 100)
2/7/12
(fig from Bar-Yam)
83
Number of Imprints with Basins
of Indicated Size (n = 100)
2/7/12
(fig from Bar-Yam)
84
42
Part 3A: Hopfield Network
2/7/12
Summary of Capacity Results
•  Absolute limit: pmax < αcn = 0.138 n •  If a small number of errors in each pattern
permitted: pmax ∝ n •  If all or most patterns must be recalled
perfectly: pmax ∝ n / log n •  Recall: all this analysis is based on random
patterns
•  Unrealistic, but sometimes can be arranged
2/7/12
III B
85
43
Download