Storing Memories as Attractors (fig. from Solé & Goodwin)

advertisement

Storing

Memories as

Attractors

11/3/03 (fig. from Solé & Goodwin) 1

Demonstration of Hopfield Net

Get hopfield from CBN site

11/3/03 2

Example of

Pattern

Restoration

11/3/03

(fig. from Arbib 1995)

3

Example of

Pattern

Restoration

11/3/03

(fig. from Arbib 1995)

4

Example of

Pattern

Restoration

11/3/03

(fig. from Arbib 1995)

5

Example of

Pattern

Restoration

11/3/03

(fig. from Arbib 1995)

6

Example of

Pattern

Restoration

11/3/03

(fig. from Arbib 1995)

7

Example of

Pattern

Completion

11/3/03

(fig. from Arbib 1995)

8

Example of

Pattern

Completion

11/3/03

(fig. from Arbib 1995)

9

Example of

Pattern

Completion

11/3/03

(fig. from Arbib 1995)

10

Example of

Pattern

Completion

11/3/03

(fig. from Arbib 1995)

11

Example of

Pattern

Completion

11/3/03

(fig. from Arbib 1995)

12

Example of

Association

11/3/03

(fig. from Arbib 1995)

13

Example of

Association

11/3/03

(fig. from Arbib 1995)

14

Example of

Association

11/3/03

(fig. from Arbib 1995)

15

Example of

Association

11/3/03

(fig. from Arbib 1995)

16

Example of

Association

11/3/03

(fig. from Arbib 1995)

17

Applications of

Hopfield Memory

• Pattern restoration

• Pattern completion

• Pattern generalization

• Pattern association

11/3/03 18

Hopfield Net for Optimization and for Associative Memory

• For optimization:

– we know the weights (couplings)

– we want to know the minima (solutions)

• For associative memory:

– we know the minima (retrieval states)

– we want to know the weights

11/3/03 19

Hebb’s Rule

“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”

—Donald Hebb ( The Organization of Behavior , 1949, p. 62)

11/3/03 20

Example of Hebbian Learning:

Pattern Imprinted

11/3/03 21

Example of Hebbian Learning:

Partial Pattern Reconstruction

11/3/03 22

Mathematical Model of Hebbian

Learning for One Pattern

Let W ij

=

Ï x

Ì

Ó i x j

, if i ≠

0, if i = j j

Since x i x i

= x i

2

= 1, W = xx T

I

For simplicity, we will include self-coupling:

W = xx T

11/3/03

23

A Single Imprinted Pattern is a

Stable State

• Suppose W = xx T

• Then h = Wx = xx T x = n x since x T x =

Â

n i = 1 x i

= n

2

Â

= n i = 1

• Hence, if initial state is s = x , then new state is s ¢ = sgn ( n x ) = x

• May be other stable states (e.g., – x )

11/3/03 24

Questions

• How big is the basin of attraction of the imprinted pattern?

• How many patterns can be imprinted?

• Are there unneeded spurious stable states?

• These issues will be addressed in the context of multiple imprinted patterns

11/3/03 25

Imprinting Multiple Patterns

• Let x 1 , x 2 , …, x p be patterns to be imprinted

• Define the sum-of-outer-products matrix:

W ij

=

W =

1 n p

Â

k = 1 x i k x k j

1 n p

Â

k = 1 x k k

T

11/3/03 26

Definition of Covariance

Consider samples ( x 1 , y 1 ), ( x 2 , y 2 ), …, ( x N , y N )

Let x = x k and y = y k

Covariance of x and y values :

C xy

= ( x k

x ) ( y k

y )

= x k y k

x y k

x k y + x y

= x k y k

x y k

x k y + x y

11/3/03

= x k y k

x y x y + x y

C xy

= x k y k

x ⋅ y

27

Weights & the Covariance Matrix

Sample pattern vectors: x 1 , x 2 , …, x p

Covariance of i th and j th components:

C ij

= x i k x k j

x i

⋅ x j

If " i : x i

= 0 ( ± 1 equally likely in all positions) :

C ij

= x i k x k j

= 1 p p

Â

k = 1 x i k y k j

11/3/03

\ W = p n

C

28

Characteristics of Hopfield Memory

• Distributed (“holographic”)

– every pattern is stored in every location

(weight)

• Robust

– correct retrieval in spite of noise or error in patterns

– correct operation in spite of considerable weight damage or noise

11/3/03 29

Stability of Imprinted Memories

• Suppose the state is one of the imprinted patterns x m

• Then: h = Wx m

= 1 n

[ Â k x k k

T ] x m

= 1 n

 k x k k

T x m

= 1 n x m m

T x m

+ 1 n

 k ≠ m x k k

T x m

= x m

+ 1 n

 k ≠ m

( x k

⋅ x m ) x k

11/3/03 30

Interpretation of Inner Products

• x k ⋅ x m = n if they are identical

– highly correlated

• x k ⋅ x m = – n if they are complementary

– highly correlated (reversed)

• x k ⋅ x m = 0 if they are orthogonal

– largely uncorrelated

• x k ⋅ x m measures the crosstalk between patterns k and m

11/3/03 31

11/3/03

Cosines and Inner products u u ⋅ v = u v cos q uv q uv v

If u is bipolar, then u

2

= u ⋅ u = n

Hence, u ⋅ v = n n cos q uv

= n cos q uv

32

Conditions for Stability

Stability of entire pattern : x m

= sgn

Ê

Á

Ë x m

+ 1 n

Â

k ≠ m x k cos q km

ˆ

˜

¯

Stability of a single bit : x i m

= sgn

Ê

Á

Ë x i m

+ 1 n

Â

k ≠ m x i k cos q km

ˆ

˜

¯

11/3/03

33

Sufficient Conditions for

Instability (Case 1)

Suppose x i m

= 1. Then unstable if :

( ) + 1 n

Â

k ≠ m x i k cos q km

> 0

1 n

Â

k ≠ m x i k cos q km

> 1

11/3/03

34

Sufficient Conditions for

Instability (Case 2)

Suppose x i m

= + 1. Then unstable if :

( ) + 1 n

Â

k ≠ m x i k cos q km

< 0

1 n

Â

k ≠ m x i k cos q km

< 1

11/3/03

35

Sufficient Conditions for

Stability

1 n

Â

k ≠ m x i k cos q km

£

1

The crosstalk with the sought pattern must be

11/3/03 36

Download