Spiking Neural Networks: The New Generation of Artificial Neural

advertisement
Clustering using
Spiking Neural Networks
Biological Neuron:
The Elementary Processing Unit of the Brain
Biological Neuron:
A Generic Structure
Axon
Synapse
Dendrite
Soma
Axon Terminal
Biological Neuron:
Nerve Impulse Transiting
Action Potential
(Spike)
Postsynaptic
Potential
Membrane
Potential
Spike-After
Potential
Action Potential
(Spike)
Biological Neuron:
Soma Firing Behavior
u(t )
Synchrony is the main
factor of soma firing

urest (t )
t
a)
t1in
t2in
t3in
b)
c)
t 4in
t out
t
t
Biological Neuron:
Information Coding
Firing rate alone does not carry all
the relevant information
Neurons communicate via exact
spike timing
Neuroscience Models of Neuron:
The Hodgkin-Huxley Model
Alan Lloyd Hodgkin and Andrew
Huxley received the Nobel Prize in
Physiology and Medicine in 1963
The Hodgkin-Huxley model is too
complicated model of neuron to be
used in artificial neural networks
Neuroscience Models of Neuron:
Leaky Integrate-And-Fire Model
or
Leaky Integrate-And-Fire model disregards
the refractory capability of neuron
Neuroscience Models of Neuron:
Spike-Response Model
Spike-Response model captures the major
elements of a biological neuron behavior
Biological Neuron – Computational
Intelligence Approach:
The First Generation
The first artificial neuron was
proposed by W. McCulloch &
W. Pitts in 1943

1, if
yj  
0 , if

n
w
ji i
w
ji i
i 1
n
i 1
x 
x 
Biological Neuron – Computational
Intelligence Approach:
The Second Generation
Multilayered Perception is a
universal approximator
 n

y j  f   w ji xi 
 i 1

Biological Neuron – Computational
Intelligence Approach:
Artificial Neurons – Too Artificial?
1 spike occurrence
y
0 spike absence
From neurophysiology point of view, y
is existence of an output spike
y
Number of spikes
Time frame
From neurophysiology point of view, y
is firing rate
Spike timing is not considered at all!
Biological Neuron – Computational
Intelligence Approach:
The Third Generation
Spiking neuron model was introduced
by J. Hopfield in 1995

Spiking neural networks are
- biologically more plausible,
- computationally more powerful,
- considerably faster
than networks of the second
generation
Spiking Neural Network:
Overall Architecture
x 1 (k )
Spiking neural network is a
heterogeneous two-layered feedforward network with lateral
connections in the second hidden layer
RN
1 ,1
RN
2 ,1
MS1,1,1
s
∫
t 1[1] ( x (k ))
SN 1
RN
x 2 (k )
h ,1
RN 1, 2
s
RN
∫
2 ,2
t 2[1] ( x (k ))
SN 2
RN
RN is a receptive neuron
h ,2
MS is a multiple synapse
SN is a spiking neuron
x n (k )
RN 1,n
s
RN
∫
2 ,n
t m[1] ( x (k ))
SN m
RN
hn
MS mhn
Spiking Neural Network:
Population Coding
Input spike:
t [10,i]( x ( k ))  0
RN1,i
RN 2,i
  
[0]
tli[ 0 ] ( x ( k ))  tmax
1   xi ( k )  cli[ 0 ] ,  i
RN3,i
Pool of
Receptive
Neurons
t [10,i]( x ( s ))  1
xi (s )
t [20,i]( x ( s ))  2
t [20,i]( x ( k ))  4
xi (k )
xi (s )

Spiking Neural Network:
Multiple Synapse
Spike-response function:
RN 1,1
t1[,01] ( x 1 ( k ))
u j ,1,1(t )
MS j ,1,1
t
t

 pjli (t )  exp 1   H (t )

 
Delayed postsynaptic potential:
u pjli (t )  w pjli(t  (t li[ 0 ] ( x( k ))  d p ))
...
...
RN li
d
t li[ 0 ] ( x i (k ))

1

d2
w 2jli
...
...
d
q
s
w 1jli
Σ
...

w
q
jli
u jli (t )
∫
SN j
MS jli
Total postsynaptic potential:
q
u jli (t )   u pjli (t )
p 1
...
RN hn
t hn[ 0 ] ( x n (k ))
MS jhn
u jhn (t )
Membrane potential:
n
h
u j (t )    u jli (t )
i 1 l1
Spiking Neural Network:
Hebbian Learning – WTA and WTM
t pjli  tli[ 0 ] (xi ( k ))  d p  t[j1] (x( k ))
Winner-Takes-All:

~
j  j,
~
j  j,
w pjli (K )  w (K )L( t pjli ),
p
w jli (K  1)   p
w jli (K ),
Winner-Takes-More*:
2
  1
  
2 ln

1




L( t )
p
jli
2
1
w (K  1)  w (K )  w (K )( t j~j )L(t )
p
jli

 t pjli   2 
 
L( t pjli )  1   exp 
 2(   1) 


p
jli
t j~j  t~[j1 ] ( x( k ))  t [j1 ] ( x( k ))
0

*Proposed for the first time on the 11th International
Conference on Science and Technology “System Analysis
and Information Technologies” (Kyiv, Ukraine, 2009) by
Ye. Bodyanskiy and A. Dolotov
0

t
0
Spiking Neural Network:
Image Processing*
Original Image
SOM at 50 epoch
*In Bionics of Intelligence: 2007, 2 (67), pp. 21-26 by Ye. Bodyanskiy and A. Dolotov
SNN at 4 epoch
Spiking Neuron:
The Laplace Transform Basis
From control theory point of view, action potential (spike) is a signal in pulseposition form:
Lt  t( x( k ))  e  t ( x( k ))s
Thus, transformation of action potential to postsynaptic potential taken into
synapse is nothing other than pulse-position – continuous-time transformation,
and soma transformation is just reverse one, continuous-time – pulse-position
transformation
Spiking Neuron Synapse:
A 2nd order critically damped response unit *
G( s ) 
1
1s  1 2 s  1
1  2
t 1 t
(t )  e

t
t

 

1
~ (t ) 
 e 1  e 2 

 1   2 

t
t  1
~
 (t )  2 e
1
(t )  e~ (t )
GSynapse ( s ) 
*Proposed for the first time on the 6th International
Conference “Information Research and Applications”
(Varna, Bulgaria, 2009) by Ye. Bodyanskiy, A. Dolotov,
and I. Pliss
e
s  12
Spiking Neuron:
Technically Plausible Description*
Incoming Spike:
L t  t ( x ( k ))   e
[0]
li
Time Delay:
 t [li0 ] ( x ( k )) s
Spike-Response Function:
e
GSRF ( s ) 
s  12
Membrane Potential:
n
h
q
u j (s )  
w e 
i 1 l 1 p 1
Relay:

G TimeDelay ( s )  e
 d ps

[0]
p 1  t li ( x ( k ))  d p s
jli
2
s  1
Outgoing Spike:

 u j (t );  s 
sign(u j (t )   s )  1
2



L  t  t [j1 ] ( x( k ))  sL  u j (t );  s
*Proposed for the first time on the 6th International Conference “Information Research and Applications”
(Varna, Bulgaria, 2009) by Ye. Bodyanskiy, A. Dolotov, and I. Pliss

Spiking Neuron:
Analog-Digital Architecture*
RN 1 ,1

 t  t[10,1] ( x1 ( k ))
e
x1 ( k )

MS j ,1 ,1
t [10, 1] ( x 1 ( k ))s
[0]
tmax
RN l ,1
u j , 1 , 1 (t )
u j ,1 ,1 (s )
Analog-digital spiking neurons corresponds
to spike-response model entirely
t
s.n.
RN h ,1
RN 1 ,i
x i (k )
RN li
1
e d s

 t  t [li0 ] ( x i ( k ))
[0]
e tli
( x i ( k ))s
[0]
tmax
RN hi

e
p
d s
t
q
e d s
ePSP
PSP s  12
w1jli
ePSP
PSP s  12
w pjli
ePSP
PSP s  12
u 1jli (t )
u j (t )
u 1jli (s )
u j (s )
p
u jli (t )+
p
u jli (s )
+

+
u jli (t )
u jli (s )
s.n.

+
+
+
 t  t [j1] ( x( k ))

e
-
q
wqjli
u jli (t )
q
u jli (s )
s
MS jli
wSAP
SN j
1
SAP s  12
e
 dspikes

 t [j 1 ] ( x ( k ))s
[ 1]
tmax
t
RN 1 ,n
x n (k )
RN ln
RN hn

 t  t [hn0 ] ( x n ( k ))
e
t [hn0 ] ( x n ( k ))s
[0]
tmax
t

MS jhn
u jhn (t )
u jhn (s )
* Proposed for the first time in Image
Processing / Ed. Yung-Sheng Chen: InTeh, Vukovar, Croatia, pp. 357-380 by Ye.
Bodyanskiy and A. Dolotov,
Fuzzy Receptive Neurons*:
( x i )
X1,i
X2 ,i
X3 ,i
...
Xh 1,i
X hi
 3 , i ( xi ( k ))
Pool of receptive neurons is a linguistic
variable, and a receptive neuron within a
pool is a linguistic term.
 2 , i ( xi ( k ))
r.n.
xi (k )
xi
*Proposed for the first time in Information Technologies and Computer Engineering: 2009, 2(15), pp. 51-55
by Ye. Bodyanskiy and A. Dolotov
Fuzzy Spiking Neural Network:
Fuzzy Probabilistic Clustering*
FRN1,1
x1 (k )

t
 x( k ) 
j
[ 1]
j
 t
m
1
( x( k ))
[ 1]


2
1
MS1,1,1
FRN 2,1
s
SN1
FRN h ,1
( x( k ))
2
1
s
FRN1,2
1 ( x (k ))
x 2 (k )
FRN 2,2
SN 2
Fuzzy Clustering
Layer
 j ( x (k ))
 m ( x (k ))
FRN h ,2
There is no need to calculate cluster centers!
s
FRN1,n
x n (k )
FRN 2,n
FRN hn
SN m
MSmhn
*Proposed for the first time in Sci. Proc. of Riga Technical University: 2008, 36, P. 27-33 by Ye. Bodyanskiy
and A. Dolotov
Fuzzy Spiking Neural Network:
Fuzzy Possibilistic Clustering*
FRN1,1

  t[j1] ( x( k )) 2 

 j x( k )   1  

j
 




N
j 

  j t[j1] (x( k ))
k 1
N

2
1
 1





1
x1 (k )
FRN 2,1
s
SN1
FRN h ,1
s
FRN1,2
1 ( x (k ))
x 2 (k )
FRN 2,2
SN 2
Fuzzy Clustering
Layer
 j ( x (k ))
 m ( x (k ))
  (x( k ))
k 1
MS1,1,1

j
FRN h ,2
s
FRN1,n
x n (k )
FRN 2,n
FRN hn
SN m
MSmhn
*Proposed for the first time on the 15th Zittau East-West Fuzzy Colloquium (Zittau, Germany, 2008) by
Ye. Bodyanskiy, A. Dolotov, I. Pliss, and Ye. Viktorov
Fuzzy Spiking Neural Network:
Image Processing*
Original
image
4th
FSNN at
epoch
*In Proceeding of the 4th International School-Seminar “Theory of Decision Making“
(Uzhhorod, Ukraine, 2008) by Ye. Bodyanskiy, A. Dolotov, and I. Pliss
Training set
SOM at 40th
epoch
Fuzzy Spiking Neural Network:
Image Processing*
Original
image
3rd
FSNN at
epoch
*In Proceeding of the 11th International Biennial Baltic Electronics Conference "BEC 2008“
(Tallinn/Laulasmaa, Estonia, 2008) by Ye. Bodyanskiy and A. Dolotov
Training set
FCM at 29th
epoch
Fuzzy Spiking Neural Network:
Image Processing*
Original
image
FSNN at 1st
epoch
Training set
FSNN at 3rd
epoch
FCM at 3rd
epoch
FCM at 30th
epoch
*In Image Processing / Ed. Yung-Sheng Chen: In-Teh, Vukovar, Croatia, pp. 357-380 by Ye. Bodyanskiy
and A. Dolotov
Download