A Bifurcation Theoretical Approach to the Solving the Neural Coding Problem

advertisement
A Bifurcation Theoretical Approach
to the Solving the Neural Coding Problem
June 28
Albert E. Parker
Complex Biological Systems
Department of Mathematical Sciences
Center for Computational Biology
Montana State University
Collaborators: Tomas Gedeon, Alex Dimitrov, John Miller, and Zane Aldworth
Outline
The Neural Coding Problem
A Clustering Problem
The Role of Bifurcation Theory
A new algorithm to solve the Neural Coding
Problem
The Neural Coding Problem
GOAL: To understand the neural code.
EASIER GOAL: We seek an answer to the question,
How does neural activity represent information about environmental stimuli?
“The little fly sitting in the fly’s brain trying to fly the fly”
Looking for the dictionary to the neural code …
encoding
stimulus
X
response
Y
decoding
… but the dictionary is not deterministic!
Given a stimulus, an experimenter observes many different neural responses:
X
Yi| X
i = 1, 2, 3, 4
… but the dictionary is not deterministic!
Given a stimulus, an experimenter observes many different neural responses:
X
Yi| X
i = 1, 2, 3, 4
Neural coding is stochastic!!
Similarly, neural decoding is stochastic:
Y
Xi|Y
i = 1, 2, … , 9
Probability Framework
encoder: P(Y|X)
environmental
stimuli
neural
responses
Y
X
decoder: P(X|Y)
The Neural Coding Problem:
How to determine the
encoder P(Y|X) or the decoder P(X|Y)?
Common Approaches: parametric estimations, linear methods
Difficulty: There is never enough data.
As we attempt search for an answer to the neural coding problem, we proceed in the
spirit of John Tukey:
It is better to be approximately right than exactly wrong.
One Approach: Cluster the responses
Stimuli
X
L objects {xi}
Responses
p(X,Y)
Y
Clusters
q(YN |Y)
K objects {yi}
YN
N objects {yNi}
One Approach: Cluster the responses
Stimuli
X
L objects {xi}
Responses
p(X,Y)
Y
Clusters
q(YN |Y)
K objects {yi}
YN
N objects {yNi}
• To address the insufficient data problem, one clusters the
outputs Y into clusters YN so that the information
that one can learn about X by observing YN , I(X;YN), is as
close as possible to the mutual information I(X;Y)
Two optimization problems which use this
approach
• Information Bottleneck Method (Tishby, Pereira, Bialek 1999)
min I(Y,YN) constrained by I(X;YN)  I0
q
max –I(Y,YN) +  I(X;YN)
q
• Information Distortion Method (Dimitrov and Miller 2001)
max H(YN|Y) constrained by I(X;YN)  I0
q
max H(YN|Y) +  I(X;YN)
q
An annealing algorithm
to solve
maxq(G(q)+D(q))

Let q0 be the maximizer of maxq G(q), and let 0 =0. For k  0, let (qk , k ) be
a solution to maxq G(q) +  D(q ). Iterate the following steps until
K =  max for some K.
1. Perform  -step: Let  k+1 = k + dk where dk>0
2. The initial guess for qk+1 at  k+1 is qk+1(0) = qk +  for some small
perturbation .
3. Optimization: solve maxq (G(q) +  k+1 D(q)) to get the maximizer qk+1 ,
using initial guess qk+1(0) .
Application of the annealing method to the Information Distortion problem
maxq (H(YN|Y) +  I(X;YN))
when p(X,Y) is defined by four gaussian blobs
p(X,Y)
X
52 stimuli
Y
52 responses
Stimuli
Responses
Y
q(YN |Y)
52 responses
YN
4 clusters
Evolution of the optimal clustering:
Observed Bifurcations for the Four Blob problem:
We just saw the optimal clusterings q* at some  *=  max . What do the clusterings look like for <  max ??
Application to cricket sensory data
E(X|YN): stimulus
means conditioned
on each of the classes
typical spike
patterns
optimal
quantizer
We have used bifurcation
theory in the presence of
symmetries to totally describe
how the the optimal clusterings
of the responses must evolve…
Symmetries??
class 1
class 3
q(YN|Y) : a clustering
Y
K objects {yi}
YN
N objects {yNi}
Symmetries??
class 3
class 1
q(YN|Y) : a clustering
Y
K objects {yi}
YN
N objects {yNi}
Observed Bifurcation
Structure
Observed Bifurcation
Structure
Group Structure
S4
S3
S2 S2 S2
S3
S2
S2
S3
S2
S2
1
S2 S2
S3
S2
S2
S2
Observed Bifurcation
Structure
q*
Group Structure
S4
S3
S2 S2 S2
S3
S2
S2
S3
S2
S2
S2 S2
S3
S2
S2
S2
1

Continuation techniques
provide
numerical confirmation
of the theory
Additional structure!!
Conclusions …
 We have a complete theoretical picture of how the
clusterings of the responses evolve for any problem of the
form
maxq(G(q)+D(q))
o When clustering to N classes, there are N-1 bifurcations.
o In general, there are only pitchfork and saddle-node bifurcations.
o We can determine whether pitchfork bifurcations are either
subcritical or supercritical (1st or 2nd order phase transitions)
o We know the explicit bifurcating directions
 SO WHAT?? This yields a new and improved algorithm
for solving the neural coding problem …
A numerical algorithm to solve max(G(q)+D(q))
q

Let q0 be the maximizer of maxq G(q), 0 =1 and s > 0. For k  0, let (qk , k ) be a
solution to maxq G(q) +  D(q ). Iterate the following steps until
K =  max for
some K.
   qk 

1. Perform  -step: solve  q , L (qk , k ,  k )
        q , L (qk , k ,  k )
  k
   qk 
 and select  k+1 = k + dk where
for 

   k 
dk = (s sgn(cos )) /(||qk ||2 + ||k ||2 +1)1/2.
2.
The initial guess for (qk+1,k+1) at  k+1 is
(qk+1(0),k+1 (0)) = (qk ,k) + dk ( qk, k) .
3.
Optimization: solve maxq (G(q) +  k+1 D(q)) using pseudoarclength continuation
to get the maximizer qk+1, and the vector of Lagrange multipliers k+1 using initial
guess (qk+1(0),k+1 (0)).
4.
Check for bifurcation: compare the sign of the determinant of an identical block
of each of q [G(qk) +  k D(qk)] and q [G(qk+1) +  k+1 D(qk+1)]. If a bifurcation is
detected, then set qk+1(0) = qk + d_k u where u is bifurcating direction and repeat
step 3.
Download