D. Mclaughlin

advertisement
Kinetic Theory for the Dynamics
of Fluctuation-Driven Neural Systems
David W. McLaughlin
Courant Institute & Center for Neural Science
New York University
http://www.cims.nyu.edu/faculty/dmac/
Toledo – June ‘06
Happy Birthday, Peter & Louis
Kinetic Theory for the Dynamics
of Fluctuation-Driven Neural Systems
In collaboration with:
  David Cai
Louis Tao
Michael Shelley
Aaditya Rangan
Visual Pathway: Retina --> LGN --> V1 --> Beyond
Integrate and Fire Representation
 t v = -(v – VR) – g (v-VE)
 t g = - g + l f (t – tl) +
(Sa/N) l,k (t – tlk)
plus spike firing and reset
v (tk) = 1; v (t = tk + ) = 0
Nonlinearity from spike-threshold:
Whenever V(x,t) = 1, the neuron "fires", spike-time recorded,
and V(x,t) is reset to 0 ,
The “primary visual cortex (V1)” is a “layered structure”,
with O(10,000) neurons per square mm, per layer.
Map of
Orientation
Preference
O(104) neuons
per mm2
With both regular &
random patterns
of neurons’ preferences
Lateral Connections and Orientation -- Tree Shrew
Bosking, Zhang, Schofield & Fitzpatrick
J. Neuroscience, 1997
Line-Motion-Illusion
LMI
Coarse-Grained Asymptotic
Representations
Needed for “Scale-up”
• Larger lateral area
• Multiple layers
First, tile the cortical layer with coarse-grained (CG) patches
Coarse-Grained Reductions for V1
Average firing rate models
[Cowan & Wilson (’72); ….; Shelley &
McLaughlin(’02)]
Average firing rate of an excitatory (inhibitory)
neuron, within coarse-grained patch located
at location x in the cortical layer:
m(x,t),  = E,I
Cortical networks have a
very “noisy” dynamics
• Strong temporal fluctuations
• On synaptic timescale
• Fluctuation driven spiking
Experiment Observation
Fluctuations in Orientation Tuning (Cat data from Ferster’s Lab)
Ref:
Anderson, Lampl, Gillespie, Ferster
Science, 1968-72 (2000)
Fluctuation-driven
spiking
(very noisy dynamics,
on the synaptic time scale)
Solid:
average
( over 72 cycles)
Dashed: 10 temporal
trajectories
•
To accurately and efficiently describe these networks requires
that fluctuations be retained in a coarse-grained
representation.
•
“Pdf ” representations –
(v,g; x,t),  = E,I
will retain fluctuations.
•
But will not be very efficient numerically
•
Needed – a reduction of the pdf representations which retains
1.
2.
•
Means &
Variances
Kinetic Theory provides this representation
Ref: Cai, Tao, Shelley & McLaughlin, PNAS, pp 7757-7762 (2004)
Kinetic Theory begins from
PDF representations
(v,g; x,t),  = E,I
• Knight & Sirovich;
• Nykamp & Tranchina, Neural Comp (2001)
• Haskell, Nykamp & Tranchina, Network
(2001) ;
• For convenience of presentation, I’ll sketch the derivation
a single CG patch, with 200 excitatory Integrate & Fire
neurons
• First, replace the 200 neurons in this CG cell by an
equivalent pdf representation
• Then derive from the pdf rep, kinetic theory
• The results extend to interacting CG cells which include
inhibition – as well as different cell types such as
“simple” & “complex” cells.
• N excitatory neurons (within one CG cell)
• Random coupling throughout the CG cell;
• AMPA synapses (with a short time scale )
 t vi = -(vi – VR) – gi (vi -VE)
 t gi = - gi + l f (t – tl) +
(Sa/N) l,k (t – tlk)
plus spike firing and reset
vi (tik) = 1; vi (t = tik + ) = 0
• N excitatory neurons (within one CG cell)
• Random coupling throughout the CG cell;
• AMPA synapses (with time scale )
 t vi = -(v – VR) – gi (v-VE)
 t gi = - gi + l f (t – tl) +
(Sa/N) l,k (t – tlk)
(g,v,t)  N-1 i=1,N E{[v – vi(t)] [g – gi(t)]},
Expectation “E” over Poisson spike train { tl }
 t vi = -(v – VR) – gi (v-VE)
 t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk)
Evolution of pdf -- (g,v,t): (i) N>1; (ii) the total input to
each neuron is (modulated) Poisson spike trains.
t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) }
+ 0(t) [(v, g-f/, t) - (v,g,t)]
+ N m(t) [(v, g-Sa/N, t) - (v,g,t)],
0(t) = modulated rate of incoming Poisson spike train;
m(t) = average firing rate of the neurons in the CG cell
=  J(v)(v,g; )|(v= 1) dg,
and where J(v)(v,g; ) = -{[(v – VR) + g (v-VE)] }
t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) }
+ 0(t) [(v, g-f/, t) - (v,g,t)]
+ N m(t) [(v, g-Sa/N, t) - (v,g,t)],
N>>1;
f << 1; 0 f = O(1);
t  = -1v {[(v – VR) + g (v-VE)] }
+ g {[g – G(t)]/) } + g2 / gg  + …
where g2 = 0(t) f2 /(2) + m(t) (Sa)2 /(2N)
G(t) = 0(t) f + m(t) Sa
Kinetic Theory Begins from Moments
•
•
•
•
(g,v,t)
(g)(g,t) =  (g,v,t) dv
(v)(v,t) =  (g,v,t) dg
1(v)(v,t) =  g (g,tv) dg
where (g,v,t) = (g,tv) (v)(v,t).
t  = -1v {[(v – VR) + g (v-VE)] }
+ g {[g – G(t)]/) } + g2 / gg  + …
First, integrating (g,v,t) eq over v yields:
 t (g) = g {[g – G(t)]) (g)} + g2 gg (g)
Fluctuations in g are Gaussian
 t (g) = g {[g – G(t)]) (g)} + g2 gg (g)
Integrating (g,v,t) eq over g yields:
t (v) = -1v [(v – VR) (v) + 1(v) (v-VE) (v)]
Integrating [g (g,v,t)] eq over g yields an
equation for
1(v)(v,t) =  g (g,tv) dg,
where (g,v,t) = (g,tv) (v)(v,t)
t 1(v) = - -1[1(v) – G(t)]
+ -1{[(v – VR) + 1(v)(v-VE)] v 1(v)}
+ 2(v)/ ((v)) v [(v-VE) (v)]
+ -1(v-VE) v2(v)
where 2(v) = 2(v) – (1(v))2 .
Closure:
One obtains:
(i) v2(v) = 0;
(ii) 2(v) = g2
t (v) = -1v [(v – VR) (v) + 1(v)(v-VE) (v)]
t 1(v) = - -1[1(v) – G(t)]
+ -1{[(v – VR) + 1(v)(v-VE)] v 1(v)}
+ g2 / ((v)) v [(v-VE) (v)]
Together with a diffusion eq for (g)(g,t):
 t (g) = g {[g – G(t)]) (g)} + g2 gg (g)
Fluctuation-Driven Dynamics
PDF of v
Theory→
←I&F (solid)
firing rate (Hz)
N=75
N=75
σ=5msec
S=0.05
f=0.01
Fokker-Planck→
Theory→
←I&F
←Mean-driven limit (
Hard thresholding
):
N  
Bistability and Hysteresis
 Network of Simple, Excitatory only
N=16!
N=16
FluctuationDriven:
Relatively Strong
Cortical Coupling:
MeanDriven:
N  
Bistability and Hysteresis
 Network of Simple, Excitatory only
N=16!
MeanDriven:
Relatively Strong
Cortical Coupling:
Computational Efficiency
• For statistical accuracy in these CG patch settings,
Kinetic Theory is 103 -- 105 more efficient than I&F;
Realistic Extensions
Extensions to coarse-grained local
patches, to excitatory and inhibitory
neurons, and to neurons of different
types (simple & complex). The pdf
then takes the form
,(v,g; x,t),
where x is the coarse-grained label,  = E,I
and  labels cell type
Three Dynamic Regimes of Cortical Amplification:
1) Weak Cortical Amplification
No Bistability/Hysteresis
2) Near Critical Cortical Amplification
3) Strong Cortical Amplification
Bistability/Hysteresis
(2)
(3)
Excitatory Cells Shown
(1)
Firing rate vs. input conductance for 4 networks
with varying pN: 25 (blue), 50 (magneta), 100
(black), 200 (red). Hysteresis occurs for pN=100
and 200. Fixed synaptic coupling Sexc/pN
Summary
• Kinetic Theory is a numerically efficient (103 -- 105 more efficient
than I&F), and remarkably accurate, method for “scale-up”
Ref: PNAS, pp 7757-7762 (2004)
• Kinetic Theory introduces no new free parameters into the model,
and has a large dynamic range from the rapid firing “mean-driven”
regime to a fluctuation driven regime.
• Sub-networks of point neurons can be embedded within kinetic
theory to capture spike timing statistics, with a range from test
neurons to fully interacting sub-networks.
Ref: Tao, Cai, McLaughlin, PNAS, (2004)
Too good to be true?
What’s missing?
• First, the zeroth moment is more accurate
than the first moment, as in many moment
closures
Too good to be true?
What’s missing?
• Second, again as in many moment
closures, existence can fail -- (Tranchina,
et al – 2006).
• That is, at low but realistic firing rates,
equations too rigid to have steady state
solutions which satisfy the boundary
conditions.
• Diffusion (in v) fixes this existence
problem – by introducing boundary layers
Too good to be true?
What’s missing?
• But a far more serious problem
• Kinetic Theory does not capture detailed
“spike-timing” information
Why does the kinetic theory (Boltzman-type approach in general) not work?
Note
E nsem ble A verage
(N etw ork M echanism )

N etw ork M echanism
(E nsem ble A verage)
Too good to be true?
What’s missing?
• But a far more serious problem
• Kinetic Theory does not capture detailed
“spike-timing” statistics
Too good to be true?
What’s missing?
• But a far more serious problem
• Kinetic Theory does not capture detailed
“spike-timing” statistics
• And most likely the cortex works, on very
short time time scales, through neurons
correlated by detailed spike timing.
• Take, for example, the line-motion illusion
Line-Motion-Illusion
LMI
Stimulus
Model Voltage
Direct ‘naïve’ coarse
graining
may not suffice:
• Priming
mechanism relies on
Recruitment
time
0
128
Model NMDA
space
• Recruitment relies
on locally correlated
cortical firing events
• Naïve ensemble
average destroys
locally correlated
events
0%
‘coarse’
40%
‘coarse’
Trials
Conclusion
• Kinetic Theory is a numerically efficient (103 - 105 more efficient than I&F), and remarkably
accurate.
• Kinetic Theory accurately captures firing rates in
fluctuation dominated systems
• Kinetic Theory does not capture detailed spiketimed correlations – which may be how the
cortex works, as it has no time to average.
• So we’ve returned to integrate & fire networks,
and have developed fast “multipole” algorithms
for integrate & fire systems (Cai and Rangan,
2005).
Download