Compu5ng
with
transient
 network
dynamics
 Stefan
Klampfl,
Wolfgang
Maass
 Ins$tute
for
Theore$cal
Computer
Science


advertisement
Compu5ng
with
transient
network
dynamics
Stefan
Klampfl,
Wolfgang
Maass
Ins$tute
for
Theore$cal
Computer
Science
University
of
Technology
Graz,
Austria
December
11,
2008
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Computa5on
in
cor5cal
circuits
Obviously,
computa$on
in
the
brain
is
different
from
computa$on
in
computers
2
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Computers
vs
brains
• Computers
carry
out
offline
computa$ons
Different
input
components
are
presented
all
at
once,
and
there
is
no
strict
bound
on
the
computa$on
$me.
3
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Computers
vs
brains
• Typical
computa$ons
in
the
brain
are
online
In
online
computa$ons
there
arrive
all
the
$me
new
input
components.
A
real‐$me
algorithm
has
produce
for
each
new
input
component
an
output
within
a
fixed
$me
interval
D.
An
any$me
algorithm
can
be
prompted
at
any
$me
to
provide
its
current
best
guess
of
a
proper
output
(which
should
integrate
as
many
of
the
previously
arrived
input‐pieces
as
possible).
4
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Computers
vs
brains
• Obvious
differences:
– Computa$ons
in
brains
result
from
learning
– Brains
carry
out
online
computa$ons
– Neural
circuits
consist
of
heterogeneous
components
• Standard
computa$onal
models
are
not
adequate
for
understanding
brain
computa$ons
5
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
An
alterna5ve
model
• Computa$ons
can
be
viewed
as
trajectories
to
an
aWractor
in
a
dynamical
system
Problems
with
this
model:
• Not
suitable
for
online
compu$ng
• Condi$ons
implausible
for
biological
circuits
• AWractors
cannot
be
reached
in
short
$me
• Data
rarely
show
convergence
to
an
aWractor
6
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Compu5ng
with
trajectories
• Is
informa$on
encoded
in
the
transient
dynamics?
Principal
neurons
from
locust
antennal‐lobe
in
response
to
a
specific
odor
[Rabinovich
et
al,
2008]
Institute for Theoretical Computer Science
7
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Our
approach
• How
can
temporally
stable
informa$on
be
extracted
from
these
transient
trajectories?
• We
consider
the
perspec$ve
of
the
“neural
user“,
i.e.,
projec$on
neurons
that
read
out
informa$on
from
the
circuit
– e.g.,
linear
discriminator
8
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Our
approach
• A
linear
readout
neuron
computes
the
weighted
sum
of
low‐
pass
filtered
presynap$c
spike
trains
Spikes
from
presynap$c
neurons
in
the
circuit
Postsynap$c
poten$als
in
the
readout
neuron
caused
by
these
spikes
W
1
W2
!
Wd
This
high
dimensional
analog
input
is
called
the
liquid
state
9
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Compu5ng
with
trajectories
• High‐dimensionality
of
state
space
facilitates
the
extrac$on
of
temporally
stable
informa$on
Linear
separability
of
2
randomly
drawn
trajectories
of
length
100
in
d
dimensions:
a
projec$on
neuron
with
d
presynap$c
inputs
can
separate
almost
any
pair
of
trajectories
that
are
each
defined
by
connec$ng
less
than
d
randomly
drawn
points.
Institute for Theoretical Computer Science
10
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Learning
of
readout
neurons
• How
can
readout
neurons
be
trained
to
discriminate
between
trajectories?
– Supervised
learning
(e.g.,
linear
classifier)
– Reinforcement
learning
(Reward‐modulated
STDP
[Izhikevich,
2007,
Legenstein
et
al,
2007])
• I
will
focus
on
unsupervised
learning
– Slow
Feature
Analysis
(SFA)
[WiskoW
and
Sejnowski,
2002]
11
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Slow
Feature
Analysis
• Based
on
slowness
as
a
principle
for
learning
invariances
• Given
a
mul$‐dimensional
$me
series
x(t),
it
finds
those
func$ons
gi
which
generate
the
most
slowly
varying
signals
yi(t)
=
gi(x(t))
• It
minimizes
s.t.
• gi
are
instantaneous
func$ons
of
the
input
x(t)
12
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Slow
Feature
Analysis
• Consider
the
special
case
– where
the
input
x
is
whitened:
and
– the
set
of
linear
func$ons
• Minimize
under
the
constraints
• SFA
finds
the
normed
eigenvector
of
corresponding
to
the
smallest
eigenvalue
Institute for Theoretical Computer Science
13
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
SFA
for
classifica5on
• Linear
SFA
can
be
related
to
Fisher‘s
Linear
Discriminant
(FLD)
• Convert
the
input
to
the
classifica$on
problem
into
a
$me
series
for
SFA
• At
each
$me
select
random
point
from
a
class
chosen
by
a
Markov
model
14
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
SFA
for
classifica5on
• If
the
transi$on
probabili$es
between
classes
are
low,
the
eigenvalues
are
the
same
• If
the
class
is
slowly
varying,
SFA
finds
the
same
subspace
as
FLD
15
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Training
readouts
with
SFA
• Our
model
of
a
cor$cal
microcircuit:
Computer
model
(560
neurons)
from
[Haeusler
and
Maass,
2007],
based
on
data
from
the
Labs
of
Alex
Thomson,
Henry
Markram,
simulated
with
single
compartment
HH‐neurons,
conductance
based
synapses
with
short‐term
plas$city
with
noise
reflec$ng
in‐vivo‐condi$ons
according
to
Destexhe
et
al.
We
have
applied
long‐term
plas$city
so
far
only
to
projec$on
neurons
in
this
model.
16
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Training
readouts
with
SFA
• Isolated
spoken
digits
task
[Hopfield
and
Brody]
• Inputs
preprocessed
with
cochlea
model
[Lyon,
1982]
17
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Training
readouts
with
SFA
• Linear
SFA
applied
to
a
long
random
sequence
of
circuit
trajectories
in
response
to
different
versions
of
words
“one”
and
“two”
(of
a
single
speaker)
SFA
output
in
response
to
5
test
trajectories
of
each
class:
Slow
features
encode
“paWern
loca$on”
(Where‐informa$on)
and
paWern
iden$ty
(What‐informa$on)
Institute for Theoretical Computer Science
18
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Liquid
Compu5ng
model
• The
computa$onal
performance
of
a
cor$cal
microcircuit
should
be
judged
on
the
basis
how
well
it
supports
the
task
of
readout
neurons
• A
microcircuit
could
support
the
learning
capability
of
linear
projec$on
neurons
by
providing:
– analog
fading
memory
(to
accumulate
informa$on
over
$me
in
the
state)
– nonlinear
projec$on
into
high‐dimensional
space
(kernel
property)
19
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Liquid
Compu5ng
model
• Liquid
State
Machine
(LSM)
generalizes
finite
state
machines
to
con$nuous
input
values
u(s),
con$nuous
output
values
y(t),
and
con$nuous
$me
t
[Maass,
Natschläger,
Markram,
2002]
Institute for Theoretical Computer Science
20
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Liquid
Compu5ng
model
• What
is
the
computa$onal
power
of
this
model?
– If
the
dynamical
system
L
is
sufficiently
large
and
consists
of
sufficiently
diverse
components,
the
readout
func$on
f
can
be
trained
to
approximate
any
Volterra
series.
[Maass,
Markram,
2004]
– If
one
allows
feedback
from
the
readout
into
the
dynamical
system,
then
this
model
becomes
already
for
rather
simple
dynamical
systems
L
universal
for
analog
(and
digital)
computa$on
on
input
streams
(in
par$cular
it
can
simulate
any
Turing
machine).
[Maass,
Joshi,
Sontag,
2007]
21
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Liquid
Compu5ng
model
• Experimentally
testable
predic$ons
of
the
liquid
compu$ng
model
for
cor$cal
microcircuits:
– Temporal
integra$on
of
informa$on
(fading
memory)
– General
purpose
nonlinear
preprocessing
(kernel
property)
22
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
visual
cortex
Recordings
by
D.
Nikolic
from
primary
visual
cortex
of
anaesthesized
cat
[Nikolic,
Haeusler,
Singer,
Maass,
2007]
23
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
visual
cortex
C
80
80
60
Cat 1
Performance
Mean firing rate
40
20
0
0
A|C
Performance
(% correct)
100
B
E
80
80
60
40
20
Cat 2
0
0
A|C
Performance
(% correct)
100
B
E
80
80
60
40
20
0
Cat 3
0
100
200
300
Time [ms]
400
500
600
0
700
Mean firing rate [Hz]
Performance
(% correct)
B
Mean firing rate [Hz]
A|D
100
Mean firing rate [Hz]
• Informa$on
about
previously
shown
leWers
is
maintained
during
presenta$on
of
subsequent
leWers
(temporal
integra$on)
24
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
visual
cortex
• Informa$on
from
two
subsequent
leWers
is
nonlinearly
combined
in
the
circuit
(kernel
property)
25
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
auditory
cortex
• Recordings
with
4
electrodes
from
area
A1
in
awake
ferrets
(unpublished
data
from
the
lab
of
Shibab
Shamma)
unpublished
experimental
data
was
removed
from
this
publicly
available
version
26
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
auditory
cortex
• Informa$on
is
maintained
during
presenta$on
of
the
next
tone
(temporal
integra$on)
unpublished
experimental
data
was
removed
from
this
publicly
available
version
27
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Data
from
auditory
cortex
• Almost
all
informa$on
contained
in
the
spike
trains
can
be
extracted
by
linear
classifiers
(kernel
property)
unpublished
experimental
data
was
removed
from
this
publicly
available
version
28
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Summary
• I
have
presented
a
model
for
online
compu$ng
with
trajectories
in
cor$cal
microcircuits
• It
views
cor$cal
computa$ons
from
the
perspec$ve
of
generic
preprocessing
for
learning
• Slow
Feature
Analysis
(SFA)
as
an
unsupervised
mechanism
for
training
readouts
• Predic$ons
of
this
model
(temporal
integra$on
of
informa$on,
kernel
property)
have
been
tested
by
experimental
data
29
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
Thank
you
for
your
aNen5on
30
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience
References
Rabinovich,
Huerta,
and
Laurent.
Transient
dynamics
for
neural
processing.
Science
321
(2008),
pp.
48‐50
D.
Nikolic,
S.
Haeusler,
W.
Singer,
and
W.
Maass.
Temporal
dynamics
of
informa$on
content
carried
by
neurons
in
the
primary
visual
cortex.
In
Proc.
of
NIPS
2006,
Advances
in
Neural
InformaAon
Processing
Systems,
volume
19,
pages
1041‐1048.
MIT
Press,
2007.
E.M.
Izhikevich.
Solving
the
distal
reward
problem
through
linkage
of
STDP
and
dopamine
signaling.
Cerebral
Cortex
17:2443‐2452,
2007.
R.
Legenstein,
D.
Pecevski,
and
W.
Maass.
A
learning
theory
for
reward‐modulated
spike‐
$ming‐
dependent
plas$city
with
applica$on
to
biofeedback.
PLoS
ComputaAonal
Biology,
4(10):1‐27,
2008.
L.
WiskoW
L,
T.J.
Sejnowski.
Slow
feature
analysis:
unsupervised
learning
of
invariances.
Neural
ComputaAon
14:715‐770,
2002.
S.
Häusler
and
W.
Maass.
A
sta$s$cal
analysis
of
informa$on
processing
proper$es
of
lamina‐
specific
cor$cal
microcircuit
models.
Cerebral
Cortex,
17(1):149‐162,
2007.
W.
Maass
and
H.
Markram.
On
the
computa$onal
power
of
recurrent
circuits
of
spiking
neurons.
Journal
of
Computer
and
System
Sciences,
69(4):593‐616,
2004.
W.
Maass,
P.
Joshi,
and
E.
D.
Sontag.
Computa$onal
aspects
of
feedback
in
neural
circuits.
PLoS
ComputaAonal
Biology,
3(1):e165,
1‐20,
2007.
31
Institute for Theoretical Computer Science
EPSRC
Symposium
Workshop
on
Computa5onal
Neuroscience

Download