ppt - Brain Dynamics Laboratory

advertisement
Noise in the nervous systems:
Stochastic Resonance
Jaeseung Jeong, Ph.D
Department of Bio and Brain Engineering,
KAIST
Several sources of Noise in the Brain
• Thermal noise
• Cellular noise: Stochastic opening and closing of ion channels
• Membrane voltage fluctuations in the axons and dendrites
• Synaptic noise: Spontaneous release of vesicles in the synapses
• Sensory and motor noises:
Random generation of voltage fluctuations in the fibers
• Environmental (Stimulus) noises
Cortical variability
Cortical variability: cellular noise
• a. The shift of the overall spike pattern across rows reflects the
average propagation speed of the APs. The raster plot of the
somatic measurement reflects spike-time variability from AP
initiation. Owing to channel noise, the spike-time variability
rapidly increases the further the AP propagates, and it
eventually reaches millisecond orders.
• b. Trial-to-trial variability of synaptic transmission measured in
vitro by paired patch-clamp recordings in rat somatosensory
cortex slices. Six consecutive postsynaptic responses (black
traces) to an identical presynaptic-stimulation pattern (top
trace) are shown, along with the ensemble mean response
(grey trace) from over 50 trials.
What is Stochastic Resonance?
• A stochastic resonance is a phenomenon in which a nonlinear
system is subjected to a periodic modulated signal so weak as to be
normally undetectable, but it becomes detectable due to resonance
between the weak deterministic signal and stochastic noise.
• The earliest definition of stochastic resonance was the maximum
of the output signal strength as a function of noise (Bulsara and
Gammaitoni 1996).
Ice-age cycle of Earth
Noise-aided hopping events
• Surmounting the barrier requires a certain amount of force. Suppose
the ball is subjected to a force which varies in time sinusoidally, but
is too weak to push the ball over the barrier.
• If we add a random “noise” component to the forcing, then the ball
will occasionally be able to hop over the barrier.
• The presence of the sinusoid can then be seen as a peak in the
power spectrum of the time series of noise-aided hopping events.
Signal and noise
• We can visualize the sinusoidal forcing as a tilting of the
container. In the time series below, the gray background
represents the time-varying depth of the wells with respect to
the barrier. The red trace represents the position of the ball.
Physical picture of Stochastic Resonance
• If the particle is excited by a small sinusoidal force, it will
oscillate within one of the two wells. But if the particle is also
excited by a random force (i.e. noise plus sine) it will hop from
one well to the other, more or less according to the frequency
of the sine: the periodical force tends to be amplified.
• It can intuitively be sensed that if the particle is excited by the
sine plus a very small noise it will hop a few times. In return,
if the noise is too powerful, the system will become
completely randomized. Between these two extreme situations,
there exists an optimal power of input noise for which the
cooperative effect between the sine and the noise is optimal.
Bistability in Stochastic Resonance
Power spectra of hopping events
• In this power spectra of hopping events, the gray bars mark
integer multiples of the sinusoidal forcing frequency.
Kramers rate for Stochastic Resonance
• Physically, the sine posses a characteristic time that is its period.
The dynamical system has also a characteristic time system that is
the mean residence time in the absence of the sine, i.e. the mean
(in statistical sense) time spent by the particle inside one well.
• This time is the inverse of the transition rate, known as Kramers
rate, and is function of the noise level. (i.e. the inverse of the
average switch rate induced by the sole noise: the stochastic time
scale).
• For the optimal noise level, there is a synchronization between the
Kramers rate and the frequency of the sine, justifying the term of
resonance.
• Since this resonance is tuned by the noise level, it was called
stochastic resonance (SR).
Peak SNRs correspond to maximum
spatiotemporal synchronization.
• SNR of the middle oscillator of an array of 65 as a function of
noise for two different coupling strengths, 0.1 and 10.
Stochastic Resonance in multi Array oscillators
• Time evolution (up) of an array of 65
oscillators, subject to different noise
power and coupling strength. The
temporal scale of the patterns decreases
with increasing noise while the spatial
scale of the patterns increases with
increasing coupling strength.
• For this range of noise and coupling,
spatiotemporal synchronization (and peak
SNR) correspond to a coupling of about
10 and a noise of about 35 dB, as
indicated by the striped pattern in the third
column of the second row from the top.
Stochastic Resonance in the nervous system
• Since its first discovery in cat visual neurons, stochasticresonance-type effects have been demonstrated in a
range of sensory systems.
• These include crayfish mechanoreceptors, shark
multimodal sensory cells, cricket cercal sensory neurons
and human muscle spindles.
• The behavioural impact of stochastic resonance has been
directly demonstrated and manipulated in passive
electrosensing paddlefish and in human balance control.
Noise produces nonlinearity
• in spike-generating neurons, sub-threshold signals have no effect
on the output of the system. Noise can transform such threshold
nonlinearities by making sub-threshold inputs more likely to cross
the threshold, and this becomes more likely the closer the inputs
are to the threshold.
• Thus, when outputs are averaged over time, this noise produces
an effectively smoothed nonlinearity.
• This facilitates spike initiation and can improve neural-network
behaviour, as was shown in studies of contrast invariance of
orientation tuning in the primary visual cortex.
• Neuronal networks in the presence of noise will be more robust
and explore more states, facilitating learning/adaptation to the
changing demands of a dynamic environment.
SR-based techniques
• SR-based techniques has been used to create a novel class of
medical devices (such as vibrating insoles) for enhancing
sensory and motor function in the elderly, patients with
diabetic neuropathy, and patients with stroke.
Balancing act using vibrating insoles
• Using a phenomenon called stochastic resonance, the human body
can make use of random vibrations to help maintain its balance.
• In experiments on people in their 20s and people in their 70s,
actuators embedded in gel insoles generated noisy vibrations with
such a small amount of force that a person standing on the insoles
could not feel them. A reflective marker was fixed to the research
subject's shoulder, and a video camera recorded its position.
• People always sway a small amount even when they are trying to
stand still. The amount of sway increases with age. But under the
influence of a small amount of vibration, which improves the
mechanical senses in the feet, both old and young sway much less.
• Remarkably, noise made people in their 70s sway about as much as
people in their 20s swayed without noise.
‘Noise reduction’ mechanisms in the Brain
• Thresholding systems in the neurons
• Low Reliability (Bursts) between neurons
• Rate coding hypothesis
• Averaging (Neuronal Population coding)
• Using prior knowledge about the noise characteristics
Accuracy
in the Information processing
of the Brain
vs.
Noise
How can neural networks maintain stable
activity in the presence of noise?
Part a shows convergence of signals onto a single neuron. If the
incoming signals have independent noise, then noise levels in the
postsynaptic neuron will scale in proportion to the square root of
the number of signals (N), whereas the signal scales in proportion
to N.
cf. If the noise in the signals is perfectly correlated, then the noise
in the neuron will also scale in proportion to N.
Homeostatic plasticity mechanisms
• Experimental evidence suggests that average neuronal
activity levels are maintained by homeostatic plasticity
mechanisms that dynamically set synaptic strengths, ionchannel expression or the release of neuromodulators.
• This in turn suggests that networks of neurons can
dynamically adjust to attenuate noise effects. Moreover,
these networks might be wired so that large variations in
the response properties of individual neurons have little
effect on network behaviour.
Principles of how the CNS manages noise
• The principle of averaging can be applied whenever
redundant information is present across the sensory
inputs to the CNS or is generated by the CNS.
• Averaging can counter noise if several units (such as
receptor molecules, neurons or muscles) carry the
same signal and each unit is affected by independent
sources of noise.
• Averaging is seen at the very first stage of sensory
processing.
Divergence
• Counterintuitively, divergence (one neuron synapsing onto
many) can also support averaging.
• When signals are sent over long distances through noisy
axons, rather than using a single axon it can be beneficial
to send the same signal redundantly over multiple axons
and then combine these signals at the destination.
• Crucially, for such a mechanism to reduce noise the initial
divergence of one signal into many must be highly
reliable.
• Such divergence is seen in auditory inner hair cells, which
provide a divergent input to 10–30 ganglion cells through
a specialized 'ribbon synapse'.
• Averaging is used in many neural systems in which
information is encoded as patterns of activity across a
population of neurons that all subserve a similar
function: these are termed neural population codes.
Prior knowledge about noises
• Prior knowledge can also be used to counter noise. If the
structure of the signal and/or noise is known it can be used to
distinguish the signal from the noise. This principle is especially
helpful in dealing with sensory signals that, in the natural world,
are highly structured and redundant.
• Signal-detection theory shows that the optimal signal detector,
subject to additive noise, is obtained by matching all parameters
of the detector to those of the signal to be detected: in
neuroscience this is termed the matched-filter principle.
• Thus, the structures of receptive fields embody prior knowledge
about the expected inputs and thereby allow neurons to
attenuate the impact of noise.
Bayesian inference:
combining averaging and prior knowledge
• The principles of averaging and prior knowledge can be placed
into a larger mathematical framework of optimal statistical
estimation and decision theory, known as Bayesian inference.
• Bayesian inference assigns probabilities to propositions about
the world (beliefs). These beliefs are calculated by combining
prior knowledge (for example, that an animal is a predator) and
noisy observations (for example, the heading of animal) to infer
the probability of propositions (for example, animal attacks).
• Psychophysical experiments have confirmed that humans use
these Bayesian inferences to allow them to cope with noise (and,
more generally, with uncertainty) in both perception and action.
Download