Dynamical system

advertisement
Lecture series: Data analysis
Thomas Kreuz, ISC, CNR
thomas.kreuz@cnr.it
http://www.fi.isc.cnr.it/users/thomas.kreuz/
Lectures: Each Tuesday at 16:00
(First lecture: May 21, last lecture: June 25)
(Very preliminary) Schedule
•
Lecture 1: Example (Epilepsy & spike train synchrony),
Data acquisition, Dynamical systems
•
Lecture 2: Linear measures, Introduction to non-linear
dynamics
•
Lecture 3: Non-linear measures
•
Lecture 4: Measures of continuous synchronization (EEG)
•
Lecture 5: Application to non-linear model systems and to
epileptic seizure prediction, Surrogates
•
Lecture 6: Measures of (multi-neuron) spike train synchrony
Last lecture
• Example: Epileptic seizure prediction
• Data acquisition
• Introduction to dynamical systems
Epileptic seizure prediction
Epilepsy results from abnormal, hypersynchronous
neuronal activity in the brain
Accessible brain time series:
EEG (standard) and neuronal spike trains (recent)
Does a pre-ictal state exist (ictus = seizure)?
Do characterizing measures allow a reliable detection of
this state?
Specific example for prediction of extreme events
Data acquisition
System / Object
Sampling
Sensor
Amplifier
AD-Converter
Filter
Computer
Dynamical system
• Described by time-dependent states 𝑥 ∈ ℛ 𝑛
• Evolution of state
- continuous (flow)
- discrete (map)
𝑑𝒙
𝑑𝑡
= 𝒇(𝒙, 𝑡, 𝜆)
𝒙𝑡+∆𝑡 = 𝑭(𝒙𝒕 , ∆𝑡, 𝜆)
𝜆 Control parameter
𝒇, 𝑭 can be both be linear or non-linear
Today’s lecture
Non-linear model systems
Linear measures
Introduction to non-linear dynamics
Non-linear measures
- Introduction to phase space reconstruction
- Lyapunov exponent
[Acknowledgement: K. Lehnertz, University of Bonn, Germany]
Non-linear
model systems
Non-linear model systems
Discrete maps
Continuous Flows
• Logistic map
• Rössler system
• Hénon map
• Lorenz system
Logistic map
𝑟=4
r - Control parameter
• Model of population dynamics
• Classical example of how complex, chaotic behaviour can
arise from very simple non-linear dynamical equations
[R. M. May. Simple mathematical models with very complicated dynamics. Nature, 261:459, 1976]
Hénon map
• Introduced by Michel Hénon as a simplified model of the
Poincaré section of the Lorenz model
• One of the most studied examples of dynamical systems
that exhibit chaotic behavior
[M. Hénon. A two-dimensional mapping with a strange attractor. Commun. Math. Phys., 50:69, 1976]
Rössler system
𝑑𝒙
= −𝜔(𝑦 + 𝑧)
𝑑𝑡
𝑑𝒚
= 𝜔(𝑥 + 𝑎𝑦)
𝑑𝑡
𝑑𝒛
= 𝑏 + 𝑧(𝑥 − 𝑐)
𝑑𝑡
𝑎 = 0.15, b = 0.2; c = 10
• designed in 1976, for purely theoretical reasons
• later found to be useful in modeling equilibrium in
chemical reactions
[O. E. Rössler. An equation for continuous chaos. Phys. Lett. A, 57:397, 1976]
Lorenz system
𝑑𝒙
= σ y−x
𝑑𝑡
𝑑𝒚
= −𝑦 − 𝑥𝑧 + 𝑅𝑥
𝑑𝑡
𝑑𝒛
= 𝑥𝑦 − 𝑏𝑧
𝑑𝑡
𝑅 = 28, σ = 10; b = 8/3
• Developed in 1963 as a simplified mathematical model for
atmospheric convection
• Arise in simplified models for lasers, dynamos, electric
circuits, and chemical reactions
[E. N. Lorenz. Deterministic non-periodic flow. J. Atmos. Sci., 20:130, 1963]
Linear measures
Linearity
Dynamic of system (and thus of any time series measured
from the system) is linear if:
H describes the dynamics and 𝑥, 𝑦 two state vectors
•
Superposition:
•
Homogeneity:

Linearity:
𝐻 𝑥 + 𝑦 = 𝐻(𝑥) + 𝐻(𝑦)
𝐻(𝑐𝑥) = 𝑐𝐻(𝑥)
a, b, 𝑐 scalar
𝐻 𝑎𝑥 + 𝑏𝑦 = 𝑎𝐻(𝑥) + 𝑏𝐻(𝑦)
Overview
•
Static measures
- Moments of amplitude distribution (1st – 4th)
•
Dynamic measures
- Autocorrelation
- Fourier spectrum
- Wavelet spectrum
Static measures
• Based on analysis of distributions (e.g. amplitudes)
• Do not contain any information about dynamics
• Example: Moments of a distribution
- First moment: Mean
- Second moment: Variance
- Third moment: Skewness
- Fourth moment: Kurtosis
First moment: Mean
Average of distribution
Second moment: Variance
Width of distribution
(Variability, dispersion)
Standard deviation
Third moment: Skewness
Degree of asymmetry of distribution
(relative to normal distribution)
Skewness
< 0 - asymmetric, more negative tails
= 0 - symmetric
> 0 - asymmetric, more positive tails
Fourth moment: Kurtosis
Degree of flatness / steepness of distribution
(relative to normal distribution)
Kurtosis
< 0 - platykurtic (flat)
= 0 - mesokurtic (normal)
> 0 - leptokurtic (peaked)
Dynamic measures
Physical phenomenon
Time series
Time domain
Frequency domain
x (t)
Fx(𝜔)
Amplitude
Frequency amplitude
Complex number  Phase
−∞ < 𝜔 < ∞
Autocorrelation
[ Cross correlation
Fourier spectrum
Covariance ]
Autocorrelation
One signal 𝑥𝑛 with 𝑛 = 1, … , 𝑁
(Normalized to zero mean and unit variance)
Time domain: Dependence on time lag 𝝉
C XX
 1 N  '
'
x
x

n  n
( )   N   
n 1

C XX (  )

C XX ( 0 )  C XX ( )
 0
 0

Autocorrelation: Examples
periodic
𝜏
stochastic
𝜏
memory
𝜏
Discrete Fourier transform
Fourier series (sines and cosines):
Fourier coefficients:
Fourier series (complex exponentials):
Fourier coefficients:
Condition:
Power spectrum
𝑃 𝜔 =
1
lim 𝐹
𝑇→∞ 𝑇
𝜔
1
∗
𝐹 (𝑤)= lim
𝑇→∞ 𝑇
𝐹 𝜔
2
Wiener-Khinchin theorem:
∞
∞
−∞
−∞
Parseval’s theorem:
Overall power:
𝑃(𝜔)𝑒 𝑖𝜏𝜔 𝑑𝜔
𝐶𝑋𝑋 (𝜏) =
𝐶𝑋𝑋 (𝜏)𝑒 −𝑖𝜏𝜔 𝑑𝜏
𝑃(𝜔) =
∞
𝑃(𝜔) =
∞
𝑥 𝑡
−∞
2 𝑑𝑡
=
𝐹𝑋 𝜔
−∞
2 𝑑𝜔
Tapering: Window functions
Fourier transform assumes periodicity  Edge effect
Solution: Tapering (zeros at the edges)
EEG frequency bands
Description of brain rhythms
• Delta:
0.5 –
4 Hz
• Theta:
4
–
8 Hz
• Alpha:
8
– 12 Hz
• Beta:
12
– 30 Hz
• Gamma:
> 30 Hz
[Buzsáki. Rhythms of the brain. Oxford University Press, 2006]
Example: White noise
Example: Rössler system
Example: Lorenz system
Example: Hénon map
Example: Inter-ictal EEG
Example: Ictal EEG
Time-frequency representation
Wavelet analysis
Basis functions with finite support
Example: complex Morlet wavelet
𝑎 – scaling; 𝑏 – shift / translation
(Mother wavelet: 𝑎 = 1, 𝑏 = 0)
Implementation via filter banks (cascaded lowpass & highpass):
𝑔 – lowpass
(approximation)
ℎ – highpass
(detail)
Wavelet analysis: Example
Advantages:
- Localized in both
frequency and time
Power
- Mother wavelet can
be selected according
to the feature of interest
Further applications:
- Filtering
- Denoising
- Compression
[Latka et al. Wavelet mapping of sleep splindles in epilepsy, JPP, 2005]
Introduction to
non-linear dynamics
Linear systems
•
Weak causality
identical causes have the same effect
(strong idealization, not realistic in experimental situations)
•
Strong causality
similar causes have similar effects
(includes weak causality
applicable to experimental situations, small deviations in
initial conditions; external disturbances)
Non-linear systems
Violation of strong causality
Similar causes can have different effects
Sensitive dependence on initial conditions
(Deterministic chaos)
Linearity / Non-linearity
Linear systems
- have simple solutions
- Changes of parameters and initial
conditions lead to proportional effects
Non-linear systems
- can have complicated solutions
- Changes of parameters and initial conditions lead to nonproportional effects
Non-linear systems are the rule, linear system is special case!
Phase space example: Pendulum
Position x(t)
Time series:
t
Velocity v(t)
State space:
Phase space example: Pendulum
Ideal world:
Real world:
Phase space
Phase space: space in which all possible states of a system
are represented, with each possible system
state corresponding to one unique point in a
d dimensional cartesian space
(d - number of system variables)
Pendulum: d = 2 (position, velocity)
Trajectory: time-ordered set of states of a dynamical
system, movement in phase space
(continuous for flows, discrete for maps)
Vector fields in phase space
Dynamical system described by time-dependent states
𝑑𝒙
𝑑𝑡
= 𝒇(𝒙)
𝒇: ℛ 𝑑  ℛ 𝑑
ℛ 𝑑 – d-dimensional phase space
𝒇 – Vector field (assignment of a vector to
each point in a subset of Euclidean space)
Examples:
- Speed and direction of a moving fluid
- Strength and direction of a magnetic force
Here: Flow in phase space
Initial condition 𝒙(𝒕𝟎 )  Trajectory 𝒙(t)
Divergence
Rate of change of an infinitesimal volume around a given point
of a vector field:
-
Source: outgoing flow
-
Sink:
(𝑥 with 𝐷𝑖𝑣 𝑓 𝑥 > 0, expansion)
incoming flow (𝑥 with 𝐷𝑖𝑣 𝑓 𝑥 < 0, contraction)
System classification via divergence
Liouville’s theorem:
Temporal evolution of an infinitesimal volume:
𝐷𝑖𝑣 𝑓 = 0
conservative (Hamiltonian) systems
𝐷𝑖𝑣 𝑓 < 0
dissipative systems
𝐷𝑖𝑣 𝑓 > 0
instable systems
Dynamical systems in the real world
•
In the real world internal and external friction leads to
dissipation
•
Impossibility of perpetuum mobile
(without continuous driving / energy input, the motion stops)
•
When disturbed, a system, after some initial transients,
settles on its typical behavior (stationary dynamics)
•
Attractor: Part of the phase space of the dynamical system
corresponding to the typical behavior.
Attractor
Subset X of phase space which satisfies three conditions:
• X is forward invariant under f:
If x is an element of X, then so is f(t,x) for all t > 0.
• There exists a neighborhood of X, called the basin of
attraction B(X), which consists of all points b that "enter
X in the limit t → ∞".
• There is no proper subset of X having the first two
properties.
Attractor classification
Fixed point: point that is mapped to itself
Limit cycle: periodic orbit of the system that is isolated (i.e.,
has its own basin of attraction)
Limit torus: quasi-periodic motion defined by n
incommensurate frequencies (n-torus)
Strange attractor: Attractor with a fractal structure
(2-torus)
Introduction to
phase space reconstruction
Phase space reconstruction
•
Dynamical equations known (e.g. Lorenz, Rössler):
System variables span d-dimensional phase space
•
Real world: Information incomplete
Typical situation:
- Measurement of just one or a few system variables
(observables)
- Dimension (number of system variables, degrees of
freedom) unknown
- Noise
- Limited recording time
- Limited precision
Reconstruction of phase space possible?
Taken’s embedding theorem
Trajectory 𝒙(𝑡) of a dynamical system in 𝑑 - dimensional
phase space ℛ𝑑 .
One observable measured via some measurement function 𝑀:
𝑜 𝑡 = 𝑀(𝒙 𝑡 ); M: ℛ𝑑  ℛ
It is possible to reconstruct a topologically equivalent attractor
via time delay embedding:
𝒐 𝑡 = [𝑜 𝑡 , 𝑜 𝑡 − 𝜏 , 𝑜 𝑡 − 2𝜏 , … , 𝑜 𝑡 − (𝑚 − 1)𝜏 ]
𝜏 - time lag, delay; 𝑚 – embedding dimension
[F. Takens. Detecting strange attractors in turbulence. Springer, Berlin, 1980]
Taken’s embedding theorem
Main idea: Inaccessible degrees of freedoms are coupled into
the observable variable 𝑜 via the system dynamics.
Mathematical assumptions:
- Observation function and its derivative must be differentiable
- Derivative must be of full rank (no symmetries in
components)
- Whitney’s theorem: Embedding dimension 𝑚 ≥ 2𝑑 + 1
Some generalizations:
Embedding theorem by Sauer, Yorke, Casdagli
[Whitney. Differentiable manifolds. Ann Math,1936; Sauer et al. Embeddology. J Stat Phys, 1991.]
Topological equivalence
original
reconstructed
Example: White noise
Example: Rössler system
Example: Lorenz system
Example: Hénon map
Example: Inter-ictal EEG
Example: Ictal EEG
Real time series
Phase space reconstruction / Embedding:
First step for many non-linear measures
Choice of parameters:
Window length T
- Not too long (Stationarity, control parameters constant)
- Not too short (sufficient density of phase space required)
Embedding parameters
- Time delay τ
- Embedding dimension 𝑚
Influence of time delay
Selection of time delay 𝜏 (given optimal embedding dimension 𝑚)
•
𝝉 too small:
- Correlation in time dominate
- No visible structure
- Attractor not unfolded
•
𝝉 too large:
- Overlay of attractor regions that are rather separated in the
original attractor
- Attractor overfolded
•
𝝉 optimal:
Attractor unfolded
Influence of time delay
𝜏 too small
𝜏 too large
𝜏 optimal:
Criterion: Selection of time delay
Aim: Independence of successive values
•
First zero crossing of autocorrelation function
(only linear correlations)
•
First minimum of mutual information function
(also takes into account non-linear correlations)
[Mutual information: how much does knowledge of x 𝑡
tell you about x(𝑡 + 𝜏)]
Criterion: Selection of embedding dimension
Aim: Unfolding of attractor (no projections)
• Attractor dimension 𝑑 known: Whitney’s theorem: 𝑚 ≥ 2𝑑 + 1
• Attractor dimension 𝑑 unknown (typical for real time series):
 Method of false nearest neighbors
𝑚 < 𝑚𝑜𝑝𝑡 : Trajectory crossings, phase space neighbors close
𝑚 ≥ 𝑚𝑜𝑝𝑡 : Increase of distance between phase space neighbors
Procedure: - For given m count neighbors with distance 𝜀 < 𝜀𝑚𝑖𝑛
- Check if count decreases for larger 𝑚
(if yes some were false nearest neighbors)
- Repeat until number of nearest neighbors constant
[Kennel & Abarbanel, Phys Rev A 1992]
Non-linear measures
Non-linear deterministic systems
• No analytic solution of non-linear differential equations
• Superposition of solutions not necessarily a solution
• Behavior of system qualitatively rich
e.g. change of dynamics in dependence of control
parameter (bifurcations)
• Sensitive dependence on initial conditions
Deterministic chaos
Bifurcation diagram: Logistic map
Bifurcation: Dynamic change in dependence of control parameter
Fixed point
Period doubling Chaos
Deterministic chaos
• Chaos (every-day use):
- State of disorder and irregularity
• Deterministic chaos
- irregular (non-periodic) evolution of state variables
- unpredictable (or only short-time predictability)
- described by deterministic state equations
(in contrast to stochastic systems)
- shows instabilities and recurrences
Deterministic chaos
regular
chaotic
random
deterministic
deterministic
stochastic
Long-time
predictions
possible
Rather unpredictable
unpredictable
Strong causality
No strong
causality
Non-linearity
Uncontrolled
(external)
influences
Characterization of non-linear systems
Linear meaures:
•
Static measures (e.g. moments of amplitude distribution):
- Some hints on non-linearity
- No information about dynamics
•
Dynamic measures (autocorrelation and Fourier spectrum)
Autocorrelation
Wiener-Khinchin-Theorem
Fast decay, no memory
Fourier
Typically broadband
Distinction from noise?
Characterizition of a dynamic in phase space
Stability (sensitivity
to initial conditions)
Density
(Information / Entropy)
Predictability
Determinism /
Stochasticity
Linearity /
Non-linearity
Self-similarity
(Dimension)
Lyapunov-exponent
Stability
Analysis of long-term behavior (𝑡 → ∞) of a dynamic system
•
•
Unlimited growth (unrealistic)
Limited dynamics
- Fixed point / Some kind of equilibrium
- periodic or quasi-periodic motion
- chaotic motion (expansion and folding)
How stable is the dynamics?
- when the control parameter changes
- when disturbed (push to neighboring points in phase space)
Stability of equilibrium points
Dynamical system described by time-dependent states
𝑑𝒙
𝑑𝑡
= 𝒇(𝒙(𝑡))
𝑥 0 = 𝑥0
𝒇: ℛ 𝑑  ℛ 𝑑
Suppose 𝒇 has an equilibrium 𝑥𝑒 .
• The equilibrium of the above system is Lyapunov stable, if,
for every 𝜀 > 0, there exists a δ = δ(𝜀) > 0 such that if
𝑥0 − 𝑥𝑒 < δ, then 𝑥(𝑡) − 𝑥𝑒 < 𝜀, for every 𝑡 ≥ 0.
• It is asymptotically stable, if it is Lyapunov stable and if
there exists δ > 0 such that if 𝑥0 − 𝑥𝑒 < δ, then lim 𝑥(𝑡) −
𝑡→∞
Stability of equilibrium points
•
Lyapunov stability: Tube of diameter 𝜺
Solutions starting "close enough" to the equilibrium (within a
distance from it) remain "close enough" forever (within a
distance from it). Must be true for any 𝜀 that one may want to
choose.
•
Asymptotic stability:
Solutions that start close enough not only remain close
enough but also eventually converge to the equilibrium.
•
Exponential stability:
Solutions not only converge, but in fact converge faster than
or at least as fast as a particular known rate.
Divergence and convergence
Chaotic trajectories are Lyapunov-instable:
•
Divergence:
Neighboring trajectories expand
Such that their distance increases
exponentially (Expansion)
•
Convergence:
Expansion of trajectories to the
attractor limits is followed by a
decrease of distance (Folding).
 Sensitive dependence on initial conditions
Quantification: Lyapunov-exponent
Lyapunov-exponent
Calculated via perturbation theory:
δ𝑥 infintesimal perturbation in the initial conditions
𝑑𝒙
𝑑𝑡
=𝒇 𝒙
𝑑(𝒙 + δ𝑥)
= 𝒇(𝒙 + δ𝑥)
𝑑𝑡
= 𝒇 𝒙 + δ𝑥𝐷𝑓(𝑥) + 𝑜(2) Taylor series
Local linearization:
Solution:
𝑑δ𝑥
𝑑𝑡
= δ𝑥𝐷𝑓(𝑥)
δ𝑥(𝑡) = δ𝑥(0)𝑒 𝜆𝑡
𝐷𝑓(𝑥) - Jacobi-Matrix
𝜆 - Lyapunov exponent
Lyapunov-exponent
In m-dimensional phase space:
Lyapunov-spectrum:
𝜆𝑖 , 𝑖 = 1, … , 𝑚
(expansion rates for different dimensions)
Relation to divergence:
Dissipative system:
𝑑𝑖𝑣 𝑓 =
𝑖 𝜆𝑖
𝑖 𝜆𝑖
<0
Largest Lyapunov exponent (LLE) 𝝀𝟏 (often 𝜆):
Regular dynamics
Chaotic dynamics
Stochastic dynamics
Stable fixed point
𝜆1
𝜆1
𝜆1
𝜆1
=0
>0
→∞
<0
Example: Logistic map
Bifurcation diagram
Fixed point
Period doubling
Chaos
Largest Lyapunov exponent 𝝀
Dependence of the
control parameter
Lyapunov-exponent
-
Dynamic characterization of attractors (Stability properties)
Classification of attractors via the signs of the Lyapunovspectrum
Average loss of information regarding the initial conditions
Average prediction time:
(𝜌 – localization precision of initial condition,
j+ – index of last positive Lyapunov exponent)
Largest Lyapunov-exponent: Estimation
-
Reference trajectory:
Neighboring trajectory:
Initial distance:
Distance after T time steps:
𝑥(0)
𝑦1 (0)
𝑑 0 = 𝑥 0 − 𝑦1 (0)
𝑑 𝑇 = 𝑥 𝑇 − 𝑦1 (𝑇)
Λ(1) =
- Expansion factor:
𝑑 𝑇
𝑑 0
Largest Lyapunov exponent (LLE):
1
𝜆(𝑇) =
𝑇
𝑙
ln 𝛬 𝑖
- New neighboring trajectory 𝑦2 (𝑇) to 𝑥 𝑇 , 𝑦3 (2𝑇) to 𝑥 2𝑇 etc.
- Calculate 𝑙 times: 𝛬 𝑖 , 𝑖 = 1, … , 𝑙
~𝜆
Λ(𝑖)
𝑖=1
𝑇
[Wolf et al. Determining Lyapunov exponents from a time series, Physica D 1985]
Today’s lecture
Non-linear model systems
Linear measures
Introduction to non-linear dynamics
Non-linear measures
- Introduction to phase space reconstruction
- Lyapunov exponent
Next lecture
Non-linear measures
- Dimension
- Entropies
- Determinism
- Tests for Non-linearity, Time series surrogates
Download