Synchronicity

advertisement
Lecture series: Data analysis
Thomas Kreuz, ISC, CNR
thomas.kreuz@cnr.it
http://www.fi.isc.cnr.it/users/thomas.kreuz/
Lectures: Each Tuesday at 16:00
(First lecture: May 21, last lecture: June 25)
Schedule
•
Lecture 1: Example (Epilepsy & spike train synchrony),
Data acquisition, Dynamical systems
•
Lecture 2: Linear measures, Introduction to non-linear
dynamics
•
Lecture 3: Non-linear measures
•
Lecture 4: Measures of continuous synchronization
•
Lecture 5: Measures of discrete synchronization
(spike trains)
•
Lecture 6: Measure comparison & Application to epileptic
seizure prediction
First lecture
• Example: Epileptic seizure prediction
• Data acquisition
• Introduction to dynamical systems
Second lecture
Non-linear model systems
Linear measures
Introduction to non-linear dynamics
Non-linear measures
- Introduction to phase space reconstruction
- Lyapunov exponent
Third lecture
Non-linear measures
- Dimension
[ Excursion: Fractals ]
- Entropies
- Relationships among non-linear measures
Characterizition of a dynamic in phase space
Stability (sensitivity
to initial conditions)
Density
(Information / Entropy)
Predictability
Determinism /
Stochasticity
Linearity /
Non-linearity
Self-similarity
(Dimension)
Dimension (classical)
Number of degrees of freedom necessary to characterize a
geometric object
Euclidean geometry: Integer dimensions
Object
Point
Line
Square (Area)
Cube (Volume)
N-cube
Dimension
0
1
2
3
n
Time series analysis:
Number of equations necessary to model a physical system
Hausdorff-dimension
Box-counting
Box-counting
Richardson: Counter-intuitive notion that a coastline's measured length
changes with the length of the measuring stick used.
Fractal dimension of a coastline: How does the number of measuring sticks
required to measure the coastline change with the scale of the stick?
Example: Koch-curve
Some properties:
- Infinite length
- Continuous everywhere
- Differentiable nowhere
- Fractal dimension D=log4/log3≈ 1.26
Strange attractors are fractals
Logistic map
Hénon map
Rössler System
(𝑎 = 0.15, b = 0.2; c = 10)
2,01
Self-similarity of the logistic attractor
Generalized dimensions
𝑫𝒌′ ≤ 𝑫𝒌 for 𝒌′ > 𝒌
Monotonous decrease with 𝑘
𝐷0 - Haussdorff-dimension
𝐷1 - Fractal dimension
𝐷2 - Correlation dimension
Static measure of system complexity (degrees of freedom):
Regular dynamics
𝑫 integer
Chaotic dynamics
𝑫 fractal
Stochastic dynamics
𝑫→∞
Generalized entropies
𝑲𝒒′ ≤ 𝑲𝒒 for 𝒒′ > 𝒒
Monotonous decrease with 𝑞
𝐾0 - Topological entropy
𝐾1 - Metric entropy
𝐾2 - Correlation entropy
Dynamic measure of system disorder:
Regular dynamics
𝑲=𝟎
Chaotic dynamics
𝑲>𝟎
Stochastic dynamics
𝑲→∞
Lyapunov-exponent
Rate of separation of infinitesimally close trajectories
(Sensitivity to initial conditions)
Largest Lyapunov exponent (LLE) 𝝀𝟏 (often 𝜆):
Regular dynamics
Chaotic dynamics
Stochastic dynamics
Stable fixed point
𝝀𝟏 = 𝟎
𝝀𝟏 > 𝟎
𝝀𝟏 → ∞
𝝀𝟏 < 𝟎
Summary
Regular dynamics
𝑫 integer; 𝝀𝟏 , 𝑲 = 𝟎
Chaotic dynamics
𝑫 fractal; 𝝀𝟏 , 𝑲 > 𝟎
Stochastic dynamics
𝑫, 𝝀𝟏 , 𝑲 → ∞
Today’s lecture
Motivation
Measures of synchronization for continuous data
• Linear measures: Cross correlation, coherence
• Mutual information
• Phase synchronization (Hilbert transform)
• Non-linear interdependences
Measure comparison on model systems
Measures of directionality
• Granger causality
• Transfer entropy
Motivation
Motivation: Bivariate time series analysis
Three different scenarios:
•
Repeated measurement from one system (different times)
 Stationarity, Reliability
•
Simultaneous measurement from one system (same time)
 Coupling, Correlation, Synchronization, Directionality
•
Simultaneous measurement from two systems (same time)
 Coupling, Correlation, Synchronization, Directionality
Synchronization
Etymology:
‘synchronous’ and ‘synchronicity’
𝜎𝜐𝜈 (syn = common) and 𝜒𝜌𝜊𝜈𝜊𝜍 (chronos = time)
“Happening at the same time”
But:
‘synchronization’
Implies (active) adjustment of rhythms of different oscillating
systems due to some kind of interaction or coupling
Huygens, 1673:
[Huygens: Horologium Oscillatorium. 1673]
Synchronization
•
Complete / identical synchronization
lim 𝒙 𝑡 − 𝒚(𝑡) = 𝟎
𝒕→∞
•
Phase synchronization
| m x (t )  n y (t ) | const
•
Only possible for identical systems
mn-phase locking, amplitudes uncorrelated
Generalized synchronization
𝒚 𝑡 = 𝜓(𝒙(𝑡))
Unidirectionally coupled systems:
Largest Lyapunov exponent of the responder negative
[Necessary (but not sufficient) condition]
[Pecora & Carroll. Synchronization in chaotic systems. Phys Rev Lett 1990]
Synchronization
In-phase synchronization
[Pikovsky & Rosenblum: Synchronization. Scholarpedia (2007)]
Synchronization
Anti-phase synchronization
[Pikovsky & Rosenblum: Synchronization. Scholarpedia (2007)]
Synchronization
Synchronization with phase shift
[Pikovsky & Rosenblum: Synchronization. Scholarpedia (2007)]
Synchronization
No synchronization
[Pikovsky & Rosenblum: Synchronization. Scholarpedia (2007)]
Synchronization
In-phase synchronization
Anti-phase synchronization
Synchronization with phase shift
No synchronization
[Pikovsky & Rosenblum: Synchronization. Scholarpedia (2007)]
Measures of synchronization
Synchronization
Directionality
Cross correlation / Coherence
Mutual Information
Index of phase synchronization
- based on Hilbert transform
- based on Wavelet transform
Non-linear interdependence
Non-linear interdependence
Event synchronization
Delay asymmetry
Transfer entropy
Granger causality
Linear
correlation
Static linear correlation: Pearson’s r
Two sets of data points:
r=
-1
0
1
- completely anti-correlated
- uncorrelated (linearly!)
- completely correlated
Examples: Pearson’s r
Undefined
[An example of the correlation of x and y for various distributions of (x,y) pairs; Denis Boigelot 2011]
Cross correlation
Two signals 𝑥𝑛 and 𝑦𝑛 with 𝑛 = 1, … , 𝑁
(Normalized to zero mean and unit variance)
Time domain, dependence on time lag 𝜏 :
 1 N  ' '
xn  yn


C XY ( )   N   n 1

CYX ( )
Maximum cross correlation:
 0
 0
Cmax  maxC XY ( ) 

Coherence
Linear correlation in the frequency domain
Cross spectrum:
C XY ()  FX ()  F ()
*
Y
𝐹 – Fourier transform, 𝜔 – discrete frequencies, * - Complex conjugation
Complex number  Phase
Coherence = Normalized power in the cross spectrum
| C XY () |2
XY () 
C XX () CYY ()
Welch’s method: average over estimated periodograms of subintervals of equal length
Mutual
information
Shannon entropy
𝐻=−
𝑝𝑖 log 2 𝑝𝑖
𝑖
Binary probabilities:
𝐻(𝑝)
Shannon entropy
~ ‘Uncertainty’
In general:
𝐻 𝑝 =1
𝐻 𝑝 =0
𝑝
Mutual Information
M
Marginal Shannon entropy:
H ( X )   p x (i) log p x (i)
i
M
Joint Shannon entropy:
H ( X , Y )   p xy (i, j ) log p xy (i, j )
i, j
Mutual Information:
M
p xy (i, j )
i, j
p x (i) p y ( j )
I ( X , Y )   p xy (i, j ) log
Kullback-Leibler entropy compares to probability distributions
Mutual Information = KL-Entropy with respect to independence
Estimation based on k-nearest neighbor distances:
I ( X , Y )   (k )   (nx  1)  (n y  1)   ( N ) 𝛹 - digamma function
[Kraskov, Stögbauer, Grassberger: Estimating Mutual Information. Phys Rev E 2004]
Mutual Information
Properties:
Non-negativity:
I ( X ,Y )  0
Symmetry:
I ( X , Y )  I (Y , X )
Minimum:
I ( X ,Y )  0
Maximum:
I ( X , X )  H ( X ) for identical systems
Independent time series
Venn diagram (Set theory)
I ( X , Y )  H ( X )  H (Y )  H ( X , Y )
Cross correlation & Mutual Information
Cmax
I
Cmax
I
Cmax
I
1.0
1.0
1.0
0.5
0.5
0.5
0.0
0.0
0.0
Phase
synchronization
Phase synchronization
•
Definition of a phase
- Rice phase
- Hilbert phase
- Wavelet phase
•
Index of phase synchronization
- Index based on circular variance
- [Index based on Shannon entropy]
- [Index based on conditional entropy]
[Tass et al. PRL 1998]
Rice phase
Linear interpolation between ‘marker events’
- threshold crossings
(mostly zero, sometimes after demeaning)
- discrete events
(begin of a new cycle)
Problem: Can be very sensitive to noise
Hilbert phase
Analytic signal:
i sH ( t )
H
~
z (t )  s (t )  is (t )  As (t )e
‘Artificial’ imaginary part:
~
s (t )  ( s )(t ) 
1


s (t ' )
dt '
t  t'


 - Cauchy principal value
Frequency domain:
𝜋
Phase shift of original signal by
2
~
s (t )  FT 1[ i sign( ) FT[ s (t )]]
Instantaneous Hilbert phase:
~
s (t )
 (t )  arctan
s (t )
[Rosenblum et al., Phys. Rev. Lett. 1996]
Wavelet phase
Basis functions with finite support
Example: complex Morlet wavelet
ImW (t )
Wavelet phase:  (t )  arctan
Re W (t )

W (t )    (t  t ) s (t )dt 

Wavelet = Hilbert + filter
[Quian Quiroga, Kraskov, Kreuz, Grassberger. Phys. Rev. E 2002]
Index of phase synchronization:
Circular variance (CV)
1
 cv 
N
N
e
j 1
i[1 ( t j ) 2 ( t j )]
 1  CV
Non-linear
interdependence
Taken’s embedding theorem
Trajectory 𝒙(𝑡) of a dynamical system in 𝑑 - dimensional
phase space ℛ𝑑 .
One observable measured via some measurement function 𝑀:
𝑜 𝑡 = 𝑀(𝒙 𝑡 ); M: ℛ𝑑  ℛ
It is possible to reconstruct a topologically equivalent attractor
via time delay embedding:
𝒐 𝑡 = [𝑜 𝑡 , 𝑜 𝑡 − 𝜏 , 𝑜 𝑡 − 2𝜏 , … , 𝑜 𝑡 − (𝑚 − 1)𝜏 ]
𝜏 - time lag, delay; 𝑚 – embedding dimension
[F. Takens. Detecting strange attractors in turbulence. Springer, Berlin, 1980]
Non-linear interdependences
k
N
k
1
1
1
(
k
)
2
(N )
(k )
2
( xi  x j )2
Ri ( X )   ( x i  x ri , j ) Ri ( X | Y )   ( x i  x ri , j ) Ri ( X ) 

k j 1
N  1 j 1, j i
k j 1
Nonlinear interdependence S
1
S(X | Y ) 
N
Ri( k ) ( X )

(k )
(X |Y)
i 1 Ri
Nonlinear interdependence H
N
Ss 
S ( X | Y )  S (Y | X )
2
Sa 
S ( X | Y )  S (Y | X )
2
And equivalent
for 𝑆 𝑌 𝑋 and
𝐻 𝑌𝑋
1
H (X |Y) 
N
Ri( N ) ( X )
log k

Ri ( X | Y )
i 1
N
Synchronization
Hs 
H ( X | Y )  H (Y | X )
2
Directionality
Ha 
H ( X | Y )  H (Y | X )
2
[Arnhold, Lehnertz, Grassberger, Elger. Physica D 1999]
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Non-linear interdependence
Event
synchronization
Event synchronization
tix , t jy with i  1,..., mx ; j  1,..., m y
Event times:
Window:
 1 if 0  tix  t jy  
1
J ij   2 if tix  t jy
Avoids double-counting
 0 else

Synchronicity:
Event synchronization:
(J

Q
ij
 J ji )
mx m y
Delay asymmetry:
(J

q
ij
 J ji )
mx m y
[Quian Quiroga, Kreuz, Grassberger. Phys Rev E 2002]
Event synchronization
Q
q
[Quian Quiroga, Kreuz, Grassberger. Phys Rev E 2002]
Measure comparison
on model systems
Measure comparison on model systems
Measures of synchronization
Model systems
Maximum cross correlation 𝐶𝑚𝑎𝑥
Mutual Information 𝐼
Index of phase synchronization
- based on Hilbert transform 𝛾 𝐻∗
- based on Wavelet transform 𝛾 𝑊∗
Non-linear interdependence 𝑆
Event synchronization 𝑄∗
Coupled Hénon maps
Coupled Rössler systems
Coupled Lorenz systems
Criterions
- Degree of monotonicity
- Robustness to noise
Main assumption:
Increase in coupling  Increase in synchronization
[Kreuz, Mormann, Andrzejak, Kraskov, Lehnertz, Grassberger. Phys D 2007]
Model systems &
Coupling schemes
Hénon map
• Introduced by Michel Hénon as a simplified model of the
Poincaré section of the Lorenz model
• One of the most studied examples of dynamical systems
that exhibit chaotic behavior
[M. Hénon. A two-dimensional mapping with a strange attractor. Commun. Math. Phys., 50:69, 1976]
Hénon map
Coupled Hénon maps
Driver:
Responder:
Identical systems:
Coupling strength: 𝐶 = 0, … , 0.8 in steps of 0.01 (81 values)
Coupled Hénon maps
Coupled Hénon systems
Rössler system
𝑑𝒙
= −𝜔(𝑦 + 𝑧)
𝑑𝑡
𝑑𝒚
= 𝜔(𝑥 + 𝑎𝑦)
𝑑𝑡
𝑑𝒛
= 𝑏 + 𝑧(𝑥 − 𝑐)
𝑑𝑡
𝑎 = 0.15, b = 0.2; c = 10
• designed in 1976, for purely theoretical reasons
• later found to be useful in modeling equilibrium in
chemical reactions
[O. E. Rössler. An equation for continuous chaos. Phys. Lett. A, 57:397, 1976]
Rössler system
Coupled Rössler systems
Driver:
Responder:
Parameter mismatch: 𝜔𝑥 = 0.95, 𝜔𝑦 = 1.05
Coupling strength: 𝐶 = 0, … , 2 in steps of 0.025 (81 values)
Coupled Rössler systems
Coupled Rössler systems
Lorenz system
𝑑𝒙
=σ y−x
𝑑𝑡
𝑑𝒚
= −𝑦 − 𝑥𝑧 + 𝑅𝑥
𝑑𝑡
𝑑𝒛
= 𝑥𝑦 − 𝑏𝑧
𝑑𝑡
𝑅 = 28, σ = 10; b = 8/3
• Developed in 1963 as a simplified mathematical model for
atmospheric convection
• Arise in simplified models for lasers, dynamos, electric
circuits, and chemical reactions
[E. N. Lorenz. Deterministic non-periodic flow. J. Atmos. Sci., 20:130, 1963]
Lorenz system
Coupled Lorenz systems
Driver:
Responder:
Small parameter mismatch in second component
Coupling strength: 𝐶 = 0, … , 2 in steps of 0.025 (81 values)
Coupled Lorenz systems
Coupled Lorenz systems
Noise-free case
Criterion I: Degree of monotonicity
Sequence: 𝑠𝑖 with 𝑖 = 1, … , 𝑟 (number of coupling strengths)
(Strictly) monotonic dependence: 𝑠𝑖 ≤ 𝑠𝑗 (𝑠𝑖 < 𝑠𝑗 ) for 𝑖 ≤ 𝑗
M(s)
= 1 - strictly monotonic increase
= 0 - flat line (or equal decrease and increase)
= -1 - strictly monotonic decrease
• Independent of absolute values of 𝑠
• No dependence on form of in- / decrease
(e.g. polynomial, exponential, etc.)
Degree of monotonicity: Examples
Sequences:
100 values
 5050 pairs
Left:
Monotonicity
Right:
# positive
# negative
Comparison: No Noise
Summary: No-Noise-Comparison
•
Results for Rössler are more consistent than for the other
systems
•
Mutual Information slightly better than cross correlation
(Non-linearity matters)
•
Wavelet phase synchronization not appropiate for
broadband systems (inherent filtering looses information)
Robustness
against noise
Criterion II: Robustness against noise
Noise-signal ratio:
𝑁𝑆𝑅 =
𝜎𝑛𝑜𝑖𝑠𝑒
𝜎𝑠𝑖𝑔𝑛𝑎𝑙
= 10−2+0.1𝑛 with 𝑛 = 0, … , 30
Covers range from −0.01 to 10 (three decades) equidistantly
on logarithmic scale
Critical noise level NSRC: First crossing of threshold 𝑀𝑛∗ =
White noise
Iso-spectral noise
1
2
Example: White noise
Hénon system: White noise
Rössler system : White noise
Lorenz system: White noise
Hénon system
Critical noise level NSRC: First crossing of threshold 𝑀𝑛∗ =
(the higher the better)
1
2
Comparison: White noise
Summary: White noise
• For systems opposite order as in the noise-free case
(Lorenz more robust then Hénon and than Rössler)
 the more monotonous a system has been without noise,
the less noise is necessary to destroy this monotonicity
• Highest robustness is obtained for cross correlation followed
by mutual information.
Iso-spectral noise: Example
Iso-spectral noise: Fourier spectrum complex
Physical phenomenon
Time series
Time domain
Frequency domain
x (t)
Fx(𝜔)
Amplitude
Frequency amplitude
Complex number  Phase
−∞ < 𝜔 < ∞
Autocorrelation
Fourier spectrum
Generation of iso-spectral noise
Phase-randomized surrogates:
• Take Fourier transform of original signal
• Randomize phases
• Take inverse Fourier transform
 Iso-spectral surrogate
(By construction identical Power spectrum, just different phases)
• Add to original signal with given NSR
Lorenz system: Iso-spectral noise
Comparison: Iso-spectral noise
Values beyond highest NSRC : Threshold 𝑀𝑛∗ =
1
2
not crossed.
Summary: Iso-spectral noise
• Again results for Rössler are more consistent than for the
other systems
• Sometimes M never crosses critical threshold (monotonicity
of the noise-free case is not destroyed by iso-spectral noise).
•
Sometimes synchronization increases for more noise:
(spurious) synchronization between contaminating
noise-signals, only for narrow-band systems
Correlation
among measures
Correlation among measures
Correlation among measures
Correlation among measures
Correlation among measures
Summary: Correlation
• All correlation values rather high (Minimum: ~0.65)
•
Highest correlations for cross correlation and
Hilbert phase synchronization
•
Event synchronization and Hilbert phase synchronization
appear least correlated
•
Overall correlation between two phase synchronization
methods low (but only due to different frequency sensitivity
in the Hénon system)
Overall summary: Comparison of measures
•
Capability to distinguish different coupling strengths
Obvious and objective criterion exists only in some
special cases (e.g., wavelet phase is not very suitable for
a system with a broadband spectrum).
•
Robustness against noise varies
(Important criterion for noisy data)
 Pragmatic solution:
Choose measure which most reliably yields valuable
information (e.g., information useful for diagnostic
purposes) in test applications
Measures of
directionality
Measures of directionality
Asymmetry:
𝑀(𝑋 → 𝑌) ≠ 𝑀(𝑌 → 𝑋)
𝑀 𝑋 → 𝑌 > 𝑀(𝑌 → 𝑋)  𝑋 drives 𝑌
𝑀 𝑋 → 𝑌 < 𝑀(𝑌 → 𝑋)  𝑌 drives 𝑋
• Non-linear interdependences
• Delay asymmetry (Event synchronization)
• [ Asymmetric phase synchronization ]
•
Granger causality
•
Transfer entropy
Granger causality
Aim: Prediction of a signal
Can a prediction which relies only on the information from its
own past (univariate model) be improved by incorporating
past information from the other signal (bivariate model)?
If the variance of the prediction error for the current value
of 𝑥𝑛 is significantly reduced by including past observations
of 𝑦𝑛 , then 𝑦𝑛 can be said to (Granger-)cause 𝑥𝑛 , and vice
versa.
[Granger: Investigating causal relations by econometric models and cross-spectral methods.
Econometrica 37, 424-438 (1969)]
Granger causality
K
Univariate model:
xn   akx xnk unx
k 1
K
yn   aky yn k uny
k 1
Bivariate model:
K
K
k 1
k 1
xn   akxy xn k   bkxy yn k unxy
K
K
yn   a ynk   bkyx xn k unyx
k 1
yx
k
k 1
𝐾 – Model order; akx , aky , akxy , akyx , bkxy , bkyx – Model parameters;
unx , uny , unxy , unyx – Prediction errors; Fit via linear regression
[Granger: Investigating causal relations by econometric models and cross-spectral methods.
Econometrica 37, 424-438 (1969)]
Transfer Entropy: Conditional entropy
M
p xy (i, j )
i, j
p x (i) p y ( j )
I ( X , Y )   p xy (i, j ) log
Mutual Information:
I ( X , Y )  H ( X )  H (Y )  H ( X , Y )
Venn diagram (Set theory)
Conditional entropy:
H ( X | Y )  H ( X , Y )  H (Y )
H (Y | X )  H ( X , Y )  H ( X )
Transfer entropy : Conditional entropy
Mx My
p xy (in 1 | in( k ) , jn(l ) )
i 1 j 1
p x (in 1 | in( k ) )
T (Y  X )   p xy (in 1 , in( k ) , jn(l ) ) log
𝐻(𝑌𝑛 )
𝐻(𝑌𝑛+1 |𝑌𝑛 )
[T. Schreiber. Measuring information transfer. Phys. Rev. Lett., 85:461, 2000]
Transfer entropy
Mx My
p xy (in 1 | in( k ) , jn(l ) )
i 1 j 1
p x (in 1 | in( k ) )
T (Y  X )   p xy (in 1 , in( k ) , jn(l ) ) log
Transfer entropy 𝑻(𝑿 → 𝒀):
Influence from state in 𝑿 on
transition in 𝒀
𝐻(𝑋𝑛 )
𝑻(𝑿 → 𝒀)
𝐻(𝑌𝑛 )
𝐻(𝑌𝑛+1 |𝑌𝑛 )
[T. Schreiber. Measuring information transfer. Phys. Rev. Lett., 85:461, 2000]
Transfer entropy
Mx My
p xy (in 1 | in( k ) , jn(l ) )
i 1 j 1
p x (in 1 | in( k ) )
T (Y  X )   p xy (in 1 , in( k ) , jn(l ) ) log
Transfer entropy 𝑻(𝑿 → 𝒀):
Influence from state in 𝑿 on
transition in 𝒀
Transfer entropy 𝑻(𝒀 → 𝑿):
Influence from state in 𝒀 on
transition in 𝑿
𝐻(𝑋𝑛 )
𝑻(𝑿 → 𝒀)
𝐻(𝑋𝑛+1 |𝑋𝑛 )
𝐻(𝑌𝑛 )
𝑻(𝒀 → 𝑿)
𝐻(𝑌𝑛+1 |𝑌𝑛 )
[T. Schreiber. Measuring information transfer. Phys. Rev. Lett., 85:461, 2000]
Today’s lecture
Motivation
Measures of synchronization for continuous data
• Linear measures: Cross correlation, coherence
• Mutual information
• Phase synchronization (Hilbert transform)
• Non-linear interdependences
Measure comparison on model systems
Measures of directionality
• Granger causality
• Transfer entropy
Next lecture
Measures of synchronization for discrete data
(e.g. spike trains)
• Victor-Purpura distance
• Van Rossum distance
• Schreiber correlation measure
• ISI-distance
• SPIKE-distance
Measure comparison
Download