– Emergence in Quantitative Systems towards a measurable definition

advertisement
Emergence in Quantitative Systems –
towards a measurable definition
R C Ball, RS MacKay,
M Diakonova
Centre for Complexity Science,
University of Warwick
Complexity Science
Key themes
•Systems of many inter-linked variables/components
•Emergent behaviour: not just an obvious scale up of individuals
•Robustness (under perturbation, damage)
•Fluctuations and Noise
Common issues
•Understanding system response
•Forecasting behaviour
•Optimising design (cost, performance, robustness ..)
Other angles
•Large datasets, hidden info (secondary analysis)
•The cost-benefit of [high degrees of] choice
Boundaries
•Complicated is not necessarily complex!
•Problems need inspiration from outside their field, with prospect to return it.
Emergent Behaviour
System + Dynamics
Many internal d.o.f. and/or observe over long times
Properties: averages, correlation functions
Multiple realisations (conceptually)
realisations
•
•
•
•
time
anticipation
forecast
Statistical
properties
Emergent properties
behaviour which
•can be forecast (from preceding history) but
•cannnot be anticipated (from independent realisations).
Strong emergence: different realisations (can) differ
for ever
MacKay: non-unique Gibbs phase (distribution over
configurations for a dynamical system)
Physics example: spontaneous symmetry breaking
 system makes/inherits one of many equivalent
choices of how to order (or statically disorder)
 fine after you have achieved the insight that there is
ordering (maybe heat capacity anomaly?) and what
ordering to look for (no general technique).
Emergence in Quantitative Systems –
towards a measurable definition
Input ideas:
Shannon: Information -> Entropy
transmission -> Mutual Information
Crutchfield: Complexity <-> Information
MacKay:
Emergence = system evolves to non-unique state
Emergence measure: Persistent Mutual Information across time.
Entropy & Mutual Information
Shannon 1948
Entropy as (logarithm of) variety of outcomes
S A    pi log( pi )  log(1/ p)  log N A
iA
Mutual information as missing entropy:
I AB  S A  S B  S AB
B
 pij
  pij log 
pp
ijAB
 i j

 N A NB 
  log 

N
 AB 

B
B
A
I 0
A
I 0
A
I 0
- reduction of joint possibilities compared to independent case;
- measure of information transmission when A  input; B  output.
MI-based Measures of Complexity
time
A=past
B=future
measure related to I AB
Entropy density (rate)
Excess Entropy
space
Shannon 1948
Crutchfield & Packard 1982
Statistical Complexity
Shalizi et al PRL 2004
Persistent Mutual Information
- candidate measure of Emergence
Measurement of Persistent MI
    
  0 
  

PMI  lim lim I [t    , t ], [t   0 , t   0    ]
    0 

•Measurement of I itself requires some rounding or discretising of
continuous valued data
•above seems the safer order of limits, and computationally practical
•The outer limit may need more careful definition
Examples with PMI
• Oscillation (persistent phase)
• Spontaneous ordering of spins (magnets)
• Ergodicity breaking (glasses) – pattern of atoms/spins is random but
becomes frozen in over time
Cases without PMI
• Reproducible steady state with (decaying) autocorrelation functions
• Chaotic dynamics
Discrete vs continuous emergent
order parameters
    
  0 
  
Discrete order parameters are well resolved beyond threshold values of   / 
Resolution of cts order parameters might improve without limit,
e.g. P    /  1/2 (time averaging)
 PMI 
1
log  /   const
2
This suggests some need to anticipate “information dimensionalities”
Logistic map
xn1  r xn (1  xn )
x
0
log 8
log 3
log 4
PMI = 0
log 2
log 4
log 2
r
M Diakonova
Measured PMI
log 4
log 2
Feldman,McTague and
Crutchfield, arXiv:0806.4789v1
Excess
Entropy and
Entropy
Rate
vs
PMI
PMI of Banded Chaos
A
B
3.58
D
3.60
3.62
C
3.64
3.66
3.68
PMI
M Diakonova 2009,
MI by k-neighbours method (Grassberger )
Period 3 region
Log(12)
Log(6)
Log(3)
M Diakonova 2009,
MI by k-neighbours method (Grassberger ),
R ranges from 3.835 to 3.856in 250 increments
P tripling followed by doubling
(Zoom in on structure visible after P3 doubling)
Log(36)
Log(18)
Log(9)
Log(3)
R ranges from 3.8535 to 3.8544 in 250 increments
Resolution dependent PMI at the Feigenbaum point
Log (longest period resolvable)
Practical Measurement of Entropy & hence MI
Require - dV ( x) p ( x) log( p ( x))  log( p ( x))
Estimate p( x) 
ki / N
 Vi
Box counting: fix  Vi and measure ki ..... badly biased for log( p)
k  neighbours: fix ki and measure  Vi ..... small correctable bias
k2
Grassberger MI: fix k  k12 in joint distribution to define box,
then box count marginals k1 , k2 .


k12 / N
MI ( X 1 ; X 2 )  log 
  k / N  k / N  
2
 1

bias controllable as k1 , k2  k12
k1 

Measured MI in case of Fractal scaling
small distance  about measured point:
support (measurement) space:  Vi   di , i  1, 2, 12; d12  d1  d 2
local probability scaling
ki / N  i  p Vi   Di ; Di  di ; D12  D1  D2


k12 / N
MI ( X 1 ; X 2 )  log 
  k / N  k / N  
2
 1

k12  k fixes   (k / N )1/ D12
 k1 / N   D1  (k / N ) D1 / D12 , k2 / N  (k / N ) D2 / D12
hence
MI ( X 1 ; X 2 )   log( N / k )  const ,  
D1  D2  D12
D12
Relative Information
Co-dimension
Fractal PMI at the Feigenbaum point
D1  D2  D12

 1,
D12
D1  D2  D12  D(Cantor set)
Log (longest period resolvable)
Example 2: the Standard Map
pn 1  pn  K sin( xn ) mod 2
xn 1  xn  pn 1
mod 2
Symplectic property:
uniform random distribution in x, p plane exactly preserved.
Marginals: X 1  ( x, p)0 and X 2  ( x, p) ; D1  2 (input )  D2  2.
All the interest lies in joint distribution, as marginals trivial.
K=1.2
K=0.6
K=0.9716 ~ Kc
K=2.0
pn 1  pn  K sin( xn )
xn 1  xn  pn 1
mod 2
p
x
Wikipedia “standard map” (Linas)
Fractal MI in
Standard Map
K  1  approximately Hamiltonian dynamics, one conservation law,
 D12  D1  D2  1 and hence =1/3.
Kc  0.97  breakdown of last spanning orbit
cf max(apparent ) at ca K m  0.8
???
0
Chaotic dynamics
Cross-over scaling
Quasiperiodic regime
Long times -> full mixing but in D  3 due to one conservation law
1
distance to neighbour set by
D
N
Short times -> local shear   
p
 p'  p
x
 x '   x   p
Require max   x ,  p ,  x ' ,  p '   
1
1/2
= x p leading to    x   p   / N 
N
These two limits on  meet for N   3   3
with
Chaotic analysis similar but with D  4 and ln     ln N  
Interpretation
Separate dynamics in separate regions of phase space
I  f q ln p q  f c ln p
c
  f q  q ln  N / k   f c  c ln  N / k   const
Hence simple averaging for ,
& separate crossover from causal from each class of orbit.
Simplest scenario for  with increasing  / decreasing resolution:
chaotic mixing
shear mixing
1
1 
1 f q  0 f c 
3 fq
A definition of Emergence
• System self-organises into a non-trivial behaviour;
– different possible instances of that behaviour;
– choice unpredictable but
– persists over time (or other extensive coordinate).
• Quantified by PMI = entropy of choice
Shortcomings
 Assumes system/experiment conceptually repeatable
 Measuring MI requires deep sampling
 Appropriate mathematical limits need careful construction
Generalisations
 PMI as function of resolution → fractal information
 PMI as function of timescale probed
 Other extensive coordinates could play the role of time
Ref: RCB, M Diakonova, RS MacKay, Adv. in Complex Systems 13, 327-338 (2010)
Ack’t: EPSRC funding of Warwick Complexity Science Doctoral Training Centre
K=0.5
K=0.971635
K=5
0
Logistic map
xn1  r xn (1  xn )
PMI = 0
x
log 2
log 4
log 2
log 8
r
log 4
log 3
Graph III – general PMI v r
(500 values of r between 3 and 4. r = 3 not included,
because convergence is very very slow)
First direct
measurements
PMI / ln2
r
r
Issue of time windows and limits
PMI / log2
Short
time
correl’
n
Long
strings
undersampled
Length of past, future
Length of “present”
r=3.58, PMI / log2 = 2
Download