A Wavelet Based Estimation of the Hurst Exponent from

advertisement
Estimating the Hurst
Exponent with Wavelets
and Other Methods
The George Washington University
Astrophysics Group
Glen MacLachlan
with Alex Bridi, Junaid Ghauri, Shihao Guo, Rob Coyne,
Ashwin Shenoy, Tilan Ukwatta, David Morris,
Ali Eskandarian, Kalvir Dhuga, Leonard Maximon,
and William Parke
GRB Temporal Analysis OSU Workshop, June 29, 2010
Table of contents
Introduction
Statistical Self-Similarity, Hurst Exponents, & fBms
Some Intuition
Various Techniques For Estimating H in Long GRBs
Box Counting
Zero-Counting
Rescaled Range Analysis
Measure of Variance–The Width Function
A Wavelet Method
GRB Analysis
Light Curves
Results
Extra Slides
Introduction
• Pioneering work in self-similarity was first published in
1951 by Hurst.
Introduction
• Pioneering work in self-similarity was first published in
1951 by Hurst.
• Hurst asked what should be the minimum size of a
reservoir so that it neither overflows nor runs dry.
Introduction
• Pioneering work in self-similarity was first published in
1951 by Hurst.
• Hurst asked what should be the minimum size of a
reservoir so that it neither overflows nor runs dry.
• Unexpected observation ... levels were not independent
from one another but instead exhibited memory of past
events.
Introduction
• Pioneering work in self-similarity was first published in
1951 by Hurst.
• Hurst asked what should be the minimum size of a
reservoir so that it neither overflows nor runs dry.
• Unexpected observation ... levels were not independent
from one another but instead exhibited memory of past
events.
• Time-series from many physical systems (including GRBs)
display some form of self-similarity
Physics Motivation
• Probe variability of GRB lightcurves
Physics Motivation
• Probe variability of GRB lightcurves
• Learn something about the central engine that powers the
eruption of a GRB
Hurst Exponents and Self-Similarity in
Astronomy & Astrophysics
• Wavelet and R/S analysis of the X-ray flickering of
cataclysmic variables
G. Anzolin, F. Tamburini, D. de Martino, A. Bianchini
A&A - Forthcoming; DOI:10.1051/0004-6361/201014297
• Analysis of the white-light flickering of the intermediate
polar V709 Cassiopeiae with wavelets and Hurst analysis
F. Tamburini, D. de Martino, A. Bianchini
A&A 502 1 (2009) 1-5 DOI: 10.1051/0004-6361/200911656
Statistical Self-Similarity and H
Rough ← 0 < H < 1 → Smooth
A self-similar series may be sub-divided into three categories:
1
A series with 1/2 < H < 1 is persistent or long-range
dependent.
Statistical Self-Similarity and H
Rough ← 0 < H < 1 → Smooth
A self-similar series may be sub-divided into three categories:
1
A series with 1/2 < H < 1 is persistent or long-range
dependent.
2
A series with 0 < H < 1/2 is anti-persistent.
Statistical Self-Similarity and H
Rough ← 0 < H < 1 → Smooth
A self-similar series may be sub-divided into three categories:
1
A series with 1/2 < H < 1 is persistent or long-range
dependent.
2
A series with 0 < H < 1/2 is anti-persistent.
3
For H = 1/2 uncorrelated neither persistent nor
anti-persistent.
The Hurst exponent is a valuable piece of information because
it allows for a model-independent characterization of the data.
Fractional Brownian Motion
• Fractional Brownian motions (fBm’s) are a useful model for
studying self-similarity and long-range dependence.
Fractional Brownian Motion
• Fractional Brownian motions (fBm’s) are a useful model for
studying self-similarity and long-range dependence.
• Characterized by a single parameter, H , the Hurst
exponent.
Fractional Brownian Motion
• Fractional Brownian motions (fBm’s) are a useful model for
studying self-similarity and long-range dependence.
• Characterized by a single parameter, H , the Hurst
exponent.
• Non-stationary
Fractional Brownian Motion
• Fractional Brownian motions (fBm’s) are a useful model for
studying self-similarity and long-range dependence.
• Characterized by a single parameter, H , the Hurst
exponent.
• Non-stationary
• Classic Brownian Motion, H = 1/2.
Fractional Brownian Motion
• Fractional Brownian motions (fBm’s) are a useful model for
studying self-similarity and long-range dependence.
• Characterized by a single parameter, H , the Hurst
exponent.
• Non-stationary
• Classic Brownian Motion, H = 1/2.
• Self-similar over a range of scales after a rescaling of axes,
.
BH (t ) = a −H BH (at ).
BH (t ) and a −H BH (at ) appear to be sampled from same
distribution.
Wide Sense Stationarity
• Mean and variance independent of time:
E{X (t )} = µ
E{X (t1 )X (t2 )} = γ(t1 − t2 ) = γ(τ )
Wide Sense Stationarity
• Mean and variance independent of time:
E{X (t )} = µ
E{X (t1 )X (t2 )} = γ(t1 − t2 ) = γ(τ )
• (Non-)Stationarity a property of process
Wide Sense Stationarity
• Mean and variance independent of time:
E{X (t )} = µ
E{X (t1 )X (t2 )} = γ(t1 − t2 ) = γ(τ )
• (Non-)Stationarity a property of process
• fBm’s are non-stationary, increments are stationary
Wide Sense Stationarity
• Mean and variance independent of time:
E{X (t )} = µ
E{X (t1 )X (t2 )} = γ(t1 − t2 ) = γ(τ )
• (Non-)Stationarity a property of process
• fBm’s are non-stationary, increments are stationary
• GRBs are non-stationary
Intuitive Statistical Self-Similarity
H = 1/2
H = 1 Correlation one to one
H = 0 Correlation lost
fBm’s With Various H’s
Box-Counting
• Box counting presents a way of determining the Hurst
exponent via Fractal Dimension.
Box-Counting
• Box counting presents a way of determining the Hurst
exponent via Fractal Dimension.
• Series will have some characteristic hyper-volume V
Box-Counting
• Box counting presents a way of determining the Hurst
exponent via Fractal Dimension.
• Series will have some characteristic hyper-volume V
• Covered by some number of boxes, N , at linear scale V = N D
log N = −D log + log V
Box-Counting
• Box counting presents a way of determining the Hurst
exponent via Fractal Dimension.
• Series will have some characteristic hyper-volume V
• Covered by some number of boxes, N , at linear scale • Hurst exponent related to Fractal Dimension
V = N D
log N = −D log + log V
H =2−D
Zero-Counting
• A 1-d version of Box-Counting
The slope of plot is the fractal dimension D = 1 − H .
Zero-Counting
• A 1-d version of Box-Counting
• Arrange non-overlapping windows of size l across
horizontal axis
The slope of plot is the fractal dimension D = 1 − H .
Zero-Counting
• A 1-d version of Box-Counting
• Arrange non-overlapping windows of size l across
horizontal axis
• If series crosses zero within window z (l ) = 1, otherwise
z (l ) = 0
The slope of plot is the fractal dimension D = 1 − H .
Zero-Counting
• A 1-d version of Box-Counting
• Arrange non-overlapping windows of size l across
horizontal axis
• If series crosses zero within window z (l ) = 1, otherwise
z (l ) = 0
P
• Plot l z (l ) versus l on log-log plot.
The slope of plot is the fractal dimension D = 1 − H .
Rescaled Range Analysis
• Time-series Yn {0 ≤ n ≤ N − 1}
Rescaled Range Analysis
• Time-series Yn {0 ≤ n ≤ N − 1}
• Rn = max(Yn ) − min(Yn )
Rescaled Range Analysis
• Time-series Yn {0 ≤ n ≤ N − 1}
• Rn = max(Yn ) − min(Yn )
• Sn = STD of increments of Yn
log(Rn /Sn ) ∝ H log(n)
and n is rescaled range.
Width Measurement
• Variance of some fBm trace, Xt , will be proportional to |t |2H
Width Measurement
• Variance of some fBm trace, Xt , will be proportional to |t |2H
• Compute H from slope of log-log plot of w (l ) versus l
where
1/2
w (l ) = hh(x − hx il )2 il
and l is window width.
i
Discrete Wavelet Transform
• Wavelet ψj ,k ... Little Wave
Discrete Wavelet Transform
• Wavelet ψj ,k ... Little Wave
• Rescaled, translated versions of itself
ψj ,k = 2−j /2 ψ(2−j t − k )
Discrete Wavelet Transform
• Wavelet ψj ,k ... Little Wave
• Rescaled, translated versions of itself
ψj ,k = 2−j /2 ψ(2−j t − k )
• Encodes series information in details dj ,k = hX , ψj ,k i
Wavelet Variances
• Compute wavelet variance (detail coefficients)
nj −1
1 X
var(dj ,k ) =
|dj ,k |2
nj
j =0
Wavelet Variances
• Compute wavelet variance (detail coefficients)
nj −1
1 X
var(dj ,k ) =
|dj ,k |2
nj
j =0
• And plot log2 of variances versus scale, j
log2 (var(dj ,k )) = (2H + 1)j + constant,
Wavelet Variances
• Compute wavelet variance (detail coefficients)
nj −1
1 X
var(dj ,k ) =
|dj ,k |2
nj
j =0
• And plot log2 of variances versus scale, j
log2 (var(dj ,k )) = (2H + 1)j + constant,
• Slope is α = 2H + 1. Note that the slope of the stationary
derivative process, α0 , is related α as α = α0 + 2 and to H
as α0 = 2H − 1.
Logscale Diagrams
Wavelet De-noise Detour
Calibration Test
Calibrated Analyses with synthetic fBms.
1
Box-Counting
2
Width Function
3
Zero-Counting
4
R/S
5
Wavelet Decompositions (Multiple Bases)
fBm Results From Different Methods
fBm Results From Different Methods
GRB Data
• Analyzed 396 Long GRBs
GRB Data
• Analyzed 396 Long GRBs
• Data discussed by Tilan Ukwatta
GRB Data
• Analyzed 396 Long GRBs
• Data discussed by Tilan Ukwatta
• Binned to have approximately equal length
GRB Data
• Analyzed 396 Long GRBs
• Data discussed by Tilan Ukwatta
• Binned to have approximately equal length
• Time increments from 1 ms to 128 ms
GRB Data
fBm H=0.1
fBm H=0.22
fBm H=0.5
fBm H=0.9
Logscale Diagram
log2 (var(dj ,k )) = (2H + 1)j + constant,
Another Histogram of Summed
Logscale Diagrams
Dispersion attributed to alignment of light curve and wavelet.
Common to assume 95% confidence intervals around
variances.
Fitted Logscale Diagram
Wavelet Results
Summary
• Discussed several methods for extracting H
Summary
• Discussed several methods for extracting H
• Calibrated with fBms ... wavelets do very well
Summary
• Discussed several methods for extracting H
• Calibrated with fBms ... wavelets do very well
• Extracted H from GRB time-series
Summary
• Discussed several methods for extracting H
• Calibrated with fBms ... wavelets do very well
• Extracted H from GRB time-series
• Found GRBs to be anti-persistent
Summary
• Discussed several methods for extracting H
• Calibrated with fBms ... wavelets do very well
• Extracted H from GRB time-series
• Found GRBs to be anti-persistent
• Published analysis of CV data suggests persistent
behavior (Anzolin et al)
Future Work
• What do theoretical models of GRBs say?
Future Work
• What do theoretical models of GRBs say?
• Repeat analysis with GBM data
Future Work
• What do theoretical models of GRBs say?
• Repeat analysis with GBM data
• Some GRBs exhibit multi-scaling which requires further
analysis
Haar Basis
Daubechies 4
Daubechies 6
Coiflet 6
Coiflet 12
Coiflet 18
The Logscale Surface Diagram
Haar Logscale Surface
D4 Logscale Surface
C18 Logscale Surface
Reverse-Tail Concatenation
Address circularity assumption when |X0 − XN −1 | 0.
Intuitive Statistical Self-Similarity
Intuitive Statistical Self-Similarity
Intuitive Statistical Self-Similarity
Relating D to H
• N Rectangles with area: × H = H +1
Relating D to H
• N Rectangles with area: × H = H +1
• Replace with squares with area: 2
Relating D to H
• N Rectangles with area: × H = H +1
• Replace with squares with area: 2
• N ∗ Squares required to cover the same area
Relating D to H
• N Rectangles with area: × H = H +1
• Replace with squares with area: 2
• N ∗ Squares required to cover the same area
• Ratio N /N ∗ = 2 /H +1 or N = N ∗ 1−H
Relating D to H
• N Rectangles with area: × H = H +1
• Replace with squares with area: 2
• N ∗ Squares required to cover the same area
• Ratio N /N ∗ = 2 /H +1 or N = N ∗ 1−H
• Substitute for N in N = V → N ∗ {2−H } = V
Quantity in curly brackets is the fractal dimension D = 2 − H .
A Box-Counting Demo
N = 8, = 1/8
A Box-Counting Demo
N = 8, = 1/4
A Box-Counting Demo
N = 4, = 1/2
fBm Results From Different Methods
Hurst R/S
Mandelbrot R/S
Box Count
Zero Crossings
Wavelet H – Cut 2
Background
GRB041223 w/ Uncertainties
Some XMM CVs
Download