Slides

advertisement
Stochastic processes
Lecture 6
Power spectral density (PSD)
1
Random process
2
1st order Distribution & density
function
First-order distribution
First-order density function
3
2end order Distribution & density
function
2end order distribution
2end order density function
4
EXPECTATIONS
• Expected value
• The autocorrelation
5
Some random processes
•
•
•
•
•
•
•
•
Single pulse
Multiple pulses
Periodic Random Processes
The Gaussian Process
The Poisson Process
Bernoulli and Binomial Processes
The Random Walk Wiener Processes
The Markov Process
6
Single pulse
Single pulse with random amplitude and arrival time:
X (t) = A S(t −Θ)
A and Θ are statistically independent
Amplitude
Amplitude
Nerve spike
Amplitude
Deterministic pulse:
S(t): Deterministic function.
Random variables:
A: gain a random variable
Θ: arrival time.
2
0
-2
0
0.5
1
1.5
t (ms)
2
2.5
3
0
0.5
1
1.5
t (ms)
2
2.5
3
0
0.5
1
1.5
t (ms)
2
2.5
3
2
0
-2
2
0
-2
7
Multiple pulses
Single pulse with random amplitude and arrival time:
x 𝑑 =
𝑛
π‘˜=1 π΄π‘˜ 𝑆(𝑑
− π›©π‘˜ )
Deterministic pulse:
S(t): Deterministic function.
Random variables:
Ak: gain a random variable
Θk: arrival time.
n: number of pulses
Multiple Nerve spikes
2
0
-2
0
0.5
1
1.5
2
2.5
3
0
0.5
1
1.5
2
2.5
3
0
0.5
1
1.5
2
2.5
3
2
0
-2
2
0
Ak and Θk are statistically independent
-2
8
Periodic Random Processes
• A process which is periodic with T
x 𝑑 = π‘₯ 𝑑 + 𝑛𝑇
n is an integrer
Signal
2
1.5
1
T=100
X(t)
0.5
0
-0.5
-1
-1.5
-2
0
100
200
300
400
x 𝑑 = 𝑠𝑖𝑛
500
t
2πœ‹π‘‘
50
600
700
+ Θ + 𝑠𝑖𝑛
800
2πœ‹π‘‘
100
+Θ
900
1000
9
The Gaussian Process
• X(t1),X(t2),X(t3),….X(tn) are jointly Gaussian fro
all t and n values
• Example: randn() in Matlab
Histogram of Gaussian process
Gaussian process
700
5
4
600
3
500
2
1
400
0
300
-1
200
-2
100
-3
-4
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
10000
0
-4
-3
-2
-1
0
1
2
3
4
5
10
The Poisson Process
• Typically used for modeling of cumulative
number of events over time.
– Example: counting the number of phone call from
a phone
πœ† 𝑑
𝑃 𝑋 𝑑 =π‘˜ =
π‘˜!
π‘˜
𝑒 −πœ†(𝑑)
11
Alternative definition
Poisson points
• The number of events in an interval
N(t1,t2)
𝑃 𝑁 𝑑1, 𝑑2 = π‘˜ = 𝑃 𝑋 𝑑2 − 𝑋 𝑑1
𝑃 𝑁 0, 𝑑2
πœ† 𝑑2 − 𝑑1
=π‘˜ =
π‘˜!
π‘˜
𝑒 −πœ†(𝑑2−𝑑1)
πœ†π‘‘ π‘˜ −πœ†π‘‘
=π‘˜ =𝑃 𝑋 𝑑 =π‘˜ =
𝑒
π‘˜!
12
Bernoulli Processes
• A process of zeros and ones
X=[0 0 1 1 0 1 0 0 1 1 1 0]
Each sample must be independent and identically distributed
Bernoulli variables.
– The likelihood of 1 is defined by p
– The likelihood of 0 is defined by q=1-p
13
Binomial process
Summed Bernoulli Processes
Where X[n] is a Bernoulli Processes
14
Random walk
• For every T seconds take a step (size Δ) to the left or
right after tossing a fair coin
Random walks
6
4
x[n]
2
0
-2
-4
-6
-8
0
5
10
15
20
25
n
30
35
40
45
50
15
The Markov Process
• 1st order Markov process
– The current sample is only depended on the
previous sample
Density function
Expected value
16
The frequency of earth quakes
• Statement the number large earth quakes has
increased dramatically in the last 10 year!
17
The frequency of earth quakes
• Is the frequency of large earth quakes unusual
high?
• Which random processes can we use for
modeling of the earth quakes frequency?
18
The frequency of earth quakes
• Data
• http://earthquake.usgs.gov/earthquakes/eqar
chives/year/graphs.php
19
Agenda (Lec 16)
• Power spectral density
– Definition and background
– Wiener-Khinchin
– Cross spectral densities
– Practical implementations
– Examples
20
Fourier transform recap 1
Transform between time and frequency domain
Fourier transform
Invers Fourier transform
21
Fourier transform recap 2
• Assumption: The signal can be reconstructed
from sines and cosines functions.
e(jw n)
Real
Imaginary
1
Amplitude
0.5
𝑒 −𝑗2πœ‹π‘“π‘‘
0
-0.5
= cos 2πœ‹π‘“π‘‘ − 𝑗sin 2πœ‹π‘“π‘‘
-1
0
20
40
60
80
100
n
• Requirement: absolute integrable
ο‚₯
οƒ₯ x[n] ο€Ό ο‚₯
n ο€½ ο€­ο‚₯
22
Fourier transform of a stochastic
process
• A stationary stochastic process is typical not
absolute integrable
• There the signal is truncated
• Before Fourier transform
23
What is power?
• In the power spectrum density power is
related to electrical power
𝑃=
𝑉2
𝑅
24
Power of a signal
• The power of a signal is calculated by squaring
the signal.
π‘₯(𝑑)2
• The average power in e period is :
25
Parseval's theorem
• The power of the squared absolute Fourier
transform is equal the power of the signal
26
Power of a stochastic process
• Thereby can the expected power can be
calculated from the Fourier spectrum
27
Power spectrum density
• Since the integral of the squared absolute Fourier
transform contains the full power of the signal it is a
density function.
•
So the power spectral density of a random process is:
• Due to absolute factor the PSD is always real
28
PSD Example
Fourier transform
PSD
1
Sxx(f)
0.8
0.6
0.4
0.2
0
-15
-10
-5
0
5
10
15
29
Power spectrum density
• The PSD is a density function.
– In the case of the random process the PSD is the
density function of the random process and not
necessarily the frequency spectrum of a single
realization.
• Example
– A random process is defined as
X 𝑑 = sin(πœ”π‘Ÿ 𝑑)
– Where ωr is a unifom distributed random variable
wiht a range from 0-π
– What is the PSD for the process and
– The power sepctrum for a single realization
30
PSD of random process versus
spectrum of deterministic signals
• In the case of the random process the PSD is
usual the expected value E[Sxx(f)]
• In the case of deterministic signals the PSD is
exact (There is still estimation error)
31
Properties of the PSD
1. Sxx(f) is real and nonnegative
2. The average power in X(t) is given by:
𝐸 𝑋 2 (𝑑) = 𝑅π‘₯π‘₯ 0 =
∞
𝑆π‘₯π‘₯ 𝑓 𝑑𝑓
−∞
3. If X(t) is real Rxx(τ) and Sxx(f) are also even
𝑆π‘₯π‘₯ −𝑓 = 𝑆π‘₯π‘₯ 𝑓
4. If X(t) has periodic components Sxx(f)has impulses
5. Independent on phase
32
Wiener-Khinchin 1
• If the X(t) is stationary in the wide-sense the
PSD is the Fourier transform of the
Autocorrelation
Proof: page33175
Wiener-Khinchin
Two method for estimation of the PSD
Fourier Transform
X(f)
|X(f)|2
X(t)
Sxx(f)
X(t)
f
Fourier Transform
t
Rxx()
Sxx(f)
Autocorrelation
f

34
The inverse Fourier Transform of the
PSD
• Since the PSD is the Fourier transformed
autocorrelation
• The inverse Fourier transform of the PSD is the
autocorrelation
35
Cross spectral densities
• If X(t) and Y(t) are two jointly wide-sense
stationary processes, is the Cross spectral
densities
• Or
36
Properties of Cross spectral densities
1. Since
is
2. Syx(f) is not necessary real
3. If X(t) and Y(t) are orthogonal
Sxy(f)=0
4. If X(t) and Y(t) are independent
Sxy(f)=E[X(t)] E[Y(t)] δ(f)
37
Cross spectral densities example
• 1 Hz Sinus curves in white noise
𝑋 𝑑 = sin 2πœ‹ 𝑑 + 3 𝑀(𝑑)
π‘Œ 𝑑 = sin 2πœ‹ 𝑑 + πœ‹ 2 + 3 𝑀(𝑑)
Welch Cross Power Spectral Density Estimate
Where w(t) is Gaussian noise
5
0
X(t)
10
0
-10
0
5
10
t (s)
Signal Y(t)
15
20
Y(t)
10
-5
-10
-15
-20
-25
0
-10
Power/frequency (dB/Hz)
Signal X(t)
-30
0
5
10
t (s)
15
20
0
5
10
15
Frequency (Hz)
20
25
38
Implementations issues
• The challenges includes
– Finite signals
– Discrete time
39
The periodogram
The estimate of the PSD
• The PSD can be estimate from the
autocorrelation
𝑁−1
𝑅π‘₯π‘₯ [π‘š]𝑒 −𝑗ωπ‘š
𝑆π‘₯π‘₯ ω =
π‘š=−𝑁+1
• Or directly from the signal
𝑆π‘₯π‘₯ ω =
1
𝑁
2
𝑁−1
π‘₯ [𝑛]𝑒 −𝑗ω𝑛
𝑛=0
40
The discrete version of the
autocorrelation
Rxx(τ)=E[X1(t) X(t+τ)]≈Rxx[m]
m=τ where m is an integer
𝑁− π‘š −1
𝑅π‘₯π‘₯ π‘š =
π‘₯ 𝑛 π‘₯[𝑛 + π‘š]
𝑛=0
N: number of samples
Normalized version:
𝑅π‘₯π‘₯
1
π‘š =
𝑁
𝑁− π‘š −1
π‘₯ 𝑛 π‘₯[𝑛 + π‘š]
𝑛=0
41
Bias in the estimates of the
autocorrelation
N=12
𝑁− π‘š −1
𝑅π‘₯π‘₯ π‘š =
π‘₯ 𝑛 π‘₯[𝑛 + π‘š]
𝑛=0
Autocorrelation
Autocorrelation
M=-10
M=0
M=4
222
888
111
666
000
444
-1-1
-1
-2-2
-2
-10
-10
-10
-5
-5
-5
000
555
nnn
10
10
10
15
15
15
20
20
20
222
000
222
111
-2
-2
-2
000
-4
-4
-4
-1-1
-1
-2-2
-2
-10
-10
-10
-5
-5
-5
000
555
n+m
n+m
n+m
10
10
10
15
15
15
20
20
20
-6
-6
-6
-15
-15
-15
-10
-10
-10
-5
-5
0
5
10
15
42
Bias in the estimates of the
autocorrelation
• The edge effect correspond to multiplying the true
autocorrelation with a Bartlett window
𝐸[𝑅π‘₯π‘₯ π‘š ] = 𝑀[π‘š]𝑅π‘₯π‘₯_π‘’π‘›π‘π‘–π‘Žπ‘ π‘’π‘‘ [π‘š]
𝑁 − |π‘š|
𝑀𝑏 π‘š =
𝑁
0
π‘š <𝑁
π‘œπ‘‘β„Žπ‘’π‘Ÿπ‘€π‘–π‘ π‘’
w[m]
1
0.5
0
-15
-10
-5
0
m
5
10
15
43
Alternative estimation of
autocorrelation
• The unbiased estimate
𝑅π‘₯π‘₯
1
π‘š =
𝑁 − |π‘š|
𝑁− π‘š −1
π‘₯ 𝑛 π‘₯[𝑛 + π‘š]
𝑛=0
Unbiased
0.6
0.6
0.4
0.4
0.2
0.2
Rxx[m]
Rxx[m]
Biased
0
0
-0.2
-0.2
-0.4
-0.4
-0.6
-0.6
-15
-10
-5
0
m
5
10
15
-15
-10
-5
0
m
5
10
15
• Disadvantage: high variance when |m|→N
44
Influence at the power spectrum
• Biased version: a Bartlett window is applied
∞
𝑆π‘₯π‘₯ ω =
π‘š=−∞
𝑀𝑏[π‘š]𝑅π‘₯π‘₯_π‘’π‘›π‘π‘–π‘Žπ‘ π‘’π‘‘ [π‘š]𝑒 −𝑗ωπ‘š
• Unbiased version: a Rectangular window is
applied
∞
𝑆π‘₯π‘₯ ω =
π‘š=−∞
π‘€π‘Ÿ[π‘š]𝑅π‘₯π‘₯_π‘’π‘›π‘π‘–π‘Žπ‘ π‘’π‘‘[π‘š]𝑒 −𝑗ωπ‘š
π‘€π‘Ÿ π‘š =
1
0
π‘š <𝑁
π‘œπ‘‘β„Žπ‘’π‘Ÿπ‘€π‘–π‘ π‘’
45
Example
Autocorrelation biased and unbiased
Unbiased
0.6
0.4
0.4
0.2
0.2
Rxx[m]
0.6
0
0
-0.2
-0.2
-0.4
-0.4
-0.6
-0.6
-15
-10
-5
0
m
5
10
Estimated PSD’s
15
-15
-10
-5
0
m
5
10
15
estimated PSD
5
Unbiased
Biased
4
Sxx()
Rxx[m]
Biased
3
2
1
0
0
1
2
3
4

5
6
7
46
Variance in the PSD
• The variance of the periodogram is estimated
to the power of two of PSD
π‘‰π‘Žπ‘Ÿ 𝑆π‘₯π‘₯ πœ”
= 𝑆π‘₯π‘₯ (πœ”)
2
Realization 1
10
0
5
-5
True PSD
1
0
0.8
Sxx(f)
PSD: Realization 1
5
5
t (s)
Realization 2
10
0
5
10
0
5
0
50
100
150
f (Hz)
PSD: Realization 2
200
50
200
0.6
0.4
-5
0.2
0
0
50
100
f (Hz)
150
200
0
5
t (s)
Realization 3
10
0
5
10
0
5
-5
0
5
t (s)
10
0
0
0
100
150
f (Hz)
PSD: Realization 3
50
100
f (Hz)
150
200
47
Averaging
• Divide the signal into K segments of M length
π‘₯𝑖 = π‘₯ 𝑖 − 1 𝑀 + 1: 𝑖 𝑀
1≤𝑖≤𝐾
• Calculate the periodogram of each segment
𝑆𝑖π‘₯π‘₯
1
ω =
𝑀
2
𝑀−1
π‘₯ 𝑖 [𝑛]𝑒 −𝑗ω𝑛
𝑛=0
• Calculate the average periodogram
1
𝑆π‘₯π‘₯ [ω] =
𝐾
𝐾
𝑆𝑖π‘₯π‘₯ [ω]
𝑖=0
48
Illustrations of Averaging
X(t)
2
0
-2
-4
0
1
2
10
3
4
5
4
6
7
10
8
9
0
100
10
6
4
5
2
5
2
0
0
100
200
0
0
100
200
0
0
100
200
0
200
10
5
0
0
50
100
f (Hz)
150
200
49
Effect of Averaging
• The variance is decreased
π‘‰π‘Žπ‘Ÿ 𝑆π‘₯π‘₯ πœ”
=
1
𝑆π‘₯π‘₯ (πœ”) 2
𝐾
• But the spectral resolution is also decreased
50
Additional options
The Welch method
• Introduce overlap between segment
π‘₯𝑖 = π‘₯ 𝑖 − 1 𝑄 + 1: 𝑖 − 1 𝑄 + 𝑀
1≤𝑖≤𝐾
– Where Q is the length between the segments
• Multiply the segment's with windows
𝑆𝑖π‘₯π‘₯
1
ω =
𝑀
2
𝑀−1
𝑀[𝑛]π‘₯ 𝑖[𝑛]𝑒 −𝑗ω𝑛
𝑛=0
51
Example
• Heart rate variability
• http://circ.ahajournals.org/cgi/content/full/93
/5/1043#F3
• High frequency component related to
Parasympathetic nervous system ("rest and
digest")
• Low frequency component related to
sympathetic nervous system (fight-or-flight)
52
Agenda (Lec 16)
• Power spectral density
– Definition and background
– Wiener-Khinchin
– Cross spectral densities
– Practical implementations
– Examples
53
Download