(pdf file) Lecture 10

advertisement
Stochastic Processes
1
Monday, November 14, 11
Definition and Classification
X( , t): stochastic process:
X :⌦⇥T
!
( , t)
R
X( , t)
where ⌦ is a sample space and T is time. {X( , t) is a family of r.v. defined on
{⌦, A, P and indexed by t 2 T .
• For a fixed
0,
X((
0 , t)
is a function of time or a sample function
• For a fixed t0 , X(( , t0 ) is a random variable or a function of
Discrete-time random process: n 2 T , X( , n)
Continuous-time random process: t 2 T X( , n)
2
Monday, November 14, 11
2⌦
Probabilistic characterization of X( , t)
{X( , t)} is an infinite family of r.v., we then need to know:
1. First-order p.d. or distribution
F (x, t) = P [X( , t)  x] for all t
@F (x, t)
f (x, t) =
@x
2. Second-order p.d. or distribution
F (x1 , x2 , t1 , t2 ) = P [X( , t1 )  x1 , X( , t2 )  x2 ] for all t1 , t2 2 T
@ 2 F (x, t)
f (x1 , x2 , t1 , t2 ) =
@x1 @x2
3. n-th order p.d. or distribution
F (x1 , · · · , xn , t1 , · · · , tn ) = P [X( , t1 )  x1 , · · · , X( , tn )  xn ] for all t1 , · · · , tn 2 T
@ n F (x, t)
f (x1 , · · · , xn , t1 , · · · , tn ) =
@x1 · · · @xn
Remark
The n-th order p.d./p.D must also satisfy
• Symmetry
F (x1 , · · · , xn , t1 , · · · , tn ) = F (xi1 , · · · , xin , ti1 , · · · , tin )
for any permutation of {1, 2, · · · , n}.
• Consistency in marginals
F (x1 , · · · , xn
Monday, November 14, 11
1 , t1 , · · ·
, tn
1)
= F (x1 , · · · , xn
3
1 , 1, t1 , · · ·
, tn )
Kolmogorov’s Theorem
If F (x1 , · · · , xn , t1 , · · · , tn ) for all sets of {ti } 2 T satisfy the symmetry and
consistency conditions, there corresponds a stochastic process.
Second-order Properties F (x, t), f (x, t), first-order p.d./p.D, and F (x1 , x2 , t1 , t2 ), f (x1 , x2 , t1 , t2 )
are not sufficient to characterize X( , t) but provide important information:
• Mean function ⌘(t)
⌘X (t) = E(X(t)) =
Z
1
xf (x, t)dx
1
function of t that depends on the first-order p.d.
• Auto-covariance function CX (t1 , t2 )
CX (t1 , t2 )
=
E[(X(t1 )
⌘(t1 ))(X(t2 )
=
E[X(t1 )X(t2 )] ⌘(t1 )⌘(t2 )
|
{z
}
⌘(t2 ))]
RX (t1 ,t2 )
RX (t1 , t2 ) is the autocorrelation function, a function of t1 , t2 and secondorder probability density
ZZ
RX (t1 , t2 ) = E[X(t1 )X(t2 )] =
x1 x2 f (x1 , x2 , t1 , t2 )dx1 dx2
4
Monday, November 14, 11
Remarks
• Note that
CX (t, t) = E[(X(t)
⌘(t))2 ] =
2
X (t)
or the variance of X(t).
• CX (t1 , t2 ) relates r.v.’s at times t1 and t2 , i.e., in time, while CX (t, t)
relates the r.v. X(t) with itself, i.e., in space.
Correlation coefficient
CX (t1 , t2 )
X (t1 ) X (t2 )
rX (t1 , t2 ) =
Cross-covariance
X(t), Y (t) real processes
CXY (t1 , t2 )
=
E[(X(t1 )
⌘X (t1 ))(Y (t2 )
⌘Y (t2 ))]
=
E[X(t1 )Y (t2 )] ⌘X (t1 )⌘Y (t2 )
|
{z
}
RXY (t1 ,t2 )
RXY (t1 , t2 ) is the cross-correlation function.
X(t), Y (t) are uncorrelated i↵ CXY (t1 , t2 ) = 0 for all t1 and t2 2 T .
5
Monday, November 14, 11
Remark A zero-mean process ⌫(t) is called strictly white noise if ⌫(ti ) and
⌫(tj ) are independent for ti 6= tj . It is called white noise if it is uncorrelated for
di↵erent values of t, i.e.,
⇢ 2
ti = tj
⌫
C⌫ (ti , tj ) = ⌫2 (ti tj ) =
0
ti 6= tj
Gaussian Processes
X(t) is Gaussian if {X(t1 ), · · · , X(tn ) are jointly Gaussian for any value of n
and times {t1 , · · · , tn }.
Remark The statistics of a Gaussian process are completely determines by the
mean ⌘(t) and the covariance CX (t1 , t2 ) functions. Thus
f (x, t) requires ⌘(t), C(t, t) =
2
X (t)
f (x1 , x2 , t1 , t2 ) requires ⌘(t1 ), ⌘(t2 ), C(t1 , t2 ), C(t1 , t1 ), C(t2 , t2 )
and in general the nth -order characteristic function
2
3
X
X
4
!i ⌘X (ti ) 0.5
!i !k CX (ti , tj )5
X (bf !, t) = exp j
i
i,k
6
Monday, November 14, 11
Stationarity
Strict Stationarity X(t) is strictly stationary i↵
F (x1 , · · · , xn , t1 , · · · , tn ) = F (x1 , · · · , xn , t1 + ⌧, · · · , tn + ⌧ )
for all n and ⌧ values.i.e., statistics do not change with time.
If X(t) is strictly stationary then
• its k th -moments are constant, i.e., E[X k (t)] = constant for all t.
• RX (t, t + ⌧ ) = RX (0, ⌧ ), or it only depends on the lag ⌧
Proof
• Using strict stationarity
Z 1
Z
E[X k (t)] =
xk f (x, t)dx =
1
1
xk f (x, t + ⌧ )dx = E[X k (t + ⌧ )]
1
for any ⌧ , which can only happen when it is a constant.
• By strict stationarity
ZZ
ZZ
RX (t, t + ⌧ ) =
x1 x2 f (x1 , x2 , t, t + ⌧ )dx1 dx2 =
x1 x2 f (x1 , x2 , 0, ⌧ )dx1 dx2 = RX (0, ⌧ )
Wide-sense Stationarity (wss) X(t) is wss if
1. E[X(t)] = constant , Var[X(t)] = constant, for all t.
2. RX (t1 , t2 ) = RX (t2
t1 )
7
Monday, November 14, 11
Examples of random processes
Discrete-time Binomial process
Consider the Bernoulli trials where
⇢
1 event occurs at time n
X(n) =
0 otherwise
with P [X(n) = 1] = p, P [X(n) = 0] = 1 p = q. The discrete-time Binomial
process counts the number of times the event occurred (successes) in a total of
n trials, or
n
X
Y (n) =
X(i),
Y (0) = 0, n 0
i=1
Since
Y (n) =
n
X1
X(i) + X(n) = Y (n
1) + X(n)
i=1
the process is also represented by the di↵erence equation
Y (n) = Y (n
1) + X(n)
8
Monday, November 14, 11
Y (0) = 0, n
0
Characterization of Y (n)
First-order p.d.
f (y, n) =
n
X
P [Y (n) = k] (y
n ✓ ◆
X
n
k) =
k=0
k=0
k
(y
k),
0kn
Second and higher-order p.d. are difficult to find given the dependence of the
Y (n)’s.
Statistical averages
E[Y (n)]
=
n
X
i=1
Var[Y (n)]
E[X(i)] = np
| {z }
p
=
E[(Y (n)
=
X
np)2 ] = E[(
n
X
i=1
E[(X(i)
p)(X(j)
p)] =
i,j
=
n
X
i=1
=
npq
p)2 ]
(X(i)
X
i,j
E[(X(i)
|
{z
p)2 ]
}
Var(X(i))=(12 ⇥p+0⇥q)
E[X(i)
|
p]E[X(j)
{z
=0 if i6=j
p] independence of X(i)
}
p2 =pq
Notice that E[Y (n)] = np, so it depends on time n, and Var[Y (n)] = npq also
a function of time n, so the discrete binomial process is non-stationary.
9
Monday, November 14, 11
Random Walk Process Let the discrete binomial process be defined by Bernoulli
trials
⇢
s
event occurs at time n
X(n) =
s otherwise
so that
Y (n) =
n
X
X(i),
Y (0) = 0, n
0
i=1
Possible values at times bigger than zero
n=1
n=2
n=3
..
.
Y (1) = 1, 1
Y (2) = 2, 0, 2
Y (3) = 3, 1, 1, 3
..
.
In general, at step n = n0 Y (n) can take values {2k n0 , 0  k  n0 }, so for
instance for n = 2, Y (2) can take values 2k 2, 0  k  2 or 2, 0, 2.
10
Monday, November 14, 11
Characterization of Y (n)
First-order p. mass d.
P [Y (n0 ) = 2k
n0 , 0  k  n0 ]
Convert the random walk process into a binomial process by (s = 1)
⇢
X(n) + 1
1 event occurs at time n
Z(n) =
=
0 otherwise
2
n
n
n
X
X
X
Z(i) n = 2Ỹ (n)
(2Z(i) 1) = 2
Y (n) =
X(i) =
i=1
i=1
i=1
where Ỹ (n) is the binomial process in the previous example.
Letting m = 2k n0 , 0  k  n0 then we have for 0  n0  n
P [Y (n0 ) = m]
=
P [2Ỹ (n0 )
n0 = 2k
n0 ] = P [Ỹ (n0 ) = k] =
✓
◆
n 0 k n0
p q
k
Mean and variance
E[Y (n)]
=
E[2Ỹ (n)
n] = 2E[Ỹ (n)]
Var[Y (n)]
=
4Var[Ỹ (n)] = 4npq
n = 2np
Both of which depend on n, so that the process is nonstationary.
11
Monday, November 14, 11
n
n
k
Sinusoidal processes
X(t) = A cos(⌦t + )
A constant,frequency ⌦ and phase
are random and independent. Assume
⇠ U[ ⇡, ⇡]. The sinusoidal process X(t) is w.s.s.
E[X(t)]
=
AE[cos(⌦t + )] = AE[cos(⌦t) sin( )
=
A[E[cos(⌦t)] E[sin( )] E[sin(⌦t)] E[cos( )]] independence
| {z }
| {z }
0
=
sin(⌦t) cos( )]
0
0
The
E[sin( )] =
Z
⇡
⇡
1
sin( )d = 0
2⇡
area under a period of the sinusoid T0 = 2⇡, likewise for E[cos( )] = 0.
The autocorrelation
RX (t + ⌧, t)
=
=
=
E[X(t + ⌧ )X(t)] = A2 E[cos(⌦(t + ⌧ ) + ) cos(⌦t + )]
A2
E[cos(⌦⌧ ) + cos(2⌦t + ⌦⌧ + 2 )]
2
⇢ 2
A2
A
A2
E[cos(⌦⌧ )] +
E[cos(2⌦t + ⌦⌧ ) sin(2 )]
E[sin(2⌦t + ⌦⌧ ) cos(2 )]
2
2
2
|
{z
}
0
=
A2
E[cos(⌦⌧ )]
2
where the zero term is obtained because of the independence and that the expected values E[sin(2 )] = E[cos(2 )] = 0 by similar reasons as above.
12
Monday, November 14, 11
X(t) = A cos(!0 t), A ⇠ U[0, 1], !0 constant. X(t) is non-stationary.
t
0
X(t)
A cos(0) = A ⇠ U[0, 1]
⇡
4!0
A cos( ⇡/4
)
⇡
!o
=
p
A 2
2
⇠ U[0,
p
2/2]
A ⇠ U[ 1, 0]
For each time the first-order p.d. is di↵erent so the process is not strictly
stationary. The process can be shown not to be wide-sense stationary:
E[X(t)]
=
cos(!0 t)E[A] = 0.5 cos(!0 t)
RX (t + ⌧, t)
=
E[A2 ] cos(!0 (t + ⌧ )) cos(!0 t)
RX (t, t)
=
=
2
A [cos(!0 ⌧ )
Var[X(t)] =
+ cos(!0 (2t + ⌧ ))]
2
A (1
+ cos(2!0 t)
which gives that the mean and the variance are not constant, and the autocorrelation is not a function of the lag, therefore the systems is not wide-sense
stationary.
13
Monday, November 14, 11
Gaussian processes
Let X(t) = A + Bt, where A and B are
Gaussian.
Consider
2
3 2
X(t1 )
1
6
7 6 ..
..
4
5=4 .
.
1
X(tn )
jointly Gaussian. Determine if X(t) is
3
t1 
.. 7 A
. 5 B
tn
for any n and times {tk }, 1  k  n, the above is a linear combination of A and
B which are Gaussian then {X(tk )} are jointly Gaussian, and so the process
X(t) is Gaussian.
Gaussian processes
Consider a Gaussian process W (n),
1 < n < 1 such that
E[W (n)] = 0 for all n
R(k, `) =
2
(k
`) =
⇢
2
0
k=`
k 6= `
such a process is a discrete white noise, determine its nth -order p.d. The covariance is
C(k, `) = R(k, `)
E[W (k)]E[W (`)] = R(k, `)
which is zero when k 6= `, so W (k) and W (`) are uncorrelated and by being
Gaussian they are independent. So
Y
f (wn1 , · · · , wnm ) =
f (wni )
i
14
Monday, November 14, 11
Download