Solution of homework assignment 1 Problem 1. 1. Expectation: EXt

advertisement
Solution of homework assignment 1
Problem 1.
1. Expectation:
A
EXt =
2π
Z
2π
cos(ω0 t + φ)dφ = 0, ∀t
0
because the integral of cos x over a period equals 0. Covariance:
EXt Xt+h
Z 2π
1
A2 cos(ω0 t + φ) cos(ω0 (t + h) + φ)dφ
=
2π 0
"
Z
A2 2π 1
1
=
cos(ω0 t + φ + ω0 (t + h) + φ) + cos(ω0 t + φ − ω0 (t + h) − φ)dφ
2π 0
2
2
i
2 Z 2π h
A
=
cos(2ω0 t + 2φ + ω0 h) + cos(ω0 h) dφ
4π 0
A2
cos(ω0 h).
=
2
Here the formula 2 cos α cos β = cos(α + β) + cos(α − β) is used. The process is stationary because
EXt Xt+h depends only on h.
2. Expectation EXt = Eξ = 0, ∀t. Covariance: EXt Xt+h = Eξ 2 = 1, ∀h, t. Thus the process is
stationary.
Problem 2.
1. We have E(Y − c)2 = E(Y − c ± µ)2 = E(Y − µ)2 + (c − µ)2 , which is minimized with respect to c by
the choice c = c∗ = µ.
2. Use the same argument, adding and subtracting E(Y |X):
2
2
2
E[ Y − f (X) ± E(Y |X) |X] = E[ Y − E(Y |X) |X] + f (X) − E(Y |X) ,
which is minimized by f (X) = E[Y |X].
3. First write E[Y − f (x)]2 = E{E[Y − f (X)|X]2 } and then proceed as in the previous items.
Problem 3.
1-2. First two items exactly as in Problem 2, where one needs to substitute Y = Xn+1 and vector
(X1 , . . . , Xn ) instead of X.
1
3. If X1 , X2 . . . are i.i.d. then according to item 2 the best predictor is E[Xn+1 |X1 , . . . , Xn ] = EXn+1 = µ,
and the minimum mean squared error of this predictor is E(Xn+1 − µ)2 = var(Xn+1 ).
Problem 4.
a. Xt = a + bZt−1 + cZt−3 . The mean function
EXt = a + bEZt−1 + cEZt−3 = a, ∀t.
The autocovariance function
cov(Xt , Xt+h )
= E[(Xt − a)(Xt+h − a)]
= E[(bZt−1 + cZt−3 )(bZt−1+h + cZt−3+h )]
= b2 E(Zt−1 Zt−1+h ) + bcE(Zt−1 Zt−3+h ) + cbE(Zt−3 Zt−1+h ) + c2 E(Zt−3 Zt−3+h )
 2 2
 σ (b + c2 ), h = 0,
σ 2 cb,
|h| = 2,
=

0,
otherwise.
Hence the process is stationary.
b. Xt = Z1 cos(ωt) + Z2 sin(ωt). The mean fuction
E(Xt ) = E(Z1 ) cos(ωt) + E(Z2 ) sin(ωt) = 0, ∀t.
The autocovariance function
cov(Xt , Xt+h )
= E(Xt Xt+h )
= E{[Z1 cos(ωt) + Z2 sin(ωt)][Z1 cos(ω(t + h)) + Z2 sin(ω(t + h))]}
= E(Z12 ) cos(ωt) cos(ω(t + h)) + E(Z22 ) sin(ωt) sin(ω(t + h))
= σ 2 [cos(ωt) cos(ω(t + h)) + sin(ωt) sin(ω(t + h))]
= σ 2 cos(ωh).
Here we have used the fact that Z1 and Z2 are independent, i.e. E(Z1 Z2 ) = 0, and the well–known
formula cos(α − β) = cos(α) cos(β) + sin(α) sin(β). Thus the process is stationary.
c. Xt = Zt cos(ωt) + Zt−1 sin(ωt). The mean function
E(Xt ) = E(Zt ) cos(ωt) + E(Zt−1 ) sin(ωt) = 0, ∀t.
Let’s compute cov(Xt , Xt+1 ). We have
cov(Xt , Xt+1 )
= E(Xt Xt+1 )
= E{[Zt cos(ωt) + Zt−1 sin(ωt)][Zt+1 cos(ω(t + 1)) + Zt sin(ω(t + 1))]}
= E(Zt2 ) cos(ωt) sin(ω(t + 1)) = σ 2 cos(ωt) sin(ω(t + 1))
2
Because cov(Xt , Xt+1 ) depends on t, the process is non–stationary.
d. Xt = a + bZ0 . The process is stationary, EXt = a, cov(Xt , Xt+h ) = b2 σ 2 , ∀h.
e. Xt = Z0 cos(ωt). The process is non-stationary because
var(Xt ) = E(Z02 ) cos2 (ωt) = σ 2 cos2 (ωt)
depends on t.
f. Xt = Zt Zt−2 . The mean function E(Xt ) = E(Zt Zt−2 ) = 0, ∀t.
The autocovariance function
cov(Xt , Xt+h ) = E(Xt Xt+h )
=
=
E(Zt Zt−2 Zt+h Zt+h−2 )
4
σ , h=0
0,
otherwise
Here we have used the fact that {Zt } are independent. In fact, {Xt } is a white noise process!
Problem 5.
a. To show that Xt is not an IID noise it is sufficient to note that distributions of Xt for odd and
even t are not the same. For even t, Xt = Zt ∼ N (0, 1), while for odd t, Xt is distributed as the
2
2
2
“normalized” χ2 (1) random variable. Indeed, Zt−1
∼ χ2 (1); hence EZt−1
= 1 and var(Zt−1
) = 2 and
2
E 12 (Zt−1
− 1)2 = 1. Thus, for all t, EXt = 0 and var(Xt ) = 1, however with different distributions.
To show that {Xt } is a white noise sequence we note that Xt and Xt−h are independent for |h| ≥ 2 so
EXt Xt+h = 0 for all |h| ≥ 2. Let us compute EXt Xt+1 . If t is even then
EXt Xt+1 = EZt √12 (Zt2 − 1) =
√1 (EZ 3
t
2
− EZt ) = 0.
If t is odd then
EXt Xt+1 = E
√1 (Z 2
t−1
2
− 1)Zt+1 = 0.
This shows that Xt is white noise.
b. If n even then n + 1 is odd, Xn+1 =
√1 (Z 2
n
2
− 1), Xn = Zn , and X1 , . . . , Xn−1 are determined via Zt
with 1 ≤ t ≤ n − 1. Therefore, by independence of {Zt }, if n even,
E[Xn+1 |X1 , . . . , Xn ] = E
√1 (Z 2
n
2
− 1)|Zn =
√1 (Z 2
n
2
− 1) =
√1 (X 2
n
2
− 1).
Now assume that n is odd; then n + 1 even, Xn+1 = Zn+1 and Xn+1 is independent of X1 , . . . , Xn .
Thus, for odd n
E[Xn+1 |X1 , . . . , Xn ] = EZn+1 = 0.
Therefore the best predictor of Xn+1 on the basis of X1 , . . . , Xn is 0 if n is odd, and
is even.
3
√1 (X 2
n
2
− 1) is n
Problem 6.
a. We have
2
MSE(a) = E(Xn+k − aXn )2 = E(Xn+k
− 2aXn+k Xn + a2 Xn2 ) = γX (0) − 2aγX (k) + a2 γX (0).
Minimizing the last expression w.r.t. a, we obtain a∗ = γX (k)/γX (0) = ρX (k).
b. Substituting a = a∗ in the above expression for MSE(a) we have MSE(a∗ ) = γX (0)(1 − ρ2X (k)).
Problem 7.
Pp
Let mt = k=0 ck tk ; then
∆mt
=
p
X
ck tk −
k=0
=
p−1
X
p
X
ct (t − 1)k =
k=0
p
X
ck tk − (t − 1)k
k=0
ck tk − (t − 1)k + cp [tp − (t − 1)p ]
k=0
Using the binomial formula we obtain
tp − (t − 1)p = tp −
p X
p
j=0
j
(−1)p−j tj = −
p−1 X
p
j=0
j
(−1)p−j tj ,
which is a polynomial of degree p − 1 in t. Thus ∆mt is a polynomial of degree p − 1.
Problem 8.
We have
Vt
=
∆∆12 Xt = ∆∆12 (a + bt) + ∆∆12 st + ∆∆12 Yt
=
∆(st − st−12 ) + ∆(Yt − Yt−12 ) = Yt − Yt−1 − Yt−12 + Yt−13 .
Hence by stationarity of Yt
E∆∆12 Xt = E(Yt − Yt−1 − Yt−12 + Yt−13 ) = 2µ − 2µ = 0,
4
and
γV (h)
= E(Vt Vt+h )
= E{(Yt − Yt−1 − Yt−13 + Yt−12 )(Yt+h − Yt+h−1 − Yt+h−13 + Yt+h−12 )}
= γY (h) − γY (h − 1) − γY (h − 13) − γY (h − 12)
−γY (h + 1) + γY (h) + γY (h − 12) + γY (h − 11)
−γY (h + 13) + γY (h + 12) + γY (h) − γY (h + 1)
+γY (h + 12) − γY (h + 11) − γY (h − 1) + γY (h)
=
4γY (h) − 2γY (h − 1) − 2γY (h + 1) + 2γY (h − 12) + 2γY (h − 12)
−γY (h − 11) − γY (h + 11) − γY (h + 13) − γY (h − 13).
Problem 9.
We have
Wt
=
q
X
1
{a + b(t + j) + Yt+j }
2q + 1 j=−q
=
a+
=
q
X
1
a + bt +
Yt+j
2q + 1 j=−q
q
q
X
X
b
1
(t + j) +
Yt+j
2q + 1 j=−q
2q + 1 j=−q
Therefore EWt = a + bt; hence Wt is not stationary.
cov(Wt , Wt+h )
=
q
q
X
X
1
E{Yt+j Yt+k+h }
(2q + 1)2 j=−q
k=−q
=
1
(2q + 1)2
q
q
X
X
γY (k − j + h)
j=−q k=−q
Note that even though Wt is non-stationary, cov(Wt , Wt+h ) does not depend on t.
5
Download