Time Series Analysis Spring 2015 Assignment 2 Solutions Kaiji

advertisement
Time Series Analysis
Spring 2015
Assignment 2
Solutions
Kaiji Motegi
Waseda University
Reading: Chapter 5 of Enders (2010) Applied Econometric Time Series.
Problem-1: In the midterm exam, we learned that a martingale difference sequence {yt }
is always a white noise. The converse is not true, however. This problem constructs an
example of white noise that is not a martingale difference sequence.
Consider a bilinear process:
i.i.d.
ϵt ∼ (0, σ 2 ).
yt = bϵt−1 yt−2 + ϵt ,
(a) Show that yt = b2 ϵt−1 ϵt−3 yt−4 + bϵt−1 ϵt−2 + ϵt .
(b) Show that yt = b3 ϵt−1 ϵt−3 ϵt−5 yt−6 + b2 ϵt−1 ϵt−3 ϵt−4 + bϵt−1 ϵt−2 + ϵt .
(c) Assume |b| < 1. Keep iterating to show that
yt =
∞
∑
bi ϵt−2i
i=0
where
∏n
j=1 zj
≡ z1 × z2 × · · · × zn .
i
∏
ϵt−(2j−1) ,
(1)
j=1
∏0
j=1 ϵt−(2j−1)
(d) Show that Et−1 [yt ] ≡ E[ yt | ϵt−1 , ϵt−2 , . . . ] =
∑∞
i=1
is understood as 1.
bi ϵt−2i
∏i
j=1 ϵt−(2j−1) .
(Remark: This
result indicates that {yt } is not a martingale difference sequence since Et−1 [yt ] ̸= 0.)
(e) Using Eq. (1), show that E[yt ] = 0.
(f) Assume that b2 σ 2 < 1. Show that E[yt2 ] = σ 2 /(1 − b2 σ 2 ). (Hint: Since {ϵt } is i.i.d.,
E[f (ϵt )g(ϵs )] = E[f (ϵt )]E[g(ϵs )] for any functions f (·) and g(·) whenever t ̸= s. For
example, E[ϵ2t ϵs ] = E[ϵ2t ]E[ϵs ] = σ 2 × 0 = 0 whenever t ̸= s.)
(g) Show that E[yt yt−j ] = 0 for all j ≥ 1. (Hint: Use the i.i.d. property of {ϵt } again.)
(Remark: Parts (e), (f), and (g) indicate that {yt } is a white noise.)
1
Time Series Analysis
Spring 2015
Assignment 2
Solutions
Kaiji Motegi
Waseda University
Solution-1: (a) We have that yt = bϵt−1 (bϵt−3 yt−4 +ϵt−2 )+ϵt = b2 ϵt−1 ϵt−3 yt−4 +bϵt−1 ϵt−2 +ϵt .
(b) Iterating back once more, we have that yt = b2 ϵt−1 ϵt−3 (bϵt−5 yt−6 + ϵt−4 )+bϵt−1 ϵt−2 +ϵt
= b3 ϵt−1 ϵt−3 ϵt−5 yt−6 + b2 ϵt−1 ϵt−3 ϵt−4 + bϵt−1 ϵt−2 + ϵt .
(c) Keep iterating to realize that yt = b0 ϵt−2×0 + b1 ϵt−2×1 (ϵt−1 ) + b2 ϵt−2×2 (ϵt−1 ϵt−3 ) +
b3 ϵt−2×3 (ϵt−1 ϵt−3 ϵt−5 ) + . . . . Using the summation and product operators, this can be rewritten compactly as (1).
(d) The first term of the right-hand side of Eq. (1), i.e. b0 ϵt−2×0 , contains ϵt . All other
terms of the right-hand side contain past values of ϵ only (e.g. ϵt−1 ϵt−2 in the second term).
Since Et−1 [ϵt ] = 0 and ϵt−1 , ϵt−2 , . . . are all known given the information set up to time t − 1,
∑
∏i
i
we have that Et−1 [yt ] = ∞
i=1 b ϵt−2i
j=1 ϵt−(2j−1) .
(e) The unconditional expectation of each term of the right-hand side of Eq. (1) is zero
due to the i.i.d. property of {ϵt }. Hence, E[yt ] = 0.
(f) Consider E[yt2 ] = E[(ϵt + bϵt−1 ϵt−2 + b2 ϵt−1 ϵt−3 ϵt−4 + . . . )2 ]. The expectations of cross
terms are all zeros due to the i.i.d. property of {ϵt }. Focusing on squared terms, we have
that E[yt2 ] = σ 2 + b2 (σ 2 )2 + b4 (σ 2 )3 + · · · = σ 2 [1 + b2 σ 2 + b4 (σ 2 )2 + . . . ] = σ 2 /(1 − b2 σ 2 ).
(g) Consider j = 1 first. We have that E[yt yt−1 ] = E[ (ϵt + bϵt−1 ϵt−2 + b2 ϵt−1 ϵt−3 ϵt−4 + . . . )
(ϵt−1 + bϵt−2 ϵt−3 + b2 ϵt−2 ϵt−4 ϵt−5 + . . . )]. The expectation of each term equals zero due to the
i.i.d. property of {ϵt } (e.g. E[bϵt−1 ϵt−2 × ϵt−1 ] = E[bϵ2t−1 ϵt−2 ] = b × σ 2 × E[ϵt−2 ] = 0). The
same argument holds for any j ≥ 2 as well.
Problem-2: Consider bivariate time series {xt } with xt = [x1,t , x2,t ]′ . This problem has
two similar goals. First, we construct an example where both {x1,t } and {x2,t } are covariance
stationary but {xt } is not jointly covariance stationary. Second, keeping the same example,
we show that a linear combination of individually covariance stationary series may not be
covariance stationary.
(a) Suppose that {bt } is independently and identically distributed mean-centered Bernoulli
2
Time Series Analysis
Spring 2015
Assignment 2
Solutions
random variables:
bt =
Kaiji Motegi
Waseda University



 −1 with probability 0.5,


 1 with probability 0.5.
Show that E[bt ] = 0, E[b2t ] = 1, and E[bt bt−j ] = 0 for any j ≥ 1.
(b) Assume that x1,2t = b3t and x1,2t+1 = b3t+1 for each t. Also assume that x2,2t = b3t and
x2,2t+1 = b3t+2 for each t. These patterns can be summarized in the following table:
t
1
2
3
4
5
6
7
...
x1,t
b1
b3
b4
b6
b7
b9
b10
...
x2,t
b2
b3
b5
b6
b8
b9
b11
...
Show that both {x1,t } and {x2,t } are covariance stationary.
(c) Show that {xt } is not jointly covariance stationary. (Hint: Joint covariance stationarity
requires that variance E[(xt − E[xt ])(xt − E[xt ])′ ] does not depend on t.) (Remark
1: As shown here, joint covariance stationarity is stronger than individual covariance
stationarity.)
(d) Consider a linear combination yt = β1 x1,t + β2 x2,t with β1 ̸= 0 and β2 ̸= 0. Show that
{yt } is not covariance stationary. (Remark 2: We learned in class that any linear
combination of jointly covariance stationary {xt } is covariance stationary. Problem-2
has shown that individual covariance stationarity does not guarantee the covariance
stationarity of linear combination.)
Solution-2: (a) E[bt ] = (−1) × 0.5 + 1 × 0.5 = 0. E[bt ] = (−1)2 × 0.5 + 12 × 0.5 = 1.
E[bt bt−j ] = E[bt ]E[bt−j ] = 0 for any j ≥ 1 in view of the independence.
(b) E[x1,t ] = 0, E[x21,t ] = 1, and E[x1,t x1,t−j ] = 0 for any j ≥ 1. {x1t } is therefore
covariance stationary. Similarly, E[x2,t ] = 0, E[x22,t ] = 1, and E[x2,t x2,t−j ] = 0 for any j ≥ 1.
{x2t } is therefore covariance stationary.
3
Time Series Analysis
Spring 2015
Assignment 2
Solutions
Kaiji Motegi
Waseda University
(c) E[(xt − E[xt ])(xt − E[xt ])′ ] is equal to I 2 if t is odd, while it is equal to a 2 × 2 matrix
whose elements are all 1 if t is even. Thus, E[(xt − E[xt ])(xt − E[xt ])′ ] depends on time t.
(d) We have that E[yt ] = 0 and E[(yt − E[yt ])2 ] = E[yt2 ] = β12 x21,t + 2β1 β2 x1,t x2,t + β22 x22,t .
Hence, E[yt2 ] = β12 + β22 if t is odd, while E[yt2 ] = β12 + 2β1 β2 + β22 if t is even. Since β1 ̸= 0
and β2 ̸= 0, E[yt2 ] takes a different value for odd t and even t. Thus, {yt } is not covariance
stationary.
Problem-3: Consider a bivariate VAR(1) process:
  

  
x1t  0.7 0.6 x1t−1  ϵ1t 
 =

 +  ,
x2t
0 0.8
x2t−1
ϵ2t
| {z } | {z } | {z } | {z }
≡X t
≡A
(2)
≡ϵt
=X t−1
where {ϵt } is a joint white noise with




 1 0.5   1
E[ϵt ϵ′t ] = 
=
0.5 1.25
0.5
|
{z
} | {z
≡Ω
≡L
′
0   1 0

 .
1
0.5 1
} | {z }
=L′
(a) Show that the VAR(1) process appearing in Eq. (2) is covariance stationary.
(b) Calculate VMA coefficients Θ0 , Θ1 , Θ2 , Θ3 , Θ4 , and Θ30 . (Recall xt =
∑∞
j=0
Θj η t−j ,
where {η(τL )} is a joint white noise with E[η(τL )η(τL )′ ] = I 2 .)
(c) Implement the forecast error variance decomposition of {x1t } for horizons h = 1, 2, 3, 4, 30.
(d) Implement the forecast error variance decomposition of {x2t } for horizons h = 1, 2, 3, 4, 30.
Solution-3: (a) The eigenvalues of A are 0.6 and 0.7. Hence Eq. (2) is a covariance
stationary VAR process.
4
Time Series Analysis
Spring 2015
Assignment 2
Solutions
Kaiji Motegi
Waseda University
(b) We have that Θj = Aj L for any j ≥ 0. Hence,






 1 0
 1 0.6
0.94 0.9 
Θ0 = 
 , Θ1 = 
 , Θ2 = 
,
0.5 1
0.4 0.8
0.32 0.64






 0.85 1.014
0.749 1.107
0.004 0.007
Θ3 = 
 , Θ4 = 
 , Θ30 = 
.
0.256 0.512
0.205 0.410
0.001 0.001
(c) See Table 1.
Table 1: Forecast Error Variance Decomposition of {x1t }
Share of η1
Share of η2
h=1
1
0
h=2
0.848
0.153
h=3
0.711
0.289
h=4
0.621
0.379
...
...
...
h = 30
0.445
0.555
(d) See Table 2.
Table 2: Forecast Error Variance Decomposition of {x2t }
Share of η1
Share of η2
h=1
0.2
0.8
h=2
0.2
0.8
h=3
0.2
0.8
5
h=4
0.2
0.8
...
...
...
h = 30
0.2
0.8
Download