Homework assignment 1 Problem 1. 1. Let Xt = A cos(ω0 t + φ), where φ is a random variable distributed uniformly on [0, 2π]. Draw two different sample paths of the process {Xt }. Compute the autocovariance function. Is the process stationary? 2. Let ξ ∼ N (0, 1), and defined Xt = ξ, ∀t. Draw two different sample paths and compute the autocovariance function. Is the process stationary? Problem 2. Let X and Y be two random variables, and E(Y ) = µ and EY 2 < ∞. 1. Show that the constant c that minimizes E(Y − c)2 is c = µ. 2. Deduce that the random variable f (X) that minimizes E[(Y − f (X))2 |X] is f (X) = E[Y |X]. 3. Deduce that the random variable f (X) that minimizes E[Y − f (X)]2 is f (X) = E[Y |X]. Problem 3. (generalization of Problem 2). Suppose X1 , X2 , . . . is a sequence of random variables with E(Xt2 ) < ∞ and E(Xt ) = µ. 1. Show that the random variable f (X1 , . . . , Xn ) that minimizes E[(Xn+1 − f (X1 , . . . , Xn ))2 |X1 , . . . , Xn ] is f (X1 , . . . , Xn ) = E[Xn+1 |X1 , . . . , Xn ]. 2. Deduce that the random variable that minimizes E(Xn+1 − f (x1 , . . . , Xn ))2 is also f (X1 , . . . , Xn ) = E[Xn+1 |X1 , . . . , Xn ]. 3. If X1 , . . . , Xn is independent identically distributed with E(Xt )2 < ∞ and E(Xt ) = µ, where µ is known, what is the minimum mean squared error predictor of Xn+1 in terms of X1 , . . . , Xn ? 1 Let {Zt } be a sequence of independent normal random variables, each with mean 0 and Problem 4. variance σ 2 , and let a, b, c and ω are constants. Which, if any, of the following processes are stationary? For each process specify the mean and autocovariance function. a. Xt = a + bZt−1 + cZt−3 b. Xt = Z1 cos(ωt) + Z2 sin(ωt) c. Xt = Zt cos(ωt) + Zt−1 sin(ωt) d. Xt = a + bZ0 e. Xt = Z0 cos(ωt) f. Xt = Zt Zt−2 Let Zt be independent identically distributed N (0, 1) random variables. Define Zt , t is even √ Xt = 2 (Zt−1 − 1)/ 2, t is odd. Problem 5. a. Show that {Xt } is white noise W N (0, 1) but not IID noise. b. Find E[Xn+1 |X1 , . . . , Xn ] for even and odd n and compare results. Suppose that we want to predict a stationary series {Xt } with zero mean and autocovariance Problem 6. and autocorrelation functions γX (h), ρX (h) at some future time, say, n + k. a. If we predict using only Xn and use a linear predictor X̂n+k = aXn with some scalar a, show that the mean squared error MSE(a) = E(Xn+k − aXn )2 is minimized when a = a∗ = ρX (k). b. Show that the minimal mean–squared error is M SE(a∗ ) = γX (0)[1 − ρX (k)]. c. If Xn+k = aXn then show that ρX (k) = 1 if a > 0 and ρX (k) = −1 if a < 0. Problem 7. If mt = p+1 hence that ∆ Problem 8. Pp k=0 ck t k , t = 0, ±1, . . ., show that ∆mt is a polynomial of degree p − 1 in t and mt = 0. If Xt = a+bt+st +Yt , where st is a seasonal component with period 12, and Yt is a stationary process, show that ∆∆12 Xt = (1 − B)(1 − B 12 )Xt is stationary and express its autocovariance function in terms of that of {Yt }. 2 Problem 9. Let Xt = a+bt+Yt , where {Yt , t = 0, ±1, ±2, . . .} is an independent and identically distributed sequence of random variables with mean 0 and variance σ 2 , and a and b are constants. Define Wt = q X 1 Xt+j . 2q + 1 j=−q Compute mean and autocovariance function of {Wt }. Is {Wt } stationary? Problem 10. Let {Zt } is Gaussian white noise with variance σ 2 = 1. a. Using R software simulate and plot n = 200 observations from the following model: Yt = St + Zt + 0.5Zt−1 , t = 1, . . . , 200, where St = 0, 10 exp{−(t − 100)/200} cos(2πt/4), t = 1, . . . , 100, t = 101, 102, . . . , 200. b. Compare visual appearance of the series with the earthquake data displayed as an example in the first lecture slides. Is it the sequence {Xt } stationary? c. What is the autocovariance function of {Xt }? Is it informative? Explain. d. Compute and plot the sample autocovariance function. Your solution should contain all relevant graphs and corresponding R commands that produced the simulated series and graphs. 3