ST414 – Spectral Analysis of Time Series Data Lecture 4 13 February 2014 Last Time • The periodogram • The smoothed periodogram • Asymptotic considerations 2 Today’s Objectives • Estimating the AR parameters • The AR spectrum 3 AR(1) π π‘ = ππ π‘ − 1 + π π‘ π π‘ is white noise (0, π 2 ) π2 π π = 2 π − 2π cos 2ππ + 1 π π = π2 π 2 − 2π cos 2ππ + 1 4 Some Preliminaries X(t) is an ARMA(p,q) process if X(t) is stationary and if for every t, π π π‘ − π ππ π π‘ − π = π π‘ + π=1 ππ π π‘ − π , π=1 where Z(t) is white noise (0, π 2 ) and the π π π polynomials (1 − π=1 ππ π§ ) and (1 + π=1 ππ π§ π ) have no common factors. 5 Some Preliminaries An ARMA(p,q) process X(t) is said to be causal if there exists a sequence of constants {ππ } with ∞ π=0 |ππ | < ∞ such that π π‘ = ∞ π=0 ππ π(π‘ − π) for all t. 6 The Sample Mean π π = π −1 π(π‘) π‘=1 π πππ π = π −1 β=−π β 1− π πΎ(β) 7 The Sample Autocovariance A “natural” estimator: π−|β| πΎ β = π −1 (π π‘ + β − π)(π π‘ − π) , π‘=1 −π < β < π The divisor by T (instead of T-h) ensures that the sample covariance matrix Γ = [πΎ π − π ]π,π is nonnegative definite. 8 The Yule-Walker Equations For an AR(p) causal model: Γπ ππ = πΎπ π πΎ 0 − ππ πΎπ = π 2 , where Γπ = πΎ π − π π π,π=1 (π1 , … , ππ )π π ππ = πΎπ = (πΎ 1 , … πΎ π ) 9 Conditional MLE Consider a causal AR(1) π π‘ = ππ π‘ − 1 + π π‘ with π π‘ πππ π 0, π 2 , π‘ = 1, … , π. Likelihood function: πΏ π, π 2 = π π 1 , … , π π π, π 2 = π(π 1 )π(π 2 |π 1 ) βββ π(π(π)|π π − 1 ) 10 Conditional MLE Note that π π‘ |π π‘ − 1 ~π(ππ π‘ − 1 , π 2 ) for π‘ = 2, … , π. Then the conditional (on π 1 ) likelihood is π πΏ π, π 2 |π(1) = π(π(π‘)|π π‘ − 1 ) π‘=2 2 −(π−1)/2 = (2ππ ) −π π exp( ) 2 2π 11 Conditional MLE π πΏ π, π 2 |π(1) = π(π(π‘)|π π‘ − 1 ) π‘=2 2 −(π−1)/2 = (2ππ ) −π π exp( ) 2 2π where π (π π‘ − ππ(π‘ − 1))2 π π = π‘=2 12 Conditional MLE The conditional MLE approach reduces to a regression problem! This approach can be generalised to the AR(p) model. 13 The AR Spectrum The spectral density for an AR(p) is π2 π π = , 2 |Φ(exp −π2ππ )| where π ππ π§ π Φ π§ =1− π=1 14 The AR Spectrum Let g π be the spectrum of a weakly stationary process. Then for π > 0, there exists a time series with representation π π π‘ = ππ π π‘ − π + π π‘ , π=1 where π π‘ is white noise (0, π 2 ) such that ππ π − π π < π for all ππ[−0.5,0.5]. 15 The AR Spectrum Problem: how large must the order be for the approximation to be reasonable? 16 Model Selection • PACF • AIC π΄πΌπΆ = log π 2 + (π + 2πΎ)/π where K is the number of parameters • BIC π΅πΌπΆ = log π 2 + (πΎ log(π))/π 17 Example 1 18 Example 1 19 Example 1 20 Example 1 21 Example 1 22 Example 1 23 Example 2 24 Example 2 25 Example 2 26 Example 2 27 Example 2 28 The AR Spectrum πππ π π 2π 2 ≈ π (π) π As the order increases: • the bias decreases, i.e., more complex spectra can be modeled • the variance increases linearly 29 The Whittle Likelihood Parametric spectrum: π π = π π; π Can we optimize the parameters in the frequency domain? 30 The Whittle Likelihood Recall for Gaussian white noise: π π(π ππ ) ~π(0,0.5diag(π ππ , π ππ ) πΌπ(π ππ ) In general, π ππ π π πΆ (0, π ππ ) and approximately independent at distinct frequencies. 31 The Whittle Likelihood The Whittle Likelihood: log πΏ π π 1 ≈− 2 0<ππ <0.5 |π ππ |2 log π(ππ ; π) + π(ππ ; π) 32 Comparisons AR spectrum • Good frequency resolution, even for low-order models • Potential for model misspecification Periodogram • Frequency resolution is a function of the length of the time series • Some form of smoothing is necessary for a stable estimate 33