Chapter 9

advertisement
Chapter 9
Regression with Time Series Data:
Stationary Variables
Walter R. Paczkowski
Rutgers University
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 1
Chapter Contents








9.1 Introduction
9.2 Finite Distributed Lags
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 2
9.1
Introduction
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 3
9.1
Introduction
When modeling relationships between variables,
the nature of the data that have been collected has
an important bearing on the appropriate choice of
an econometric model
– Two features of time-series data to consider:
1. Time-series observations on a given
economic unit, observed over a number of
time periods, are likely to be correlated
2. Time-series data have a natural ordering
according to time
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 4
9.1
Introduction
There is also the possible existence of dynamic
relationships between variables
– A dynamic relationship is one in which the
change in a variable now has an impact on that
same variable, or other variables, in one or
more future time periods
– These effects do not occur instantaneously but
are spread, or distributed, over future time
periods
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 5
9.1
Introduction
Principles of Econometrics, 4th Edition
FIGURE 9.1 The distributed lag effect
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 6
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x
yt  f ( xt , xt 1 , xt 2 ,...)
Eq. 9.1
• Because of the existence of these lagged
effects, Eq. 9.1 is called a distributed lag
model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 7
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Eq. 9.2
Ways to model the dynamic relationship (Continued):
2. Capturing the dynamic characteristics of timeseries by specifying a model with a lagged
dependent variable as one of the explanatory
variables
yt  f ( yt 1 , xt )
• Or have:
Eq. 9.3
yt  f ( yt 1 , xt , xt 1 , xt 2 )
– Such models are called autoregressive
distributed lag (ARDL) models, with
‘‘autoregressive’’ meaning a regression of yt
on its own lag or lags
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 8
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term
yt  f ( xt )  et
Eq. 9.4
et  f (et 1 )
• In this case et is correlated with et - 1
• We say the errors are serially correlated or
autocorrelated
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 9
9.1
Introduction
9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:
cov  yi , y j   cov  ei , e j   0 for i  j
• For time series, this is written as:
cov  yt , ys   cov  et , es   0 for t  s
– The dynamic models in Eqs. 9.2, 9.3 and 9.4
imply correlation between yt and yt - 1 or et and
et - 1 or both, so they clearly violate assumption
MR4
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 10
9.1
Introduction
9.1.2a
Stationarity
A stationary variable is one that is not explosive,
nor trending, and nor wandering aimlessly without
returning to its mean
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 11
9.1
Introduction
FIGURE 9.2 (a) Time series of a stationary variable
9.1.2a
Stationarity
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 12
9.1
Introduction
FIGURE 9.2 (b) time series of a nonstationary variable that is ‘‘slow-turning’’
or ‘‘wandering’’
9.1.2a
Stationarity
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 13
9.1
Introduction
FIGURE 9.2 (c) time series of a nonstationary variable that ‘‘trends”
9.1.2a
Stationarity
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 14
9.1
Introduction
FIGURE 9.3 (a) Alternative paths through the chapter starting with finite
distributed lags
9.1.3
Alternative Paths
Through the
Chapter
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 15
9.1
Introduction
FIGURE 9.3 (b) Alternative paths through the chapter starting with
serial correlation
9.1.3
Alternative Paths
Through the
Chapter
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 16
9.2
Finite Distributed Lags
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 17
9.2
Finite Distributed
Lags
Consider a linear model in which, after q time
periods, changes in x no longer have an impact on
y
Eq. 9.5
yt    0 xt  1 xt 1  2 xt 2 
 q xt q  et
– Note the notation change: βs is used to denote
the coefficient of xt-s and α is introduced to
denote the intercept
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 18
9.2
Finite Distributed
Lags
Model 9.5 has two uses:
– Forecasting
Eq. 9.6
yT 1    0 xT 1  1 xT  2 xT 1 
 q xT q1  eT 1
– Policy analysis
• What is the effect of a change in x on y?
Eq. 9.7
Principles of Econometrics, 4th Edition
E ( yt ) E ( yt  s )

 s
xt  s
xt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 19
9.2
Finite Distributed
Lags
Assume xt is increased by one unit and then
maintained at its new level in subsequent periods
– The immediate impact will be β0
– the total effect in period t + 1 will be β0 + β1, in
period t + 2 it will be β0 + β1 + β2, and so on
• These quantities are called interim
multipliers
– The total multiplier is the final effect on y of
the sustained increase
after q or more periods
q
have elapsed  β s
s 0
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 20
9.2
Finite Distributed
Lags
The effect of a one-unit change in xt is distributed
over the current and next q periods, from which
we get the term ‘‘distributed lag model’’
– It is called a finite distributed lag model of
order q
• It is assumed that after a finite number of
periods q, changes in x no longer have an
impact on y
– The coefficient βs is called a distributed-lag
weight or an s-period delay multiplier
– The coefficient β0 (s = 0) is called the impact
multiplier
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 21
9.2
Finite Distributed
Lags
ASSUMPTIONS OF THE DISTRIBUTED LAG MODEL
9.2.1
Assumptions
TSMR1. yt    β0 xt  β1xt 1  β2 xt 2   βq xt q  et , t  q  1, , T
TSMR2. y and x are stationary random variables, and et is independent of
current, past and future values of x.
TSMR3. E(et) = 0
TSMR4. var(et) = σ2
TSMR5. cov(et, es) = 0 t ≠ s
TSMR6. et ~ N(0, σ2)
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 22
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okun’s Law
Consider Okun’s Law
– In this model the change in the unemployment
rate from one period to the next depends on the
rate of growth of output in the economy:
Ut Ut 1   Gt  GN 
Eq. 9.8
– We can rewrite this as:
DUt    β0Gt  et
Eq. 9.9
where DU = ΔU = Ut - Ut-1, β0 = -γ, and
α = γGN
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 23
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okun’s Law
We can expand this to include lags:
Eq. 9.10
DUt    β0Gt  β1Gt 1  β2Gt 2 
 βqGt q  et
We can calculate the growth in output, G, as:
Eq. 9.11
Principles of Econometrics, 4th Edition
GDPt  GDPt 1
Gt 
100
GDPt 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 24
9.2
Finite Distributed
Lags
FIGURE 9.4 (a) Time series for the change in the U.S. unemployment rate:
1985Q3 to 2009Q3
9.2.2
An Example:
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 25
9.2
Finite Distributed
Lags
FIGURE 9.4 (b) Time series for U.S. GDP growth: 1985Q2 to 2009Q3
9.2.2
An Example:
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 26
9.2
Finite Distributed
Lags
Table 9.1 Spreadsheet of Observations for Distributed Lag Model
9.2.2
An Example:
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 27
9.2
Finite Distributed
Lags
Table 9.2 Estimates for Okun’s Law Finite Distributed Lag Model
9.2.2
An Example:
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 28
9.3
Serial Correlation
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 29
9.3
Serial Correlation
When is assumption TSMR5, cov(et, es) = 0 for
t ≠ s likely to be violated, and how do we assess
its validity?
– When a variable exhibits correlation over time,
we say it is autocorrelated or serially
correlated
• These terms are used interchangeably
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 30
9.3
Serial Correlation
FIGURE 9.5 Scatter diagram for Gt and Gt-1
9.3.1
Serial Correlation
in Output Growth
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 31
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Recall that the population correlation between two
variables x and y is given by:
ρ xy 
Principles of Econometrics, 4th Edition
cov  x, y 
var  x  var  y 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 32
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.12
For the Okun’s Law problem, we have:
ρ1 
cov  Gt , Gt 1 
var  Gt  var  Gt 1 

cov  Gt , Gt 1 
var  Gt 
The notation ρ1 is used to denote the population
correlation between observations that are one period
apart in time
– This is known also as the population
autocorrelation of order one.
– The second equality in Eq. 9.12 holds because
var(Gt) = var(Gt-1) , a property of time series that
are stationary
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 33
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
The first-order sample autocorrelation for G is
obtained from Eq. 9.12 using the estimates:
1 T
cov  Gt , Gt 1  
Gt  G  Gt 1  G 


T  1 t 2
2
1 T
var  Gt  
Gt  G 


T  1 t 1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 34
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Making the substitutions, we get:
  G  G  G
T
Eq. 9.13
r1 
t 2
 G  G 
T
t 1
Principles of Econometrics, 4th Edition
t 1
t
 G
2
t
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 35
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.14
More generally, the k-th order sample
autocorrelation for a series y that gives the
correlation between observations that are k periods
apart is:
T
 yt  y  yt k  y 

rk  t  k 1 T
2
  yt  y 
t 1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 36
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Because (T - k) observations are used to compute
the numerator and T observations are used to
compute the denominator, an alternative that leads
to larger estimates in finite samples is:
Eq. 9.15
Principles of Econometrics, 4th Edition
1 T
 yt  y  yt k  y 

T  k t k 1
rk 
1 T
2
 yt  y 

T t 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 37
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Applying this to our problem, we get for the first
four autocorrelations:
Eq. 9.16
r1  0.494 r2  0.411 r3  0.154 r4  0.200
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 38
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
How do we test whether an autocorrelation is
significantly different from zero?
– The null hypothesis is H0: ρk = 0
– A suitable test statistic is:
Eq. 9.17
Principles of Econometrics, 4th Edition
rk  0
Z
 T rk
1T
Chapter 9: Regression with Time Series Data:
Stationary Variables
N  0,1
Page 39
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
For our problem, we have:
Z1  98  0.494  4.89, Z 2  98  0.414  4.10
Z 3  98  0.154  1.52, Z 4  98  0.200  1.98
– We reject the hypotheses H0: ρ1 = 0 and
H0: ρ2 = 0
– We have insufficient evidence to reject
H0: ρ3 = 0
– ρ4 is on the borderline of being significant.
– We conclude that G, the quarterly growth rate
in U.S. GDP, exhibits significant serial
correlation at lags one and two
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 40
9.3
Serial Correlation
9.3.1b
The Correlagram
The correlogram, also called the sample
autocorrelation function, is the sequence of
autocorrelations r1, r2, r3, …
– It shows the correlation between observations
that are one period apart, two periods apart,
three periods apart, and so on
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 41
9.3
Serial Correlation
FIGURE 9.6 Correlogram for G
9.3.1b
The Correlagram
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 42
9.3
Serial Correlation
9.3.2
Serially Correlated
Errors
The correlogram can also be used to check
whether the multiple regression assumption
cov(et, es) = 0 for t ≠ s is violated
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 43
9.3
Serial Correlation
9.3.2a
A Phillips Curve
Consider a model for a Phillips Curve:
INFt  INFt E  γ Ut Ut 1 
Eq. 9.18
– If we initially assume that inflationary
expectations are constant over time (β1 = INFEt)
set β2= -γ, and add an error term:
Eq. 9.19
Principles of Econometrics, 4th Edition
INFt  β1  β2 DUt  et
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 44
9.3
Serial Correlation
FIGURE 9.7 (a) Time series for Australian price inflation
9.3.2a
A Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 45
9.3
Serial Correlation
FIGURE 9.7 (b) Time series for the quarterly change in the Australian
unemployment rate
9.3.2a
A Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 46
9.3
Serial Correlation
9.3.2a
A Phillips Curve
To determine if the errors are serially correlated,
we compute the least squares residuals:
Eq. 9.20
Principles of Econometrics, 4th Edition
eˆt  INFt  b1  b2 DUt 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 47
9.3
Serial Correlation
FIGURE 9.8 Correlogram for residuals from least-squares estimated
Phillips curve
9.3.2a
A Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 48
9.3
Serial Correlation
9.3.2a
A Phillips Curve
The k-th order autocorrelation for the residuals can
be written as:
T
rk 
Eq. 9.21
 eˆ eˆ
t  k 1
T
t t k
2
ˆ
e
t
t 1
– The least squares equation is:
INF  0.7776  0.5279 DU
Eq. 9.22
Principles of Econometrics, 4th Edition
 se   0.0658  0.2294 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 49
9.3
Serial Correlation
9.3.2a
A Phillips Curve
The values at the first five lags are:
r1  0.549 r2  0.456 r3  0.433 r4  0.420 r5  0.339
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 50
9.4
Other Tests for Serially Correlated
Errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 51
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
An advantage of this test is that it readily
generalizes to a joint test of correlations at more
than one lag
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 52
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
If et and et-1 are correlated, then one way to model
the relationship between them is to write:
et  ρet 1  vt
Eq. 9.23
– We can substitute this into a simple regression
equation:
Eq. 9.24
Principles of Econometrics, 4th Edition
yt  β1  β2 xt  ρet 1  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 53
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
We have one complication: eˆ0 is unknown
– Two ways to handle this are:
1. Delete the first observation and use a total
of T observations
2. Set eˆ0  0 and use all T observations
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 54
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
For the Phillips Curve:
 i  t  6.219
 ii  t  6.202
F  38.67 p -value  0.000
F  38.47 p -value  0.000
– The results are almost identical
– The null hypothesis H0: ρ = 0 is rejected at all
conventional significance levels
– We conclude that the errors are serially
correlated
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 55
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:
yt  β1  β2 xt  ρeˆt 1  vt
Eq. 9.25
– But since we know that yt  b1  b2 xt  eˆt , we
get:
b1  b2 xt  eˆt  β1  β2 xt  ρeˆt 1  vt
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 56
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
Rearranging, we get:
eˆt  β1  b1   β 2  b2  xt  ρeˆt 1  vt
Eq. 9.26
 γ1  γ 2 xt  ρeˆt 1  v
– If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
• T and R2 are the sample size and goodnessof-fit statistic, respectively, from least
squares estimation of Eq. 9.26
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 57
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
Considering the two alternative ways to handle eˆ0 :
2
iii
LM

T

1

R
 89  0.3102  27.61
 
 
 iv 
LM  T  R  90  0.3066  27.59
2
– These values are much larger than 3.84, which
is the 5% critical value from a χ2(1)-distribution
• We reject the null hypothesis of no
autocorrelation
– Alternatively, we can reject H0 by examining
the p-value for LM = 27.61, which is 0.000
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 58
9.4
Other Tests for
Serially Correlated
Errors
9.4.1a
Testing Correlation
at Longer Lags
For a four-period lag, we obtain:
 iii 
 iv 
LM  T  4   R 2  86  0.3882  33.4
LM  T  R 2  90  0.4075  36.7
– Because the 5% critical value from a χ2(4)distribution is 9.49, these LM values lead us to
conclude that the errors are serially correlated
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 59
9.4
Other Tests for
Serially Correlated
Errors
9.4.2
The DurbinWatson Test
This is used less frequently today because its
critical values are not available in all software
packages, and one has to examine upper and lower
critical bounds instead
– Also, unlike the LM and correlogram tests, its
distribution no longer holds when the equation
contains a lagged dependent variable
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 60
9.5
Estimation with Serially Correlated
Errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 61
9.5
Estimation with
Serially Correlated
Errors
Three estimation procedures are considered:
1. Least squares estimation
2. An estimation procedure that is relevant when
the errors are assumed to follow what is
known as a first-order autoregressive model
et  ρet 1  vt
3. A general estimation strategy for estimating
models with serially correlated errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 62
9.5
Estimation with
Serially Correlated
Errors
We will encounter models with a lagged
dependent variable, such as:
yt  δ  θ1 yt 1  δ0 xt  δ1 xt 1  vt
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 63
9.5
Estimation with
Serially Correlated
Errors
ASSUMPTION FOR MODELS WITH A LAGGED DEPENDENT VARIABLE
TSMR2A In the multiple regression model yt  β1  β2 xt 2   βK xK  vt
Where some of the xtk may be lagged values of y, vt is uncorrelated with all
xtk and their past values.
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 64
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
• Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 65
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
It is possible to compute correct standard errors
for the least squares estimator:
– HAC (heteroskedasticity and autocorrelation
consistent) standard errors, or Newey-West
standard errors
• These are analogous to the heteroskedasticity
consistent standard errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 66
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Consider the model yt = β1 + β2xt + et
– The variance of b2 is:
var  b2    wt2 var  et    wt ws cov  et , es 
t
t s
  wt ws cov  et , es  

ts
  wt2 var  et  1 
2


w
t

t var  et 


t
Eq. 9.27
where
wt   xt  x 
Principles of Econometrics, 4th Edition
 x  x
t
Chapter 9: Regression with Time Series Data:
Stationary Variables
2
t
Page 67
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
When the errors are not correlated, cov(et, es) = 0,
and the term in square brackets is equal to one.
– The resulting expression
var  b2   t wt2 var  et 
is the one used to find heteroskedasticityconsistent (HC) standard errors
– When the errors are correlated, the term in
square brackets is estimated to obtain HAC
standard errors
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 68
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
If we call the quantity in square brackets g and its
estimate gˆ , then the relationship between the two
estimated variances is:
Eq. 9.28
Principles of Econometrics, 4th Edition
varHAC  b2   varHC  b2   gˆ
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 69
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Let’s reconsider the Phillips Curve model:
INF  0.7776  0.5279 DU
Eq. 9.29
Principles of Econometrics, 4th Edition
 0.0658  0.2294
 0.1030  0.3127 
Chapter 9: Regression with Time Series Data:
Stationary Variables
 incorrect se 
 HAC se 
Page 70
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
The t and p-values for testing H0: β2 = 0 are:
t  0.5279 0.2294  2.301
p  0.0238
t  0.5279 0.3127  1.688
p  0.0950
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
 from LS standard errors 
 from HAC standard errors 
Page 71
9.5
Estimation with
Serially Correlated
Errors
9.5.2
Estimating an
AR(1) Error Model
Return to the Lagrange multiplier test for serially
correlated errors where we used the equation:
et  ρet 1  vt
Eq. 9.30
– Assume the vt are uncorrelated random errors
with zero mean and constant variances:
Eq. 9.31
E  vt   0
Principles of Econometrics, 4th Edition
var  vt    v2
cov  vt , vs   0 for t  s
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 72
9.5
Estimation with
Serially Correlated
Errors
9.5.2
Estimating an
AR(1) Error Model
Eq. 9.30 describes a first-order autoregressive
model or a first-order autoregressive process for
et
– The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
– It is called an autoregressive model because it
can be viewed as a regression model
– It is called first-order because the right-handside variable is et lagged one period
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 73
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
We assume that:
1  ρ  1
Eq. 9.32
The mean and variance of et are:
Eq. 9.33
E  et   0
var  et    
2
e

2
v
1  ρ2
The covariance term is:
ρ
cov  et , et k  
, k 0
1 ρ
k
Eq. 9.34
Principles of Econometrics, 4th Edition
2
v
2
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 74
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
The correlation implied by the covariance is:
ρ k  corr  et , et  k 
cov  et , et  k 

Eq. 9.35

var  et  var  et  k 
cov  et , et  k 
var  et 
ρ
k


2
v
2
v
1  ρ 
1  ρ 
2
2
 ρk
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 75
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
Setting k = 1:
ρ1  corr  et , et 1   ρ
Eq. 9.36
– ρ represents the correlation between two errors that
are one period apart
• It is the first-order autocorrelation for e,
sometimes simply called the autocorrelation
coefficient
• It is the population autocorrelation at lag one for
a time series that can be described by an AR(1)
model
• r1 is an estimate for ρ when we assume a series
is AR(1)
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 76
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
Each et depends on all past values of the errors vt:
et  vt  ρvt 1  ρ vt 2  ρ vt 3 
2
Eq. 9.37
3
– For the Phillips Curve, we find for the first five
lags:
r1  0.549 r2  0.456 r3  0.433 r4  0.420 r5  0.339
– For an AR(1) model, we have:
ρˆ1  ρˆ  r1  0.549
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 77
9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
For longer lags, we have:
ρˆ 2  ρˆ   0.549   0.301
2
2
ρˆ 3  ρˆ 3   0.549   0.165
3
ρˆ 4  ρˆ 4   0.549   0.091
4
ρˆ 5  ρˆ   0.549   0.050
5
Principles of Econometrics, 4th Edition
5
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 78
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
Our model with an AR(1) error is:
yt  β1  β2 xt  et with et  ρet 1  vt
Eq. 9.38
with -1 < ρ < 1
– For the vt, we have:
Eq. 9.39
E  vt   0 var  vt    v2 cov  vt , vt 1   0 for t  s
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 79
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
With the appropriate substitutions, we get:
yt  β1  β2 xt  ρet 1  vt
Eq. 9.40
– For the previous period, the error is:
et 1  yt 1  β1  β2 xt 1
Eq. 9.41
– Multiplying by ρ:
Eq. 9.42
Principles of Econometrics, 4th Edition
ρet 1  et yt 1  ρβ1  ρβ2 xt 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 80
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
Substituting, we get:
Eq. 9.43
Principles of Econometrics, 4th Edition
yt  β1 1  ρ  β2 xt  ρyt 1  ρβ2 xt 1  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 81
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
The coefficient of xt-1 equals -ρβ2
– Although Eq. 9.43 is a linear function of the
variables xt , yt-1 and xt-1, it is not a linear
function of the parameters (β1, β2, ρ)
– The usual linear least squares formulas cannot
be obtained by using calculus to find the values
of (β1, β2, ρ) that minimize Sv
• These are nonlinear least squares estimates
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 82
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
Eq. 9.44
Our Phillips Curve model assuming AR(1) errors
is:
INFt  β1 1  ρ  β2 DUt  ρINFt 1  ρβ2 DUt 1  vt
– Applying nonlinear least squares and presenting
the estimates in terms of the original
untransformed model, we have:
INF  0.7609  0.6944 DU
Eq. 9.45
 se   0.1245  0.2479 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
et  0.557et 1  vt
 0.090 
Page 83
9.5
Estimation with
Serially Correlated
Errors
9.5.2c
Generalized Least
Squares Estimation
Nonlinear least squares estimation of Eq. 9.43 is
equivalent to using an iterative generalized least
squares estimator called the Cochrane-Orcutt
procedure
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 84
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
We have the model:
yt  β1 1  ρ  β2 xt  ρyt 1  ρβ2 xt 1  vt
Eq. 9.46
– Suppose now that we consider the model:
yt  δ  θ1 yt 1  δ0 xt  δ1xt 1  vt
Eq. 9.47
• This new notation will be convenient when
we discuss a general class of autoregressive
distributed lag (ARDL) models
–Eq. 9.47 is a member of this class
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 85
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
Note that Eq. 9.47 is the same as Eq. 9.47 since:
Eq. 9.48
δ  β1 1  ρ δ0  β2 δ1  ρβ2 θ1  ρ
– Eq. 9.46 is a restricted version of Eq. 9.47 with
the restriction δ1 = -θ1δ0 imposed
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 86
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
Applying the least squares estimator to Eq. 9.47
using the data for the Phillips curve example
yields:
Eq. 9.49
INF t  0.3336  0.5593INFt 1  0.6882 DU t  0.3200 DU t 1
 se   0.0899   0.0908
Principles of Econometrics, 4th Edition
 0.2575
Chapter 9: Regression with Time Series Data:
Stationary Variables
 0.2499 
Page 87
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
The equivalent AR(1) estimates are:
δˆ  βˆ 1 1  ρˆ   0.7609  1  0.5574   0.3368
θˆ 1  ρˆ  0.5574
δˆ  βˆ  0.6944
0
2
ˆ ˆ 2  0.5574   0.6944   0.3871
δˆ 1  ρβ
– These are similar to our other estimates
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 88
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
The original economic model for the Phillips
Curve was:
INFt  INFt E  γ Ut Ut 1 
Eq. 9.50
– Re-estimation of the model after omitting DUt-1
yields:
Eq. 9.51
INF t  0.3548  0.5282 INFt 1  0.4909 DU t
 se   0.0876   0.0851
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
 0.1921
Page 89
9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
In this model inflationary expectations are given
by:
INFt E  0.3548  0.5282INFt 1
– A 1% rise in the unemployment rate leads to an
approximate 0.5% fall in the inflation rate
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 90
9.5
Estimation with
Serially Correlated
Errors
9.5.4
Summary of
Section 9.5 and
Looking Ahead
We have described three ways of overcoming the
effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 91
9.6
Autoregressive Distributed Lag
Models
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 92
9.6
Autoregressive
Distributed Lag
Models
An autoregressive distributed lag (ARDL) model
is one that contains both lagged xt’s and lagged yt’s
Eq. 9.52
yt    0 xt  1xt 1 
 q xt q  1 yt 1 
  p yt  p  vt
– Two examples:
ADRL 1,1 : INFt  0.3336  0.5593INFt 1  0.6882 DU t  0.3200 DU t 1
ADRL 1,0  : INFt  0.3548  0.5282 INFt 1  0.4909 DU t
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 93
9.6
Autoregressive
Distributed Lag
Models
An ARDL model can be transformed into one with
only lagged x’s which go back into the infinite
past:
yt    0 xt  β1 xt 1  β 2 xt  2  β3 xt 3 
Eq. 9.53
 et

    β s xt  s  et
s 0
– This model is called an infinite distributed
lag model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 94
9.6
Autoregressive
Distributed Lag
Models
Four possible criteria for choosing p and q:
1. Has serial correlation in the errors been
eliminated?
2. Are the signs and magnitudes of the estimates
consistent with our expectations from
economic theory?
3. Are the estimates significantly different from
zero, particularly those at the longest lags?
4. What values for p and q minimize information
criteria such as the AIC and SC?
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 95
9.6
Autoregressive
Distributed Lag
Models
The Akaike information criterion (AIC) is:
 SSE  2 K
AIC  ln 

 T  T
Eq. 9.54
where K = p + q + 2
The Schwarz criterion (SC), also known as the
Bayes information criterion (BIC), is:
 SSE  K ln T 
SC  ln 

T
 T 
Eq. 9.55
– Because Kln(T)/T > 2K/T for T ≥ 8, the SC
penalizes additional lags more heavily than
does the AIC
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 96
9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
Consider the previously estimated ARDL(1,0)
model:
Eq. 9.56
INF t  0.3548  0.5282 INFt 1  0.4909 DU t , obs  90
 se   0.0876   0.0851
Principles of Econometrics, 4th Edition
 0.1921
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 97
9.6
Autoregressive
Distributed Lag
Models
FIGURE 9.9 Correlogram for residuals from Phillips curve ARDL(1,0) model
9.6.1
The Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 98
9.6
Autoregressive
Distributed Lag
Models
Table 9.3 p-values for LM Test for Autocorrelation
9.6.1
The Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 99
9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
For an ARDL(4,0) version of the model:
INF t  0.1001  0.2354 INFt 1  0.1213INFt 2  0.1677 INFt 3
Eq. 9.57
 se   0.0983  0.1016 
 0.1038
 0.1050
 0.2819INFt -4  0.7902DU t
 0.1014
Principles of Econometrics, 4th Edition
 0.1885
Chapter 9: Regression with Time Series Data:
Stationary Variables
obs  87
Page 100
9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
Inflationary expectations are given by:
INFt E  0.1001  0.2354INFt 1  0.1213INFt 2  0.1677INFt 3  0.2819INFt -4
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 101
9.6
Autoregressive
Distributed Lag
Models
Table 9.4 AIC and SC Values for Phillips Curve ARDL Models
9.6.1
The Phillips Curve
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 102
9.6
Autoregressive
Distributed Lag
Models
9.6.2
Okun’s Law
Recall the model for Okun’s Law:
Eq. 9.58
DU t  0.5836  0.2020Gt  0.1653Gt 1  0.0700G t  2 , obs  96
 se   0.0472   0.0324   0.0335 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
 0.0331
Page 103
9.6
Autoregressive
Distributed Lag
Models
FIGURE 9.10 Correlogram for residuals from Okun’s law ARDL(0,2) model
9.6.2
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 104
9.6
Autoregressive
Distributed Lag
Models
Table 9.5 AIC and SC Values for Okun’s Law ARDL Models
9.6.2
Okun’s Law
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 105
9.6
Autoregressive
Distributed Lag
Models
9.6.2
Okun’s Law
Now consider this version:
Eq. 9.59
DU t  0.3780  0.3501DU t 1  0.1841Gt  0.0992G t 1 , obs  96
 se   0.0578 0.0846 
Principles of Econometrics, 4th Edition
 0.0307   0.0368 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 106
9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
An autoregressive model of order p, denoted
AR(p), is given by:
Eq. 9.60
yt  δ  θ1 yt 1  θ2 yt 2 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
 θ p yt  p  vt
Page 107
9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
Consider a model for growth in real GDP:
Eq. 9.61
G t  0.4657  0.3770Gt 1  0.2462Gt  2
 se  0.1433  0.1000   0.1029 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
obs = 96
Page 108
9.6
Autoregressive
Distributed Lag
Models
FIGURE 9.11 Correlogram for residuals from AR(2) model for GDP growth
9.6.3
Autoregressive
Models
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 109
9.6
Autoregressive
Distributed Lag
Models
Table 9.6 AIC and SC Values for AR Model of Growth in U.S. GDP
9.6.3
Autoregressive
Models
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 110
9.7
Forecasting
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 111
9.7
Forecasting
We consider forecasting using three different
models:
1. AR model
2. ARDL model
3. Exponential smoothing model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 112
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Consider an AR(2) model for real GDP growth:
Gt  δ  θ1Gt 1  θ2Gt 2  vt
Eq. 9.62
The model to forecast GT+1 is:
GT 1  δ  θ1GT  θ2GT 1  vT 1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 113
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
The growth values for the two most recent
quarters are:
GT = G2009Q3 = 0.8
GT-1 = G2009Q2 = -0.2
The forecast for G2009Q4 is:
GˆT 1  δˆ  θˆ 1GT  θˆ 2GT 1
Eq. 9.63
 0.46573  0.37700  0.8  0.24624   0.2 
 0.7181
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 114
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
For two quarters ahead, the forecast for G2010Q1 is:
GˆT  2  δˆ  θˆ1GT 1  θˆ 2GT
 0.46573  0.37700  0.71808  0.24624  0.8
Eq. 9.64
 0.9334
For three periods out, it is:
GˆT 3  δˆ  θˆ1GT  2  θˆ 2GT 1
Eq. 9.65
 0.46573  0.37700  0.93343  0.24624  0.71808
 0.9945
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 115
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Summarizing our forecasts:
– Real GDP growth rates for 2009Q4, 2010Q1,
and 2010Q2 are approximately 0.72%, 0.93%,
and 0.99%, respectively
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 116
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
A 95% interval forecast for j periods into the
future is given by:
GˆT  j  t 0.975,df  σˆ j
where σˆ j is the standard error of the forecast
error and df is the number of degrees of freedom
in the estimation of the AR model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 117
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
The first forecast error, occurring at time T+1, is:

 



u1  GT 1  GˆT 1  δ  δˆ  θ1  θˆ1 GT  θ2  θˆ 2 GT 1  vT 1
Ignoring the error from estimating the coefficients,
we get:
Eq. 9.66
Principles of Econometrics, 4th Edition
u1  vT 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 118
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
The forecast error for two periods ahead is:
Eq. 9.67


u2  θ1 GT 1  GˆT 1  vT  2  θ1u1  vT  2  θ1vT 1  vT  2
The forecast error for three periods ahead is:
Eq. 9.68


u3  θ1u2  θ 2u1  vT 3  θ12  θ 2 vT 1  θ1vT  2  vT 3
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 119
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Because the vt’s are uncorrelated with constant
variance  v2, we can show that:
σ12  var  u1   σ v2

σ 22  var  u2   σ v2 1  θ12
σ  var  u3   σ
2
3
Principles of Econometrics, 4th Edition
2
v


θ  θ2
2
1
Chapter 9: Regression with Time Series Data:
Stationary Variables

2

 θ12  1
Page 120
9.7
Forecasting
Table 9.7 Forecasts and Forecast Intervals for GDP Growth
9.7.1
Forecasting with
an AR Model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 121
9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
Consider forecasting future unemployment using
the Okun’s Law ARDL(1,1):
DUt  δ  θ1DUt 1  δ0Gt  δ1Gt 1  vt
Eq. 9.69
The value of DU in the first post-sample quarter
is:
Eq. 9.70
DUT 1  δ  θ1DUT  δ0GT 1  δ1GT  vT 1
– But we need a value for GT+1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 122
9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
Now consider the change in unemployment
– Rewrite Eq. 9.70 as:
UT 1 UT  δ  θ1 UT UT 1   δ0GT 1  δ1GT  vT 1
– Rearranging:
Eq. 9.71
U T 1  δ   θ1  1U T  θ1U T 1  δ0GT 1  δ1GT  vT 1
Principles of Econometrics, 4th Edition
 δ  θ1*U T  θ*2U T 1  δ0GT 1  δ1GT  vT 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 123
9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
For the purpose of computing point and interval
forecasts, the ARDL(1,1) model for a change in
unemployment can be written as an ARDL(2,1)
model for the level of unemployment
– This result holds not only for ARDL models
where a dependent variable is measured in
terms of a change or difference, but also for
pure AR models involving such variables
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 124
9.7
Forecasting
9.7.3
Exponential
Smoothing
Another popular model used for predicting the
future value of a variable on the basis of its history
is the exponential smoothing method
– Like forecasting with an AR model, forecasting
using exponential smoothing does not utilize
information from any other variable
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 125
9.7
Forecasting
9.7.3
Exponential
Smoothing
One possible forecasting method is to use the
average of past information, such as:
yT  yT 1  yT  2
yˆT 1 
3
– This forecasting rule is an example of a simple
(equally-weighted) moving average model with
k=3
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 126
9.7
Forecasting
9.7.3
Exponential
Smoothing
Now consider a form in which the weights decline
exponentially as the observations get older:
Eq. 9.72
yˆT 1  αyT  α 1  α  yT 1  α 1  α  yT  2 
1
2
– We assume that 0 < α < 1
– Also, it can be shown that:

Principles of Econometrics, 4th Edition

s 0
α 1  α   1
Chapter 9: Regression with Time Series Data:
Stationary Variables
s
Page 127
9.7
Forecasting
9.7.3
Exponential
Smoothing
For forecasting, recognize that:
Eq. 9.73
1  α  yˆT  α 1  α  yT 1  α 1  α 
2
yT 2  α 1  α  yT 3 
3
– We can simplify to:
Eq. 9.74
Principles of Econometrics, 4th Edition
yˆT 1  αyT  1  α yˆT
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 128
9.7
Forecasting
9.7.3
Exponential
Smoothing
The value of α can reflect one’s judgment about
the relative weight of current information
– It can be estimated from historical information
by obtaining within-sample forecasts:
yˆt  αyt 1  1 α yˆt 1 t  2,3, , T
Eq. 9.75
• Choosing α that minimizes the sum of
squares of the one-step forecast errors:
Eq. 9.76
Principles of Econometrics, 4th Edition
vt  yt  yˆt  yt   αyt 1 + 1  α  yˆt 1 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 129
9.7
Forecasting
FIGURE 9.12 (a) Exponentially smoothed forecasts for GDP growth with α
= 0.38
9.7.3
Exponential
Smoothing
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 130
9.7
Forecasting
FIGURE 9.12 (b) Exponentially smoothed forecasts for GDP growth with α
= 0.8
9.7.3
Exponential
Smoothing
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 131
9.7
Forecasting
9.7.3
Exponential
Smoothing
The forecasts for 2009Q4 from each value of α
are:
α  0.38 : GˆT 1  αGT  1  α  GˆT  0.38  0.8  1  0.38   0.403921
= 0.0536
α  0.8 : GˆT 1  αGT  1  α  GˆT  0.8  0.8  1  0.8    0.393578
= 0.5613
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 132
9.8
Multiplier Analysis
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 133
9.8
Multiplier Analysis
Multiplier analysis refers to the effect, and the
timing of the effect, of a change in one variable on
the outcome of another variable
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 134
9.8
Multiplier Analysis
Let’s find multipliers for an ARDL model of the
form:
Eq. 9.77
yt    1 yt 1 
  p yt  p  0 xt  1 xt 1 
 q xt q  vt
– We can transform this into an infinite
distributed lag model:
Eq. 9.78
yt  α  β0 x t + β1 xt 1  β2 xt 2  β3 xt 3 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
 et
Page 135
9.8
Multiplier Analysis
The multipliers are defined as:
yt
βs 
 s period delay multiplier
xt  s
s
β
j 0
j
 s period interim multiplier
j
 total multiplier

β
j 0
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 136
9.8
Multiplier Analysis
The lag operator is defined as:
Lyt  yt 1
– Lagging twice, we have:
L  Lyt   Lyt 1  yt 2
– Or:
L2 yt  yt 2
– More generally, we have:
Ls yt  yt s
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 137
9.8
Multiplier Analysis
Now rewrite our model as:
yt    1 Lyt  2 L2 yt 
  p Lp yt  0 xt  1Lxt   2 L2 xt
Eq. 9.79

Principles of Econometrics, 4th Edition
 q Lq xt  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 138
9.8
Multiplier Analysis
Rearranging terms:
Eq. 9.80
2
1


L


L
 1 2 
Principles of Econometrics, 4th Edition
  p Lp  yt     0  1L  2 L2 
Chapter 9: Regression with Time Series Data:
Stationary Variables
 q Lq  xt  vt
Page 139
9.8
Multiplier Analysis
Let’s apply this to our Okun’s Law model
– The model:
DUt  δ  θ1DUt 1  δ0Gt  δ1Gt 1  vt
Eq. 9.81
can be rewritten as:
Eq. 9.82
Principles of Econometrics, 4th Edition
1 θ1L DUt  δ  δ0  δ1L Gt  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 140
9.8
Multiplier Analysis
Define the inverse of (1 – θ1L) as (1 – θ1L)-1 such
that:
1  θ1L  1  θ1L   1
1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 141
9.8
Multiplier Analysis
Multiply both sides of Eq. 9.82 by (1 – θ1L)-1:
DU t  1  θ1 L  δ  1  θ1 L 
1
Eq. 9.83
1
 δ0  δ1L  Gt  1  θ1L 
1
– Equating this with the infinite distributed lag
representation:
DU t  α  β0Gt  β1Gt 1  β 2Gt 2  β3Gt 3 
Eq. 9.84
Principles of Econometrics, 4th Edition

 α  β0  β1L  β 2 L2  β3 L3 
Chapter 9: Regression with Time Series Data:
Stationary Variables
G  e
t
 et
t
Page 142
vt
9.8
Multiplier Analysis
For Eqs. 9.83 and 9.84 to be identical, it must be
true that:
Eq. 9.85
α= 1  θ1 L  δ
Eq. 9.86
β0  β1L  β 2 L  β3 L 
Eq. 9.87
et  1  θ1L  vt
1
2
Principles of Econometrics, 4th Edition
3
 1  θ1L 
1
 δ0  δ1L 
1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 143
9.8
Multiplier Analysis
Multiply both sides of Eq. 9.85 by (1 – θ1L) to
obtain (1 – θ1L)α = δ.
– Note that the lag of a constant that does not
change so Lα = α
– Now we have:
δ
1  θ1  α  δ and α 
1  θ1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 144
9.8
Multiplier Analysis
Multiply both sides of Eq. 9.86 by (1 – θ1L):

δ0  δ1 L  1  θ1 L  β 0  β1 L  β 2 L2  β3 L3 
Eq. 9.88

 β0  β1 L  β 2 L2  β3 L3 
 β0θ1 L  β1θ1 L2  β 2θ1 L3 
 β0   β1  β0θ1  L  β 2  β1θ1  L2  β3  β 2θ1  L3 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 145
9.8
Multiplier Analysis
Rewrite Eq. 9.86 as:
Eq. 9.89
δ0  δ1L  0L2  0L3  β0  β1  β0θ1  L  β2  β1θ1  L2  β3  β2θ1  L3 
– Equating coefficients of like powers in L yields:
δ0 = β 0
δ1  β1  β0θ1
0  β 2  β1θ1
0  β3  β 2θ1
and so on
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 146
9.8
Multiplier Analysis
We can now find the β’s using the recursive
equations:
β 0 = δ0
Eq. 9.90
β1  δ1  β0θ1
β j  β j 1θ1 for j  2
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 147
9.8
Multiplier Analysis
You can start from the equivalent of Eq. 9.88
which, in its general form, is:
δ0  δ1L  δ2 L2 
Eq. 9.91

 δq Lq  1  θ1L  θ2 L2 
 θ p Lp


 β0  β1L  β2 L2  β3 L3 

– Given the values p and q for your ARDL
model, you need to multiply out the above
expression, and then equate coefficients of like
powers in the lag operator
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 148
9.8
Multiplier Analysis
For the Okun’s Law model:
DU t  0.3780  0.3501DUt 1  0.1841Gt  0.0992Gt 1
– The impact and delay multipliers for the first
four quarters are:
βˆ 0 = δˆ 0  0.1841
βˆ  δˆ  βˆ θˆ  0.099155  0.184084  0.350116  0.1636
1
1
0 1
βˆ 2  βˆ 1θˆ 1  0.163606  0.350166  0.0573
βˆ  βˆ θˆ  0.057281 0.350166  0.0201
3
2 1
βˆ 4  βˆ 3θˆ 1  0.020055  0.350166  0.0070
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 149
9.8
Multiplier Analysis
FIGURE 9.13 Delay multipliers from Okun’s law ARDL(1,1) model
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 150
9.8
Multiplier Analysis
We can estimate the total multiplier given by:

β
j 0
j
and the normal growth rate that is needed to
maintain a constant rate of unemployment:
GN  α
Principles of Econometrics, 4th Edition

β
j 0
Chapter 9: Regression with Time Series Data:
Stationary Variables
j
Page 151
9.8
Multiplier Analysis
We can show that:

ˆ  δˆ θˆ
δ
0.163606
1
0 1
ˆ
β j  δ0 
 0.184084 
 0.4358

1  0.350116
1  θˆ
j 0
1
– An estimate for α is given by:
δˆ
0.37801
αˆ 

 0.5817
1  θˆ 0.649884
1
– Therefore, normal growth rate is:
0.5817
ˆ
GN 
 1.3% per quarter
0.4358
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 152
Key Words
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 153
Keywords
AIC criterion
AR(1) error
AR(p) model
ARDL(p,q) model
autocorrelation
Autoregressive
distributed lags
autoregressive
error
autoregressive
model
BIC criterion
correlogram
delay multiplier
distributed lag
weight
Principles of Econometrics, 4th Edition
dynamic models
exponential
smoothing
finite distributed lag
forecast error
forecast intervals
forecasting
HAC standard errors
impact multiplier
infinite distributed
lag
interim multiplier
lag length
lag operator
lagged dependent
variable
Chapter 9: Regression with Time Series Data:
Stationary Variables
LM test
multiplier analysis
nonlinear least
squares
out-of-sample
forecasts
sample
autocorrelations
serial correlation
standard error of
forecast error
SC criterion
total multiplier
T x R2 form of LM
test
within-sample
forecasts
Page 154
Appendices
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 155
9A
The DurbinWatson Test
For the Durbin-Watson test, the hypotheses are:
H0 :   0
H1 :   0
The test statistic is:
T
Eq. 9A.1
d
  eˆt  eˆt 1 
2
t 2
T
2
ˆ
e
t
t 1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 156
9A
The DurbinWatson Test
We can expand the test statistic as:
d
T
T
T
t 2
t 2
t 2
2
2
ˆ
ˆ
e

e
 t  t 1  2 eˆt eˆt 1
T
2
ˆ
e
t
t 1
T
Eq. 9A.2

 eˆ
t 2
T
2
t
2
ˆ
e
t
t 1
T

 eˆ
t 2
T
2
t 1
2
ˆ
e
t
T
2
t 1
 eˆt eˆt 1
t 2
T
2
ˆ
e
t
t 1
 1  1  2r1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 157
9A
The DurbinWatson Test
We can now write:
d  2 1  r1 
Eq. 9A.3
– If the estimated value of ρ is r1 = 0, then the
Durbin-Watson statistic d ≈ 2
• This is taken as an indication that the model
errors are not autocorrelated
– If the estimate of ρ happened to be r1 = 1 then
d≈0
• A low value for the Durbin-Watson statistic
implies that the model errors are correlated,
and ρ > 0
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 158
9A
The DurbinWatson Test
Principles of Econometrics, 4th Edition
FIGURE 9A.1 Testing for positive autocorrelation
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 159
9A
The DurbinWatson Test
FIGURE 9A.2 Upper and lower critical value bounds for the DurbinWatson test
9A.1
The DurbinWatson Bounds
Test
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 160
9A
The DurbinWatson Test
9A.1
The DurbinWatson Bounds
Test
Decision rules, known collectively as the DurbinWatson bounds test:
– If d < dLc: reject H0: ρ = 0 and accept
H1: ρ > 0
– If d > dUc do not reject H0: ρ = 0
– If dLc < d < dUc, the test is inconclusive
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 161
9B
Properties of the
AR(1) Error
Note that:
et  ρet 1  vt
 ρ  ρet  2  vt 1   vt
Eq. 9B.1
 ρ2 et  2  ρvt 1  vt
Further substitution shows that:
Eq. 9B.2
Principles of Econometrics, 4th Edition
et  ρ2  ρet 3  vt 2   ρvt 1  vt
 ρ3et 3  ρ2vt 2  ρvt 1  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 162
9B
Properties of the
AR(1) Error
Repeating the substitution k times and rearranging:
Eq. 9B.3
et  ρ et k  vt  ρvt 1  ρ vt 2 
k
2
k 1
 ρ vt k 1
If we let k → ∞, then we have:
Eq. 9B.4
Principles of Econometrics, 4th Edition
et  vt  ρvt 1  ρ2vt 2  ρ3vt 3 
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 163
9B
Properties of the
AR(1) Error
We can now find the properties of et:
E  et   E  vt   ρE  vt 1   ρ 2 E  vt  2   ρ3 E  vt 3  
 0  ρ  0  ρ 2  0  ρ3  0 
0
var  et   var  vt   ρ 2 var  vt 1   ρ 4 var  vt 2   ρ6 var  vt 3  
 v2  ρ 2 v2  ρ 4 v2  ρ6 v2 
 v2 1  ρ 2  ρ 4  ρ6 

v2

1  ρ2
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 164
9B
Properties of the
AR(1) Error
The covariance for one period apart is:
cov  et , et 1   E  et et t 
2
3

 E  vt  ρvt 1  ρ vt  2  ρ vt 3 

 v  ρv  ρ v  ρ v  
 ρE  v   ρ E  v   ρ E  v  
 ρ 1  ρ  ρ  
t 1
t 2
2
t 1
3
2
v
2
2
2
t 2
3
t 3
5
t 4
2
t 3
4
ρv2

1  ρ2
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 165
9B
Properties of the
AR(1) Error
Similarly, the covariance for k periods apart is:
ρ k v2
cov  et , et  k  
1  ρ2
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
k 0
Page 166
9C
Generalized Least
Squares Estimation
We are considering the simple regression model
with AR(1) errors:
yt  1  2 xt  et
et  et 1  vt
To specify the transformed model we begin with:
yt  1  2 xt  yt 1  1  2 xt 1  vt
Eq. 9C.1
– Rearranging terms:
Eq. 9C.2
Principles of Econometrics, 4th Edition
yt yt 1  1 1  2  xt xt 1   vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 167
9C
Generalized Least
Squares Estimation
Defining the following transformed variables:
yt  yt  yt 1
xt2  xt  xt 1
xt1  1  
Substituting the transformed variables, we get:
Eq. 9C.3
Principles of Econometrics, 4th Edition
yt  xt11  xt22  vt
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 168
9C
Generalized Least
Squares Estimation
There are two problems:
1. Because lagged values of yt and xt had to be
formed, only (T - 1) new observations were
created by the transformation
2. The value of the autoregressive parameter ρ is
not known
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 169
9C
Generalized Least
Squares Estimation
For the second problem, we can write Eq. 9C.1 as:
yt 1 2 xt  ( yt 1 1 2 xt 1 )  vt
Eq. 9C.4
For the first problem, note that:
y1  1  x12  e1
and that
1  2 y1  1  2 1  1  2 x12  1  2 e1
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 170
9C
Generalized Least
Squares Estimation
Or:


y1  x11
1  x12
2  e1
Eq. 9C.5
where
y1  1  2 y1
x11  1  2
x12  1  2 x1
e1  1  2 e1
Eq. 9C.6
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 171
9C
Generalized Least
Squares Estimation
To confirm that the variance of e*1 is the same as
that of the errors (v2, v3,…, vT), note that:
2

var(e1 )  (1  2 ) var(e1 )  (1  2 ) v 2  v2
1 
Principles of Econometrics, 4th Edition
Chapter 9: Regression with Time Series Data:
Stationary Variables
Page 172
Download