AR--occurs when something happens today will have an impact on

advertisement
AR--occurs when something happens today will have an impact on
what happens in the future.
Often found in time-series.
Typically in Financial data, (Returns, Sales) Macro data, Wage data.
Econometric models:
• AR(1) errors occur when yi = Xi β + ²i and
²i = ρ²i−1 + ui
where ρ is the autocorrelation coecient, |ρ| < 1 and ui ∼ N (0, σu2 ).
• The consequences for OLS: β̂ is unbiased and consistent but no longer ecient
and usual statistical inference is rendered invalid.
• Note: In general we can have AR(p) errors which implies p lagged terms in the
error structure, i.e.,
²i = ρ1 ²i−1 + ρ2 ²i−2 + · · · + ρp ²i−p
• var(²i ) is
var(²i ) = σu2 + ρ2 σu2 + ρ4 σu2 + · · ·
= σu2 + ρ2 (var(²i−1 ))
But, assuming homoscedasticity, var(²i ) = var(²i−1 ) so that
var(²i ) = σu2 + ρ2 (var(²i−1 ))
= σu2 + ρ2 (var(²i ))
σu2
≡ σ2
var(²i ) =
1 − ρ2
• Note: This is why we need |ρ| < 1 for stability in the process.
If |ρ| > 1 then the denominator is negative and the var(²i ) cannot be negative.
• Note:
1. The OLS estimate of s2 is biased but consistent
2. s2 is usually biased downward because we usually nd ρ > 0 in economic
data.
One way to solve AR problem is to get the error term of the estimated equation
to satisfy the full ideal conditions. By substitution.
Consider the model we estimate is yt = β0 + β1 Xt + ²t where ²t = ρ²t−1 + ut
and ut ∼ (0, σu2 ).
Rewrite the original model as
yt =
but ²t−1 =
thus yt =
yt − ρyt−1 =
⇒ yt∗ =
β0 + β1 Xt + ρ²t−1 + u t
yt−1 − β0 − β1 Xt−1
β0 + β1 Xt + ρ(yt−1 − β0 − β1 Xt−1 ) + ut :
β0 (1 − ρ) + β1 (Xt − ρXt−1 ) + ut :
β0∗ + β1 Xt∗ + ut
<substitution>
The ARCH Regression Model
when the disturbances in a linear regression model follow an ARCH process:
|w = {0w e + %w
¡ ¢
Hw¡1 %2w ´ 2w = $ + (O) %2w
¢
¡
%w jªw¡1 » Q 0> 2w
where {w include lagged dependent and exogenous variables.
Time Series
ARCH MODELS (Murray 11.6)
ARIMA Formulation
Extract the deterministic components to have a
STATIONARY time series
• De-trending
• Differencing
Find a proper MODEL to describe the stochastic
behavior
• Model Selection and Identification
• Parameter Estimation
Test the non-modeled RESIDUALS to make sure
they don’t carry any information (Model Adequacy)
• Ideally, Residuals should be White Noise
• Several statistical tests
ARCH
Forecast
>
Example 1(Amazon Series).
The Amazon series, Brocklebank & Dickey (2003), represents daily
stock prices from May 16, 1997 to May 25, 1999. The following scenario is indicative for analyzing
security prices, with respect to the ARMA modelling framework.
• Plot the xt series.----have a nonconstant variance.
transformation----> ln(xt ) series.
natural log
Amazon and ln(Amazon) closing prices.
5
200
Figure.1:
Amazon Series
4
1
2
3
log(x_t)
100
50
0
0
100
200
300
400
Time
500
0
100
200
300
400
• Plot the (ACF) and (PACF),
. As the lag length h increases, the estimated autocorrelations for xt and ln(xt ) slowly decay.
Amazon series (xt ) autocorrelations.
ACF
0.0
0.2
0.4
0.6
0.8
1.0
Figure.2:
10
20
30
40
50
60
Lag
0.6
0.4
0.2
Partial ACF
0.8
1.0
0
0.0
x_t
150
Log Amazon Series
Lag
1
500
Time
Hence, first differences are taken to correct for nonstationarity in the mean: yt = ln(xt ) − ln(xt−1 ).
• After we address nonconstant variance and nonstationarity, the yt series resembles a white noise
process, see Figure .3 . This is rather unfortunate. Any suggestions?
Series : lamz.diff
ACF
0.1
0.0
0.2
0.4
0.6
0.2
0.8
1.0
First Differenced Log Amazon Series
0
10
20
30
40
50
60
40
50
60
0.0
y_t
Lag
0.0
-0.10
-0.2
Partial ACF
-0.1
0.05
Series : lamz.diff
0
100
200
300
400
500
0
Time
Figure.3:
10
20
30
Lag
First differenced log Amazon series (yt ) and autocorrelations.
Hence, our final model for yt is an ARIMA(0,1,0): ln(xt ) − ln(xt−1 ) = ²t . This is well known as the
random walk model or stock market model. Random walk theory merely states that the future price
movements cannot be predicted from past price movements alone. For example, the change in the stock
price from time t to t +1 is unpredictable with past information. What can we, econometrician, do?
Example.2 (IBM Series).
The IBM series represents daily stock returns from February 2, 1984 to
December 31, 1991, Zivot and Wang (2003).
• Firstly, the distribution of yt has heavier tails than a normal distribution.
• Kurtosis, the nor malized fourth cental moment of a distribution is defined as
κ = µ4 /µ22 and measures the degree of peakedness in a distribution.
• The standard normal distribution has a kurtosis of
κN (0,1) = µ4 /µ22 = 3/12 = 3. In the literature, leptokurtic is often used to describe distributions that are peaked and have fat tails.
Sample Moments:
mean
std
0.0001348 0.01443
2
skewness
kurtosis
-2.004
38.27
• Secondly, the changes in yt tend to be clustered. (This may be easier to visualize in a graph of the
squared yt series and even easier to see in Example .3 .) Hence, dependence in the variability or
volatility of the observed values is present.
Figure.4:
IBM series: yt and yt2 .
Daily Stock Returns of IBM^2
0.00
-0.20
0.01
-0.15
0.02
-0.10
-0.05
0.03
0.00
0.04
0.05
0.05
0.10
Daily Stock Returns of IBM
Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1
Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1
1987
1988
1989
1990
1991
1984
1992
1985
1986
1987
IBM series: yt and yt2 correlations.
Figure.5:
Series : ibm.s^2
0.0
0.0
0.2
0.2
ACF
0.4 0.6
ACF
0.4 0.6
0.8
0.8
1.0
1.0
Series : ibm.s
0
10
20
30
40
50
0
10
20
Lag
30
40
50
Lag
Series : ibm.s
Series : ibm.s^2
Partial ACF
0.05
0.15
0.04
-0.05
1986
Partial ACF
0.0
1985
-0.04
1984
0
10
20
30
40
50
Lag
0
10
20
30
40
Lag
• And finally, the yt2 series is correlated and nonnegative.
What does this imply?
3
50
1988
1989
1990
1991
1992
Example.3 (Copper Series).
In this example, the concept of volatility clustering is visually repre-
sented more clearly. The copper series represents the cash settlement of Copper Prices in U.S. Dollars
($) in the spot market on the London Metal Exchange from January 3, 1989 to October 31, 2002.
Figure.:
Copper series: yt and yt2 .
Log Returns of Copper^2
y_t
0.002
0.0
-0.04
-0.02
0.001
0.0
y_t
0.02
0.003
0.04
0.004
0.06
Log Returns of Copper
0
1000
2000
3000
Time
0
1000
2000
3000
Any economic implication? (of volatility clustering)
2. ARCH(1) Model
yt = σt ²t
(2.1)
2
σt2 = α0 + α1 yt−1
(2.2)
where ²t ∼ iid(0, 1). Some notes on the ARCH(1) model.
1. Model constraints. As with ARMA models, one must impose constraints on the parameters, α0
and α1 , in order to obtain tractable properties, such as, σt2 > 0. Think back to the stationarity
requirements of ARMA processes. How important was this property with respect to estimation,
forecasting, etc.? More specific constraints on the model parameters will be derived below.
2. Models with ARCH errors. We can think of yt as a white noise process with its variance
a function of past variances. However, if a process is not initially white noise, some correlation
structure among the residuals exists, the researcher may need to initially fit a regression or ARMA
model, output the residuals, and then model them as an ARCH process.
3. Examine the behaviour of yt conditionally. Assume the following distributional assumption
q
2
on the error series: ²t ∼ N (0, 1). Rewrite the ARCH(1) model as yt = α0 + α1 yt−1
²t . Condi2
tional on yt−1 , yt has a normal distribution: yt |yt−1 ∼ N (0, α0 + α1 yt−1
). Some standard results
follow:
• E(yt |yt−1 ) = 0
2
• V (yt |yt−1 ) = E(yt2 |yt−1 ) − [E(yt |yt−1 )]2 = E(yt2 |yt−1 ) = α0 + α1 yt−1
= σt2
2
Hence, the conditional variance of yt , V (yt |yt−1 ), is a function of yt−1
This is where the AR (autoregressive) and C (conditional) parts of ARCH originate.
4
Time
4. Non-Normal AR(1) model for yt2 with νt errors. Likewise, we could express the model as:
yt2 = yt2 + (σt2 − σt2 )
2
= (σt ²t )2 + α0 + α1 yt−1
− σt2
2
2
= α0 + α1 yt−1
+ σt2 (²2t − 1) = α0 + α1 yt−1
+ νt
where νt = σt2 (²2t − 1). Since ²t ∼ iid N (0, 1), ²2t ∼ iid χ21 . As a result, (²2t − 1) is a shifted (to have
mean zero) χ21 random variable.
5. Examine the behaviour of yt unconditionally.
Using the law of iterated expectations:
Ey (y) = Ex [Ey|x (y|x)], and the variance computing formula: Vy (y) = Eyt (yt2 ) − [Eyt (yt )]2 , the
following ARCH(1) properties are examined:
• Eyt (yt ) = Eyt−1 [Eyt |yt−1 (yt |yt−1 )] = Eyt−1 [0] = 0
• Vyt (yt ) = Eyt (yt2 ) − [Eyt (yt )]2 = Eyt (yt2 )
= Eyt−1 [Eyt |yt−1 (yt2 |yt−1 )]
= Eyt−1 [Vyt |yt−1 (yt |yt−1 ) + [Eyt |yt−1 (yt |yt−1 )]2 ]
= Eyt−1 [Vyt |yt−1 (yt |yt−1 )]
2
= Eyt−1 (α0 + α1 yt−1
)
2
= α0 + α1 Eyt−1 (yt−1
) = α0 + α1 Eyt (yt2 )
= α0 + α1 (Vyt (yt ) + [Eyt (yt )]2 )
= α0 + α1 Vyt (yt )
Thus, Vyt (yt ) = α0 /(1 − α1 ). Because the variance of yt must be positive α0 > 0; whereas, the
support for α1 is restricted to the set [0, 1). Typically, this constraint is stated as: 0 ≤ α1 < 1.
6. Higher order moments of yt . In some applications, assumptions on higher moments of yt
are necessary. This is critical in extreme value theory (EVT) settings, such as stress-testing. In
particular, we require the fourth moment to be finite: E(yt4 ) < ∞. Since the forth moment is
positive, it can be shown that the variance of yt2 (presented below) is also finite, provided that
3α12 < 1. Combining this result with the previous constraint: 0 ≤ α12 < 1/3 or alternatively
p
0 ≤ α1 < 1/3.
3α02 (1 − α12 )
V (yt2 ) = E(yt4 ) =
(1 − α1 )2 (1 − 3α12 )
The kurtosis of yt is:
κ=
µ4
1 − α12
=
3
µ2
1 − 3α2
5
7. Alternative representation. Let ²t be an iid sequence with mean zero and conditional variance
σt2 . In other words, E(²t ) = 0 and V (²t |=t−1 ) = E(²2t |=t−1 ) − E(²t |=t−1 )2 = E(²2t |=t−1 ) = σt2 ,
where =t−1 represents the set of information up to time t − 1. Then, the following equations
alternatively represent an ARCH(1) process.
yt = ²t
(2.3)
σt2 = α0 + α1 ²2t−1
(2.4)
If Equation 2.4 is rewritten such that
σt2 = α0 + α1 ²2t−1
E(²2t |=t−1 ) = α0 + α1 ²2t−1 + [²2t − ²2t ]
²2t = α0 + α1 ²2t−1 + [²2t − E(²2t |=t−1 )]
²2t = α0 + α1 ²2t−1 + ωt
where ωt = ²2t − E(²2t |=t−1 ) is an iid sequence with mean zero. Then the last equation above
represents an AR(1) process for ²2t . This is where the AR (autoregressive) and C (conditional)
parts of ARCH originate in the alternative representation.
White noise, ARCH(2) simulation [1] and simulation [2].
White Noise
-3
-1
e_t
1 2 3
Figure:
0
100
200
300
400
500
Time
0
-5
y_t
5
10
ARCH(2): conditional standard deviation provided
0
100
200
300
400
500
Time
0
-2
yy_t
1
2
ARCH(2): drop the first hundred observations
0
100
200
300
Time
6
400
500
Illustrating Example
Case: Dow Jones Data
12000
Dow Jones Index
10000
8000
6000
4000
2000
0
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
Time in days from 1/1/1975 to 07/30/2005
Fitting a model on Residuals:
-9.0215
-9.022
-9.0225
-9.023
-9.0235
-9.024
-9.0245
-9.025
1
2
3
4
5
AR(p)
6
7
ARMA(p,q)
8
-9.02
ARMA(2,2)
-9.0201
-9.0202
-9.0203 ARMA(1,1)
ARMA(2,1)
-9.0204
-9.0205
-9.0206
ARMA(1,2)
-9.0207
-9.0208
1
1.5
2
2.5
3
3.5
4
9
Residuals
0.1
0.05
0
-0.05
N(0,σ 2)
-0.1
-0.15
-0.2
-0.25
0
500
1000
1500
7
2000
2500
Dow Jones Residuals Rt
White Noise Zt
0.1
0.1
0
0
-0.1
-0.1
-0.2
-0.2
0
500
1000
1500
2000
2500
3000
1
-0.3 0
500
1000
1500
2000
2500
1
Zt
0.5
0.5
0
0
-0.5
0
2
4
6
8
10
Lag
12
14
16
18
1
20
-0.5
Rt
0
2
2
0.5
0
0
4
6
8
10
Lag
12
14
16
18
8
10
Lag
12
14
16
18
20
-0.5
0
2
4
6
8
10
Lag
20
2
Rt
12
14
16
18
20
Dow Jones Residuals seems:
Uncorrelated, and ML Test says they are not IID.
Dow Jones Residuals Rt
0.1
0
-0.1
-0.2
-0.3 0
500
1000
1500
2000
2500
3500
3000
0.25
0.2
Quantiles of Input Sample
2
6
McLeod-Li Test Fails
Zt
0
4
1
0.5
-0.5
3500
3000
0.15
0.1
0.05
0
-0.05
-0.1
-0.15
-0.2
-0.25
-4
-3
-2
-1
0
1
2
Standard Normal Quantiles
Dow Jones Residuals seems:
1- Uncorrelated
2- not IID, and
3- Non-Gaussian (Heavy-Tailed)
8
3
4
Need a Better Model
(Non-Gaussian)
Volatility Clustering
0.1
0.05
0
-0.05
-0.1
-0.15
-0.2
-0.25
-0.3
0
1000
2000
3000
4000
5000
6000
7000
8000
9000
Time in days from 1/1/1975 to 07/30/2005
•
Volatility in financial assets comes in clusters: high volatility and low volatility
regimes are persistent (First observed by Benoit Mandlebrot in 1963).
Volatility follows a Dynamic Behavior
and there is mean-reversion in volatility
ARCH Model
• Volatility is one of the most important parameters in statistical analysis
and in particular in option pricing
• Engle's 2003 Nobel citation was "for methods of analyzing economic
time series with time-varying volatility", specifically the concept known as
ARCH – autoregressive conditional heteroscedasticity.
ARCH models
can accurately capture the long-term properties of many time series, and
have become an indispensable tool for researchers and analysts
studying the financial markets and problems of risk evaluation.
9
Autoregressive Process
Let {Zt } be WN(0, σ ), and consider the process
2
Consider φ = 0,
X
t
= φX
X
t
= Z
t −1
+ Z
t
Z t ≈ IIDN (0, σ 2 )
t
4
Constant
2
0
-2
-4
0
500
1000
1500
2000
2500
3000
We introduce a dynamic volatility process,
X
IIDN (0, σ t )
2
σ
t
2
t
= σ tZ
= ω
0
Z t ≈ IIDN (0,1)
t
+ ω1X
2
t −1
Volatility Dynamics:
ARCH(1) Process
Feedback
ARCH(1) Process
= σ tZ
X
t
σ
2
t
t
Z t ≈ IIDN (0,1)
= 1 + 0 .9 X
2
t −1
Varying Volatility
20
10
0
-10
-20
0
500
1000
1500
2000
2500
3000
Volatility
Mean-Reversion
Trading Opportunities !
0
500
1000
1500
2000
2500
3000
0
500
1000
1500
2000
2500
3000
4
X
t
= Z
t
Z t ≈ IIDN (0, σ 2 )
Constant Volatility
2
0
-2
-4
10
GARCH
Generalized Auto Regressive Conditional Heteroscedasticity
12000
GARCH(1,1) Model
Dow Jones Index
10000
8000
X t = σ tZt
σ t2 = 1 + 0.0826 X t2−1 + 0 .8895 σ t2−1
et =
Xt
σt
6000
4000
2000
00
Residuals
1000
2000
3000
4000
5000
6000
7000
8000
et
0
2
4
6
8
10
Lag
12
14
16
18
et
2
0
20
2
4
6
8
10
Lag
12
14
16
18
20
GARCH fits even much better than others !
GARCH(1,1) Model
12000
10000
Dow Jones Index
8000
X t = σ tZt
σ t2 = 1 + 0.0826 X t2−1 + 0 .8895 σ t2−1
Simulation of the Return Series
6000
4000
2000
00
1000
2000
3000
4000
5000
6000
7000
8000
0.05
0
GARCH Forecast
-0.05
-0.1
-0.15
ARMA Forecast
-0.2
-0.25
0
1000
2000
3000
11
4000
5000
6000
9000
IGARCH
Integrated GARCH Models
200
X
100
X
σ
t
2
t
= σ tZ
= ω
0
Z t ≈ IIDN (0,1)
t
+ ω1X
2
t −1
+ η 1σ
t
0
-100
2
t −1
-200
80
ω1 + η1 = 1
60
0
σ
500
1000
1500
2000
2500
3000
ω0 = 1 λ = 0.94
t
40
Random Walk
20
0
σ
If
2
t
= ω 0 + (1 − λ ) X
2
t −1
+ λσ
2
t −1
λ ∈ [0 ,1 ]
0
500
1000
I-GARCH Model
1500
2000
2500
(Mostly used in FX)
ω0=0 , I-GARCH Model becomes Exponentially Weighted Moving Average (EWMA)
σ
2
t
= (1 − λ ) X
2
t −1
+ λσ
2
t −1
EWMA Model
λ ∈ [0 ,1 ]
12
3000
SV model
A Stochastic Volatility Process (SV) Model for volatility
Observations
This is what we see
Unobservable or
Latent Processes
X
σ
t
2
t
= µ X + σ tZt
Z t ≈ IIDN
= exp( Y t )
Y t = µ Y + φ Y t −1 + ε t
ε t ≈ IIDN
(0 ,1 )
(0 ,ν )
2
5
Note:
In a GARCH Process,
volatility is observable
X
0
t -5
-10
X t = σ tZt
σ t2 = exp( Yt )
Y t = 0 . 92 Y t −1 + ε t
Z t ≈ IIDN (0 ,1 )
ε t ≈ IIDN (0 , 0 . 3
2
0
500
1000
1500
2000
2500
0
500
1000
1500
2000
2500
0
500
1000
1500
2000
2500
4
σ
)
t 2
0
5
Yt
0
-5
Summary of ARCH / GARCH anf SV Mode in Financial econometricsl
A simple ARCH(1) model
rt = t
t ∼ N (0, σt2 )
IID
σt2 = ω + α2t−1
I ARCH models are really AR’s in disguise
I Add 2t − σt2 to both sides σt2 = ω + α2t−1
σt2 + 2t − σt2 = ω + α2t−1 + 2t − σt2
I AR(1) in
I
I
I
I
2t = ω + α2t−1 + 2t − σt2
2t
2t = ω + α2t−1 + νt
νt = 2t − σt2 is a mean 0 white noise (WN) process
Captures surprise variance: 2t − σt2 = σt2 (e2t − 1)
Autocovariance
Same as in AR(1)!
2
2
2
2
E[(2t −
13σ̄ )(t−1 − σ̄ )] = αV[t ]
13
The GARCH model
rt = µt + t
µt = φ0 + φ1 rt−1 + . . . + φs rt−S
t ∼ N (0, σt2 )
IID
σt2
=ω+
P
X
αp 2t−p
+
p=1
Q
X
2
βq σt−q
q=1
I Add lagged variance to evolution
I A simple GARCH(1,1)
rt = t
t ∼ N (0, σt2 )
IID
2
σt2 = ω + α2t−1 + βσt−1
I Unconditional Variance
E[σt2 ] =
I Kurtosis
κ=
I Stationarity
•
•
•
ω
1−α−β
3(1 + α + β)(1 − α − β)
>3
1 − 2αβ − 3α2 − β 2
1−α−β >0
ω > 0, α ≥ 0, β ≥ 0
ARMA in disguise
2
σt2 + 2t − σt2 = ω + α2t−1 + βσt−1
+ 2t − σt2
2
+ 2t − σt2
2t = ω + α2t−1 + βσt−1
2t = ω + α2t−1 + β2t−1 − βνt−1 + νt
2t = ω + (α + β)2t−1 − βνt−1 + νt
14
* Example Estimation of ARCH(2) Process */
title 'IBM Stock Returns (daily)';
title2 '29jun1959 - 30jun1960';
data ibm;
infile cards eof=last;
input x @@;
r = dif( log( x ) );
time = _n_-1;
output;
return;
last:
do i = 1 to 46;
r = .;
time + 1;
output;
end;
return;
cards;
445 448 450 447 451 453 454 454 459 440 446 443 443 440
439 435 435 436 435 435 435 433 429 428 425 427 425 422
409 407 423 422 417 421 424 414 419 429 426 425 424 425
425 424 425 421 414 410 411 406 406 413 411 410 405 409
410 405 401 401 401 414 419 425 423 411 414 420 412 415
412 412 411 412 409 407 408 415 413 413 410 405 410 412
413 411 411 409 406 407 410 408 408 409 410 409 405 406
405 407 409 407 409 425 425 428 436 442 442 433 435 433
435 429 439 437 439 438 435 433 437 437 444 441 440 441
proc gplot data=ibm;
plot r*time / vref=0;
symbol1 i=join v=none; run;
proc autoreg data=ibm maxit=50;
model r = / noint garch=(q=2);
output out=a cev=v; run;
data b; set a;
length type $ 8.;
if r ^= . then do;
type = 'ESTIMATE'; output; end;
else do;
439 439 438 437 441 442 441 437 427 423 424 428 428 431
425 423 420 426 418 416 419 418 416 419 425 421 422 422
417 420 417 418 419 419 417 419 422 423 422 421 421 419
418 421 420 413 413 408 409 415 415 420 420 424 426 423
423 425 431 436 436 440 436 443 445 439 443 445 450 461
471 467 462 456 464 463 465 464 456 460 458 453 453 449
447 453 450 459 457 453 455 453 450 456 461 463 463 461
465 473 473 475 499 485 491 496 504 504 509 511 524 525
541 531 529 530 531 527 525 519 514 509 505 513 525 519
519 522 522
;
type = 'FORECAST'; output; end; run;
proc gplot data=b;
plot v*time=type / href=254 vaxis=.00010 to .00035 by .00005;
symbol1 i=join v=none;
symbol2 i=join v=plus;
run; quit;
15
SAS---Simple GARCH Model with Normally Distributed Residuals
The simple GARCH(p,q) model can be expressed as follows.
%let df = 7.5;
%let sig1 = 1;
%let sig2 = 0.1 ;
%let var2 = 2.5;
%let nobs = 1000 ;
%let nobs2 = 2000 ;
%let arch0 = 0.1 ;
%let arch1 = 0.2 ;
%let garch1 = 0.75 ;
%let intercept = 0.5 ;
data normal;
lu = &var2;
lh = &var2;
do i= -500 to &nobs ;
/* GARCH(1,1) with normally distributed residuals */
h = &arch0 + &arch1*lu**2 + &garch1*lh;
u = sqrt(h) * rannor(12345) ;
y = &intercept + u;
lu = u;
lh = h;
if i > 0 then output;
end;
run;
To estimate a simple GARCH model, you can use the AUTOREG procedure.
use the GARCH= option to specify the GARCH model, and the (P= , Q= ) suboption to specify the orders
proc autoreg data = normal ;
/* Estimate GARCH(1,1) with normally distributed residuals with AUTOREG*/
model y = / garch = ( q=1,p=1 ) ;
run ;
quit ;
OR
/* Estimate GARCH(1,1) with normally distributed residuals with MODEL*/
proc model data = normal ;
parms arch0 .1 arch1 .2 garch1 .75 ;
/* mean model */
y = intercept ;
/* variance model */
h.y = arch0 + arch1*xlag(resid.y**2,mse.y) +
garch1*xlag(h.y,mse.y) ;
/* fit the model */
fit y / method = marquardt fiml ;
run ;
quit ;
16
Figure : Test Procedure for misspecifications
Tests for
misspecifications
Residual tests
Autocorrelation
Breusch-Godfrey
LM test
Heteroskedasticity
White test
ARCH test
BDS test
RESET test
Mc-Leod-Li test
Jarque Bera
Nonlinearity
Coefficient tests
CUSUM
CUSUMQ
RecursiveResiduals
Parameter-Stability
Adequate final
model
Residual Analysis
• Autocorrelaton Test for Residuals
• Portmanteau Test for Residuals
• Ljung-Box Test
• McLeod-Li Test
• Turning Point Test for Residuals
17
McLeod-Li Test •
The McLeod & Li test looks at the autocorrelation function of the squares
of the prewhitened data and tests whether corr ( et2 , et2− k ) is non-zero for
some k and can be considered as an LM statistic against ARCH effects
18
Download