Second Expansion Models Order

advertisement
Technology
Department of Economics
Working Paper Series
Massachusetts
Institute of
Second Order Expansion of T-Statistic
in Autoregressive Models
Anna Mikusheva
Working Paper 07-26
October 12, 2007
Room
E52-251
50 Memorial Drive
Cambridge, AAA 02142
This paper can be downloaded without charge from the
Social Science Research Network Paper Collection at
http://ssrn.com/abstract=1021364
Second Order Expansion of
Models
t-statistic in Autoregressive
by Anna Mikusheva
MIT, Department
^
of Economics
Abstract
The purpose
of this paper
is
to receive a second order expansion of the t-statistic in
AR(1) model in local to unity asjnnptotic approach.
I
show that Hansen's (1998) method
for
confidence set construction achieves a second order improvement in local to unity asymptotic
approach compared with Stock's (1991) and Andrews' (1993) methods.
Key Words: autoregressive process, confidence
set, local to
unity asymptotics, uniform
convergence
Introduction
1
The paper
in
AR(1) models. The
finite
of
deals with inferences about the persistence parameter
classical
samples, especially
macroeconomic time
theory
size
,
that
is,
n converges
if
Wald confidence
the true value of p
series.
the setup
to infinity.
Wald type
when
The
<
\p\
1
classical
Numbers) do not hold uniformly over the
becomes slower
as p approaches
1,
is
coefficient)
p
interval typically has low coverage in
close to unity as
interval
is
(AR
is
it
happens
for
most
based on classical asymptotic
considered to be fixed and the sample
asymptotic laws (CLT and
interval p
€
(0, 1),
Law
of Large
rather the convergence
and the both laws do not hold
for
p
=
An
1.
alternative asymptotic approach, local to unity asymptotics, considers sequences of
models with p„
=
1
+
c/n as n goes to
Andrews and Guggenberger (2007a,b)
infinity.
According to Mikusheva (2007) and
local to unity asymptotics leads to
uniform
inferences on p, whereas classical asymptotics does not.
There are at
least three
correct confidence set for p:
methods that can be used to construct asymptotically
method based on the
^The paper constitutes the second part
grateful to
my
thesis
comments. E-mail
of
my PhD
local to unity
dissertation at
asymptotic approach
Harvard University.
committee Jim Stock, Marcelo Moreira and Gary Chamberlain
for correspondence:
amikushe@mit.edu
I
am
for helpful
Digitized by the Internet Archive
in
2011 with funding from
IVIIT
Libraries
http://www.archive.org/details/secondorderexpan0726miku
(Stock (1991)), parametric grid bootstrap (Andrews (1993)) and non-parametric grid
The
bootstrap (Hansen (1999)).
validity of the
methods was proved
Mikusheva
in
(2007).
This paper compares three methods on a ground of accuracy of asymptotic ap-
proximation they provide. All three methods are asymptotically
that
is,
the coverage of the confidence sets uniformly converges to the confidence level
The question I address
as the sample size increases.
is
well
order correct,
first
known
is
the speed of the convergence.
It
that Hansen's grid bootstrap achieves a second order refinement in clas-
sical
asymptotic approach, whereas the two other methods (Andrews' and Stock's) do
not.
I
address the same question in local to unity asymptotic approach.
My answer is
that the non-parametric grid bootstrap (Hansen's method) achieves the second order
refinement, that
is,
the speed of coverage probability convergence
is
o(n~^/^), whereas
the other two methods in general guarantee only Oiyn'^/"^) speed of convergence in
cal to unity
asymptotic approach. To compare the three method
expansion of the
A
around
t-statistic
its limit in local
second-order distributional expansion
is
I
find
an asymptotic
to unity asymptotics.
an approximation of the unknown
expansion
is
the
first
One example
of a second order distributional
two terms of well-known Edgeworth expansion.
There are several differences between the expansion obtained
Edgeworth expansion.
First of
all,
Edgeworth expansion
normal distribution. In our case we expand the
asymptotic
the
first
limit,
which
t-statistic
a non-normal distribution.
is
is
in this
paper and
an expansion around
around
Secondly,
its local
it
is
to unity
known
that
two terms of Edgeworth expansion do not constitute a distribution function
themselves. In particular,
special feature of
t-statistic
And
dis-
some other
tribution function of the statistic of interest (t-statistic in our case) by
function up to the order of o(n~^/^).
lo-
it
my expansion is that
it
to
1.
One
approximates the distribution function of the
distribution function (cdf), that can be easily simulated.
by a cumulative
finally,
can be non-monotonic and not changing from
opposed to the Edgeworth' expansion which came from expanding
characteristic function,
my
approximation principle.
expansion comes from stochastic embedding and strong
The same idea was used
in a very inspirational
work by
Park (2003a). He obtained a second order expansion of the Dickey- Fuller
for testing
a unit root.
construct a
random
way
The expansion
variable on the
is
That
"probabilistic" one.
same probability space
I
also
is,
I
as the t-statistic in such a
that the difference between the constructed variable and the t-statistic
order o{n~^/'^) in probability.
it
obtain
I
t-statistic
is
of the
show that under additional moment assumptions
leads to a second order "distributional" expansion.
The
me
distributional expansion allows
show that Hansen's
to
grid bootstrap
achieves the second order improvement in local to unity setting compared to An-
drews'
method and the
local to unity asymptotic distribution.
parametric grid bootstrap improvement
is
the classical one
-
The
intuition for non-
Hansen's grid bootstrap
uses the information about the distribution of error terms.
The paper
and
contributes to the literature on bootstrapping autoregressive processes
on making inferences on persistence
closes the discussion
some
of the
known
results
in
AR(1) model. Here
on bootstrap of AR models: Bose(1988) showed
in classical
asymptotics that the usual bootstrap provides the second order improvement com-
pared to the
OLS
usual bootstrap
asymptotic distribution. However, Basawa et al(1991) showed the
fails
root. Their result can
(has asymptotically
wrong
be easily generalized to
size) if
the true process has a unit
local to unity sequences.
Park (2003b)
showed that the usual bootstrap achieves higher accuracy than the asymptotic nor-
mal approximation
with
AR
of the t-statistic for weakly integrated sequences (for sequences
coefficient converging to the unit root
intuition behind Park's result
about closeness of the
is
with a speed slower than 1/n). The
that the ordinary bootstrap uses the information
AR coefficient to the
and the reason of bootstrap improvement
is
unit root. His expansion
is
non-standard
also not usual (usually bootstrap achieves
higher efficiency due to usage information about the distribution of error term).
1
get
many
ideas from a paper by Park (2003a), where he proves the second
order improvement of bootstrapped tests for the unit root.
expansion of
My
t-statistic for the unit root in
He found an asymptotic
terms of functionals of Brownian motion.
expansions for local to unity sequences will bear the similar idea to
The
rest of the
paper
is
his.
organized in the following way. Section 2 introduces nota-
tions. Section 3 obtains
expansion of the
a probabilistic embedding of error terms and a probabilistic
Section 4 shows that the probabilistic expansion from the
t-statistic.
previous section leads to a distributional expansion. Section 5 establishes a similar
expansion
for
a bootstrapped
proofs are
left
to the Appendix.
2
statistic
and obtains the main
result of the essay. All
Notations and preliminary results
Let us have a process
y3
We assume that yo =
absolute
sets is
moment
0.
PVj-i
+ ^v
=
J
'i-,-,n
Error terms Sj are iid with
of order
based on the
=
The procedure
r.
t-statistics.
mean
of testing
(1)
zero, unit variance
and
finite
and constructing confidence
Let
,,
s
t{y,p,n)^
E^iivj -
pyj-i)yj-i
^^JEU yU
be the t-statistic
The
classical
to infinity
for testing the true value of the
parameter p using the sample {yj}"=i.
asymptotic approach states that for every fixed
\p\
<
1
as
n
is
a standard
increases
we have
t{y,p,n)^N{0,l).
According to local to unity asymptotic approach
t{y,Pn,n)
—
J^
e'^^^~^''dw{s) is
=
1
+ c/n, C >
J^ Jc{x)dw{x)
ylo
where Jc(x)
if /0„
Jc{x)d2
an Ornstein- Ulenbeck process, w{-)
Brownian motion.
As
it
was shown
in
Mikusheva(2007) the
uniform. In particular,
if
Za
lim inf
n-»oo|p|<l
is
classical
asymptotic approximation
is
not
the a-quantile of standard normal distribution, then
Pp{za/2
<
t{y,p,n)
<
zi-q/2}
<
1
-
a.
As a
if
the usual
result,
we
OLS
confidence set would have a poor coverage in finite samples
allow p to be arbitrary close to the unit root.
A local to unity asymptotic approach on the contrary is uniform (Mikusheva(2007),
Theorem
Namely,
2).
sup sup \Pp{t{y,
lim
=
where Fl^ix)
The use
p,
<
n)
- F^ Jx)\ =
x}
^
"^'^/9e[o,i]
P{j; Mt)dw{t)/yJ J^
< x}
J^{t)dt
=
with c
of local to unity asymptotic in order to construct a confidence set
suggested by Stock (1991).
It
to test a set of hypothesis
Hq
=
p
:
A
po (in practice the testing could
test
compares
The acceptance
The method
=
Pp^{t{z, po,n)
and normal
errors.
accurately, let y^
OLS
<
x}. Here
—
t-statistic for
Poyl-i
+ ^t
regression, then F^p^{x)
Previously,
Zt is
differ in
is
an
,
where
—
My goal is to
we use
AR
quantiles of F*p{x) the finite
e^
are
sampled from the residuals of the
Ppg{t{y*,po,n)
all
<
x}.
three methods are uniformly asymp-
I
will
show that Hansen's bootstrap provides the
second order improvement in local to unity asymptotic approach. That
=
pn
=
exp{c/n} as n increases to
called "nearly integrated" process).
expansion of
i(y,
coefficient po
explore the second order properties of the methods in
local to unity asymptotic approach.
a sequence of models p
errors:
a bootstrapped model with the null imposed.
Mikusheva (2007) proved that
totically correct.
is
the choice of
model with normal
AR(1) process with the
In Hansen's grid bootstrap
sample distribution of the
models
set
In particular, in Andrews' grid bootstrap critical values are taken as
quantiles of finite sample distribution of the t-statistic in a
initial
critical
alternatives to the procedure above are Andrews' parametric grid bootstrap
critical values.
More
with
set.
and Hansen's non-parametric grid bootstrap.
F^p^{x)
be performed
t-statistic t{y,po,n)
values which are quantiles of the distribution of F^p^{x).
asymptotic confidence
was
can be implemented as a "grid" procedure. One need
over a fine grid of values of po).
Two
nlogip).
The
goal
p„,n) along this sequence of models.
devoted to probabilistic expansion.
is
is, I
infinity (this
consider
sequence of
to obtain the second order
The next
section would be
Stochastic embedding.
3
Assumptions A. Assume
a^
—
1
and
<
£'|ej|''
that error terms £j are
>
oo for some r
with
i.i.d.
mean
zero, variance
2.
According to Skorokhod embedding scheme, there exists a Brownian motion
a sequence of
on an extended probabiUty space such that the sequence
iid variables r^
of error terms have the
same
known
that Etj
—
a^
define T„j
—
^ ^11=1
It also
We
constant.
^j
=
(a'^'^)
matrix
S
=
"^i-
KrE\ej\'^,
where
K,. is
Let us consider a sequence of
= ^J2f^\v, =
an absolute
random
vectors
{wnit),Vnit),Un{t)). Park(2003a)
a Brownian motion with covariance
i? is
given by
^
S=
^
Here Ee]
<
1,E\tj\^^^
{w,V,U), where
>'^
Brownian motion:
distribution as a sequence of stopped
and Bnit)
— B =
proved that B„
w and
=
a^
=
1
^3/30-3
i^^/3a^
K/a"
^i3/(T^
1,
(//4
=
Ee^^
//g,
-
3^4
Ee^^
way that B„
{fi4-3a^
3K)/12a4
IJ4,
E{tj
5„ and
Park(2003a)also proved that
space in such a
=
+
>
^^/^3
B
-
+
3K)/12a*
{f^^-a^)/a*
a^-f
=
(3)
^
k.
can be defined on the same probability
—^''^ B. Let N{t)
—
w{l + t) — w{l), M{t) be aBrownian
motion independent on w.
Theorem
with r
>
8,
1 Let pn
=
1
+
cjn^c
<
0.
Assume
that Sj satisfy set of assumptions
A
then one has the following probabilistic expansions:
(a)
-^ -
Jc(T„,,.)
=
-^ r
"
e'^C^/"-) Je(5)dy(s)
+
o,{n-"^)
iVy,-_i£fc= / Mx)dw{x) + n-'/'Ul)M{V)+
+1=1 -cj j
e'^^'-'^Us)dV{s)dw{t)
+ Ul)N{V) + ]^M\V) + ]^u\+o,{^)
f Mx) re'^'-'^Jc{s)dV{s)dx\Y.yl= JoI Jl{^)dx-^
yi^ Jo
Jo
IT-
Jl{x)dV{x)
1
+
1
,o
-j=Jl{l)V
+ op(^)
s/n Jq
(d)
'^
'
V^
Jo
^
Jo
Jo
^M1)V + Op{^)
/ Mx)dV{x) +
(e)
=
t{y, p„, n)
here
^
^
=
t^
/J
Mx)dw{x)/^ J^
J^{x)dx, f
Mx) lo
The expansions from Theorem
variable t{y,pn,n),
^n (whose distribution
probability: P{\£,n
interest
+ n-'/^f +
n-^'^g
=
+ o^{n-^'^)
J,{1)M{V)/^J J^ J^ix)dx,
Tites (~^^o' Ioe'=^'-^^Ms)dV{s)dw{t) + Jc{l)N{V) + '^MHV) + \U
+t \/r^',, (2c lo
dom
t^
—
is
known
+
J^
are probabihstic. Namely,
whose distribution
Ux)dV{x) -
Ul)V
we approximate a
ran-
unknown, by another random variable
is
or could be simulated) with accuracy o{n~^^'^) in
>
t{y,pn,n)\
by themselves
1
e'^^^-^U,{s)dV{s)dx
en~^^^} -^
0.
Probabilistic expansions are not of
(since they are abstract constructions)
,
rather they are building
blocks in getting distributional expansions described in the next section.
The random
motions B{t)
=
variables on the right-hand side are functionals of several
{w{t), V{t), U{t))
only on some characteristics
the
first
four
moments
are defined above).
of Sj
M{t)
{a'^
and M{t). The covariance matrix
depends
^3 ^4 k) of the distribution function of Sj namely on
,
,
and some characterization
is
of B{t)
,
,
Brownian
of non-normality k (parameters
independent of B{t). As a result the distribution of the
approximating variable depends only on
=
ip
{a^,iJ.3, ji^, k, c).
The
distribution of the
approximating variable can be easily simulated.
Remark
to the
1 // one has
an exact unit root
(c
expansion obtained by Park(2003a).
=
0),
then the expansion
is
exactly equal
Remark
U(-).
2 If
Ej are
implies that
It
w. So, according
same up
to
normally distributed, then V{t)
=
t
f^
+ -K=—j==^ +
to this probabilistic
=
and w{-)
Op(n~^/^), where
U
is
same
independent on
expansion Stock's and Andrews' methods are the
an independent summand of order Op{n~^^^). I show
that they are the
independent of
is
in the next section
distributionally up to the order of o{n~^/'^).
Distributional expansion
4
For making inferences we need asjrmptotic theory to approximate the
we found a sequence
In particular,
of
random
ip
(the distribution can
with known distribution depending on a vector of parameters
be simulated
That
if V' is
known) such that t{y,n,pn)
=
^n
+
dis-
we estabhshed a prob-
tribution of the t-statistic t{y,n,pn)- In the previous section
abihstic approximation.
unknown
variables ^„
Op{n~^^'^) for pn
—
1
+ c/n.
is,
lim Pp„
The
<^
\t{y, n,
goal of this section
is
expansion of the second order
p„)
-
^„|
>
^= I =
for all
e
>
to get a distributional expansion.
I
mean a sequence
0.
By
distributional
of real- value functions G„(-) such
that
P {t{y, n, Pn) <x} =
In general, (?«(•)
is
G„(x)
not required to be a cdf of any
+ o(n-i/2).
random
(4)
variable.
An example of a distributional expansion is the second order Edgeworth expansion.
Initially,
Edgeworth expansion was stated as an approximation to the distribution of
normalized sums of random variables. Nowadays, Edgeworth type expansions have
been obtained
for
many
statistics
having normal limiting distribution. Traditionally
Edgeworth type expansions are obtained from expansions of
It is
also
known
any random
is
that usually, in Edgeworth expansions function
variable. In particular,
In our setup
characteristic functions.
G„
is
not monotonic in
Edgeworth expansion does not
not normal. In this section
I
many
G„
is
not a cdf
of
applications.
exist since the limiting distribution
show that under some moment conditions our proba-
expansion corresponds to a distributional expansion. Namely,
bilistic
sup |P,„ {t{y,n,pn)
<
x}
- P{^„ <
x}|
=
o{T-^'^).
X
=
here ^n
Gn{x)
f^
+
+
n~^/^f
= P {^n <
x}
is
from part(e) of Theorem
rT^/'^g
a cdf.
It
Definition 1 (Park(2003a))
That
1.
depends on a parameter vector
A random
variable
in
is,
our case
ip.
X has a distributional order o{T~°-)
>T-^] <r-"
ifp{\x\
Theorem
2 Let
all
assumptions of Theorem
ments (a)-(e) of Theorem
Corollary
// error
1
1 hold,
then
all Op{T~^^'^)
terms in state-
are of distributional order o{T~^^'^).
1
terms are
mean
with
i.i.d.
zero and 8 finite moments,
the
following distributional expansion holds:
sup \P{t{y,
pn,n)<x}- P{f + n-"^f + n-^'^g <x}\=
o{n-^'^)
X
One can
Gn
that
Remark
tion
If
and
Gn
would
a
is
notice there
cdf.
no "unique" distributional expansion even
This surprising fact
3 Let Gn{x)
is
is
=
P{^n < x}
is
also satisfy
That
it.
is,
be a cdf
and assume
i^„
and
F
that
on
The idea
is
A.
= P{^n + FA^rj <
x}
approximation
(4),
the additional
term (which
then Gn{x)
This point was
that the characteristic function for ^„
is
It
might seem strange that the probabilistic expansion of
e''^"
up
of probabilistic order
is
A
equal to
has normal distribu-
rj
be measurable with respect to
of Op{n~^/'^)) has distributional impact of order o{n~^f'^).
Park(2003a).
to the
+
^(1)
VWl^
is
order 0{n~^).
totally parallel to the note above. Indeed,
where M(l)
~
made by
F-j^r) conditional
^ yj-i£j
order Op{n~^^'^). This term has distributional impact of order 0(n~^/^).
the statement
require
explained in the note below.
independent of a- algebra A. Let
satisfies the distributional
we
if
N{0,
1)
and
is
M{V)
independent of B{-)
=
is
has term of
The
idea of
distributionally
{w, V, U).
Remark
4 Combining Notes 2 and 3 one
get the following. If error terms are nor-
mally distributed then we have a distributional equivalence
P{t{z, n, p)
That
is
<x} = P{f <x} +
oin"^/'^).
the difference between quantiles constructed in Stock's
is
of the order o{n~^^'^). The two methods achieve the
and Andrews' methods
same accuracy up
to the
second
order.
Bootstrapped expansion
5
5.1
Embedding
In section 4
we got
by a sequence
of
for
bootstrapped
statistic
that the distribution oft-statistic t{y, n, p„) could be approximated
of functions
Gn{x)
=
P{t'^+:^^f+ -y^g}, where / and g
Brownian motions B{-) (covariance structure
cr^, fj.3, fi4,
K, c)
and
statistic
value" (not estimator) of p(or
and the
(3), it
depends on
M (independent of B).
The bootstrapped
t-statistic
described in
is
are functionals
c).
has totally the same form, since
The only
difference
between the
grid bootstrapped distribution of t-statistic
is
it
uses the "true
initial distribution of
different distribution
of error term.
P{t{y*,n,p)<x}^G:{x) +
where G;(x)
same
=
Pji'^
+
Jjjf*
+ ^5*}
with /* and g* are functionals of 5*,M(the
functionals, covariance structure of
The next
B* depends on
(a^,/X3,/i4, k) at
up to the order
A
with r
initial
and grid bootstrapped
statistics
of o(n"^/^).
3 Let us have an AR(1) process
Assumptions
al-
a speed of Op(n~^/^), which would be enough to say
that the second order terms in expansions of
Theorem
a'^\fL3,'p,4,K)
subsection states that the parameter vector (a^,/23,/24, k) converge
most surely to
coincide
o{n-'/^),
>
1.
Assume
(1) with yo
that p„
10
=
1
+
=
c/n,c
and error terms
<
0.
satisfying
Let us consider for
every n a process y*
—
+
PnVj-i
and normalized residuals from
sup|P{i(y,n,/9„)
<
—
^jil/o
where
0;
the initial regression.
x}
-
<
P*{t{y*,n,pn)
sample from centered
e* are i.i.d.
Then
x\y}\
=
o{n-^^'^)
a.s.
X
Theorem
3 states that Hansen's grid bootstrap provides the second order improve-
ment compared with Andrews' and Stock's methods
The
proach.
intuition for that
is
in local to unity
asymptotic ap-
The second order term depends on
the usual one.
the parameters of the distribution of error terms. Those parameters are well approx-
imated by the sampled analogues. The non-parametric grid bootstrap uses sampled
residuals
whose parameters are very
finement
is
achieved.
The only parameter
that could not be well estimated
3
is
As a
result, the re-
(on which the limiting expansion depends)
local to unity
is
procedure uses the "true" value of
Theorem
close to the population values.
parameter
c.
The
grid bootstrap
c.
a statement obtained in local to unity asymptotic approach.
The
statement that Hansen's grid bootstrap achieves a second order refinement in the
classical
asymptotics
is
an easy one.
It
could be obtained from Edgeworth expansion
As a
along the lines suggested in Bose (1988).
result,
we should advise applied
researchers to chose Hansen's grid bootstrap over Andrews' and Stock's methods.
5.2
Convergence of parameters
This subsection
is
a part of the proof of Theorem 3 from the previous subsection. Here
we show that the parameter
vector
tp
=
a sample analog (moments of residuals)
Lemma
(cr^,
V'
=
^3,
/i4,
k) could
be well approximated by
(5'^,/^3,M4)^)•
1 Let error terms Sj satisfy the set Assumptions A.
Then
there
is
a Sko-
rokhod's embedding for which
The convergence
of the third
and forth sample moments
population analogues with a speed of 0(n~^/^)
is
of residuals to their
the usual statement. For that
need to require enough moments of error term, 8 moments should be enough.
11
we
One parameter,
«, as
was discussed above
Skorokhod embedding was
non-trivial mainly because
is
The
realized).
is
not intrinsic
fact that
—>p
depends on a way the
Et^ with speed
most of known constructions are not
dependence of moments of r on distribution of e
I
k,
(it
is
not evident.
explicit
By messy
0{n''^^'^)
and the
calculation
got that in the initial Skorokhod construction published in Skorokhod's book(1965)
Et"^
=
|£'^'*.
That would imply the speed
of convergence
we
need.
Appendix. Proofs of results
6
We
use the following results from Park (2003a):
Lemma
2 (Park (2003a),
Ifr>8,
Lemma
3.5(a))
then
(a)
-j^ Y^ sj =
where
V—
+ n-'/^M{V) + n-'^^N{V) + Op{n-'/^),
V{1)-
About convergence
Lemma
w{l)
of stochastic integrals:
3 (Kurtz and Protter) For each
D
with sample paths in Skorokhod space
that
Yn
= Mn + An + Zn,
variation process and
Zn
whcre
is
Mn
is
n, let
and
let
a local
(X„,y„)
be
an !F^- adapted process
Yn be JF" semimartingale. Suppose
T^ martingale, An
constant except for finitely
many
is
stochastically bounded for each
stopping times {r"} such that
t
>
0.
P{r^ < a} <
J-^ adapted finite
discontinuities. Let Nn{t)
denotes the number of discontinuities of process Zn on interval
Nn
is
[0, t]
Suppose that for each a
l/ce
Suppose that
.
>
there exist
and sup„£'[[M„]jAra +Tt/\T^{An)
<
oo.
If
[Xn, Yn, Zn)
—^'^
{X, Y, Z) in the Skorokhod topology, then
with respect to a filtration to which
{X,Y, J
XdY)
X
and
Y
are adapted
12
is
a semimartingale
and {Xn,Yn, J XndYn)
in the Skorokhod topology. If {Xn,Yn, Zn) -^
then convergence in probability holds in the conclusion.
Y
{X,Y,Z)
—
^'^
in probability,
Proof of Theorem
1
(a)
k
V^
.7=1
.c^ _
pC(T„,fc-T„,j)
)
{w
{Tn,j)
-W
(Tnj-i))
+
j=l
/ '
n,
+
= I
n
e^e^/"-^) J,(5)dF(s)
+
^i,„
+
R2,n
+ R3,n,
Jo
where we have the following
lines of reasoning:
j=i
V"
j=l
fe
fe
--^ 5Z ^'^ IZ
^"^
_
y„(^
_
1))
(^
(T„,,.)
_
u,
(T„,,_i))
+
^=.+1
i=i
k
(14(i)
i-l
V'^ i=l j=l
= --^
V
e'^'^
(K(i)
- K(i -
1)) y,_i
+
i?i,T
=
fkjn
c
X
_
gCCr^.k-Tn,.
n
\W {Tn,j) ~w{Tn,j-l)\
13
<
l.-'n.fe
-'nj,
^i,„
=
k
<
i?2,„
^
/
c^e'^^
= -^
V'lT'
E
~^
\ 2
•
7
i
(T„,fc
^'"^ ^^n{i)
''
-
-
K^(^
T„j)
-
\w (r„j)
j
1)) y^-l
-w
'^ f
(T„,j_i)|
^
e'^''/"~''>
=
i?i,„
Ms)dV (s)
y/n Jo
rTn,j
(b)
where we used statement
(a).
In the next theorem
distributional impact of the Op term, so,
^
V
V^
I
we would need
to estimate the
keep track of them
J \/n
^/n
Jo
Jo
n
VnJo
y/n
V^f^^Jo
Jo
So
fl
n
^
tr;
fc=i
v"
,._i
fc=i
v"
7o
rt
Jo
Now
y2jc{Tn,k-l)^- I J{s)dwS^ f
Ji
^
tt
By
definition of
n
„T
Jo
0-U
process J{s)
—
^'^
"'
{Jc{s)-MTn,k-l))dw{s)
Ms)dw{s)-f2 f
t^iK.-.
w{s)
+ cj^
«
1.
J{t)dt:
/-T
w{s)
n
»Xn
-
it
(5(s) +^Z1 JTn,k-l
/
fe=l
14
B(7;,fe-i))du;(s)
w{Tn,k-i))dw{s)+
where
j5(s)
nition
/j,"'*"
"
nT
=
- ELi ff^^tJBis) - B{Tn,k-i))dw{s). By
- w{Tn,k-i))dw{s) — ^+ "''''2"'"" ^^ ^ result
/; Jc{t)dt. Let Re,n
{w{s)
'
1
I.
2^^
fc=l •''^n,k-l
Now
the
last.
We know
that d{J^{x))
/Tn,n
=
2Jcdw
+
2cJ^dx
-
+ &,
(J,2(7;„)
-
J,2(l))
1
-
[cJlix)
J
Je(l) {Jc{Tnn)
~
Jc{l))
Je(l) (U;(r„„)
1
+
-
'^
Here the
terms
last equality is
.
(MTnn) -
u;(l))
= n-i/V,(l)M(F) + n-1/2
+
MW - -^V
,...,2
{wiTnn)
I
z
('jc(l)iV(V^)
+
due to statement
is
so
pTnn
1
=
J,(5)d«;(5)
=
defi-
' w{l)f -
^M2(1/) -
(a) of
-)dx
/ .0,.lcJ^{l)
.-
1
+
=
1
+ -]+
R^,r^
-^V
+ Rs,n =
2y/n
^v\ + Rs,n + Op{n-'/').
Lemma
2.
The
definition of error
/Tnn
{Jl{x)
-
Jl{l))dx-
/Tnn
(
As a
Jc(x)
-
Jc{l))dx
-
i?7,n
+ cJc{l)R%,n-
result,
-y^yk-iek= I Jc{x)dw{x)+n-"U,{l)M{V)+
+-^ (-cj j
(c)
=
e'^'-'U,{s)dV{s)dw{t)
Using the statement of part
+
Jc{l)N{V)
-^
/
Jc{x)
^t/)
+
Op(-^)
(a)
iE (-;!/""'"""-"^.W^V-M +
=
+ '^M\V) +
o,
(-1;))
(27,(T„.)
r e'^''-'^Ms)dV{s)dx + R9n +
15
+0, (-L)) =
0pin-'^"-),
where
l^"
V
^Vt[
1
= -^
i?9,„
/"^
/ 1
-
Bi{k/n)
-
B^{x)di
/
Jo
here
Bi(x)
=
-cJ,(x)
/
e'^(^-^)Je(5)dy(5).
Jo
As a
result,
'2
•"
—
/-i
2c
Now,
i
Y, Jc(Tn,k-l) -
-E
Jl{x)dx
j^
"^
/
"
= J2 JciTn,k-l)
(^
(Tnk
-
Tn,k-l)^
-
^^
{J^it)-JUTn,k-i))dt+ f
Jl{t)dt.
Let us consider each term separately:
where
i?io,n
= --^
(E Jl{Tr,,k-i){Vn{kln) ^'
-Y.I
(Jc (^)
-
Jl{TnM-i))dt
-
=
1/n))
i?ii,„
-
=
y
J2(x)dy(x)^
Op(n-i/2^
v^
Ji
Summing
14(fc
up:
fjci^)dx-^ f Ux) r e'^^^-^^Us)dV[s)dx^E?^'^
—
Jo
Jo
v"- Jo
J^
1
VnJo
(d)
Using the statement of part
(a)
16
,o...,,
,
1
.
'•^
''
c
Jo
As a
Jo
result,
fl
n
V^
"^
i-x
Jo
Jo
Now,
^ J2
Jo{Tn,k-l)
-
j
Jc{x)dx
"'
- 5] /
= Y, Jc{Tn,k-l)
-
(-^^(^^
Q-
me
T„,fc_i)
-
j
Je(t)di
Jl
•^T^,fc-i
Let
""
+ /
Jc{Tn,k-l))dt
"
{Tnk
consider each term separately
Q-
J2 Jc{Tn,k-l)
{Tnk
-
T„,fc-l)
j
= --^ /
+
^c(x)ciF(x)
i?i2,„,
where
^i2,n
= --^ f 5^
Je(r„,fe_i)(K(Vn)
" Zl /
"
"'
^-^^(^^
-
-
Vn{k
Jc{Tn,k-l))dt
= Mi)v +
Jc{t)dt
1/n))
=
-
/"
J,{x)dV{x)\
.
i?13.„.
/?8,„.
Summary
-^
Part
(e) follows
Now we
-^2,71,
grals.
from
and
and
i?ii,„
G Li,C2 G
1/2.
i?i3,„
Then
for all
Lemma
have a structure of
^i^k
=
E^i^n
~
E
Jo
Jo
Yllz=i
we have
Cfc.ni
^
i?9,„
where
consi
cl{s)dsdudt)
Jo
17
for
3 on convergence of stochastic inte-
CiC^t^O
Jo
Statements
Op{n~^/'^).
=
•
=
Op{n~^/'^).
^k,n are i.i.d. across
Ci{t)dt
+ C2{t)dw{t)
n~^, and
pu
nt
^E{
=
Ri^n
C{t)dt for dC{t)
Jq
^ilo
rr/n
E^l„
we have
i
of non-stochastic integrals
and distributionally equal to
Ci
and Taylor expansion.
follows from
i?i2,n
Prom convergence
Terms
(a)-(d)
need to check that
Rs.ni-Rio.n
+ ^J,{l)V + Op{^)
J^{x)dV{x)
[
<
const
n'^.
k
with
As a
We
both terms are Op(l).
result,
can also notice that Chebyshev's inequality
imphes that they are distributionally o(n~^/^).
Terms
A;.
Here
and
i?3_„
~
^i,„
J^
E^l^ < const
iJg.n also
have a structure of ^^^^
=
C{t)dw{t) with dC{i)
n~^.
It
^k,n,
Cidt. It is
where
^k,n are i.i.d. across
easy to see that
£'^i,„
=
and
implies that terms are probabilistically and distributionally
o(n-i/2).
Terms
i?7_„
and
""^
have form of
i?8,n
Jj
Lemma
4 (Park(2003a))
PI
sup \Bn{t)
-
If r
B{t)\
>
C>
/or any
didw+d2dt.
>c\ <
n^-'"/4C-"/2(l
B
+ a-')K{l +
such that
E\ej\')
J
n-i/2+2A
Proof of Theorem
Theorem
we might choose B„ and
4 then
lo<i<l
=
D
easy to see that they are Op{n~^^'^).
It's
{D{s) — D{l))ds where dD{s)
1 is
We
2.
need to check that
all
terms
R^^n
used in the proof of
distributionally of order o(n~^/^).
In the proof of
Theorem
1
we aheady showed that terms
-Rs^n,
R6,m Rn,n and
Ri3^n are distributionally o(n~-'/^).
Terms
Vn{t)) or
and
i?2,n, -Rs.m -^lo.n
-^
Jg
^{t)d{w{t)
—
Ri2,n have a
form of stochastic integrals -^
Wnit))- Their distributional order
quadratic variations which have forms of supo<j<i \Vn{t)
The
w{t)\'^.
Terms
order of the last expressions
i?7,„
is
is
determined by
and/?8,n have form of /j^""(i:)(s)-£)(l))ds|
Lemma
<
(1965)
Let
\wn{t)
—
4.
supo<t<i \D{t)\-\Tnn-l\
distributionally o"^/^.
Proof of
First,
would depend on the
— V{t)\- and supo<j<;^
|
which
J^ ^{t)d{V{t)-
I
Lemma
show that
1.
for
Skorokhod's construction presented in Skorokhod's book
we have Et^ = §£^^1
Tafi is
the smallest root of the equation {w{t)
„
w
£^g-Ara,f,
^
sinh
6v2A —
—
a){w{t)
sinh a V 2A
— a)v2A
0.
Then
,^,
^
sinh(6
— 6) =
(5)
and
(-1)'=— ^e-^-.^^^ = £;<,.
dX''
18
(6)
book (1965) the stopping time
In the construction from Skorokhod's
=
r
where s
P{e <
Then
'w{t) has the
same
G{£))}
=
defined as
t,,g(s),
G is defined by JZ. ydF{y) =
independent of w and the function
x}.
We
The
is
- e){w{t) -
iniiiwit)
is
distribution as
0,
=
F{x)
e.
can notice that
last
formula
G{G{x))
moments
could be calculated using equation (6) for
We
for the characteristic function (5).
=
=
X and G{x)dF{G{x))
<
Since Ee^
and the
also use the following
explicit
two
facts:
xdF{x). By tedious but straightforward calcula-
—
Et^
tions one can obtain the formula
of Ta^
\E^^.
oo by using Chebyshev's inequality one can get the statement of the
lemma.
Proof of Theorem
t-statistic
and
3.
Theorem
2 states the distributional expansions for the
grid bootstrapped analog
its
sup \P{t{y,
< x} -
n, p)
=
Gn{x)\
oirT^I''),
(7)
X
=
here Gn{x)
The
B{-).
P{t'^
+ :^^ f + A^g}
,
covariance structure of
M
Brownian motion
is
where / and g are functional of Brownian motions
B
described in
is
independent of B.
2 that the term o{n~^^^) in equation (7)
eights
moment
is
of the approximated error
For almost any realization of an
its finite
It
depends on
cr^, jj.^,
could be seen from the proof of
^4, k,
c.
Theorem
bounded by a constant depending on the
term
infinite
(3), it
fxg
times n~^/^~'' for some 5
sequence of error terms
(ei,
...,
>
e„,
0.
...)
and
subsequence of the length n we would have
sup \P{t{y*,n,p)
<x}- G;(x)| <
Const{Jls)n-'/^-\
X
here
G*
(x)
=
Plf^
+
^r??/*
B*,M*. The covariance
Const{Ji^)n~^/'^~^
=
+
;^5*} where /* and
structure of
o{n~^/'^) a.s.
B* depends on
As a
g* are the
same functionals
ct^,/X3,/24,k Since r
>
8,
of
then
result,
sup\P{t{y*,n,p)<x}-Glix)\ =
X
19
o{n-'/^)
a.s.,
'
(8)
Combining equations
Theorem
7
(7)
and
(8)
with
Lemma
1
one obtains the statement of
3.
References
Andrews D.W.K.
"
(1993):
Exactly Median-Unbiased Estimation of First Order Au-
toregressive/Unit Root Models," Econometrica, 61(1), 139-165.
Andrews D.W.K. and
sample Methods
",
P.
Guggenberger (2007a): "Hybrid and Size-corrected Sub-
Cowles Foundation Discussion Paper 1606.
Andrews D.W.K. and
P.
Guggenberger (2007b): "The Limit of Finite Sample Size
and a Problem with Subsamphng", Cowles Foundation Discussion Paper 1605.
Basawa
"
I.V.,
A.K. Mallik, W.P. McCormick, J.H. Reeves and R.L. Taylor (1991):
Bootstrapping Unstable First-Order Autoregressive Processes," Ann.
Statist., 19 {2),
1098-1101.
Hansen B.E.
(1999):
"The Grid Bootstrap and the Autoregressive Model," The
Review of Economics and
Mikusheva A.
(2007):
Statistics, 81(4), 594-607.
"Uniform inference
in autoregressive models," Econometrica,
75(5), 1411-1452.
Park J.Y. (2003a): "Bootstrap Unit root
tests,"
Econometrica, 71(6), 1845-1895.
Park J.Y. (2003b): "Weak Unit Roots," unpublished manuscript.
Phillips P.C.B. (1987):
"Toward a Unified Asymptotic Theory
for Autoregres-
sion," Biometrika, 74(3), 535-547.
Skorokhod A.V. (1965): "Studies
in the
Theory of Random Processes," Addison-
Wesley Publishing Company, Reading, Massachusetts
Stock
J.
(1991): "Confidence Intervals for the Largest Autoregressive
Macroeconomic Time
Series,"
Root
Journal of Monetary Economics, 28, 435-459.
20
in
US
Download