Complete convergence of moving average ... dependence assumptions 1 STATISTICS& PROBABILITY

advertisement
: :- .
1'
;~
STATISTICS&
PROBABILITY
LEI"IIERS
ELSEVIER
Statistics & Probability Letters 30 (1996) 165-170
Complete convergence of moving average processes under
dependence assumptions 1
Li-Xin Zhang
Department of Mathematics, Hangzhou University, Hangzhou 310028, China
Received July 1995
Abstract
Let
{Yi;-oc
< i < c~} be a doubly infinite sequence of identically distributed and (b-mixing random variables,
(ai; - ~ < i < oc} an absolutely summable sequence of real numbers. In this paper, we prove the complete convergence
of {Ek=xn ~io~=_¢xzai+kYi/nt/,; n>~ 1} under some suitable conditions.
AMS classification: 60G50; 60F15
Keywords: Complete convergence; Moving average; ~b-mixing
We assume that {Yi;-co < i < co} is a doubly infinite sequence of identically distributed random variables.
Let {ai; - c o < i < co} be an absolutely summable sequence o f real numbers and
OQ
Xk ~- Z
ai+kYi,
k >11.
i=--oO
Under independence assumptions, i.e., {Y/;-oc < i < oc} is a sequence of independent random variables,
many limiting results have been obtained for the moving average process {Xk; k >~1}. For example, Ibragimov
(1962) has established the Central Limit Theorem for {Xk;k ~>1}, Burton and Dehling (1990) have obtained a
large deviation principle for {Xk;k>>. 1} assuming Eexp(tY1) < oc for all t, and Li et al. (1992) have obtained
the following result on complete convergence.
Theorem A. Suppose {Yi; - o c < oc} is a sequence of independent and identically distributed random variables. Let {Xk;k >~1} be defined as above and 1 <~t < 2. Then EY1 = 0 and EIYll 2t < oc imply
E
)nl/t~
P
< ec for all ~ > O.
n=l
1 Supported by National Natural Science Foundation of China.
0167-7152/96/$12.00 (~) 1996 Elsevier Science B.V. All rights reserved
K ' K ' / ' ) ! f) 1 ( ~ 7 - 7
1 a;9(O~;
"~Of19 1 ~ ; - A
L.-X. Zhang / Statistics & Probability Letters 30 (1996) 165-170
166
Under dependence assumption, few results for {Xk;k>11} are known. In this note, we shall extend
Theorem A to the case of dependence. We suppose {Yi;-oo < i < oo} is a sequence of identically distributed and @mixing random variables, i.e.,
¢(m) = sup¢(~_koo, ffk~m) --+ O,
m ---+cx~,
k
where
~nm=a(Yk, n < . k < m )
and
¢(d,M)=
sup
AE~I,BE~
IP(B[A)-P(B)I.
P(A)>0
Theorem 1. Suppose {Y/;-c~ < i < c~} is a sequence o f identically distributed and @mixing random
variables w i t h ~ m % l q~l/2(m) < OO and {Xk,k>~l} is defined as above. Let h(x) > 0 (x > O) be a slowly
varying function and 1 <~t < 2, r >>.1. Then EY~ = 0 and El Y1Ir'h(IYl [t) < go imply
<oo
for all e > O.
Throughout the sequel, C will represent a positive constant although its value may change from one
appearance to the next, and an << bn will mean an = O(bn).
Observe that
k=l
s e t ani =
£
i=--oo j=l
zn
Xk =
k=l
Then
j = l aj+i.
£
ani Yi.
i=-oo
The following lemma comes from Burton and Dehling (1990).
o(3
go
Lemma 1. Let ~i=-oo ai be an absolutely convergent series of real numbers with a = ~i=--oo ai and k >>.1.
Then
'£2a
lim n--*oon
i=--oo
'
=
[al I k.
[
j=i+l
The following lemma will be useful. A proof appears in Shag (1988) (see also Shag, 1993).
Lemma 2. Let {Yn;n~> 1} be a C-mixing sequence. Sn = ~ = l Yk, n >/1. Suppose that there exists a sequence
{C,} o f positive numbers such that
max ES2i < G.
l <~i<~n
Then for any q~>2, there exists C = C(q, ¢(.)) such that
L.-X. Zhan9 I Statistics& ProbabilityLetters 30 (1996) 165-170
167
We now present the proof of Theorem 1.
Proof of Theorem 1. Recall that
k=l
k=l
i=--oc
i=-oc
From Lemma 1, we can assume, without loss of generality, that
]a.il<~n,
i=--oc~
Let
n>~l
and
~i=: Z
]ai]<~l'
i=--~
S. = ~ i ~ - ~ aniYil{la.iYil ~nl/t}.
Then
n-l/t]ES.[ = n-Ut i=_ a.iEYil{]a.iYil > nil, }
<
n-l~ t ~
lani]Elgl]l{]aniYl[ >n Ut}
i=--<:x2
<~
n-1/tnEIrl I/{alrl
I > n 1#} <<-n-l/tnEIYll/{lrl I >
<--.EIY, I'I{IYll > n Ut} ---~ 0,
So, for n large enough we have
nut}
n ---+ oo.
n-1/tlES~ I < ~/2. Then
~n~=lnr-2h(n)P{ k=~lxk ~nl/t'~ I
<<
~_nr-2h(n)P
supla, iY/I >n
Ut
n=l
o~
+ Z nr-2h(n)P{ IS" - ES~ [ >~nl/te/2}
n=l
=:11 +12.
Set
lnj :
{i E £~,;(j +
l) -1/t <
lanil<~j-1/t}, j = 1,2 ....
k
#Inj~n(k + 1) 1/t.
Z
j=l
For 11, we have
I1 <~Znr-2h(n) Z
n:l
O~
P{laniYll >nUt}
i=--~
0¢3
<~~---~nr-2h(n) Z ~ P{]Yl[>/jl/tnl/t}
n:l
j = l iEl~/
Then
Uj>.llnj
=
~,~(. Note that (cf. gi et al., 1992)
L.-X. Zhan9 / Statistics & Probability Letters 30 (1996) 165-170
168
f)o
oo
<~ Z
nr-2h(n) Z (#Inj) Z e{k<~lYllt
n=l
oo
<k+
1}
k>~jn
j=l
oc~ [k/n]
<~Znr-Zh(n) ~_~'~(#l,,s)P{k<~lY,[ t < k +
1}
k=n j = l
n=l
<~Z nr-2h(n) Z
+1
ne{k<~lYllt < k +
1}
k=n
n=l
O~
OO
nr-lh(n)n-l/t Z kl/tp{k <~[I111t <
<< ~
k + 1}
k=n
n=l
cx~
k
<< ZZn~-lh(n)n-l/tkl/tP{k<~lYll
r <k + 1}
k=l n=l
o~
kr-l/th(k)kl/tP{k <'[I11it < k + 1}
<< Z
k=l
oo
= Z krh(k)P{k <~IYI ]t <
k + 1} << E IYI ('h(IY1 I') < ~ .
k=l
F o r I2, n o t e t h a t
y'~m~=l~)l/2(m)
< ec, we have
2
E(~~aniYil{,aniYil<~nl/t}-E~-~aniYil{laniYil<~nl/t})
--oo ~ l ~m<~oc
i=l
i=l
sup
<~C
E(a.iyi)2I{la.iYil<~n1/'} = C Z E(aniYl)2I{la"iYll<~ni/t}"
i=--c~
i=--<x~
By Lemma 2, we have for q ~>2,
p {ISn_ ESn]>~52 n 1/t} <~Cn-q/tEiSn_ESnlq
~Cn -q/t
i=~_ aZniEY~l{laniYll<~nl/t})q/2+EmaxlaniYilql{laniYi[<~n1/t}
<~Cn-q/t
a2niEr~I{lamYll<~nl/t} + Z ElanigllqI{lanirl[w<~nl/r}
\i=--cx~
i=--cx~
l'hen
aZiEY~I{la.iYll<~nl/t}
12 << Z nr-Zh(n)n-q/t
n=l
i=--~
oo
-t-Znr-2h(n) n-q/t ~
n=l
=:13+14.
i=--cxD
ElaniYllqI{la.iYl[<~ nil'}
L.-X. Zhang I Statistics & ProbabilityLetters 30 (1996) 165-170
If r/> 2, we choose q large enough such that
q(1/t -
1/2) > r - 2, then
(3o
a2"'Ey2
<~Z nr-2h(n)n-q/t
13
n=l
<<"Z nr-2h(n)n-q(llt-l/2)< cx~;
i=--~
n=l
oo
h <~~ nr-Zh(n)n-q/t Z Z ElaniYllqI{laniYll<"nl/t}
n=l
j = l /El.i
oo
Z
nr-Zh(n)n -q/t E(#Inj)j-q/tEI Y1IqI{IY,It <<.n(j + 1 )}
n=l
j=l
oo
oo
<~Z nr-2h(n)n-q/t
n=l
oo
Z nr-2h(n)n-q/t
n=l
E (#Inj)j-q/t Z
~x>
2n
j= 1
k=0
E(#inj)j--qltE
oQ
oo
(j+l)n
j=l
=:/5+16
< k + 1}
El YI IqI{k <<.IYI It < k + 1 }
+ E nr-2h(n)n-q/t E (#I"j)j-q/t Z
n=l
EIYllql{k<..iYll t
O<.k<~(j+l)n
j=l
EIYllqI{k<"lYllt< k +
1}
k=2n+l
.
Note that for q t> 1 and m/> 1, we have
n
~>
la.,I = ~ ~ la.,I >>-~ #I.Aj + 1)-Wt
i=--oo
j=l
iEl.j
j=l
oc
o~
>/ Z #I.j(j + 1)-i/t >~E #Inj(j + 1)-qlt(m + 1)q/t-1/t.
j=m
j=m
So,
o()
Z #Injj-q/t ~ Cnm-(q- 1)It.
j=m
Then
2.
o~
15<<Znr-2h(n)n-q/tn Z
n=l
EIY'lqI{k<"lY'lt < k +
1}
k=O
oo
<<E ~
k=l
nr=l h(n)n-q/tE[Y11qI{k<~lYlIt < k + 1}
n=[k/2]
<<Zkr-q/th(k)Eiy1iqi{k ~ iy1it < k + 1} << ElY11rth([YlIt) < 00;
k=l
oo
oo
I6<<Z nr-2h(n)n-q/t Z
n=l
k=2n+l
Z
j>~k/n--I
(#1"j)J-q/tEIYllqI{k~lyllt <k +
1}
169
L-X. Zhan9 / Statistics & Probability Letters 30 (1996) 165-170
170
~
<(~-~nr-2h(n)n -q/t
n=l
( k ) -(q-l,/t
n
EIYIIqI{k<<.[Yll ' < k + 1}
k=2n+l
oQ
oo
= ~-~nr-lh(n)n -l/t ~
n=l
k-(q-1)/tElYl[ql{k<~lYllt < k q - 1 }
k=2n+l
[k/2]
<< ~ ~
k=2 n=l
nr-lh(n)n-l/tk-(q-1)/tE]Y11qI{k <~]Y1 it < k +
1}
OQ
<<
~-'~ krh(k)k-l/tk-(q-1)/tElY1 [qI{k <~ [Y1 [t < k + 1}
k=2
OQ
= y ~ k r - q / t h ( k ) E l Y l [ q l { k < ~ l Y l [ t < k + 1} << ElYllrth(lYl[ t) < oo.
k=2
I4 < oe, and then/2 < e~.
If r < 2, we choose q = 2. Then
So,
oo
I2<~ ~-~nr-2h(n)n -2/t ~
n=l
ElaniYll2l{laniYl[<<,nl/t} •
i=--cx~
Similarly to I4, w e have 12 < c~.
References
Burton, R.M. and H. Dehling (1990), Large deviations for some weakly dependent random process, Statist. Probab. Lett. 9, 397-401
Hsu, P.L. and H. Robbins (1947), Complete convergence and the law of large numbers, Proc. Nat. Acad. Sci. 33, 25-31.
Ibragimov, I.A. (1962), Some limit theorems for stationary processes, Theory Probab. Appl. 7, 349-382.
Li, D.L., M.B. Rao and X.C. Wang (1992), Complete convergence of moving average processes, Statist. Probab. Lett. 14, 111-114.
Shao, Q.M. (1988), A moment inequality and its application, Acta Math. Sinica 31, 736-747 (in Chinese).
Shao, Q.M. (1993), Almost sure invariance principles for mixing sequences of random variables, Stochastic Processes Appl. 48, 319-334
Download