CAUCHY APPROXIMATION FOR SUMS OF INDEPENDENT RANDOM VARIABLES K. NEAMMANEE

advertisement
IJMMS 2003:17, 1055–1066
PII. S0161171203208206
http://ijmms.hindawi.com
© Hindawi Publishing Corp.
CAUCHY APPROXIMATION FOR SUMS OF INDEPENDENT
RANDOM VARIABLES
K. NEAMMANEE
Received 6 August 2002
We use Stein’s method to find a bound for Cauchy approximation. The random
variables which are considered need to be independent.
2000 Mathematics Subject Classification: 60F05, 60G50.
1. Introduction. In Stein’s work [19], the aim was to show convergence in
distribution to the normal. His technique was novel. Stein’s technique was free
from Fourier methods and relied instead on the elementary differential equation
f (w) − wf (w) = h(x) − Nh
where h : R → R is such that
∞
−∞
(w ∈ R),
h(x)e−(1/2)x 2 dx < ∞
(1.1)
(1.2)
and Nh = E(h(Z)), where Z ∼ N(0, 1).
Stein’s method was extended from normal distribution to the Poisson distribution by Chen [9]. Stein’s equation for Poisson with parameter λ is
λf (w + 1) − wf (w) = h(w) − Pλ h
w ∈ Z+ ,
(1.3)
where Pλ h = E(h(Z)), Z ∼ Poi(λ).
Since then, Stein’s method has found considerable applications in combinatorics, probability, and statistics. Recent literature pertaining to this method
includes Arratia et al. [1, 2], Baldi and Rinott [3], Barbour [4, 5], Barbour et al. [6],
Bolthausen and Götze [7], Chen [10, 11], Goldstein and Reinert [12], Goldstein
and Rinott [13], Götze [14], and Green [15]; the work of Holst and Janson [16]
gives an excellent account of this method. In this paper, we further develop
the Stein technique to bound errors for a Cauchy approximation to the distribution of W , the sum of independent random variables. In fact, there are some
literatures (e.g., Boonyasombut and Shapiro [8], Neammanee [17], and Shapiro
[18]) give a bound of Cauchy approximation in some kind of random variables.
But they used Fourier methods.
1056
K. NEAMMANEE
This paper is organized as follows. Main results are stated in Section 2. Proof
of main results is in Section 3, while an example is given in Section 4.
2. Main results. At the heart of Stein’s method lies a Stein equation. For
example,
f (w) − wf (w) = g(w),
w ∈ R,
λf (w + 1) − wf (w) = g(w),
w ∈ Z+
(2.1)
are Stein equations for normal and Poisson distribution, respectively.
∞
Let Ᏼ = {h : R → R | −∞ (|h(x)|/(1 + x 2 ))dx < ∞}, and for each h ∈ Ᏼ,
Cau(h) =
1
π
∞
−∞
h(x)
dx.
1 + x2
(2.2)
The Stein equation for Cauchy distribution F
1
1 x
dt
F (x) =
π −∞ 1 + t 2
(2.3)
is
f (w) −
2wf (w)
= h(w) − Cau(h).
1 + w2
It is easy to check that a solution of (2.4) is Uh : R → R defined by
w h(x) − Cau(h)
dx.
Uh (w) = 1 + w 2
1 + x2
−∞
(2.4)
(2.5)
Fix w0 ∈ R, and choose h to be the indicator function I(−∞,w0 ] which is defined
by

1 if w ≤ w0 ,
I(−∞,w0 ] (w) =
(2.6)
0 if w > w .
0
Let fw0 = UI(−∞,w ] . Then, by (2.2), (2.3), and (2.5), we see that
0
 π 1 + w 2 F (w) 1 − F w0
fw0 (w) =
π 1 + w 2 F w 1 − F (w)
0
if w ≤ w0 ,
if w ≥ w0 .
(2.7)
The broad idea of Stein’s argument is as follows. First, for any w0 ∈ R, a
function fw0 : R → R is constructed to solve (2.4) when h is the indicator function I(−∞,w0 ] . Replacing w by W , for any random variable W , it therefore follows
that the difference between P (W ≤ w0 ) and F (w0 ) can be expressed as
2W fw0 (W )
(W
)
−
E fw
.
0
1+W2
The main results are the following.
(2.8)
CAUCHY APPROXIMATION FOR SUMS . . .
1057
Theorem 2.1. Let X1 , X2 , . . . , Xn be independent random variables with EXi
= 0, EXi2 = σi2 , and E|Xi |4 < ∞. Then,
P W ≤ w0 − F w0 2
n
σi2 + Xi2
≤ 3 E 1−
1+W2
i=1




n
n
n
 
4
+ 4π min
σi2 , 2n
σi2
E Xi F w0 1 − F w0




i=1
+C
n
i=1
(2.9)
i=1
3
E Xi ,
i=1
when W = X1 + X2 + · · · + Xn .
Corollary 2.2. Let Y1 , Y2 , . . . , Yn be identically independent random vari√
ables with zero means EYi2 = 1/2 and E|Yi |5 < ∞. Let Xi = Yi / n and W =
X1 + X2 + · · · + Xn . Then,
C
1
4
P W ≤ w0 − F w0 < √
+
C
min
,
2
EY
i F w0 1 − F w 0 .
4
n
2
(2.10)
Throughout this paper, C stands for an absolute constant with possibly different values in different places.
3. Proof of main results. Before we prove the main results, we need the
following lemmas.
Lemma 3.1. For any real numbers w0 and w,
(1) |fw0 (w)/(1 + w 2 )| ≤ π F (w0 )(1 − F (w0 ))
(2) |fw
(w)| ≤ 3
0
(w)| ≤ 3 + 2π
(3) |fw
0
(w)/(1 + w 2 )) | ≤ 6 + 2π
(4) |(fw
0
(5) |(wfw0 (w)/(1 + w 2 )2 ) | ≤ 3 + 5π .
Proof. (1) follows directly from (2.7).
(2) Before we start the proof, we need the following inequalities:
1
≤ wF (w) ≤ 0 for w ≤ 0,
π
1
0 ≤ w 1 − F (w) ≤
for w > 0.
π
−
(3.1)
(3.2)
1058
K. NEAMMANEE
To show (3.1), we define g on (−∞, 0] by g(w) = wF (w). Since g (w) =
2/π (1 + w 2 )2 > 0, g is increasing. From this fact and the fact that
1
w→−∞ π
lim g (w) = lim
w→−∞
w
π
+ arctan w +
1 + w2
2
= 0,
(3.3)
we have g ≥ 0. Hence, g is increasing and
−
1
= lim g(t) ≤ g(w) ≤ g(0) = 0
π t→−∞
(3.4)
for any w ≤ 0. So (3.1) holds. To show (3.2), we can apply the same argument
to the function g̃ on [0, ∞) which is defined by g̃(w) = w(1 − C(w)). Since
fw0 (w) = f−w0 (−w), it suffices to prove the lemma in the case where w0 ≥ 0.
By (2.7), we have

 1 − F w0 1 + 2π wF (w) if w ≤ 0,
f (w) = w0
F w
− 1 + 2π w 1 − F (w)
if w ≥ w0
0

1 + 2π wF (w)
if w ≤ 0,
≤
1 + 2π w 1 − F (w) if w ≥ w0

3 if w ≤ 0,
≤
3 if w ≥ w ,
(3.5)
0
where we have used the fact that 0 ≤ F (w) ≤ 1 in the first inequality and
(3.1) and (3.2) in the second inequality. In the case where 0 ≤ w ≤ w0 , by
monotonicity of F and (3.2), we see that
(w)
0 ≤ fw
0
= 1 − F w0 + 2π 1 − F w0 wF (w)
≤ 1 + 2π 1 − F (w) w ≤ 3.
(3.6)
Hence, (2) follows from (3.5) and (3.6).
(3) follows immediately from (2) and the fact that
(w) =
fw
0
2w
2 1 − w2
f
(w)
+
2 fw0 (w).
1 + w 2 w0
1 + w2
(3.7)
(4) and (5) follow from (2) and (3) and the facts that
(w)
fw
0
1 + w2
wfw0 (w)
2
1 + w2
=
=
fw
(w)
0
1 + w2
2wfw
(w)
0
− 2 ,
1 + w2
wfw
(w) + fw0 (w) 4w 2 fw0 (w)
0
− 2
3 .
1 + w2
1 + w2
(3.8)
CAUCHY APPROXIMATION FOR SUMS . . .
1059
) be an exchangeable pair of random variables, that
Lemma 3.2. Let (W , W
is,
∈ B)
= P (W ∈ B,
W
∈ B)
P (W ∈ B, W
(3.9)
for any Borel sets B and B on R, and there exists λ > 0 such that
= (1 − λ)W ,
EW W
− W |2 < ∞,
E|W
(3.10)
is the conditional expectation of W
with respect to W . Then,
where E W W
E
)
2W f (W ) 1
f (W
f (W )
−W)
(
W
−
−
=0
2 1 + W 2
1+W2
λ
1+W
(3.11)
for any function f : R → R, for which there exists C > 0 such that for all w ∈ R,
f (w) ≤ C 1 + w 2 .
(3.12)
Moreover,
) fw0 (W ) f (W
1
− W ) w0
(W ) − (W
−
P W ≤ w0 = C w0 + E fw
0
2
λ
1+W2
1+W
(3.13)
for any w0 ∈ R.
Proof. Define F : R2 → R by
f (w)
f (
w)
) = (
+
F (w, w
w − w)
.
2 1 + w 2
1+w
(3.14)
) = −F (
Then, F is antisymmetric, that is, F (w, w
w , w). By Stein [20, pages
9–10], we have EF (W , W ) = 0, which implies that
)
f (W )
f (W
−W)
+
0 = E(W
2 1 + W 2
1+W
)
2f (W )
f (W )
f (W
−W)
= E(W
+
−
2
2
2 1 + W
1+W
1+W
)
f (W )
W
f (W
f (W )
−W
−W)
= 2E E W
+
E(
W
−
2 1 + W 2
1+W2
1+W
)
f (W )
f (W
2W f (W )
−W)
−
+
E(
W
= −λE
2 1 + W 2
1+W2
1+W
f (W )
2W f (W ) 1
f (W )
−W)
(
W
=E
−
−
.
2 1 + W 2
1+W2
λ
1+W
(3.15)
Then, (3.11) holds and (3.13) follows from (3.11) and (2.4) when h = I(−∞,w0 ] .
1060
K. NEAMMANEE
) be an exchangeable pair of random variables such
Lemma 3.3. Let (W , W
that
= (1 − λ)W ,
EW W
− W |2 < ∞
E|W
(3.16)
with λ > 0. Then, for any w0 ∈ R,
P W ≤ w0
− W )2 W fw0 (W )
− W )2 2 E(W
(W
1
+
(W ) 1 − E W
= C w0 + Efw
2
0
2
λ
1+W
λ
1+W2
1 ∞
W +W
−W) w −
+
E(W
λ −∞
2
(3.17)
fw0 (w)
− I(w ≤ W )
× I w ≤W
dw
2
1+w
∞
W +W
2
−W w −
−
E W
λ −∞
2
wfw0 (w)
× I w ≤ W − I(w ≤ W ) 2 dw.
1 + w2
, we see that
Proof. Let w0 ∈ R. For W < W
− W )fw
) fw0 (W ) (W
− W )W fw0 (W )
(W ) 2(W
fw0 (W
0
−
−
+
2
2
2
2
1+W
1+W
1+W
1+W2
W
(W ) 2W fw0 (W )
fw0 (w) fw
0
=
−
+ 2 dw
2
1+w
1+W2
W
1+W2
W
fw0 (w) 2wfw0 (w) fw
(W ) 2W fw0 (W )
0
=
− + 2 −
2 dw
1 + w2
1+W2
W
1 + w2
1+W2
W
W
w w
fw0 (y) yfw0 (y)
dy
dw
−
2
dy dw
=
2
1+y2
W W
W W
1+y2
W
W
W
W
fw0 (y) yfw0 (y)
=
dw dy − 2
2 dw dy
1+y2
W y
W y
1+y2
W
W
f (y) yf (y)
− y) w0
− y) w0 (W
dy
−
2
(
W
dy,
=
2
1+y2
W
W
1+y2
(3.18)
and by the same argument we can show that
− W )fw
) fw0 (W ) (W
− W )W fw0 (W )
(W ) 2(W
fw0 (W
0
−
−
+
2
2
1+W2
1+W2
1+W
1+W2
W
W
f (w) wf (w)
) w0
) w0 =
(w − W
dw
−
2
(w
−
W
dw
2
1 + w2
W
W
1 + w2
< W.
for W
(3.19)
CAUCHY APPROXIMATION FOR SUMS . . .
1061
So,
− W )fw
− W )W fw0 (W )
) fw0 (W ) (W
(W ) 2(W
fw0 (W
0
−
−
+
2
2
1+W2
1+W2
1+W
1+W2
∞
fw0 (w)
− w) I(w ≤ W
) − I(w ≤ W )
=
(W
dw
1 + w2
−∞
∞
wf (w)
− w) I(w ≤ W
) − I(w ≤ W ) w0 (W
dw.
−2
2
−∞
1 + w2
(3.20)
By Lemma 3.2, we have
P W ≤ w0
− W )2 1 fw
− W )2
(W )(W
1 fw0 (W )(W
0
= C w0 + E fw
(W ) −
+
0
2
λ
1+W
λ
1+W2
2
2
− W ) W fw0 (W ) 2 (W
− W ) W fw0 (W )
2 (W
+
−
2
2
2
λ
λ
1+W
1+W2
) fw0 (W ) f (W
1
− W ) w0
− (W
−
2
λ
1+W2
1+W
− W )2
fw
(W )(W
1
0
= C w0 + Efw
(W ) − EE W
0
λ
1+W2
2
− W ) W fw0 (W ) 1
2 E(W
−W)
+
− E(W
2
λ
λ
1+W2
− W )fw
− W )W fw0 (W )
) fw0 (W ) (W
(W ) 2(W
fw0 (W
0
−
−
+
×
2
2
1+W2
1+W2
1+W
1+W2
− W )2 1 W (W
E
(W
)
1
−
= C(w0 ) + E fw
0
λ
1+W2
2
2 E(W − W ) W fw0 (W ) 1
−W)
− E(W
+
2
λ
λ
1+W2
− W )fw
) fw0 (W ) (W
− W )W fw0 (W )
(W ) 2(W
fw0 (W
0
−
−
+
×
2
2
1+W2
1+W2
1+W
1+W2
− W )2 W fw0 (W )
− W )2 2 E(W
(W
1
(W ) 1 − E W
+
= C w0 + E fw
2
0
2
λ
1+W
λ
1+W2
∞
fw
(w)
1
0
−W)
− w) I(w ≤ W
) − I(w ≤ W )
− E(W
(W
dw
λ
1 + w2
−∞
∞
wf (w)
2
−W)
− w) I(w ≤ W
) − I(w ≤ W ) w0 (W
dw,
+ E(W
2
λ
−∞
1 + w2
(3.21)
where we have used (3.20) in the last equality.
1062
K. NEAMMANEE
For fixed w, we define F : R2 → R by
x −x
− I(w ≤ x) .
= (x − x)
I(w ≤ x)
F (x, x)
2
(3.22)
are exchangeable, EF (W , W
) = 0.
Then, F is antisymmetric. Since W and W
Thus,
− W )(w − W
) I(w ≤ W
) − I(w ≤ W )
E(W
W −W
W +W
−W) w −
) − I(w ≤ W )
= E(W
+
I(w ≤ W
2
2
W +W
−W) w −
) − I(w ≤ W ) − EF (W , W
)
= E(W
I(w ≤ W
2
W +W
−W) w −
) − I(w ≤ W ) .
= E(W
I(w ≤ W
2
(3.23)
By (3.21) and (3.23), the lemma is proved.
Proof of Theorem 2.1. Let X1 , X2 , . . . , Xn be independent random variables and W = X1 + X2 + · · · + Xn . In order to prove the theorem, we introduce
1 , X
2 , . . . , X
n , and W
defined in the following
additional random variables I, X
1 , X
2 , . . . , X
n are independent, I
way. The random variables I, X1 , X2 , . . . , Xn , X
i has the same
is uniformly distributed over the index set {1, 2, . . . , n}, each X
= W + (X
I − XI ). Then, (W , W
) is
distribution as the corresponding Xi and W
an exchangeable pair. We note that
= W + EW X
I − E W XI = W −
EW W
n
1 1
W,
Xi = 1 −
n i=1
n
n
n
1 2 2
I − XI 2 =
i − Xi 2 =
− W | = E X
E X
σ .
E|W
n i=1
n i=1 i
(3.24)
2
Then, the assumptions of Lemma 3.3 are satisfied with λ = 1/n. Moreover, we
know that
n
n
3
1 8 − W |3 = E X
I − XI 3 =
i − Xi 3 ≤
E|W
E X
E Xi ,
n i=1
n i=1
(3.25)
n
n
4
1 16 − W |4 = E X
I − XI 4 =
i − Xi 4 ≤
E X
E Xi .
E|W
n i=1
n i=1
(3.26)
1063
CAUCHY APPROXIMATION FOR SUMS . . .
To prove the theorem, let w0 ∈ R. By Lemma 3.3, we obtain
P W ≤ w0 − C w0 E(W
2
2
− W ) W fw0 (W ) W (W − W ) E (w)
≤ sup fw
+
2n
1
−
nE
2
0
2
1+W
w∈R
1+W2
f (w) ∞
W +W
w0
− W |
|
W
E
+ n sup w
−
1 + w2
−∞
2
w∈R
) − I(w ≤ W ) dw
× I(w ≤ W
wf (w) ∞
W +W
w0
− W |
|
W
w
−
E
+ 2n sup 2
2
2
−∞
w∈R
1+w
) − I(w ≤ W ) dw
× I(w ≤ W
E(W
2
2
− W ) W fw0 (W ) W (W − W ) ≤ sup fw0 (w) E 1 − nE
+
2n
2
1+W2
w∈R
1+W2
f (w) wf (w) w0
w0
+ n sup +
2n
sup
E
2 2
2
1
+
w
w∈R
w∈R
1+w
W +W
dw
− W |
|W
w−
2 W ∧W
E(W
2
− W )2 2
(W
− W ) W fw0 (W ) +
2n
≤ sup fw0 (w) E 1 − nE W
2
2
1+W
w∈R
1+W2
f (w) wf (w) n
w0
w0
− W |3
E|W
sup +
n
sup
+
1 + w2
1 + w 2 2 2
E(W
2
− W )2 2
(W
− W ) W fw0 (W ) W
≤ 3 E 1 − nE
+
2n
2
1+W2
1+W2
×
W ∨W
− W |3
+ 6n(π + 1)E|W
E(W
2
− W )2 2
(W
− W ) W fw0 (W ) W
≤ 3 E 1 − nE
+ 2n
2
2
2
1+W
1+W
+C
n
3
E Xi ,
i=1
(3.27)
where the fourth inequality comes from (4) and (5) of Lemma 3.1 and the last
i are independent and have the
inequality comes from (3.25). Since Xi and X
same distribution,
n
1 1
I − XI 2 =
i − Xi 2 =
− W )2 = E W X
X
E W (W
n i=1
n
n
i=1
σi2 +
n
i=1
Xi2 .
(3.28)
1064
K. NEAMMANEE
Hence,
2 2
− W )2 2
(W
W (W − W )
=
E
1
−
nE
E 1 − nE W
1+W2
1+W2
2
n
σi2 + Xi2
= E 1−
.
1+W2
i=1
(3.29)
− W )2 (W fw0 (W )/(1 + W 2 )2 ).
Next, we will give a bound of 2nE(W
From Lemma 3.1(1),
n
W fw0 (W ) 2
i − Xi 2
−W) 2nE(W
1
−
F
w
E X
≤
2π
F
w
0
0
2
2
1+W
i=1
n
= 4π F w0 1 − F w0
σi2 ,
i=1
2 W fw0 (W ) − W |2 |W |
2nE(W − W ) ≤ 2nπ F w0 1 − F w0 E|W
2
2
1+W
I − XI 4 EW 2
E X
≤ 2nπ F w0 1 − F w0
n
n
4
= 8π F w0 1 − F w0 n
σi2
E Xi .
i=1
i=1
(3.30)
Hence,
W f (W ) − W )2 w0 2nE(W
2
1+W2 

n
n
 4
σi2 , 2n
σi2
E Xi F w0 1 − F w0 .
≤ 4π min



 i=1
i=1
i=1


n

(3.31)
This completes the proof.
4. Proof of Corollary 2.2. Using Taylor’s formula, we see that
1
= 1 − W 2 + CW 3
1+W2
1
2
3
2 = 1 − 2W + CW
1+W2
for some |C| < 1,
for some |C| < 1.
(4.1)
CAUCHY APPROXIMATION FOR SUMS . . .
1065
C
1
1
C
1
√
+
,
E
≤
2 ≤ √ ,
2
1+W2
2
n
n
1+W
n
n
n
!n
2
i=1 Xi
2
2
2
2
Xi − E
Xi W + C1 E
Xi W 3
=E
E
1+W2
i=1
i=1
i=1
(4.2)
Hence,
E
C
1
≤ +√ ,
4
n
!n
2
C
i=1 Xi
E
2 ≤ √ ,
n
1+W2
!n
E
2 2
i=1 Xi
1+W2
≤
C
,
n
which implies that
2
n
1 2
+
X
2 i=1 i
!n
2
1
1
1
i=1 Xi
E
−
2E
+
2
1+W2
1+W2
4
1+W2
!n
!
n
2
2 2
i=1 Xi
i=1 Xi
+E 2 + E
1+W2
1+W2
C
≤√ .
n
1
E 1−
1+W2
= 1−E
(4.3)
Clearly, that
n
3
C
E Xi ≤ √ ,
n
i=1




n
n
 
n 2
2
4
σi , 2n
σi
EXi F w0 1 − F w0
4π min



 i=1
i=1
i=1
1
, 2 EYi4 F w0 1 − F w0 .
≤ C min
2
C
(4.4)
Hence, by (4.3) and (4.4), the example is proved.
References
[1]
[2]
[3]
[4]
[5]
R. Arratia, L. Goldstein, and L. Gordon, Two moments suffice for Poisson approximations: the Chen-Stein method, Ann. Probab. 17 (1989), no. 1, 9–25.
, Poisson approximation and the Chen-Stein method, Statist. Sci. 5 (1990),
no. 4, 403–434.
P. Baldi and Y. Rinott, On normal approximations of distributions in terms of
dependency graphs, Ann. Probab. 17 (1989), no. 4, 1646–1650.
A. D. Barbour, Stein’s method and Poisson process convergence, J. Appl. Probab.
25A (1988), 175–184.
, Stein’s method for diffusion approximations, Probab. Theory Related
Fields 84 (1990), no. 3, 297–322.
1066
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
K. NEAMMANEE
A. D. Barbour, L. H. Y. Chen, and W.-L. Loh, Compound Poisson approximation for
nonnegative random variables via Stein’s method, Ann. Probab. 20 (1992),
no. 4, 1843–1866.
E. Bolthausen and F. Götze, The rate of convergence for multivariate sampling
statistics, Ann. Statist. 21 (1993), no. 4, 1692–1710.
V. Boonyasombut and J. M. Shapiro, The accuracy of infinitely divisible approximations to sums of independent variables with application to stable laws,
Ann. Math. Statist. 41 (1970), 237–250.
L. H. Y. Chen, Poisson approximation for dependent trials, Ann. Probab. 3 (1975),
no. 3, 534–545.
, Stein’s method: some perspectives with applications, Probability Towards
2000 (New York, 1995) (L. Accardi and C. C. Heyde, eds.), Lecture Notes in
Statist., vol. 128, Springer, New York, 1998, pp. 97–122.
, Non-uniform bounds in probability approximations using Stein’s method,
Probability and Statistical Model with Applications: A Volume in Honor of
Thephilos Cacoullous (Ch. A. Charalambides, M. V. Koutras, and N. Balakrisshnan, eds.), Chapman&Hall/CRC Press, Florida, 2000, pp. 3–14.
L. Goldstein and G. Reinert, Stein’s method and the zero bias transformation with
application to simple random sampling, Ann. Appl. Probab. 7 (1997), no. 4,
935–952.
L. Goldstein and Y. Rinott, Multivariate normal approximations by Stein’s method
and size bias couplings, J. Appl. Probab. 33 (1996), no. 1, 1–17.
F. Götze, On the rate of convergence in the multivariate CLT, Ann. Probab. 19
(1991), no. 2, 724–739.
T. A. Green, Asymptotic enumeration of generalized Latin rectangles, J. Combin.
Theory Ser. A 51 (1989), no. 2, 149–160.
L. Holst and S. Janson, Poisson approximation using the Stein-Chen method
and coupling: number of exceedances of Gaussian random variables, Ann.
Probab. 18 (1990), no. 2, 713–723.
K. Neammanee, On the rate of convergence of distribution functions of sums of
reciprocals of logarithm of random variables to the Cauchy distribution, to
appear.
J. M. Shapiro, On the rate of convergence of distribution functions of sums of reciprocals of random variables to the Cauchy distribution, Houston J. Math.
4 (1978), no. 3, 439–445.
C. Stein, A bound for the error in the normal approximation to the distribution of
a sum of dependent random variables, Proceedings of the Sixth Berkeley
Symposium on Mathematical Statistics and Probability (Univ. California,
Berkeley, Calif., 1970/1971), Vol. II: Probability theory (California), Univ.
California Press, 1972, pp. 583–602.
, Approximate Computation of Expectations, Institute of Mathematical Statistics Lecture Notes—Monograph Series, vol. 7, Institute of Mathematical
Statistics, California, 1986.
K. Neammanee: Department of Mathematics, Faculty of Science, Chulalongkorn University, Bangkok 10330, Thailand
E-mail address: kritsana.n@chula.ac.th
Download