Final – Math 5080 – Spring 2004 Name: Instructions. READ CAREFULLY.:

advertisement
Final – Math 5080 – Spring 2004
Name:
Instructions. READ CAREFULLY.:
(i) The work you turn in must be your own. You may not discuss the final with anyone,
either in the class or outside the class. [You may of course consult with me for
clarification of any of the problems.] Failure to follow this policy will be considered
cheating and will result in a course grade of E.
(ii) You may consult the textbook and your notes. In particular, feel free to use any of the
information in the tables of distributions in Appendix B, for example, the moment
generating functions for specific distributions. You can use a general mathematical
reference, for example a calculus text or a table of integrals. You may not use any
statistical textbook or written source material concerning the specific subject matter
of the course. Failure to follow this policy will be considered cheating and will result
in a course grade of E.
(iii) Your final must be clearly written and legible. I will not grade problems which
are sloppily presented and such problems will receive a grade of 0. If you
are unable to write legibly and clearly, use of a word processor. You have at least 7
days to complete the final; budget time for writing up your solutions.
(iv) Think about your exposition. Someone (me) has to read what you have written. Your
answer is only correct if I can understand what you have done. Style matters.
(v) Finals are due at 6 PM on Wednesday May 5, 2004. Late finals will not be
accepted, except for reasons of death or serious illness. It is highly recommended
that you turn in your exam to me personally. I will be in my office (LCB 209) at 6
PM May 5 2004 to accept exams then, and will check my box in the math department
at that time. I cannot guarantee that exams left in my box will be received.
1
Sign here to indicate you have read and understand these instructions.
2
Problem 1. For each n, let Xn,1 , Xn,2 , . . . , Xn,n be independent Bernoulli random variables.
That is,
Xn,i

1 with probability pn,i ,
=
0 with probability 1 − p .
n,i
Note that the pn,i are not assumed to be identical. Suppose that
lim
n→∞
n
X
pn,i = µ ,
and
lim
n→∞
i=1
Find a random variable Y so that Sn =
Pn
i=1
n
X
p2n,i = 0 .
Xn,i converges in distribution to Y . Prove
your answer.
Solution. Let Mn,i (t) be the mgf for Xn,i :
Mn,i (t) = E etXn,i = et pn,i + (1 − pn,i ) = 1 + pn,i et − 1 .
If Mn (t) is the mgf for Sn , we have
Mn (t) =
n
Y
Mn,i (t)
i=1
n
Y
=
1 + pn,i (et − 1)
log Mn (t) =
i=1
n
X
log 1 + pn,i (et − 1) .
i=1
Now log(1 + x) = x + ε(x), where |ε(x)| ≤ x2 . Thus
n
X
Mn (t) =
pn,i (et − 1) + ε pn,i (et − 1)
i=1
= (et − 1)
n
X
pn,i +
i=1
n
X
i=1
3
(1)
i=1
ε pn,i (et − 1) .
Now,
n
n
X
X
t
ε pn,i (et − 1) ε pn,i (e − 1) ≤
i=1
≤
i=1
n
X
2
p2n,i et − 1
i=1
n
2 X
= e −1
p2n,i .
t
i=1
We conclude from the hypotheses (1) that
log Mn (t) → µ(et − 1) .
Thus, we conclude that Sn converges in distribution to a Poisson random variable with
mean µ.
Problem 2. Let X1 , X2 , . . . , Xn be a sequence of i.i.d. random variables with density

2θ−2 x if 0 ≤ x ≤ θ
f (x; θ) =
0
otherwise.
(i) Find the MLE for θ.
(ii) Compute the bias of the MLE for θ.
Solution. The likelihood function is
n
Y
2
L(θ; x) =
1 {0 ≤ xi ≤ θ}
θ2
i=1
2n = 2n 1 x(n) ≤ θ .
θ
By graphing this function, we see it is largest at θ = x(n) , and so the MLE is θ̂ = X(n) .
We need to find the distribution of X(n) : For 0 ≤ x ≤ θ
P X(n) ≤ x = F (x)n
x2n
= 2n ,
θ
4
and so
fX(n) (x) =
2nx2n−1
θ2n
for x ∈ [0, θ]. Then
Zθ
E X(n) =
θ
2nx2n−1
2nx2n+1 2n
θ
x
dx =
=
2n
2n
θ
(2n + 1)θ 2n + 1
0
0
Thus
β =θ−
2n
θ
θ=
.
2n + 1
2n + 1
Problem 3. Let X and Y be two independent random variables with respective density
functions
f (x) =

1
2
if 0 ≤ x ≤ 2
0 otherwise.

e−x if 0 ≤ x < ∞
g(x) =
0
otherwise.
Compute the density of X + Y .
Solution. Let S = X + Y . Using the convolution formula, for s ≥ 0, and letting a ∧ b =
5
min{a, b},
Z∞
fX (x)fY (s − x)dx
fS (s) =
−∞
Z∞
=
1
1 {0 ≤ x ≤ 2} e−(s−x) 1 {0 ≤ s − x} dx
2
−∞
Z2∧s
1 −(s−x)
e
dx
2
0
2∧s
1 −(s−x) = e
2
=
0
1
1
= e−(s−2∧s) − e−s .
2
2
In other words,

1

(1 − e−s )
if 0 ≤ s ≤ 2 ,


2
fS (s) = 12 e−(s−2) − e−s if 2 ≤ 2 < ∞ ,



0
if s < 0 .
Check that it integrates to one:
Z∞
0
Z∞ −(s−2)
e
− e−s
1 − e−s
ds +
ds
fS (s)ds =
2
2
2
0
∞
−s 2
−(s−2)
s+e −e
+ e−s =
+
2 0
2
2
1 + e−2 1 − e−2
=
+
2
2
= 1.
Z2
6
Problem 4. Let X1 and X2 be independent random variables with the density function

e−x if 0 ≤ x < ∞
f (x) =
0
otherwise.
Compute the joint density function of Y1 = X1 + X2 and Y2 = 2X1 − X2 .
Solution. Let g(x1 , x2 ) = (x1 + x2 , 2x1 − x2 ). We have
" # "
#" #
y1
1 1
x1
=
,
y2
2 −1 x2
and since
"
1
1
#−1
2 −1
We have g −1 (y1 , y2 ) = ( y31 +
y2 2y1
, 3
3
"
# "
#
1
1
1 −1 −1
= 32 3 1
=
−3 −2 1
−3
3
−
y2
),
3
and
"
Dg −1 (y1 , y2 ) =
1
3
2
3
1
3
#
− 31
1 2 1
|Jg−1 (y1 , y2 )| = − = .
9 9
3
Thus
fY1 ,Y2 (y1 , y2 ) = f (g1−1 (y1 , y2 ))f (g2−1 (y1 , y2 ))|Jg−1 (y1 , y2 )|
y1
y2
= e− 3 − 3 1 {y1 + y2 ≥ 0} e−
=
2y1
y
+ 32
3
1 {2y1 − y2 ≥ 0}
1
3
e−y1
1 {−y1 ≤ y2 ≤ 2y1 } .
3
Note that if you don’t indicate the region where the density is positive, your answer is wrong.
In particular, it is not the case that fY1 ,Y2 (y1 , y2 ) = 31 e−y1 , as that does not integrate to one
(it does not even integrate to a finite number!)
7
Problem 5. Let X1 , X2 , . . . , X6 be independent random variables. We assume that X1 and
X2 are Normal(µ = 0, σ 2 = 2), while X3 , X4 , X5 , X6 are Normal(µ = 0, σ 2 = 4). Determine
c such that
P
X1 +X2
is
2
2
χ4 . Thus
Solution. Notice that
3, . . . , 6 so
P6
2
i=3 Xi
4
is
!
X1 + X2
p
X32 + . . . + X62
≤c
= 0.9 .
a Normal(0, 1) random variable. Also Xi2 /4 is χ21 for i =
X1 +X2
q P6 2
2
i=3 Xi
4
X1 + X2
= 2 qP
6
2
/4
i=3 Xi
has a t4 distribution. The 90the percentile of the t4 distribution is 1.533. Thus since
!
!
X1 + X2
X1 + X2
P p 2
≤ c = P 2p 2
≤ 2c = P (t4 ≤ 2c) ,
X3 + . . . + X62
X3 + . . . + X62
Setting 2c = 1.533 makes the probability equal to 0.9 Thus c = 0.7665.
Problem 6. Let X1 , X2 , . . . , X120 be independent and identically distributed random variables with density function

3x2
f (x) =
0
if 0 ≤ x ≤ 1
otherwise.
Get an approximate value of c so that
P (X1 + · · · + X120 ≤ c) = 0.99 .
Solution. We have
Z1
E(X1 ) =
0
1
3 4 3
x3x dx = x = ,
4 0 4
2
8
and
E
X12
Z1
=
0
1
3
3 5 x 3x dx = x = .
5 0 5
2
2
Thus
Var (X1 ) =
Let Sn =
P120
i=1
3
9
3
−
=
5 16
80
Xi . Then
3
E(Sn ) = 120 = 90 ,
4
r
SD (Sn ) =
120
3
3
=√
80
2
Thus
Sn − 90
P (X1 + · · · + X120 ≤ c) = P
√3
2
≤
c − 90
!
√3
2
≈ Φ(z) ,
where z =
c−90
3
√
2
. The 90th percentile of the standard normal distribution is 3.26. Thus
solving
2.326 =
c − 90
√3
2
for c gives c = 94.9342.
Problem 7. Let X1 , X2 , . . . , Xn be i.i.d. Geometric(p) random variables:

p(1 − p)x−1 if x = 1, 2, . . . ,
P (Xi = x; p) =
0
otherwise.
(i) Find the MLE of p.
(ii) Find the UMVU estimator of 1/p. [ Justify your answer. ]
9
Solution. We have
f (x; p) =
n
Y
P
p(1 − p)xi −1 = pn (1 − p)
xi −n
i=1
X
= exp n log p + (1 − p)
xi − n log(1 − p)
X
p
.
= exp (1 − p)
xi + n log
1−p
P
P
Thus this is a regular exponential class, with sufficient statistic
xi . It follows that
xi
is a complete minimal sufficient statistic.
We have
X
`(p) = log(1 − p)
xi + n log
P
∂`
xi
n
=−
+
.
∂p
1 − p p(1 − p)
Setting
∂`
∂p
p
1−p
= 0 and solving for p gives
X
0 = −p
xi + n
n
p= P
xi
1
p= .
x̄
It can be checked that this is indeed a global maximum. Thus p̂ = 1/X̄. By the invariance
d = X̄.
of the MLE, we have 1/p
It is clear that E X̄ = p1 , so the MLE is unbiased.
Since X̄ is a function of the complete minimum sufficient statistic, it is UMVU (by
Lehmann-Scheffe Theorem).
Problem 8. Let (X1 , Y1 ), (X2 , Y2 ), . . . , (Xn , Yn ) be i.i.d. random vectors with the following
density:
f (x, y; ρ) =


1
πρ2
0
if (x, y) is in the disc centered at 0 with radius ρ ,
otherwise.
Find the MLE of ρ.
10
Solution. The likelihood is
q
n
Y
1
2
2
Xi + Yi ≤ ρ
L(ρ) =
1
2
πρ
i=1
q
1
2
2
= n 2n 1 max Xi + Yi ≤ ρ
i
π ρ
Thus the MLE is
q
ρ̂ = max Xi2 + Yi2 .
i
Note that
max
i
q
Xi2 + Yi2 6=
q
2
2
X(n)
+ Y(n)
.
[Try the two points (2, 1) and (1, 2): In this case x(2) = y(2) = 2, so
p
√
while max x2i + yi2 = 5.]
11
q
2
x2(2) + y(2)
=
√
8,
Download