228 Self-Test Exercises for Chapter 8 f

advertisement
228
Self-Test Exercises for Chapter 8
If, instead, X is a continuous random variable with probability density function
fX (x; θ), then the likelihood function is given by
L(x1 , x2 , . . . , xn ; θ) =
n
Y
fX (xi ; θ).
i=1
Maximum likelihood estimators are obtained by finding that value of θ that maximizes L for a given set of observations x1 , x2 , . . . , xn . Since the value of θ that
does this will usually vary with x1 , x2 , . . . , xn , θ can be thought of as a function of
x1 , x2 , . . . , xn , namely θ̂(x1 , x2 , . . . , xn ).
To evaluate the properties of θ̂, we can look at its performance prior to actually
making the observations x1 , x2 , . . . , xn . That is we can substitute Xi for xi in the
specification for θ̂ and look at its properties as the statistic
θ̂(X1 , X2 , . . . , Xn ).
For example, one of the properties that we might like to check for is whether θ̂ is
an unbiased estimator for θ (i.e., check to see if E(θ̂) = θ).
Self-Test Exercises for Chapter 8
For the following multiple-choice question, choose the best response among those
provided. The answer can be found in Appendix B.
S8.1 Suppose that X1 , X2 , . . . , Xn are independent identically distributed random
variables each with marginal probability density function
¡
¢
1 x−µ 2
1
fXi (x) = √ e− 2 σ
σ 2π
for −∞ < x < +∞, where σ > 0. Then an unbiased estimator for µ is
(A) (X1 )(X2 ) · · · (Xn )
(B) (X1 + X2 )2 /2
(C)
1
n
Pn
i=1 Xi
(D) σ
(E) none of the above.
Questions for Chapter 8
229
Questions for Chapter 8
8.1 Let X be a random variable with a binomial distribution, i.e.,
à !
pX (k ; θ) =
n k
θ (1 − θ)n−k
k
for k = 0, 1, . . . , n.
Let X1 be a random sample of size 1 from X .
(a) Show that θ̂ = X1 /n is an unbiased estimator for θ.
(b) Show that θ̂ = X1 /n is the maximum likelihood estimator
for θ.
8.2 Let Y be an estimator for θ based on the random sample X1 , X2 , . . . , Xn .
P
Suppose that E(Xi ) = θ and Y = ni=1 ai Xi where a1 , a2 , . . . , an are
constants. What constraint must be placed on a1 , a2 , . . . , an in order for Y
to be an unbiased estimator for θ?
8.3 The life of a light bulb is a random variable X which has probability density
function
(
1 − θ1 x
x>0
e
θ
fX (x; θ) =
0
otherwise
Let X1 , X2 , . . . , Xn be a random sample from X .
(a) Find an estimator for θ using the method of maximum likelihood.
(b) Is the estimator for part (a) unbiased? Justify your answer.
(c) Find the maximum likelihood estimator for η = 1/θ.
8.4 The number of typographical errors, X , on a page of text has a Poisson
distribution with parameter λ, i.e.,
pX (k ; λ) =
λk e−λ
,
k!
for k = 0, 1, 2, . . ..
A random sample of n pages are observed
(a) Find an estimator for λ using the method of maximum likelihood.
(b) Is the estimator for part (a) unbiased? Justify your answer.
230
Questions for Chapter 8
8.5 Given the random sample X1 , X2 , . . . , Xn consider the statistic d2 formed
by averaging
the squared differences of all possible pairings of {Xi , Xj }.
¡ ¢
There are n2 such pairs. That statistic can be represented as
1 X
d2X ≡ ¡n¢
(Xi − Xj )2
2
Prove that d2X = 2s2X .
i>j
ANSWERS TO SELFTEST
EXERCISES
Chapter 8
S8.1 C
Download