Stat 643 Final Exam

advertisement
Stat 643 Final Exam
December 16, 1996
Prof. Vardeman
1. Let 0 ÐBl.Ñ be the density of the (right) truncated NÐ.ß "Ñ distribution with known
truncation point "!Þ That is, let
0 ÐBl.Ñ œ ž FÐ"!•.Ñ
!
"
9ÐB • .Ñ
for B • "!
Þ
for B ž "!
Suppose initially that one has a single observation from this distribution, \Þ
a) Find the best size ! œ Þ"! test of H! À . œ "! vs H" À . œ &Þ
b) Find a UMP size ! œ Þ"! test of H! À . Ÿ "! vs H" À . ž "! and argue carefully that
your test is indeed UMP.
c) Set up a pair of equations that you would need to solve in order to produce a UMPU
size ! œ Þ"! test of H! À . œ "! vs H" À . Á "!Þ
d) Find an the Fisher Information contained in a single observation from this distribution.
Now suppose that \" ß \# ß ÞÞÞß \8 are iid 0 Ð † l.ÑÞ
e) Identify a complete sufficient statistic for . and say carefully why you know that it is
so.
f) As explicitly as possible, give an equation that must be solved to find the MLE of ..
^ 8 stand for the MLE of ..
Henceforth let .
g) Give as explicitly as possible two different large sample approximate 90% confidence
sets for .Þ
2. Consider the two discrete distributions on Ö"ß #ß $× indicated belowÞ
T"
T#
Bœ"
Þ"
Þ%
Bœ#
Þ&
Þ&
Bœ$
Þ%
Þ"
For a parameter ) œ Ð)" ß )# Ñ in the parameter space @ œ Ö"ß #× ‚ Ö"ß #×, suppose that
\ œ Ð\" ß \# Ñ has independent coordinates, \3 µ T)3 Þ With action space T œ Ö!ß "×
consider the decision problem with loss
PÐÐ)" ß )# Ñß +Ñ œ MÒ)" œ )# ÓMÒ+ œ "Ó € MÒ)" Á )# ÓMÒ+ œ !Ó .
(You might think of this in terms of a test of the hypothesis that )" œ )# ÞÑ
a) Consider the decision rule $ ÐBÑ » MÒB" Á B# ÓÞ Evaluate VÐÐ"ß "Ñß $ Ñ and VÐÐ"ß #Ñß $ Ñ.
-1-
b) Suppose that one uses a prior distribution K for ) œ Ð)" ß )# Ñ of the form indicated in
the table below.
)" œ " )" œ #
)# œ 2
Þ#
Þ$
)# œ 1
Þ$
Þ#
Find VÐKß $ Ñ for the rule of part a).
c) For the prior distribution of part b), find a Bayes action for the realization \ œ Ð3ß 3 Ñ.
d) Argue carefully that the Bayes rule against the prior in b) is admissible in this decision
problem.
3. Consider the following model. Given parameters -" ß ÞÞÞß -R variables \" ß ÞÞÞß Þ\R are
independent Poisson variables, \3 µ PoissonÐ-3 ÑÞ Q is a parameter taking values in
Ö"ß #ß ÞÞÞß R × and if 3 Ÿ Q ß -3 œ ." , while if 3 ž Q ß -3 œ .# Þ (Q is the number of
Poisson means that are the same as that of the first observation.) With parameter vector
) œ ÐQß ." ß .# Ñ belonging to @ œ Ö"ß #ß ÞÞÞß R × ‚ Ð!ß_Ñ‚Ð!ß_Ñ we wish to make
inference about Q based on \" ß ÞÞÞß Þ\R in a Bayesian framework.
As matters of notational convenience, let W7 œ ! \3 and X œ W R .
7
3œ"
a) If, for example, a prior distribution K on @ is constructed by taking Q uniform on
Ö"ß #ß ÞÞÞß R × independent of ." exponential with mean 1, independent of .# exponential
with mean 1ß it is possible to explicitly find the (marginal) posterior of Q given that
\" œ B" ß ÞÞÞß \R œ BR . Don't actually bother to finish the somewhat messy calculations
needed to do this, but show that this is possible (indicate clearly why appropriate integrals
can be evaluated explicitly).
b) Suppose now that K is constructed by taking Q uniform on Ö"ß #ß ÞÞÞß R × independent
of Ð." ß .# Ñ with joint density 1Ð † ß † Ñ wrt Lebesgue measure on (0,_) ‚ (0,_). Describe
in as much detail as possible a SSS based method for approximating the posterior of Q ,
given \" œ B" ß ÞÞÞß \R œ BR . (Give the necessary conditionals up to multiplicative
constants, say how you're going to use them and what you'll do with any vectors you
produce by simulation.)
%. Let 0 Ð?ß @Ñ be the bivariate probability density of the distribution uniform on the
square in Ð?ß @Ñ-space with corners at ÐÈ#ß !Ñß Ð!ß È#Ñß Ð • È#ß !Ñ and Ð!ß • È#Ñ.
Suppose that \ œ Ð\" ß \# Ñ has bivariate probability density
0 ÐBl)Ñ œ 0 ÐB" • )ß B# • ) Ñ. Find explicitly the best location equivariant estimator of )
under squared error loss. (It may well help you to visualize this problem to "sketch" the
joint density here for ) œ "(.)
-2-
Download