Stat 643 Exam 2 Spring 2010

advertisement
Stat 643 Exam 2
Spring 2010
I have neither given nor received unauthorized assistance on this exam.
________________________________________________________
Name Signed
Date
_________________________________________________________
Name Printed
This exam is comprised of 10 equally weighted parts. Write answers for as many of those parts as
you can in the 2 hours available for this exam. (You will surely NOT come close to finishing. Find
things that you can do easily and completely.) Put your answers on the pages that follow, not on
separate sheets.
1
1. Prove or give a counter-example to the following:
For a decision problem with finite parameter space
1, 2,
, m , if a prior distribution G is
least favorable, it is unique. (That is, if G ' G then G ' is not least favorable.)
2
2. Consider a decision problem with finite parameter space
consisting of two-element subsets of
, $
L
Suppose that a prior G on
that we will symbolize as G
, ' | , '
,a
and a likelihood f
I
1, 2,
,
, 9 , and action space
' . Then let
a
X produce for each x ; a posterior over
| x . Give a prescription (in as explicit a form as possible) for a
Bayes decision rule versus the prior G .
3
3. Consider a decision-theoretic treatment of simple versus simple testing of H 0 : X
H1 : X
f 0 versus
f1 with 0-1 loss, where
f0 x
I 0
x 1 and f1 x
are densities with respect to Lebesgue measure on
favorable prior for this problem.
2 xI 0
x 1
. Find a minimax test and identify a least
4
4. In the testing context of problem 3, suppose that one observes only Y
X itself). Find most powerful tests of every size
X I X
.8 (and not
0,1 based on Y .
5
5. Consider squared error loss estimation of
method of moments estimator of is ˆ X
sufficient for
0 based on X 1 and X 2 that are iid U 0,
1
X2 . M
. The
max X 1 , X 2 is well-known to be
, and conditional distributions for X 1 , X 2 given M are uniform on the union of
two line segments in 2
SM
x1 , x2 | x1
M and x2
0, M
x1 , x2 | x2
M and x1
(Note that this means that marginally conditionally, X 1 is M with probability
0, M
1
and with
2
1
is U 0, M .) Rao-Blackwellize ˆ to produce an explicit formula for an estimator.
2
What properties does this estimator have?
probability
6
6. A certain realized log-likelihood Ln
problem is maximized at
of second partials) is
1
2.0 and
1
2
,
2
for the two real parameters
1
and
2
in a regular
3.0 , where the Hessian of the log-likelihood (the matrix
225
25
25
100
What are approximate 95% confidence limits for 1 ? What is an approximate significance level for
testing H 0 : 1 0 and 2 0 ? (You don't need to report a number, but tell me what tail probability
of what distribution provides this.)
7
7. The interchange of orders of differentiation and integration is important to many of the
arguments made in likelihood-based theory. For a sigma-finite measure on some space ; ,
suppose g x, : ;
is measureable in x for every fixed and differentiable in for
every fixed x , g x,
0 , and there exists some integrable M x
d
g x,
d
M x
0 such
x and
Prove then that
d
d
g x,
d
x
d
g x,
d
d
x
(Hint: Begin by writing the left hand side as a limit of a difference quotient. A first order Taylor
approximation becomes relevant.)
8
8. Suppose that X 1 , X 2 ,
1
and
2
, X n are iid on
with marginal pdf f 1 , 2 x for real parameters
. Write
m1
1
,
2
and suppose that the mapping from m :
m 1,
E 1 , 2 X and m2
2
2
2
m1
1
,
2
defined by
1 , 2 , m2
1,
E 1, 2 X 2
2
is one-to-one and differentiable, with (differentiable) inverse h . Identify a consistent estimator of
and identify a condition sufficient to guarantee that your estimator is root- n consistent.
1, 2
(Argue that your condition really is sufficient.)
9
9. Suppose that the density of X with respect to some sigma-finite measure
exponential family form
f x .
exp T x
is of 1-parameter
for some real-valued function T . Under what conditions is the Cauchy-Schwarz inequality an
equality? How then is it obvious that the Cramér-Rao inequality is achieved for the statistic T X ?
(Argue this not from a knowledge of what the C-R bound turns out to be, but from the condition
that produces equality in Cauchy-Schwarz.)
10
10. Consider a "compound decision problem" as follows.
0,1
$
N
(so both parameters and
actions are N -vectors of 0's and 1's ). Let
,a
L
Suppose that for
, X
X1, X 2 ,
, XN
1 N
I ai
i
N i1
has independent components, X i
N
i
,1 . (This is
N discriminations between N 0,1 and N 1,1 where loss is an overall error rate.)
Let
x
: 1, 2,
x1 , x2 ,
,N
1, 2,
, N be a permutation of the integers 1 to N , and define for
, xN the transformation
N
N
g
x
x
1
,x
2
,
,x
N
. Then
*
g | such that is a permutation
is a group. Show that the problem is invariant and say explicitly as possible what it means for a
non-randomized decision rule to be equivariant in this context.
11
Download