Document

advertisement
STATISTICAL INFERENCE
PART II
POINT ESTIMATION
1
SUFFICIENT STATISTICS
• X, f(x;), 
• X1, X2,…,Xn be a sample rvs
• Y=U(X1, X2,…,Xn ) is a statistic.
• A sufficient statistic, Y is a statistic which
contains all the information for the estimation
of .
2
SUFFICIENT STATISTICS
• Given the value of Y, the sample contains
no further information for the estimation of
.
• Y is a sufficient statistic (ss) for  if the
conditional distribution h(x1,x2,…,xn|y) does
not depend on  for every given Y=y.
• A ss for  is not unique.
• If Y is a ss for , then a 1-1 transformation
of Y, say Y1=fn(Y) is also a ss for .
3
SUFFICIENT STATISTICS
• The conditional distribution of sample rvs
given the value of y of Y, is defined as
f  x1 , x 2 , , x n , y ;  
h  x1 , x 2 , , x n y  
g  y ; 
h  x1 , x 2 ,
, xn y  
L   ; x1 , x 2 ,
g  y ;
• If Y is a ss for , then
h  x1 , x 2 ,
, xn y  
ss for 

Not depend on  for every given y.
L   ; x1 , x 2 ,
g  y ;
, xn 
, xn 

 H  x1 , x 2 ,
, xn 
may include y or constant.
4
• Also, the conditional range of Xi given y not depend on .
SUFFICIENT STATISTICS
EXAMPLE: X~Ber(p). For a r.s. of size n,
show that  X is a ss for p.
n
i
i 1
5
SUFFICIENT STATISTICS
• Neyman’s Factorization Theorem: Y is a
ss for  iff
L     k 1  y ;   k 2  x1 , x 2 ,
The likelihood function
, xn 
Does not contain any other xi
Not depend on  for
every given y (also in the
conditional range of xi.)
where k1 and k2 are non-negative
functions and k2 does not depend on  or
y.
6
EXAMPLES
1. X~Ber(p). For a r.s. of size n, find a ss for
p if exists.
7
EXAMPLES
2. X~Beta(θ,2). For a r.s. of size n, find a ss
for θ.
8
SUFFICIENT STATISTICS
• A ss may not exist.
• Jointly ss Y1,Y2,…,Yk may be needed.
Example: Example 10.2.5 in Bain and
Engelhardt (page 342 in 2nd edition), X(1) and X(n)
are jointly ss for 
• If the MLE of  exists and unique and if a
ss for  exists, then MLE is a function of a
ss for .
9
EXAMPLE
X~N(,2). For a r.s. of size n, find jss for 
and 2.
10
MINIMAL SUFFICIENT STATISTICS
• If S ( x )  ( s1 ( x ),..., s k ( x )) is a ss for θ, then,
~
~
~
*
S ( x )  ( s 0 ( x ), s1 ( x ),..., s k ( x ))
~
~
~
is also a SS
~
for θ. But, the first one does a better job in
data reduction. A minimal ss achieves the
greatest possible reduction.
11
MINIMAL SUFFICIENT STATISTICS
• A ss T(X) is called minimal ss if, for any
other ss T’(X), T(x) is a function of T’(x).
• THEOREM: Let f(x;) be the pmf or pdf of
a sample X1, X2,…,Xn. Suppose there exist
a function T(x) such that, for two sample
points x1,x2,…,xn and y1,y2,…,yn, the ratio
f  x1 , x 2 , , x n ;  
f  y1 , y 2 , , y n ;  
is constant as a function of  iff T(x)=T(y).
Then, T(X) is a minimal sufficient statistic
for .
12
EXAMPLE
• X~N(,2) where 2 is known. For a r.s. of
size n, find minimal ss for .
Note: A minimal ss is also not unique.
Any 1-to-1 function is also a minimal ss.
13
RAO-BLACKWELL THEOREM
•
Let X1, X2,…,Xn have joint pdf or pmf
f(x1,x2,…,xn;) and let S=(S1,S2,…,Sk) be a
vector of jss for . If T is an UE of ()
and (T)=E(TS), then
i) (T) is an UE of () .
ii) (T) is a fn of S, so it is also jss for .
iii) Var((T) ) Var(T) for all .
• (T) is a uniformly better unbiased estimator
of () .
14
RAO-BLACKWELL THEOREM
• Notes:
• (T)=E(TS) is at least as good as T.
• For finding the best UE, it is enough to
consider UEs that are functions of a ss,
because all such estimators are at least as good
as the rest of the UEs.
15
Example
• Hogg & Craig, Exercise 10.10
• X1,X2~Exp(θ)
• Find joint p.d.f. of ss Y1=X1+X2 for θ and
Y2=X2.
• Show that Y2 is UE of θ with variance θ².
• Find φ(y1)=E(Y2|Y1) and variance of φ(Y1).
16
ANCILLARY STATISTIC
• A statistic S(X) whose distribution does not
depend on the parameter  is called an
ancillary statistic.
• An ancillary statistic contains no information
about .
17
Example
• Example 6.1.8 in Casella & Berger, page
257:
Let Xi~Unif(θ,θ+1) for i=1,2,…,n
Then, range R=X(n)-X(1) is an ancillary
statistic because its pdf does not depend
on θ.
18
COMPLETENESS
• Let {f(x; ), } be a family of pdfs (or
pmfs) and U(x) be an arbitrary function of x
not depending on . If
E  U  X
   0 for all   
requires that the function itself equal to 0
for all possible values of x; then we say
that this family is a complete family of pdfs
(or pmfs).
E  U  X
   0 for all     U  x   0 for all x.
19
EXAMPLES
1. Show that the family {Bin(n=2,); 0<<1}
is complete.
20
EXAMPLES
2. X~Uniform(,). Show that the family
{f(x;), >0} is not complete.
21
BASU THEOREM
• If T(X) is a complete and minimal sufficient
statistic, then T(X) is independent of every
ancillary statistic.
• Example: X~N(,2).
X : th e m ss fo r 
2
X ~ N ( , / n )
 X
is
a
(n-1)S2/
and
complete
2 ~ 
family
of
2
N ( , / n )
is
complete
family .
statistic
2
n 1
S2 Ancillary statistic for 
By Basu theorem, X and S2 are independent.
22
COMPLETE AND SUFFICIENT
STATISTICS (css)
• Y is a complete and sufficient statistic
(css) for  if Y is a ss for  and the family
 g  y ;  ;   
is complete.
The pdf of Y.
1) Y is a ss for .
2) u(Y) is an arbitrary function of Y.
E(u(Y))=0 for all  implies that u(y)=0
23
for all possible Y=y.
THE MINIMUM VARIANCE UNBIASED
ESTIMATOR
• Rao-Blackwell Theorem: If T is an
unbiased estimator of , and S is a ss
for , then (T)=E(TS) is
– an UE of , i.e.,E[(T)]=E[E(TS)]= and
– the MVUE of .
24
LEHMANN-SCHEFFE THEOREM
• Let Y be a css for . If there is a function Y
which is an UE of , then the function is
the unique Minimum Variance Unbiased
Estimator (UMVUE) of .
• Y css for .
• T(y)=fn(y) and E[T(Y)]=.
T(Y) is the UMVUE of .
So, it is the best estimator of .
25
THE MINIMUM VARIANCE UNBIASED
ESTIMATOR
• Let Y be a css for . Since Y is complete,
there could be only a unique function of Y
which is an UE of .
• Let U1(Y) and U2(Y) be two function of Y.
Since they are UE’s, E(U1(Y)U2(Y))=0
imply W(Y)=U1(Y)U2(Y)=0 for all possible
values of Y. Therefore, U1(Y)=U2(Y) for all
Y.
26
Example
• Let X1,X2,…,Xn ~Poi(μ). Find UMVUE of μ.
• Solution steps: n
– Show that S   X i is css for μ.
i 1
– Find a statistics (such as S*) that is UE of μ
and a function of S.
– Then, S* is UMVUE of μ by Lehmann-Scheffe
Thm.
27
Note
• The estimator found by Rao-Blackwell
Thm may not be unique. But, the estimator
found by Lehmann-Scheffe Thm is unique.
28
Download