Sampling Distributions of Sample Mean, Variance for Normal

advertisement
Joint Distribution of X and S2, Normal Distribution*
Proposition: Let X1, X2, …, Xn be a mutually independent r.v.’s having, respectively, normal
distributions with mean i and variance  i2 for i = 1, 2, …, n. Let k1, k2, …, kn be real constants.
n
Then the r.v. Y   k i X i has a normal distribution with mean
i 1
n
 k i  i and variance
i 1
n
k
i 1
2
i
 i2 .
Proof: Since X1, X2, …, Xn are mutually independent, then the m.g.f. of Y is
n
1
 t  ki X i 
n
i t   i2t 2
tki X i
tki X i
tY
i 1


. But we also have  X i t   E e
Y t   E e  E e
 E e
 e 2 . Hence,

 i 1


n
n
n
1


Y t   exp  k i  i t   k i2 i2 t 2  , the m.g.f. for a normal distribution with mean  k i  i and
2 i 1
i 1
 i 1


 
n
variance
k
i 1
2
i
 i2 . For k i 




1
 
 .
, for i = 1, 2, …, n, we find that X ~ Normal  ,
n
n

#
Proposition: Let X1, X2, …, Xn be a random sample from a normal distribution with mean 
 X
n
and variance 2. Then
n  1S

2
2

i 1
 X
2
i

2
has a 2(n-1) distribution. Also, X and S2. are
independent random variables.
Proof: Let X1, X2, …, Xn be i.i.d. Normal(, ) r.v.’s, and let Y1  X , Y2  X 2 , …, Yn  X n .
The corresponding inverse transformation
x1  ny1  y 2  ...  y n , x2  y2 , …, xn  y n has Jacobian n. Since
n
n
i 1
i 1
2
2
2
 xi      xi  x   nx    , the joint p.d.f. of X1, X2, …, Xn can be written as
 n

xi  x 2
2 


 1 
n x   
 . Using the transformation,
f X1 , X 2 ,..., X n x1 , x2 , ..., xn   
 exp  i 1

2
2
2 2 

 2  


we find that the joint p.d.f. of Y1, Y2, …, Yn is
n


 yi  y 2
n
2
2 


ny  y 2  ...  y n  y1  i 2
n y1   
 1 

f Y1 ,Y2 ,..., Yn  y1 , y 2 , ..., y n   n
 exp  1


2
2
2
2

2

2

2








n
.
The quotient of this joint p.d.f. and the p.d.f. f Y1  X  y1  
the conditional p.d.f. of Y2, Y3, …, Yn, given Y1 = y1,
 n y1   2 
exp 
 of Y1  X is
2 2 
2 

n
 1 
f Y2 ,...,Yn |Y1  y1  y 2 , ..., y n   n 

 2  
n 1
 q2 
exp  2  , where
 2 
n
q  ny1  y 2  ...  y n  y1     y i  y1  . Since this is a joint conditional p.d.f., it must be,
2
2
i 2

for all  > 0, that

 1
  n  2 



n 1
 q2 
exp 
dy 2 dy 3 ...dy n  1 .
2 
2



Now consider
n  1S 2   X i  X 2  nY1  Y2  ...  Yn  Y1 2   Yi  Y1 2  Q .
n
n
i 1
n  1S
The conditional m.g.f. of

Ee
tQ /  2





2
 1
n 
 2 
| y1    
n 1

2

i 2
2




n 1
Q
2
, given Y1 = y1, is
 1  2t q 
exp 
dy 2 dy3 ...dy n
2 2 

n 1
 1 
 1  2t  2
 1  2t q 

exp 
dy 2 dy3 ...dy n , where 0 < 1 – 2t, or t < 0.5.
   n 
2 
2 2 
 1  2t     2 

However, this latter integral is exactly the same as that of the conditional p.d.f. of Y2, Y3, …, Yn,

given Y1 = y1, with  replaced by
2
m.g.f. of
( n  1) S 2
2

Q
2
2
1  2t
 0 , and thus must equal 1. Hence the conditional

, given Y1 = y1, is E e
conditional distribution of
n  1S 2 , given

2
tnS 2 /  2

 1 
|x 

 1  2t 
n 1
2
, t < 0.5. That is, the
X  x , is  2 n  1 . Moreover, since it is clear that
this conditional distribution does not depend on x , the random variables X and
n  1S 2
2
be independent, or equivalently, X and S 2 are independent.
* This proof is from Hogg, R. V. and Craig, A. T. (1978). Introduction to Mathematical
Statistics, Fourth Edition, New York, Macmillan Publishing Co., Inc.
#
must
Download