Stat 461!561: Exercises 5 In Casella & Berger, Exercises 7.23, 7.24

advertisement
Stat 461-561: Exercises 5
In Casella & Berger, Exercises 7.23, 7.24, 7.25 (Week 7) and 8.10, 8.11, 8.53 and
8.54 (Week 8)
Remark: There are several conventions available for parameterising Gamma and
inverse Gamma distributions. I have adopted here the ones from C&B.
Exercise C&B 7.23.
The question is not very precise. We have f xj 2 = N x; m; 2 :
If we assume that m is known, then
2
x1:n
1
/
2 ) +1
(
n
Y
1
1
exp
2
exp
/
1
/
where S 2 :=
1
n 1
n
2 ) 2 + +1
(
2)
(
n
+
2
Pn
m)2 so
i=1 (xi
2
and (n 21)S + 1
+1
2= +
exp
2
; m x1:n
!
Pn
m)2
i=1 (xi
2 2
!
1) S 2 =2
1= + (n
exp
2
x1:n is an inverse Gamma distribution of
1
parameters n2 +
.
If we assume that m is known and that
2
2
2
i=1
1
m)2
(xi
(m) / 1 then
1
/
(
n
Y
1
1
exp
2 ) +1
2
m)2
(xi
exp
2
i=1
2
!
and
(xi
2
m)
2
2
x) +
2
2
= n (m
n
X
x2i
nx2
i=1
where x =
1
n
2
Pn
i=1 xi
; m x1:n
and S 2 :=
/
1
(
2)
1
n
= n (m
1 Pn
exp
exp
1) S 2
x)2 : So
i=1 (xi
n 1
+1
x) + (n
1
2
n2 (m
2
1
x)2
2
1) S 2 =2
1= + (n
2
!
:
2
Stat 461-561: Exercises 5
By integrating out m, we obtain
2
; m x1:n
1
/
2 ) +1
(
1
(n
1)S 2
2
2
1) S 2 =2
1= + (n
exp
n 1
Hence,
1
exp
:
2
x1:n is an inverse Gamma distribution of parameters
1
n 1
2
+
.
+1
The mean of this distribution is given by
2
E
2
x1:n
=
(n 1)S 2
+1
2
n 1
2 +
=
2= + (n 1) S 2
:
n 1+2
Exercise C&B 7.24.
We have
1
( j x1:n ) /
+
/
exp (
i=1
xi 1
;
+
=
1
p
2
)
Pn
i=1 xi
+n
Pn
xi !
(1= + n)) :
i=1 xi ; (1=
( +
+ n)
1
;
i=1 xi )
2 :
i
2
i)
2
2
!
1
p
2
exp
+
+
(
)2
i
2
where
2
i)
(xi
2
=
= (
1
2
i
2
i
+
mi )2
+
(
)2
i
2
1
2
2
1
e2
:
(1= + n)
(xi
exp
exp (
1
V [ j x1:n ] =
Z
xi
exp (
Pn
+
E [ j x1:n ] =
Exercise C&B 7.25.
We have
Z
(xi ) =
f ( xi j i ) ( i ) d
= )
i=1
Pn
Hence we have ( j x1:n ) = Gamma
Using C&B, page 624
n
Y
xi
i
2
2
m2i
x2i
+
+
2
e2
2
2
x2i
2
2
+
2
2
!
d
i
and
3
Stat 461-561: Exercises 5
where
1
e2
1
=
2
(xi ) = p
e
2
i,
2
xi
mi = e2
It follows that by integrating out
1
+
2
;
+
2
:
we have
x2i
1
2
exp
2
+
2
m2i
e2
2
After rearranging the term in brackets, we obtain
2
(xi ) = N xi ; ;
+
2
:
:
There is a much simpler way to do this calculations. We can rewrite
Xi =
i
+ Vi
where Vi N 0; 2 is independent of i . We know that the sum of two Gaussian rvs
is a Gaussian so we just need to compute its mean and variance to get its distribution
E [Xi ] = E [
i
+ Vi ]
= E [ i ] + E [Vi ]
=
and
V [Xi ] = V [
i
+ Vi ]
= V [ i ] + V [Vi ]
=
2
+
2
:
Moreover, the Xi s are iid as
(x1 ; :::; xn ) =
=
=
Z
n Z
Y
i=1
n
Y
Z Y
n
i=1
f ( xi j i ) ( i ) d
f ( xi j i ) ( i ) d
1:n
i
(xi )
i=1
where (xi ) and (xj ) have the same functional form.
Straightforward.
Exercise 1 (Week 7) Let be a random variable in (0; 1) with density
( )/
1
exp (
)
4
Stat 461-561: Exercises 5
where ; 2 (1; 1).
Calculate the mean and mode of .
( ) is a Gamma distribution. We have
E[ ] =
:
Moreover
log ( ) = cst + (
d log ( )
(
1)
=
d
thus the mode is
=
(
1)
1) log
;
:
Suppose that X1 ; :::; Xn are random variables, which, conditional on , are
independent and each have the Poisson distribution with parameter . Find the form
of the posterior density of given X1 = x1 ,...,Xn = xn . What is the posterior mean?
We have
Pn
( j x1:n ) / + i=1 xi 1 exp ( ( + n) )
P
which with C&B convention is a Gamma distribution of parameters
+ ni=1 xi ; ( + n)
Suppose that T1 ; : : : ; Tn are random variables, which, conditional on , are
independent and each is exponentially distributed with parameter where we adopt
the convention f ( xj ) = exp ( x ) 1(0;1) (x). What is the mode of the posterior
distribution of , given T1 = t1 ,...,Tn = tn ?
1
( j t1:n ) /
exp (
)
n
Y
exp (
ti )
i=1
+n 1
/
exp
+
n
X
ti
i=1
which is Gamma distribution of parameters
We have
log ( j t1:n ) = cst + ( + n
+ n; ( +
1) log
!!
Pn
i=1 ti )
+
n
X
i=1
ti
1
.
!
thus the mode is located at
=
P
+ ni=1 ti
+n 1
i.i.d.
1
:
Exercise 2 (Week 7). Let X1 ,...,Xn
P oisson ( ).
Let
Gamma ( ; ) be the prior. Show that the posterior is also a Gamma.
Find the posterior mean.
1
:
5
Stat 461-561: Exercises 5
Trivial.
Find the Je¤reys’s prior. Find the posterior.
We have
n
X
log f ( x1:n j ) =
i=1
n
X
=
log f ( xi j )
(
+ xi log
d log f ( x1:n j )
d
=
n+
d2 log f ( x1:n j )
d 2
=
xi !)
i=1
so
so
"
I( )=E
So Je¤reys’prior is
n
X
xi
;
i=1
n
X
xi
2
i=1
n
X
Xi
2
i=1
#
=
n
:
1
( )/ p :
Exercise 3 (Week 7). Suppose that, conditional on , X1 ,...,Xn are i.i.d. N ; 20
with 20 known. Suppose that
N ( 0 ; 0 ) where 0 ; 0 are known. Let Xn+1 be
a single future observation from N ; 20 . Show that given (X1 ; :::; Xn ), Xn+1 is
normally distributed with mean
1
1
1
2 =n +
0
0
where X = n
1
Pn
i=1 Xi
X
2 =n
0
0
+
0
and variance
2
0
We have
2
0)
(
( j x1:n ) / exp
1
1
1
2 =n +
0
0
+
2
2
0
!
:
n
Y
(xi
2
exp
i=1
We have
2
0)
(
+
0
=
2
1
+
0
= (
2
n
Pn
2
0
2
)2
i=1 (xi
2
0
2
m)
e =e + cst
0
0
+
Pn
i=1 xi
2
0
)2
2
0
!
:
6
Stat 461-561: Exercises 5
where
So
1
1
1
2 =n +
0
0
e2 =
X
m
e = e2
2 =n
0
+
;
0
:
0
; m;
e e2 . Now we want to compute
Z
( xj x1:n ) = N x; ; 20 N ; m;
e e2 d :
( j x1:n ) = N
This is similar to the …rst part of Ex. 7.25 (corrected above) and the result follows.
Exercise 5 (Week 7). Let X1 ,...,Xn be i.i.d. N ( ; 1= ) and suppose that independent priors are placed on and . wit
N ; 1 and
G ( ; ). Show
that the conditional posterior distributions ( j x1 ; :::; xn ; ) and ( j x1 ; :::; xn ; )
admit standard forms, namely normal and Gamma, and give their exact expressions.
We have
/
( j x1:n ; )
( ; j x1:n )
(
/ exp
)2
2
(
/ exp
m)
e 2
2e2
!
n
Y
exp
2
i=1
!
)2
(xi
!
where
e
2
=
m
e = e
+ ;
2
n
X
+
xi
i=1
!
:
So we have ( j x1:n ; ) = N x; m;
e e2 .
We have
/
( j x1:n ; )
( ; j x1:n )
1
/
/
so
exp
n
Y
1=2
exp
i=1
+n
1
2
( j x1:n ; ) = Gamma
1
exp
+
n
X
(xi
i=1
;
+ n2 ;
1
+
Pn
i=1
)2
(xi
)2
2
)2
(xi
2
2
! !
1
:
!
Download