Statistics 580 Project #2(100 points)

advertisement

Spring 2009

Statistics 580

Project #2(100 points)

Turn-in all code and outputs produced.

1. Suppose that (1 , 2 , 1 , 2 , 1 , 1 , 2 , 1 , 3 , 1) is an observed random sample from the (discrete) logarithmic series distribution with density

θ x f ( x ; θ ) =

− x log (1

θ ) for x = 1 , 2 , 3 , . . . , 0 < θ < 1

(a) Plot the log likelihood ` ( θ ) and its derivative as functions of θ on the same graph.

(b) Use the functions secant() and newton() used in our class to obtain the m.l.e. of θ and its standard error by finding the root of the derivative of ` ( θ ).

(c) Use any of the functions in Splus/R to maximize ` ( θ ) w.r.t.

θ directly i. without explicitly calculating the derivatives analytically, and, ii. using both the gradient and the hessian derived analytically ( either by you or by the program).

(d) If the prior π ( θ ) is taken as a Beta(2,2) , construct and execute a Metropolis procedure to estimate the posterior mean and its variance.

See the notes Metropolis algorithm applied to logseries data from the “code for homework” page for some notes on part d).

2. Let y

1

µ

1 and

, . . . , y

µ

2 n be a random sample drawn from a mixture of two normal densities with means and common variance from population I be α

σ

2 . Let the unknown proportion of the observations drawn

. Thus the parameter vector of the model is θ = ( α, µ

1

, µ

2

, σ 2 )

0

. Let the probability that observation i belongs to population I be denoted by p ( t ) i

=

α ( t ) φ ( y i

; µ

1

( t ) , σ ( t ) 2 )

α ( t ) φ ( y i

; µ

1

( t ) , σ ( t ) 2 ) + (1

α ( t )) φ ( y i

; µ

2

( t ) , σ ( t ) 2 ) for i = 1 , . . . , n . Derive the EM algorithm updates α ( t + 1) , µ

1

( t + 1) , µ

2

( t + 1) and σ ( t + 1) 2 in terms of p ( t ) i

Bowmaker et al.(1985) analyze data on the peak sensitivity wavelengths for individual microspectrophotometric records on a small set of monkey’s eyes. Data for one monkey are given below:

529 .

0 , 530 .

0 , 532 .

0 , 533 .

1 , 533 .

4 , 533 .

6 , 533 .

7 , 534 .

1 , 534 .

8 , 535 .

3 , 535 .

4 , 535 .

9 , 536 .

1 , 536 .

3 , 536 .

4 ,

536 .

6 , 537 .

0 , 537 .

4 , 537 .

5 , 538 .

3 , 538 .

5 , 538 .

6 , 539 .

4 , 539 .

6 , 540 .

4 , 540 .

8 , 542 .

0 , 542 .

8 , 543 .

0 , 543 .

5 ,

543 .

8 , 543 .

9 , 545 .

3 , 546 .

2 , 548 .

8 , 548 .

7 , 548 .

9 , 549 .

0 , 549 .

4 , 549 .

9 , 550 .

6 , 551 .

2 , 551 .

4 , 551 .

5 , 551 .

6 ,

552 .

8 , 552 .

9 , 553 .

2

Write an R program to compute the maximum likelihood estimates via the EM algorithm.

Use the initial estimates α (0) = 0 .

5, µ

1

(0) = 525, µ

2

(0) = 550 and σ (0) = 3 .

5.

Note: It is reminded that this project requires individual work; if you have questions please do not hesitate to ask me.

Due Thursday, April 30, 2009

1

Download