Parameter Redundancy Workshop

advertisement
Parameter Redundancy Workshop
Maple Practical II
First open the Maple worksheet MaplePartII.mw
This worksheet again contains procedures, some of which are new. Each new one will be explained
when it is used. They can all be activated again by pressing <enter> or a short cut is the execute all
button ‘!!!’ on the top menu bar
Note that you can save time by coping and pasting (<ctr>+<c> <ctr>+<v>) from Maple Practical I
code and previously typed code.
To enter a Greek letter type its name in words. Eg for  type alpha.
Example 1a: CJS model Exploring Exhaustive Summaries
We shall again look at the CJS model which has recapture probability matrix
1 p2 1 p22 p3 1 p22 p33 p4 
P   0
2 p3
2 p33 p4 
p2  1  p2 etc
 0

0
3 p4
We shall examine all the exhaustive summaries given in the slides. (Reminder: An exhaustive
summary is a vector that uniquely describes a model).
First input the parameters using the code and the recapture probabilities:
>
>
Then we shall examine each of the exhaustive summaries in turn.
The first of the exhaustive summaries was the means. The procedure MuP will find the means of
any recapture or recovery probability matrix. (Enter  by typing kappa)
>
Then find the derivative matrix, check the rank is 5 and find the estimable parameter combinations.
Reminder the code to do this is:
>
>
>
Now find the derivative matrix, check the rank is 5 and find the estimable parameter combinations
for each exhaustive summary it turn.
The second of the exhaustive summaries was the non-zero entries of the capture probability matrix
>
The procedure logvector finds the natural logarithm of a vector. The third exhaustive summary was
the natural logritum of the non-zero entries of the capture probability
matrix. >
The procedure LoglikP finds the log-likelihood exhaustive summary for any recapture or recovery
matrix. The final log-likelihood exhaustive summary can be found using the code:
>
Example 1b: CJS model – missing data
Does the CJS still have deficiency 1 if the recovery data looked like this?
0
0
67 0
0 0



N   0 103 0  or this? N  0 103 3 
 0
0 0 91
0 91
This can be examined using the log-likelihood exhaustive summary; use follow code to get the new
exhaustive summary for the first case:
>
Now find is derivative matrix and rank and repeat for the second case.
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Example 1c: CJS model - Reparameterisation
A possible reparameterisation for the CJS model is
The Maple code for this is:
>
First we check this is a unique reparameterisation
>
To be unique the rank should be 5, as the reparameterisation is of length 5.
Next we rewrite the original exhaustive summary in terms of the reparameterisation. Here we are
using the ln(P) exhaustive summary as this was the simplest. Note that we actually rewrite in terms
if ssi as si has already been assigned a value in the Maple code.
>
Our new parameters are the ssi:
>
The derivative matrix Ds and its rank can be calculated using:
>
>
The rank should be equal to 5, again showing us that the CJS has 5 estimable parameters and that
we have a reduced-form exhaustive summary.
Now use s as an exhaustive summary to check (yet again) that the model has 5 estimable parameters
and that its estimable parameter combinations are as we expect.
Example 1d: CJS model – PLUR decomposition
Now consider adding survival covariates to the CJS model, with i = {1 + exp(a +bxi)}.
The parameters are now:
>
We use the above exhaustive summary, s, now evaluated at i = {1 + exp(a +bxi)}.
>
Then find the derivative matrix and its rank. Is this model full rank?
…………………………………………………………………………………………………….
The procedure PLURdecomp will find a modified PLUR decomposition and return the matrices, U
and R and the determinant of U in results receptively. Use the code (for derivative matrix D1):
>
When is the model parameter redundant?
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Example 2: Ring-recovery – nested models
Now consider again the model for the recovery of dead where the parameters are
         
  1,1 1, 2  a 1 2  and the probabilities of recovery are P   1,1 1 1 a 1 1 a a a  .
1, 2 1 1, 2 a a 
 0
Enter the probabilities again (you could copy and paste from Workshop I’s) and find it derivative
matrix and rank using the ln(P) exhaustive summary.
Find the modified PLUR decomposition, using the code (for derivative matrix D1):
>
The following code will find where determinant is zero:
>
To find the deficiency of U at the point 1,1 = 1,2 use the code:
>
What is the determinant of U and when does it equal zero? What does this tell us about any nested
models?
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Example 3: Multi-site models – reparameterisation
Hunter and Caswell examine multi-state models for seabirds. The calculation of the probability that
an individual released in stage i at time r is next captured in stage j at time c in a vector form is
carried out here using the procedure parray( , N) (for N sampling occasions). The main inputs into
this procedure are the transition matrix and the recapture matrix. Here we shall look at a 4 state
model for breeding success and failure. The transition matrix and recapture matrices are
 2  2 2
 3  3 3
 4  4 4 
  1 1 1
 p1 0 0 0
  (1   )   (1   )   (1   )   (1   )
 0 p 0 0
1 1
1
2 2
2
3 3
3
4 4
4 
2

.
and Π  
Φ
  1 (1  1 )

 0 0 0 0
0
 3 (1   3 )
0




0
 2 (1   2 )
0
 4 (1   4 ) 
 0 0 0 0

A little more detail on this multi-state model can be found in the slides.
First we input the parameters and the transition and recapture matrices. (Note the use of A rather
than  for the transition matrix, and P for the recapture matrix.)
>
>
>
Next we set up our exhaustive summary, which consists of the probabilities of being recaptured for
5 years worth of data. We use the code:
>
This should have 40 entries to look at the first 8 of these use the code:
>
If we tried to calculate the rank of the derivative matrix directly, we would be waiting a very long
time (and probably end up crashing the computers!). Instead we use reparameterisation to find a
simpler exhaustive summary. Our reparameterisation is:
>
First check this is a reduced-form exhaustive summary. This should be rank 14 as required.
>
Then we rewrite the original reparameterisation in terms of s. As the original exhaustive summary
repeats after the 16th entry we only examine the first 16 terms.
>
>
Next find the derivative matrix and its rank:
>
>
>
Therefore how many estimable parameters are there in the original model? ……………….
s is not an exhaustive summary, but we can find one using:
>
The (reduced-form) exhaustive summary is then:
>
>
>
You can now use this reduced-form exhaustive summary (sre) to find a derivative matrix with
respect to the parameters. This can be used to check again the number of estimable parameters.
What are the estimable parameter combinations?
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Hunter and Caswell (2008) consider constrained models, where parameters of the same type are set
equal; for example {1 = 2 = 3 = 4 } or {1 = 2 = 3 = 4 p1 = p2}. Determining possible
parameter redundancy under such constraints just involves forming the derivative matrix by
differentiating the exhaustive summary (sre) evaluated at the decreased parameter set. The
procedure constraints will do this for a given exhaustive summary and a given set (of at least 2)
constraints and their parameters.
>
>
>
Now explore more of the possible constraint models, formed from setting parameters equal to each
other, such as those given in the table on slide 18.
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Newt’s Challenge
In Rachel’s talk she described multi-state analysis of Great Crested
newts. Here we shall look at parameter redundancy of this multi-state
mark-recapture model. There are two states: state 1 breeding and state
2 non-breeding. The newts are only ever captured in the breeding
state; so the breeding state is observable and the non-breeding state is
unobservable. The probabilities of being recaptured for the first time
can be found using the procedure parray, as in example 3.
First consider a model with the following parameterisation (with N sampling occasions):
t: annual survival which is dependent on time, but not on state (t = 1,…,N – 1)
1,1: transition from state 1 to state 1 (breeding to breeding)
2,1: transition from state 2 to state 1 (not breeding to breeding)
p1: recapture probability in state 1 (p2 = 0)
The transition and probability matrices are:
t 2,1 
 t 1,1
 p 0
 
and    1

.

(
1


)

(
1


)
0
0
t
1
,
1
t
2
,
1




The parameters are [1  2 ...  N 1  1,1  2,1 p1 ]
(NB the different notation to Rachel’s talk is to be compatible with Maple)
Is this model parameter redundant for N = 3, 4 and 5? Is there a general result for all N?
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
What about the nested model where transition to the breeding state is the same regardless of the
state the newt was in previously (1,1 = 1,2), is this parameter redundant? What is the deficiency?
…………………………………………………………………………………………………….
Now consider the most general model where all parameters are dependent on state and time.
The transition and probability matrices are:
t , 2 t , 2,1 
0
 t ,1 t ,1,1
p
 
and    t ,1

.
 0 0
t ,1 (1  t ,1,1 ) t , 2 (1  t , 2,1 )
The parameters are [1,1 ...  N 1,1 1, 2 ...  N 1, 2  1,1,1 ...  N 1,1,1  1, 2,1 ...  N 1, 2,1 p 2,1 p N ,1 ]
How many parameters are estimable for the general model for N = 3, 4, 5 and 6 recapture occasions?
(Hint N = 3 and N = 4 should be possible directly. N = 5 and N = 6 require using reparameterisation.)
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
…………………………………………………………………………………………………….
Download