- SourceForge

advertisement
Course on Study Design
Hands-On Session – Implementation of model based predictions
using simulation/reestimation
Model-based design of dose finding studies
Introduction:
This exercise aims to illustrate simulation/reestimation-based evaluation of proposed designs.
Based on models developed from data on a parallel group phase 2a (PoC) study (0, 5, 30 mg
daily), different designs are evaluated with respect to their ability to precisely estimate
parameters of a longitudinal model, as well as allow discrimination between model, both
longitudinal and time-specific. The project team has planned to run a dose finding study with
5 doses (0, 5, 15, 30, 60 mg) and 60 subjects per arm using the same parallel group design as
in phase 2a.
Files provided:
Data set:
data1.csv
data from the original POC trial (N=90, 30 per arm)
data11.csv
as data1.csv but with the proposed design (N=300, 60 per arm)
data17.csv
as data11.csv but N=3000
data19.csv
1 observation per subject (TIME=3) for 10 dose levels and N=10000
Model files:
run10.mod
=
model implementing an Emax model
run11.mod
=
as 10 but data set with proposed phase 2b design
run12.mod
=
model with linear drug effect
run13.mod
=
model without drug effect
run14.mod
=
model for last time point only implementing an Emax model
run15.mod
=
model for last time point only implementing a linear model
run16.mod
=
model for last time point only without drug effect
run19.mod
=
model for last time point change from baseline only
1
Tasks:
1) Explore an Emax model as an alternative to describe Phase 2a data
a. In the original model development the final model was selected as a linear
drug effect. This was (partially) based on parsimony. For the design of the
dose finding study it is of interest to explore also other model structures and
how informative designs are with respect to more complex models. Review
run10.mod and then estimate and review output of the model.
[PsN]
execute run10.mod
sumo run10.lst
b. The output from run10.mod indicates that there is considerable uncertainty
in ED50. Please explore this uncertainty further using the log-likelihood
profile (llp) method in PsN. To explore the confidence interval on ED50
(=THETA(5)) the following command is used. What is the confidence
interval for ED50?
[PsN]
llp run10.mod –thetas=5
c. Applying llp on each parameter is somewhat cumbersome and we explore
parameter distributions in a bootstrap of run10.mod. This bootstrap also
generates the parameter uncertainty for a simulation-reestimation (SSE)
evaluation. Histograms of the parameter estimates can be visualized using
Xpose.
[PsN]
bootstrap run10.mod –samples=100 –dir=boot10
[Xpose]
boot.hist(results.file="./boot10/raw_results_run10.csv",
incl.ids.file="./boot10/included_individuals10.csv")
2
2) Evaluate the proposed design using stochastic simulations and estimation (sse)
a. For the evaluation of the proposed design we will perform a stochastic
simulation and estimation exercise, using the PsN functionality sse. We
will simulate with run11.mod with parameter uncertainty using parameter
vectors from the preceding bootstrap. For analysis, we investigate Emax,
slope and step drug effect models. We will do so both for a longitudinal
analysis using the 4 observations per subject (MEM; models 11,12 and 13)
and later for models that utilize only end-of-treatment change from baseline
(CFB; models 14, 15 and 16). [40 simulations only, for runtime
considerations]
[PsN]
sse run11.mod -samples=40 –dir=sseMEM –seed=1234
-alt=run12.mod,run13.mod
-rawres_input=boot10/raw_results_run10.csv
b. Inspect the sse_results.csv file and determine if the parameters of the model
used for simulation were well captured in the estimation. What was the
power to identify an Emax model over slope and step models?
c. The power to identify an Emax model (in contrast to slope or step models) is
likely linked to the true ED50 value. We can estimate the power when data
were simulated with ED50<60, that is within the range of doses tested, by
the command below
[PsN]
sse run11.mod -samples=40 –dir=sseMEMlowED50 –seed=1234
-alt=run12.mod -rawres_input=boot10/raw_results_run10.csv
–in_filter=5_ED50.lt.60
d. In the sse above each simulated data set was made with a separate
population parameter vector and resulted in a different estimated parameter
vector. We can investigate what the expected typical individual difference
from placebo is at end-of-study (TIME=3) for a certain dose for each set of
simulation parameter vector (which will give us an idea of the probability of
reaching a certain desired clinical effect (let’s set this desired effect at 20)).
We can in parallel investigate the estimated typical individual response
(from the sse). We can thus form a two-by-two table with the true effect
being < or < 20 and similarly for the estimated effect.
e. Now use the same simulations to investigate the properties of models that
use change from baseline (CFB) data at treatment end (TIME=3) only.
[PsN]
sse run11.mod -samples=100 –dir=sseCFB –seed=1234
-alt=run14.mod,run15.mod,run16.mod
-rawres_input=raw_results_run10.csv
-no-estimate_simulation
3
3) Evaluate the proposed design using monte-carlo mapped power (mcmp)
a. Update run10.mod to a model that contains the final estimates from that run.
Then change the $DATA in run17.mod to be data17.csv This is as
data11.csv but 10 times more subjects.
[PsN]
update run10.mod –out=run17.mod
b. With run17.mod as simulation model, and full model, compare the power to
detect a significant difference between an Emax and a step (run13.mod)
model when the analysis is using all data per individual (MEM). How many
subjects are needed for 80% power at 5% significance level?
[PsN]
mcmp -full_model=run17.mod -reduced_model=run13.mod
-stratify_on=ARM –dir=mcmpMEM
c. With run17.mod as simulation model. Compare the power to detect a
significant difference between Emax (run14.mod) and step (run16.mod)
models when the analysis is using only one change from baseline (CFB)
datum per individual.
[PsN]
mcmp -simulation_model=run17.mod -full_model=run14.mod
-reduced_model=run16.mod -stratify_on=ARM –dir=mcmpCFB
4
4) Evaluate the proposed design using sse and point estimates from POC
a. Copy run17.mod to run18.mod and change to data11.csv in $DATA. With
run18.mod as simulation model, we can perform an sse to investigate the expected
precision in estimates of the proposed design.
[PsN]
sse run18.mod -samples=100 -dir=sse18
b. The resulting set of parameter vectors can be used to generate a plot of the
uncertainty in the mean effect versus dose given. Note that the
raw_results_run18.csv file is generated by opening the sse18/raw_results_run18.csv
file, deleting the first column*, duplicating the second row* and then saving it in the
main directory as raw_results_run18sse.csv. The model file (run19.mod)
is using a data file (data19.csv) with many (10) dose levels and 1000 subjects per
dose levels. [*This needs to be done as the vpc command expect a rawres_input file
from a bootstrap procedure, not a sse procedure. The two procedure generate
slightly different raw_results_runx.csv files.]
[PsN]
vpc run19.mod -samples=100 -idv=DOSE -flip_comments
-rawres_input=raw_results_run18sse.csv -dir=vpc_MEM300
[Xpose]
xpose.VPC(vpc.info="vpc_MEM300/vpc_results.csv",vpctab="vpc_ME
M300/vpctab19",PI.real=NULL,type="n",ylim=c(0,60))
5
Download