Methods for Evaluating Effectiveness of Implementation Strategies Using a Randomized Implementation Trial

advertisement
Methods for Evaluating Effectiveness of
Implementation Strategies Using a
Randomized Implementation Trial
,
C Hendricks Brown
Director, Prevention Science and Methodology Group
Director, Social Systems Informatics
University of Miami Miller School of Medicine
chbrown@med.miami.edu
Penn State Prevention Research Center and
Methodology Center
1
Acknowledgements
Co-Authors
Patti Chamberlain, OSLC
John Landsverk, Rady Children’s Hospital
Lisa Saldano, Center for Research to Practice
Larry Palinkas, USC
Mitsu Ogihara, U Miami
Prevention Science and Methodology Group
Penn State Prevention Research Center and
Methodology Center
2
Acknowledgements
• NIMH R01 MH076158 Chamberlain PI, Community
Development Teams to Scale-Up MTFC in California
• NIMH P30MH074678-03 Landsverk, PI, Advanced
Center to Improve Pediatric Mental Health Care
• NIDA R01 DA019984 Poduska, PI, A Follow-up of
Classroom Services to Prevent Drug Use
• NIDA R21DA024370 Poduska, PI, Scaling-up Prevention
Services for Early Drug Abuse Risk in School Systems
• NIMH/NIDA R01-MH040859, Brown PI, Methods for
MH/DA Prevention and Early Intervention, Prevention
Science and Methodology Group (PSMG)
Penn State Prevention Research Center and
Methodology Center
3
Outline of Concepts and Methods
1. Methods Problems and New Directions for
(Prevention) Implementation Science
Measurement to Monitor Implementation Stage
Models to Represent and Analyze Implementation
Testing to Evaluate Implementation Strategies
2. Roll-Out Implementation Trials: CAL-40 Experience
Testing: Randomized Implementation Trial
Measure: Stages of Implementation Completion (SIC)
Model: Directed Action (DATI)
3. Conclusions
Penn State Prevention Research Center and
Methodology Center
4
Implementation Research
Involves the use of strategies to adopt and
integrate evidence-based health interventions
and change practice patterns within specific
settings – Chambers 2008
Implementation Science
Producing general, replicable knowledge
regarding the success (or failure) of
implementation strategies
Penn State Prevention Research Center and
Methodology Center
5
1. Methods Problems and New Directions for
(Prevention) EXPERIMENTAL Implementation
Science
Individual
Interventions
in Medicine
Group Level
Interventions in
Prevention
System Level
Interventions for
Implementation
1940’s Clinical Trial
. . .
Adaptive
Intervention
Individual Level
1970s
. . .
Multilevel and
Growth Modeling
2000s
Place-Based
Longitudinal
Trial
????
Penn State Prevention Research Center and
Methodology Center
Interactions within and
across systems
6
If Randomized Trials Are to Play an
Important Part in Implementation
• Correspondences with Existing Trials
Research Questions  Testable Hypotheses
Efficacy, Effectiveness,
Implementation?
Trial Conduct and Protocols
Population  Sample Selection
Random Assignment
Evaluation
• Extensions of Existing Trials
• Real Differences with Existing Trials
Penn State Prevention Research Center and
Methodology Center
7
Changing Research Questions
Effectiveness vs Implementation
System to Support
Adoption
System to Support
Adoption
Intervention
Penn State Prevention Research Center and
Methodology Center
Intervention
8
Intervention Research Phases, Including
Implementation
Domain of Implementation
Science
Sustaining
Sustainability
Systemwide
Going to Scale
Fidelity
Sustainability
Adoption
Effectiveness
Efficacy
O’Connell, Boat & Warner, 2009 NAS
Landsverk J, Brown CH, Chamberlain P, et al.
Aarons et al., in press
Penn State Prevention Research Center and
Methodology Center
9
Phases of Implementation Research
and Related Hypotheses
• Adoption – increase quantity and speed of adopting
interventions in communities
Does Strategy A increase adoption of B?
• Implementation Fidelity – improve the quality of
program delivery
Does Strategy A increase fidelity , fit or successful
adaptation over B?
• Sustainability – maintain quality and expand the
quantity of program delivery
Does Strategy A improve quality of delivery of “Intervention
Agents” over time compared to B?
Penn State Prevention Research Center and
Methodology Center
10
Implementation Method: Testing
• There is a role for randomized implementation
trials, especially at this early stage of the
science
Penn State Prevention Research Center and
Methodology Center
11
Why Trials? There are very few Broad
Street Pump Handles Left to Remove
John Snow’s Map
of London
1849 proposed
1854 500 deaths
to ~ 0
Removal of Pump
Led to Immediate
Reduction in
Cbolera Deaths
Penn State Prevention Research Center and
Methodology Center
12
Dealing with “Sample Size”
• 40 Randomized Counties still need a large effect
• “Roll-Out Trials” – eventually all communities get
the active intervention
• “Cumulative Trials” -- Implementation strategy
evolves over time; combined analyses reflecting
these changes
Brown et al., 2008 ARPH, Brown et al., 2006 Clinical
Trials
Penn State Prevention Research Center and
Methodology Center
13
Randomized Trials Are Well
Represented in Implementation
Prevention :
• PROSPER , Communities that Care
338 Papers
Child welfare/mental health implementation
• 9 of 338 studies had a comparison group
– 8 of 9 used a randomized trial
1
Quality Improvement in Health Care
• Cochrane Collaboration Effective Practice and Organization of Care Review
Group (EPOC) Reviews –
– 57% exclusively Randomized Trials
Landsverk, Brown, Chamberlain et al. (accepted for publication)
Landsverk, Brown, Rolls Reutz et al (2011)
Penn State Prevention Research Center and
Methodology Center
14
2. Methods Illustrations from the CAL40 Implementation Trial
Test Implementation of Multidimensional Treatment Foster
Care in 40 Counties in California
Objective : Test the effectiveness of the Community
Development Team (CDT), a theory driven model to promote
the adoption, implementation, and sustainability for
delivering the evidence-based Multidimensional Treatment
Foster Care (MTFC) intervention in California counties that are
not already using MTFC.
Method: Randomize counties into 6 equivalent clusters, 3 of
which receive CDT, other 3 receive standard implementation.
Measures: Time it takes to adopt, recruit staff, train, and place
youth in MTFC homes, receive certification
Penn State Prevention Research Center and
Methodology Center
15
Outcome(s) of the CAL-40 Randomized
Implementation Trial
How fast do counties in the two conditions
adopt, implement with fidelity, and sustain
the intervention ? Survival Analysis
Primary Outcome: Time county takes to achieve
certification of sustainability.
Secondary Outcome: Timing in Attaining Stages
of Implementation Completion (SIC)
Penn State Prevention Research Center and
Methodology Center
16
Changing Research Questions
Effectiveness vs Implementation
System to Support
Adoption
System to Support
Adoption
Intervention
Penn State Prevention Research Center and
Methodology Center
Intervention
17
Two-Arm Trials
Effectiveness vs Implementation
Existing
Implementation
Supports for
County, Agency,
Group Home
Control
Condition
Youth
MTFC
Implementation
Supports for County,
Agency, Clinicians,
Parent
MTFC
Intervention
Standard
Implementation
Supports for County
MTFC
Intervention
CDT
Implementation
Supports for County
MTFC
Intervention
Youth
Youth
Penn State Prevention Research Center and
Methodology Center
Youth
18
Randomly Assign 40 Counties
Inclusion/Exclusion Criteria
All Counties Included Except
Early Adopters
Counties with too few Placements
One where Court Case Occurring (LA)
Counties balanced on Fixed Factors (SES, Size, EPSDT), then
Randomized to TIME 3 Cohorts and CONDITION of
Implementation Strategy (Community Development
Team, CDT, and Independent)
Account for County level Clustering in CDT condition
Penn State Prevention Research Center and
Methodology Center
19
COHORT 1
COHORT 2
COHORT 3
Initial CAL-40 Design
40 CA Counties
CDT
Stnd
26 Wait
LIsted
Wait
Listed
13 Wait
LIsted
Penn State Prevention Research Center and
Methodology Center
20
Intervention Assignment
• Randomly Assigned to Condition and Timing
(Cohort)
• Roll-Out Trial, Dynamic Wait-Listed Design
Brown et al., 2006 Clinical Trials
REMARK
NEARLY IMPOSSIBLE TO HAVE A TRUE CONTROL
Penn State Prevention Research Center and
Methodology Center
21
Issues in the CA-MTFC Design
1. Acceptance of the Design was Complete
2. Some Counties Were Not Ready to take Part
Moved Up Counties from Next Cohort, but
remained in same implementation condition
Penn State Prevention Research Center and
Methodology Center
22
Consort Diagram
Assessed for eligibility, n=58.
Excluded n=18.
- Already had implemented MTFC, n=9
- Fewer than 6 youth per year, n=8
-Los Angels County, n=1
Formed 6 equivalent clusters by
matching background, n=40.
Randomized the clusters to 3 cohorts
and 2 conditions, n=40.
Cohort 1, n=12.
Cohort 2, n= 14.
CDT, n=6
IND, n=6
CDT, n=7
IND, n=7
- Accepted, n=4
- Accepted, n=4
- Accepted, n=7
- Accepted, n=7
-Declined, n=2
- Moved to cohort
1, n=2
- Moved to cohort
1, n=0
- Expected to be
filled by cohort 3
counties, n=2
- Expected to
receive, n=7
Declined, n=2
- Filled by cohort 2
counties, n=0
- Received, n=4
2 counties moved up.
- Expected to
receive, n=7
- Invited to ‘go
early’, n=3 (2
counties accepted
invitation)
- Invited to ‘go
early’, n=3 (no
counties accepted
invitation)
Cohort 3, n= 14.
CDT, n=7
Accepted, n=4
Declined- n=1
IND, n=7
- Accepted, n=6
-Declined n=0
- Pending, n=2
- Pending, n=1
- Expect to be
moved to cohort 2,
n=2
- Expect to be filled
by declined cohort 1
counties, n=2
- Expect to be filled
by declined cohort 1
counties, n=2
- Expect to receive,
n=9
- Expect to receive,
n=7
Penn State Prevention Research Center and
Methodology Center
23
EBP Advice Network by Implementation Stage & Agency
Implementation Stage:
Blue = Implementing (1)
Red = Considering (2)
Green = Declined to participate (3)
Gray = Not Applicable (8)
White = missing data (0)
Agency:
Square = Child Welfare (1)
Up triangle = Mental Health (2)
Diamond = Probation (3)
Plus = Not Applicable (4)
Circle = missing data (0)
Combination of Network Survey and Interview: (Source: Palinkas 2011)
159
176
172
17
59
158
30
70
64
36
121
165
74
164
23
5
60
66
163
8
49
91
122
55
65
155
46
13
146
147
154
115
120
51
156
87
67
24
118
56
157
167
119
90
174
144
145
75
33
43
111
116
117
18
171
26
39
72
11
112
31
25
94
48
73
93
16
53
22
148
100
47
92
34
85
150
2
149
127
102
62
77
134
95
40
52
1
3
78
130
101
61
20
114
131
79
173
44
6
9
141
136
57
132
45
103
15
42
104
28
38
88
128
96
4
97
71
19
84
63
129
54
50
153
41
69
32
99
35
58
143
139
110
89
142
135
82
81
98
133
137
7
113
109
140
138
37
68
125
80
10
21
12
27
151
160
168
126
106
169
86
83
152
161
166
76
107
162
14
29
105
170
123
175
Penn State Prevention Research Center and
Methodology Center
108
124
24
Exogenous Factors
Running Randomized Trials During a Recession
Solution: Expand trial by Adding 13 counties in
a second state (Ohio), using equivalent
inclusion/exclusion criteria and random
assignment
Penn State Prevention Research Center and
Methodology Center
25
Types of Protocol Deviations
Deviations
Example
Response
Consequence
No Need to Impose Protocol Rules
Counties talk to
other counties
assigned to
different
strategies
Protocols were Followed
When County Was
not Ready to Start
in their Assigned
Cohort
Randomly and
Blindly Selected
County in SAME
Condition in Next
Cohort
So Far Little
Imbalance, little
indication of
differential
replacement bias
Interactions of
Counties in
Different
Conditions
Allowed and
measured
Potential
reduction in
statistical power
Protocols were Not Followed
No Protocols Available
Stagnancy of
County
Governments’
Budgets
Extend trial to
another state with
same
inclusion/exclusio
criteria
Penn State Prevention Research Centern and
Methodology Center
Statistical Power
26
Measurement: SIC
.
Ability to Measure Implementation
Milestones
SIC Stages of Implementation
Completion
- Whether they are achieved
- How long it takes to achieve
- With what quality are they achieved
- In what quantity are they achieved
Penn State Prevention Research Center and
Methodology Center
27
Measurement: Stages of
Implementation Completion (SIC)
Stage by Agent (s)
8 Stages
Involvement
1.
Engagement
System Leader
1.
Considering feasibility
System Leader, Agency
1.
Planning/readiness
System Leader, Agency
1.
Staff hired and trained
Agency, Practitioners
1.
Fidelity monitoring process in place
Practitioners, Child/Family
1.
Services and consultation begin
Practitioners, Child/Family
1.
Fidelity, competence, & adherence
Practitioners, Child/Family
1.
Sustainability (certification)
System, Agency, Practitioner
Chamberlain et al., 2010
Penn State Prevention Research Center and
Methodology Center
28
Contemplation/Pre-Adoption (Wang, Saldana et
al., Implementation Science, 2010)
Factors affecting time decision to consider
Penn State Prevention Research Center and
Methodology Center
29
Sub-Stages
Stage 3: Readiness Planning
Date of cost/funding plan review (start)
Date of staff sequence, timeline, hire plan
review
Date of Foster Parent recruitment review
Date of referral criteria review
Date of communication plan review
Date of second in-person meeting held
Date written implementation plan complete
(completion)
Penn State Prevention Research Center and
Methodology Center
30
0.6
0.4
0.2
0.0
Fraction Complete
0.8
1.0
Counties' Completion of Substages
1
2
3
4
5
6
7
8
Stage
Penn State Prevention Research Center and
Methodology Center
31
Three Interrelated Characteristics in
Monitoring Implementation
Penn State Prevention Research Center and
Methodology Center
32
First Sign in a Window
Cheap
Fast
Accurate
Penn State Prevention Research Center and
Methodology Center
33
Next Sign
Cheap
Fast
Accurate
Pick any
Two
Penn State Prevention Research Center and
Methodology Center
34
Cost, Speed, Accuracy are Interrelated
Penn State Prevention Research Center and
Methodology Center
35
Thermodynamics Ideal Gas Law
(Clausius Clapeyron)
P – Pressure
V – Volume
T – Temperature
PV = r T
T = PV / r
Penn State Prevention Research Center and
Methodology Center
36
Three Interrelated Characteristics in
Monitoring Implementation
Speed – time to milestone
attainment
Quality – how well is a
program implemented
Quantity – how much is
implemented
Penn State Prevention Research Center and
Methodology Center
37
Functional Relationship like PV = rT
• T – Time
• F -- Fidelity or Quality
• Q -- Quantity
Days
20% or 80%
1st facilitator, vs 50%
T = fctn ( F , Q ) + error
Time T
Quantity
Q
Fidelity F
Penn State Prevention Research Center and
Methodology Center
38
Differential Goals of Implementation
Strategies
CAL-40
True Goal
Good Behavior Game
Implementation
CDC-DEBI
Penn State Prevention Research Center and
Methodology Center
39
Modeling: Directed Action
• Represent the Implementation Strategy
through intentional steps
• Logic Model  Agent-Based Modeling
Penn State Prevention Research Center and
Methodology Center
40
Directed Action
. DATI Diagram
Director
(D)
Action (A)
Target
(T)
Penn State Prevention Research Center and
Methodology Center
Intent
(I)
41
Computational/Simulation Modeling
Ability to Describe What the Implementation
Strategy is
DATI Director, Action, Target,
Intention
Penn State Prevention Research Center and
Methodology Center
42
Multilevel, Systemic Logic Model for
Implementation
CDT
MTFC Support
Training
Institute
County Plan for
Implementation
Standard
Clinical Support
Staff: Select and
Train Foster
Parents
Foster Parent:
Parenting
Practices
Youth: Less
Runaways
Penn State Prevention Research Center and
Methodology Center
43
2. Measurement of the
Implementation Process
A. Directed Action: Cognitive Science ->
Computer Sci.
Who does what to whom for what intended
purpose?
DATI: Director, Action, Target, Intention
Director (D)
Action (A)
Target
(T)
Intention (I)
Penn State Prevention Research Center and
Methodology Center
44
Director (D)
Action (A)
Target
(T)
Intention (I)
MTFC Intervention
Director: Clinical Support Team (Intervention Agent)
Action: Deliver MTFC Support Protocol
Target: Foster Parent
Intention: Improve Parental Monitoring
Parent Actions
Director Foster Parent
Action: Improve Parental Monitoring
Target: Youth
Intention: Decrease Runaways
Penn State Prevention Research Center and
Methodology Center
45
Director (D)
Action (A)
Target
(T)
Intention (I)
Agency Services
Director: Agency
Action: Provide MTFC Services
Target: Clinical Support Team
Intention: Deliver MTFC Support Protocol
MTFC Intervention
Director: Clinical Support Team
Action: Deliver MTFC Support Protocol
Target: Foster Parent
Intention: Improve Parental Monitoring
Penn State Prevention Research Center and
Methodology Center
46
Director (D)
Action (A)
Target
(T)
Intention (I)
MTFC Support Training
Director: MTFC Training Organization
Action: Train Staff in Implementing MTFC Support
Target: Clinical Support Team
Intention: Deliver MTFC Support Protocol
Agency Services
Director: Clinical Support Team
Action: Deliver CDT Implementation
Target: Foster Parent
Intention: Improve Parental Monitoring
Penn State Prevention Research Center and
Methodology Center
47
Director (D)
Action (A)
Target
(T)
Intention (I)
CDT Program
Director: California Institute of Mental Health Broker
Action: Community Development Team
Target: County Social Services
Intention: Arrange Training of Agency
County Implementation Plan
Director: County Social Services
Action: Arrange Training of Agency
Target: Agency
Intention: Train Staff to provide MTFC Support
Penn State Prevention Research Center and
Methodology Center
48
Conclusions
• Randomized Implementation Trials (RITs) are likely to
be used, especially in this first phase of research
• Randomization to time of implementation, and also
type of implementation strategy, since communities
are not likely to tolerate non-active control.
• Major Outcomes of RITS are Time, Quality, Quantity;
potentially generic measures like SIC could be used.
• There is substantially less control in RITs, so
modifications in the design may need to happen.
• Modeling of implementation strategies will need
objects like DATIs to represent and simulate
implementation effects.
Penn State Prevention Research Center and
Methodology Center
49
Related Papers
Aarons GA, Hurlburt M, Horwitz SM. Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service
Sectors. Administration and Policy in Mental Health and Mental Health Services Research. In press.
Brown, C. H., P. A. Wyman, et al. (2007). "The role of randomized trials in testing interventions for the prevention of youth
suicide." International Review of Psychiatry 19(6): 617-631.
Brown, C. H., P. A. Wyman, et al. (2006). "Dynamic wait-listed designs for randomized trials: new designs for prevention of youth
suicide." Clinical Trials 3(3): 259-271.
Brown, Kellam, Muthen, Wang, Kaupert, Ogihara, Valente, McManus, Pantin, Szapocznik (in preparation). Partnerships for
Effectiveness and Implementation Research: Experiences of the Prevention Science and Methodology Group
Chamberlain P, Saldana L, Brown CH, Leve LD. Implementation of MTFC in California: A Randomized Trial of an Evidence-Based
Practice. In M Roberts-DeGennaro & SJ Fogel (Eds.) Empirically Supported Interventions for Community and Organizational
Change. Lyceum Books, Inc, In Press.
Chamberlain, P., Saldana, L., Brown, H., & Leve, L. (2010). Implementation of MTFC in California: A Randomized Trial of an
Evidence-Based Practice. In M. Roberts-DeGennaro, & S. J. Fogel (Eds.), Using Evidence to Inform Practice for Community and
Organizational Change (pp.218–234). Chicago: Lyceum.
Chamberlain, P, Marsenich L, Sosna, T, et al. (accepted for publication). Three collaborative models for scaling up evidence-based
programs.
Landsverk J, Brown C, Rolls Reutz J, Palinkas L, Horwitz S. Design Elements in Implementation Research: A Structured Review of
Child Welfare and Child Mental Health Studies. Administration and Policy in Mental Health and Mental Health Services
Research. 2011:1-10.
Penn State Prevention Research Center and
50
Methodology Center
References
Ideker T, Sharan R. Protein networks in disease. Genome Res 2008, 18, 644-652.
Landsverk, Brown, Chamberlain, Palinkas, Horwitz (in preparation). Design and Analysis in Dissemination and
Implementation Research
Luke DA, Harris JK, Shelton S, et al., 2010. Systems analysis of collaboration in 5 national tobacco control networks.
AJPH 100: 1290-1297.
O’Connell, Boat & Warner, 2009 Preventing mental, emotional, and behavioral disorders among young people:
progress and possibilities NAS
Palinkas and Soydan (2010). Translation and Implementation of Evidence Based Practice in Social Work: A Strategy for
Research
Sonsa, T., & Marsenich, L. (2006) Community Development Team Model: Supporting the model adherent implementation
of programs and practices. Sacramento, CA: California Institute for Mental Health Publication.
program: A baseline survey within a randomized controlled trial. Implementation Science.
Valente (2010), Social Networks and Health: Models, Methods and Applications
Wang, Saldana, Brown, Chamberlain (2010). Factors that influenced county system leaders to implement an evidencebased program: a baseline survey within a randomized controlled trial. Implementation Science.
Penn State Prevention Research Center and
Methodology Center
51
Download