Implementation Science and Quality Improvement Research:

advertisement
Center for the Study of
Healthcare Provider Behavior
Implementation Science and Quality Improvement Research:
Methods for Translating Research into Practice
Approaches to Evaluating QI
Interventions
Elizabeth M. Yano,, PhD,, MSPH
VA Greater Los Angeles HSR&D Center of Excellence
UCLA School of Public Health
AcademyHealth  Chicago, IL  June 2009
Overview
Approaches to evaluating implementation
of QI interventions
– Evaluation methods for different stages
– Moving from understanding the evidence to
national rollout studies
Case examples
– VA primary
i
care
– Depression collaborative care models (TIDES)
Designs for Developing Evidence Base
D
t
t work
k under
d optimal
ti l
Does a new ttreatment
conditions?
Hypothesis-generating, safety studies
HypothesisRandomized efficacy trials vs
vs. placebo or
vs. best available treatment
– Patients randomized to studyy arms within
each site, provider or clinical practice
– Patients selected to reduce complexity
– Aim to factor out patient, provider and
contextual variables (organizational and area)
– Study
St d ensures adherence
dh
to
t the
th protocol
t
l
Adapted from Rubenstein LV et al. Provider behavior design methods
Designs for Developing Evidence Base
Randomized Efficacy/Effectiveness Trials
Randomized at the patient level
– Complex or unselected patient populations
– Usual care or unselected providers
– Diverse clinical practices or organizations
– Treatment is discrete
Aim is still to factor out patient and
provider variables
p
Organizational or area context typically not
factored in ((or designed
g
for))
Adapted from Rubenstein LV et al. Provider behavior design methods
Methods on the Path to Implementation:
B
ildi IImplementation
l
t ti E
id
Building
Evidence
Identify the Target Problem
– Epidemiological studies
– Literature synthesis
– Expert panels
D fi B
Define
Bestt P
Practices
ti
– Literature synthesis, meta
meta--analysis
– Expert panels
Adapted from Rubenstein LV et al. Provider behavior design methods
Methods on the Path to Implementation:
Id
tif i T
t
tP
bl
&C
Identifying
Treatment
Problems
Causes
Assess care variations
– Epidemiologic designs (e.g., cohort studies)
– Quality measurement
Assess determinants of care variations
– Qualitative
Q
hypothesis generating
– Epidemiologic designs (e.g., cross
cross--sectional)
– Literature synthesis
Adapted from Rubenstein LV et al. Provider behavior design methods
Improving Delivery of Efficacious
Treatments
D
Develop
l and
d ttestt prototype
t t
interventions
i t
ti
– Literature review regarding variations and
d t
determinants
i
t
– Literature review on human behavior theories
( ti t & providers),
(patient
id ) organizational
i ti
l th
theories
i
Implementation interventions include
– Efficacious treatments AND
– Improved methods of treatment delivery
Adapted from Rubenstein LV et al. Provider behavior design methods
Approaches to Early Interventions
Aim to test in favorable environments
Aim for either the strongest intervention
(multi--component black box) or the most
(multi
feasible
E
Ensure
adherence
dh
tto protocol
t
l
– Personnel paid for by the study are involved
i carrying
in
i outt th
the iintervention
t
ti att the
th local
l
l
level
Adapted from Rubenstein LV et al. Provider behavior design methods
Formative Evaluations for Early
IInterventions
t
ti
Assess ways to improve safety
safety,
acceptability, feasibility
– Qualitative
Q lit ti d
data
t on iimplementation
l
t ti
– Qualitative improvement (e.g. industrial QI),
demonstration project design
– Monitoring outcomes, possible adverse
effects for each patient
Adapted from Rubenstein LV et al. Provider behavior design methods
Effectiveness Evaluations for Early
IInterventions
t
ti
Assess program effects on process and
outcomes
– Randomized quasi
quasi--experiments or rigorous
pre--post designs
pre
Randomize patients within providers if possible
Qualitative data
– Measure intervention fidelity
Quantitative data
– Outcome,
O t
process, subgroups,
b
costs
t
Adapted from Rubenstein LV et al. Provider behavior design methods
Develop Intervention Implementation
St
t i
Strategies
Are there still quality deficits/need for QI?
What are the best interventions?
– Meta
Meta--analysis, metameta-regression
– Expert panels
– Marketing approaches
What groups can be approached for
implementation?
– Diverse clinical practices within similar types of
organizations ideal
Adapted from Rubenstein LV et al. Provider behavior design methods
Evaluate Effectiveness of Intervention
IImplementation
l
t ti
Policy evaluation
– Pre
Pre--post
– Natural experiments
– Large
Large--scale demonstrations
E g Medicare
E.g.,
Adapted from Rubenstein LV et al. Provider behavior design methods
Features of Encouragement Designs
Randomized encouragement designs
– Randomized at the provider or organizational levels
– Intervention is implemented
p
by
y organizations
g
or
providers themselves, without direct action by study
– Minimal study dollars to support the intervention
– Qualitative
Q lit ti data
d t on fidelity,
fid lit organizational
i ti
l variables
i bl
Encouragement approaches have variation in
implementation by design
Adapted from Rubenstein LV et al. Provider behavior design methods
Desirable Features for Encouragement
D
i
Designs
Maximize the number of intervention vs. control
sites
Evaluate the effect of the program on random
i di id l entering
individuals
i the
h practice
i
– Weighting and imputation
– ? Identify program patients to the practice
Document training, materials, intervention
adherence
Propensity score approaches
Source: Yano, et al. The evolution of changes in primary care delivery underlying
transformation.
Am J Public
Health,
2007;97(12):2151-2159.
Adapted VA’s
from quality
Rubenstein
LV et al. Provider
behavior
design
methods
After Encouragement,
IInstitutionalization
tit ti
li ti
Ability to involve all relevant stakeholders
– Informatics
– Human resources management
– Ongoing education and training resources
– Provider
Pro ider behavior
beha ior ssupport
pport
Ability to sustain intervention financially
– Reimbursement policy
Ability to ensure ongoing QI/adaptation
– Performance measures, changing evidence
Adapted from Rubenstein LV et al. Provider behavior design methods
Evolving ResearchResearch-Clinical
Partnership
From researcherresearcher-control to control of
others
– Need for tools to support
pp their control and
“ownership” in ways that support fidelity
– Need for processes/procedures for ongoing
adaptation
d t ti (but
(b t within
ithi view
i
off evidence
id
b
base))
– Need for tools/materials to orient new team
members new leaders
members,
Need for ongoing consultation as evidence
changes new challenges emerge
changes,
Case Example: VA Primary Care
VA historically hospitalhospitalbased specialty oriented
Reputation
p
for p
poor care
Institute of Medicine panel
– What to do with the VA…
Clinton health care reform
– VA “wake“wake-up
p call”
Case Example: VA Primary Care
Identify the target problem
– Surveyed VA outpatient/primary care programs
Virtuallyy no primary
p
y care
Walk--in clinics, charts rarely available, very long
Walk
waits, limited generalist staffing, no teams
D fi b
Define
bestt practices
ti
– Literature synthesis showing primary care
models features associated with quality
models,
– Policy directive to implement primary care
based on evidence ((e.g.,
g , teams))
Sources: Rubenstein et al., VA Survey of Outpatient Services (1993); Yano et al., Arch
Internal Med (1995); VA Primary Care Directive (1994).
Case Example: VA Primary Care
Assess care variations
– Measured adoption of primary care models
and features via organizational survey
– VA primary care quality measures collected
through chart review and patient surveys
Chronic disease quality, preventive practice
Patient ratings of access, continuity, coordination
Sources: Soban & Yano J Ambul Care Manage (2005); Jackson et al., Am J Managed
Care (2002); Yano et al., HSR (2008).
Case Example: VA Primary Care
Assess determinants of care variations
– Site visits of early adopters
– Cross
Cross--sectional studies of differences by
urban/rural, academic vs. not, by gender
Develop and test prototype interventions
– Consensus development conference to
identify optimal primary care model
– Pilot model demonstration project
Sources: Weeks et al., J Rural Health, 2002; Yano et al., Women’s Health Issues,
2003; Goldzweig et al., Am J Manage Care (2004); Rubenstein et al., Acad Med 1995.
Case Example: VA Primary Care
Identify successful model characteristics
– Cross
Cross--sectional analyses of effectiveness of
different primary care models and features
Monitor implementation (“fidelity”)
– Ongoing organizational surveys assessing
changes in primary care delivery over time
Sources: VA Survey of Primary Care Practices (1999-2000); VA Clinical Practice
Organizational Survey (2007), VA Primary Care Survey (2009).
Case Example: VA Primary Care
Patient--centered medical homes (PCMH)
Patient
– Consensus development meetings
– Demonstration project(s)
– Literature synthesis
A
Assess
care variations
i ti
and
dd
determinants
t
i
t
– Assess level of PCMH implementation
– Area and organizational factors related to
PCMH implementation
Evolution of VA Primary Care Delivery
Percent of VAs w/PC Program
100
90
80
PC budget
PC-based QI
Pt-PCP assignment
PC teams
PC Teams
Pt assignment
i
t
70
60
PC-based QI
50
40
Separate PC budget
30
20
10
0
1993
1996
1999
Source: Yano, et al. The evolution of changes in primary care delivery underlying
VA’s quality transformation. Am J Public Health, 2007;97(12):2151-2159.
Primary Care Features Associated with
Diff
lit
Differences iin Q
Quality
Colorectal cancer screening rates
– Sufficiency of resources for clinical support
arrangements, practice autonomy, size
Breast and cervical cancer screening rates
– Better in VAs with stronger PCPC-subspecialist
coordination
di ti (timely
(ti l consultlt results)
lt )
– Worse in VAs with strict PC assignment
– Better in VAs with women’s
women s health clinics
Diabetic control (HbA1c)
– Better in VAs with chronic care model components
Sources: Yano, et al. (2006); Jackson, Yano, et al (2004); Goldzweig, Yano, et al. (2004).
Case Example: Depression
Identify target problem
– Epidemiologic studies on prevalence, impact
of depression and location in primary care
Define best practices (guidelines)
– Literature synthesis/meta
synthesis/meta--analysis showing
CBT and antidepressants equally effective
– Expert panel methods to develop guidelines
Adapted from Rubenstein LV et al. Application of QUERI evaluation methods.
Case Example: Depression
Assess care variations
– Quality measures based on medical record
and survey
– Worse care for minorities, managed care
identified in large studies
Develop intervention models and evaluate
effectiveness
– Provider behavior and QI theory for design
– Randomized trials of collaborative care
Adapted from Rubenstein LV et al. Application of QUERI evaluation methods.
Case Example: Depression
C
Collaborative C
Care
Identify successful model characteristics
and develop implementation models
– Qualitative research on models
models, organizations
organizations,
predictors of success and barriers
– Quasi
Quasi--experiments based on diffusion
diffusion,
provider behavior change, and QI theory
– Literature synthesis and metameta-analysis
Adapted from Rubenstein LV et al. Application of QUERI evaluation methods.
Case Example: Depression
C
Collaborative C
Care
Identify successful implementation models,
implement as routine policies/procedures
– Quality improvement theories used to engage
organizations and their leaders
– Quality
Qua ty improvement
p o e e t type measurement
easu e e t
– Qualitative research on organizations/models
– Policy analysis and theory to understand/foster
policy uptake, incentive changes for spread
Evaluate system performance measures
Adapted from Rubenstein LV et al. Application of QUERI evaluation methods.
VA ResearchResearch-to
to--Practice Implementation:
Trajectory
j
y Toward National Rollout
Depression
Collaborative
Care Model
TIDES WAVES
Process
Evaluation
Outcomes
Evaluation
20+ year evidence base (efficacy/effectiveness)
Single and multi-component interventions
Multiple settings
COVES
Cost Assessment
Stakeholder Analysis
Formative Evaluation
ReTIDES
Impact evaluation
Cost-effectiveness
National “bridge” interviews
Process tools
DESIGN PROGRESSION
Building a Clinical-Research Partnership
National
Rollout
Ongoing
Performance
Monitoring
PHQ--9 Score Baseline to 24
PHQ
Weeks
14
12
10
8
6
4
2
0
Baseline
4-6 wk
8-12 wk
16 wk
20 wk
24 wk
Q
lit ti E
l ti
Qualitative
Evaluation
Semi--structured interviews during field visits
Semi
– Regional (VISN) and medical center managers
– Providers,
Providers administrators
administrators, and consumers
– Focused on perspectives on the care model
and the implementation process
Documentation of process, timeline, and
costs associated with adaptation and
implementation
Source: Kirchner J, Liu C-F, Parker LE, Ritchie M, Fickel JJ, Yano EM. Cost and value
evaluation (COVES).
Costs of Implementation
Broad range of involvement for both researcher
consultants and participating VISNs/sites
– 128 persons contributed 3,764 hours with a total cost
of $296,346 over 4 years
Partner
p
People
Hours
Cost
VISNs/Sites
86
1,190
$83,581
Researchers
42
2,573
$212,765
Data sources: Project records, ~20,000 project emails, personnel survey.
Source: Liu, Rubenstein, et al. HSR 2009.
PC
C Provider
o de Penetration
e e a o
% PCPs Started 1st 6 Months
Referrals/PCP FTEs
100
30
% PCPs Started
Consults/FTE
90
25
80
70
20
60
50
15
40
10
30
20
5
10
0
0
A1
A2
Network #1
B1
B2
Network #2
B3
C1
C2
Network #3
Evaluating Variable Implementation
Study designs need to account for variable
implementation (designed & unintentional)
Be cognizant of SQUIRE guidelines
Data sources include patients, providers,
managers and
d the
th iintervention
t
ti tteam
Track organizational outputs (time, costs)
Analysis needs to be at multiple levels
– Implementation measures should map to them
Triangulation across data sources key
Download