Rapid Cycle Evaluation of Health System Innovation

advertisement
Rapid Cycle Evaluation of Health
System Innovation
November 14, 2012
1:00-2:30 PM ET
Marsha Gold, Sc.D., Mathematica Policy Research
David Dyjack, Dr.PH., C.I.H., National Association of County
and City Health Officials (NACCHO)
AcademyHealth: Improving
Health & Health Care
AcademyHealth is a leading national organization serving the fields of health services
and policy research and the professionals who produce and use this important work.
Together with our members, we offer programs and services that support the
development and use of rigorous, relevant and timely evidence to:
1.
2.
3.
Increase the quality, accessibility and value
of health care,
Reduce disparities, and
Improve health.
A trusted broker of information, AcademyHealth
brings stakeholders together to address the current
and future needs of an evolving health system,
inform health policy, and translate evidence into action.
Learn More

Join our Public Health Systems Research
(PHSR) Interest Group
– PHSR examines the organization, financing, &
delivery of public health services within
communities, and the impact of those services on
population health outcomes
– Visit www.AcademyHealth.org/InterestGroups

Sign up for our newsletter, Methods Minute
– Email HSRMethods@AcademyHealth.org
Be Active
Visit http://my.AcademyHealth.org
 Follow @PHSR_AH
 Submit to our Call for Abstracts for
2013 Annual Research Meeting

– Themes= Methods Research, Public
Health
The audio and slide presentation will be
delivered directly to your computer



Speakers or headphones are required to hear the audio
portion of the webinar.
If you do not hear any audio now, check your computer’s
speaker settings and volume.
If you need an alternate method of accessing audio, please
submit a question through the Q&A pod.
Technical Assistance

Live technical assistance:
– Call Adobe Connect at (800) 422-3623

Refer to the ‘Technical Assistance’ box in the
bottom left corner for tips to resolve common
technical difficulties.
Questions may be submitted at any
time during the presentation
To submit a question:
1.
2.
Click in the Q&A box
on the left side of your
screen
Type your question
into the dialog box and
click the Send button
Accessing PowerPoint Presentations


The PowerPoint presentation used during this webinar
can be found in the “Downloadable Files” pod.
Select the file from the list and click “Save to my
computer”
Moderator
David Dyjack, Dr.PH.
National Association of County
and City Health Officials
(NACCHO)
Faculty
Marsha Gold,Sc.D.
Mathematica Policy
Research
Objectives





Review evaluation expectations associated with
“rapid cycle health system innovation”
Identify ways to shape innovations to help support
their ability to generate rapid feedback on
implementation progress and early results
Identify the challenges in answering “what works,
where and for whom,” and how these questions can
be addressed
Address these issues in real life situations
Identify unique challenges in applying this approach
to public health settings
Rapid Cycle Evaluation of Health
System Innovation
AcademyHealth Webinar
November 15, 2012
Marsha Gold, Sc.D., Senior Fellow
12
Session Goals

To review evaluation expectations associated
with “rapid cycle health system innovation”

To identify how to shape innovations so that they
generate rapid feedback on implementation progress and
early results

To identify and address challenges in answering “what
works, where, and for whom”— more or less rapidly

To help participants prepare to address these issues in
real-life situations

Identify challenges unique to public health settings
13
Topics To Be Discussed

Why the interest in rapid cycle change

First principles: practices that are key to any successful
evaluation

Defining evaluation goals and constraints,
and making appropriate trade offs

Impact analysis: techniques for rigorous analysis of
outcomes as a measure of intervention success
14
The Push for Innovation
and Rapid Cycle Feedback
15
The Push for Rapid Innovation

Health care costs continue to rise while pressure on
resources grows

Push for more efficient, effective systems

Growing interest in public health approach as means of
shifting the perspective from treating individuals to
leveraging social determinants
of health and to community and population-based
prevention and health promotion

Tremendous interest in rapid learning, often in context of
limited resources (including data)
16
Applications in Multiple Contexts

Clinical care

Population-level preventive health

Enhancing administrative processes

Answering policymakers’ need for accountability: was it
implemented, did it work, any problems?
17
Context: PSDA Orientation

Rapid cycle change draws on experience with smaller
scale change efforts: P(lan), D(o), S(tudy), A(ct) cycles, IHI
Breakthrough series

Empowering those at the operational work unit level is
key to learning from workflow redesign
to generate rapid feedback and learning

Identify key indicators: measure over time,
keep practice diary, adjust practice incrementally as
necessary

Example: simple actions to increase immunization level
18
Challenges for More Broad-Based Innovations

Complex interplay of systems and organizational
dynamics: multi-level change and multiple influences

Did the change really lead to improvement? What else
could explain it? What would have happened otherwise?

Evaluation goals could differ at different points: refine
concept, show feasibility, encourage spread and learn
about how to achieve it, test permanent change
19
RE-AIM Framework = Five Steps to Translating Research
into Practice

Reach your intended target population

Effectiveness or efficacy

Adoption by target staff, settings, or institutions

Implementation consistency, costs, and adaptations made
during delivery

Maintenance of intervention effects in individuals
and settings over time
20
First Principles: Important Practices
in Any Evaluation
21
Define the Innovation Logic

What is being changed (the “innovation”)?

What outcomes are sought as a result of the change, in
what population/organizations, and over what period?

Why do we think the action will result in change (logic,
evidence to support it)?

What else could influence change?

How complex and multi-layered are the processes?
22
Simple Logic Model
Personnel
Resource
Inputs
Target population
Activities
(processes)
and how
they will
change
Number
immunized
Outreach
Immunization
clinics
23
Outcomes
Population reached
Key elements of organizational transformation
to deliver high-quality patient care
Lukas, Carol; Holmes, Sally; Cohen, Alan; Restuccia, Joseph; Cramer, Irene; Shwartz, Michael; Charns, Martin. 2007.Transformational change in health care
systems: An organizational model. Health Care Management Review. 32(4):309-320
Factors That Complicate Logic Model – I

Available resources constrain outreach
and modalities used (power of intervention)

Subgroups respond differently to outreach (targeting,
reach)

People may try to come but leave before getting
immunized, e.g., vaccine availability, transportation,
logistical barriers, illness,
not return for f/u (implementation success)
25
Factors That Complicate Logic Model – II

Individual results are not the same for the population (e.g.,
other sources of immunizations, data gaps)

Environmental complexity (e.g., competing messages in the
press on value of prevention)

Intermediate and ultimate outcomes may differ
(e.g., effectiveness of immunization)
26
Key Thoughts for Effective Approaches

Identifying the level(s) within the system where change
sought

Conceptualizing models of change that build on evidence
and account for messy realities

Distinguishing between intention and reality
(what really changed and when, what didn’t)

Measuring outcomes relevant to time frame in ways that help
“control” for other influences

Thinking about replicability and applicability
27
Concerns to Consider

What is level(s) of the organization(s) for innovation?

What population is the target and with what intensity of
intervention over what time?

Who needs to make this happen? Can keep it from
happening?

What tools can be used to create alignment across
participants and processes and with goals?

What is the “theory of action”/logic model?
28
Defining Evaluation Goals, Constraints,
Trade-Offs
29
Conceptualizing “Intervention” and “Goals”
“Would you tell me, please, which way I ought to go from
here?” asked Alice.
“That depends a good deal on where you want to
get to,” said the Cat.
“I don’t care where,” said Alice.
“Then it doesn’t matter which way you go,”
said the Cat.
-- Lewis Carroll
30
Design and Formative Feedback

Research synthesis: what do we know about whether this
is likely to work?

Conceptualization: logic models and other ways
of clarifying intervention and learning

Process data: early indicators of implementation and
characteristics

Interviews, diaries, etc: why is implementation is rolling
out as it is?

Short-term outcomes: early evidence for whether results
are as expected
31
Role of Program Manager Versus Evaluator (or not)
Evaluator
Influence
Manager
Influence
Types of
Evaluation
Manager
Dominated
Collaboration
Relevance,
Progress
Efficiency,
Effectiveness
Source: Veney and Kaluzny, Figure 1-3 Manager-evaluator collaboration
32
Evaluator
Dominated
Impact
Documenting and Learning from Innovation

Efficiency: investing in shared metrics and approaches
for cross-site learning
– Characteristics of innovations
– Characteristics of context
– Common metrics of success

Realistic expectations: implementation always takes
longer than expected and more so if the context is
complex

Minimize barriers that slow or drain momentum
33
Did It Work, Where, Why, and for Whom?

Summative or impact evaluation
– Examines outcomes and impacts
– If success different from what was expected , could be
because
(1) not implemented
(2) poor or faulty implementation
(3) underlying theory was wrong

Could work but later, only for a short time, only in
a few places, only under right conditions—or might not
yield much
34
Evidence to Support Broad-Based Program Change

What will/should be the standard of evidence?

CQI (Plan-Do-Study-Act) versus traditional standards HSR—
balancing potential risks/rewards
35
If You Really Want to Know If It Worked
(“Impact Analysis”)
36
Traditional Criteria for “Good” Design – I

Clarity of questions to be addressed

Defined standard of proof
– Design that can achieve this standard with appropriate
counterfactual and power

Internal validity
– Design with data for appropriate comparisons
to rule out alternative explanations for results
37
Criteria for Good Design – II

External validity
– Findings applicable to other circumstances
of interest

Feasible with time, resources, data
38
Five Questions to Ask

What is the question? What relationship is being
examined?

What else might affect or explain observed relationships?
How does the design control for
or take these into account?

What’s the comparison?

What are the measures? What’s the data?

How big an effect is necessary for success?

To what other time, place, populations,
or circumstances will/can the results be applied?
39
Historical Approaches to Impact Evaluation

Careful definition of target population for the purpose of
judging success

One or more comparison groups “otherwise similar” to
serve as benchmark

Metrics typically constructed from centralized data files,
existing or new

Long time frame to distinguish short-term effects from
stable long-term effects
40
Common Designs

Random assignment (patients, organizations)
with control group

Quasi-experimental designs
– Matched comparisons
– Regression-adjusted comparisons
– Discontinuity designs

Retrospective multivariate analysis using
available data

Can consider staging first with groups where intervention
is most likely to be effective
41
Likely Reality of Innovation Testing

“Broad-based national or state” demonstrations across
widely divergent organizations

“Bottom up” innovation with variation in detail across
sites

“Contaminated” comparison groups with attrition that
may be differential

Desire for rapid feedback on “right direction”
even though some effects may take time to surface
42
How HSR Can Contribute

Support for advance planning, conceptualization,
common metrics

Techniques that take advantage of variation

Ways to use (and improve) operational data

Innovative comparisons that address problems

Integrated analysis of process and outcomes
to enhance understanding of “what works,
for whom, and where” with enough evidence
to convince audience and help them apply results
43
Realism on Standards of Evidence

Trade-offs between “type 1” and “type 2” errors

Moving too quickly versus too slowly
on improvements: how “good” are things now,
what risks are there in change?

Congressional history
– Legislators have acted before evaluations are done
– They have also failed to act despite evaluation results showing
what was or was not successful

Minimum: trend key indicators over time and against
counterfactual, and identify fidelity of implementation
44
Theory versus Reality

The relevance of “case studies” and qualitative insight to
complex processes (Greenhalgh et al. 2011)

Demonstrating the feasibility of implementation
is an important and potentially powerful outcome
45
Priorities for Attention by Implementers
Utility of pilots enhanced by high-quality evidence that:

Clarifies in advance what is to be tested and why

Provides ongoing, consistent, and timely measures
of intended and unintended changes

Includes appropriate analysis to aid in attribution

Gives “enough” information on context and implementation to
allow suitable spread to be assessed
by diverse group of stakeholders
46
Closing Points
47
Ultimate Policy/Research Challenges

Distinguish useful initiatives from efforts that mainly
preserve the status quo or are actually harmful

Avoid stifling innovation to improve system because “no
data are good enough”

Appropriate trade-offs likely to vary across innovations at
different stages or with different risks/rewards

Being nimble when it is clear things aren’t working but
not jumping too fast at short-term results
48
Front Line Contributions

Recognize the value of solid conceptualization

Support metric development and sharing

Document progress and reasons for change

Be open to partnering and learning

Be realistic (though that may be in tension with buy-in)
49
Collaborate Early

Need to think about evaluation in developing program if
important:
– Design features to facilitate evaluation
– Data requirements to facilitate evaluation
– Collection of baseline data
– Important to bring evaluator in early, though contractual
requirements could interfere
– Could bring in evaluation perspective even if evaluator not
selected

If evaluation objectives are important, policymakers will have
to support them
50
Small Steps Can Make a Big Difference

Fuzzy logic leads to confusing results

Be clear on what is being changed and what outcomes are
of interest

Capture data on key parameters of change

Acknowledge reality: pace/nature of implementation,
environment of change, competing processes

Seek help to put results in context of existing
studies/knowledge and pursue more rigorous analysis
where warranted
51
For More Information

Please contact:
– Marsha Gold mgold@mathematica-mpr.com
52
Mathematica® is a registered trademark of Mathematica Policy Research.
Some Sources of Additional Guidance – I
 M. Gold, D. Helms, and S. Guterman. “Identifying, Monitoring
and Assessing Promising Innovation.”
New York: The Commonwealth Fund , 2011.
 B. Dowd and R. Down. “Does X Really Cause Y?”
AcademyHealth. Available at www.academyhealth.org.
 W.W. Kellogg. “Logic Model Development Guide.” Updated
January 2004 and available on line at www.kff.org.
 R. Pawson and N. Tilley. Realistic Evaluation.
London: Sage Press, 1997.
53
Some Sources of Additional Guidance – II
 M. Hargreaves. “Evaluating System Change: A Planning Guide.”
Mathematica Policy Research, April 2010. Available at
www.mathematica-mpr.com.
 M.Q. Patton. Utilization-Focused Evaluation. Thousand Oaks,
CA: Sage Publications, 2008.
 D.T. Campbell and J.C. Stanley. Experimental and
Quasi -experimental Designs . Chicago: Rand McNally and Co.,
1972.
 T. Greenhalgh, J. Russell, R.E. Aschcroft , and
W. Parsons. “Why National eHealth Programs Need Dead
Philosophers: Wittgensteinian Reflections on Policymakers
Reluctance to Learning from History.” Milbank Quarterly, vol.
894, 2011, pp. 533-563.
54
Submitting Questions
To submit a question:
1.
2.
Click in the Q&A box
on the left side of your
screen
Type your question
into the dialog box and
click the Send button
Survey

Please fill out a brief evaluation of this
webinar. The survey will pop up at the end of
the webinar, or can be accessed here:
https://www.surveymonkey.com/s/phsrrapidcycle
Thank You!
Please, remember to take a minute and fill out our brief survey.
https://www.surveymonkey.com/s/phsrrapidcycle
www.academyhealth.org/phsr
Download