Implementation and process evaluation: developing our approach

advertisement
Implementation and process
evaluation: developing our
approach
Neil Humphrey
Gemma Moss
Ann Lendrum
University of Manchester
Institute of Education
University of Manchester
Starting activity
• What is your understanding of the terms
‘process’ and ‘implementation’?
• Try to come up with working definitions for
each in your group
• It may be helpful to think about how these
terms could be applied and understood in
relation to an intervention you are
evaluating
Introduction
• Understanding ‘what works’ is important
but it is equally important to know, “why
various programs do or do not work, for
whom and under what conditions they
work, what is needed to scale up proven
programs, and what policy supports are
needed to scale them up without losing
their effectiveness” (Slavin, 2012,p.xv)
What do we mean by implementation
and process evaluation?
• Put very simply…
– Assessment of outcomes in trials answers the question does it
work?
– Implementation and process evaluation (IPE) helps us to
understand how and why
• IPE within trials explore, “the implementation, receipt and
setting of an intervention and help in the interpretation of
outcomes” (Oakley et al, 2006, p.413)
– Studying how the intervention is implemented (including how and
why this varies)
– Ascertaining the views of key participants (e.g. pupils, teachers)
of critical issues (e.g. social validity)
– Distinguishing between different intervention components
– Investigating contextual factors that may influence the
achievement of expected outcomes
What do we mean by implementation
and process evaluation?
• IPE can help to clarify whether assumptions in the
intervention design about the causal links from intervention to
impact work out in practice
• By paying attention to the social processes involved in making
the intervention happen, IPE can help us to
–
–
–
–
–
Know what happened in an intervention
Establish the internal validity of the intervention and strengthen
conclusions about its role in changing outcomes
Understand the intervention better – how different elements fit
together, how users interact et cetera
Provide ongoing feedback that can enhance subsequent
delivery
Advance knowledge on how best to replicate intervention
effects in real world settings (Domitrovich and Greenberg,
2000)
A worked example
•
Secondary SEAL national evaluation
(Humphrey, Lendrum & Wigelsworth, 2010)
– Outcomes strand
•
•
Pre-test-post-test control group design (41
schools, c.8,000 pupils)
Primary outcomes were social and emotional
skills, behaviour, mental health
THEORY
– IPE strand
•
•
•
•
Longitudinal cases studies in 9 SEAL schools
Interviews and focus groups, observations,
document analysis
Analysis of data from outcomes strand
indicated that SEAL had no measurable
impact
Analysis of data from IPE strand helped us
to understand why this was the case
– Issues with the programme theory (or lack
thereof)
– Implementation failure
– Lack of understanding, resistance among staff
EVALUATION
IMPLEMENTATION
Why is IPE important?
• Interventions are rarely (if
ever!) delivered exactly as
planned
• Variability in implementation
has been consistently
shown to predict variability
in outcomes
• Interventions do not happen
in a vacuum – so
understanding context and
the social processes within
them is crucial
Teacher-rated SDQ peer problems
0.1
0
Low PR
Moderate PR
High PR
-0.1
PATHS
-0.2
Control
-0.3
-0.4
InCAS Reading
3.5
3
2.5
2
PATHS
1.5
Control
1
0.5
0
Low dosage
Moderate
Dosage
High Dosage
Putting the I in IPE
•
•
•
Aspects of implementation
– Fidelity/adherence
– Dosage
– Quality
– Participant responsiveness
– Programme differentiation
– Programme reach
– Adaptation
– Monitoring of comparison conditions
Factors affecting implementation
– Preplanning and foundations
– Implementation support system
– Implementation environment
– Implementer factors
– Programme characteristics
See Durlak and DuPre (2008), Greenberg et al (2005), Forman et al (2009)
Researching IPE activity
• Think about a trial you have recently completed
• Did you have an IPE built into your trial?
• If YES…
– What information did you collect and why?
– What did the data generated tell you about the intervention, the
context in which it was being implemented, and the interaction
between the two?
– One useful piece of information you were able to feedback to the
intervention designers/implementers/participants/funders about
the process of implementation
• If NO…
– What were the main findings of the trial?
– What conclusions did you draw as a result?
– What could IPE have added?
Developing our approach to IPE
• General approach - quantitative, qualitative or both?
– 62% quant, 21% qual, 17% both in health promotion
research (Oakley, 2005)
• Where to target resources?
– Intervention, context, and the interaction between the two
• Which aspects to assess when examining
implementation?
– In assessment of implementation the focus is
predominantly on fidelity and dosage, but this can lead to a
Type III error
• What you see and what people tell you
– Implementer self-report or independent observation?
Benefits of IPE
• For implementers
• explains and illuminates findings from the impact evaluation,
strengthening the conclusions impact evaluators draw
• For intervention designers
• clarifies whether assumptions embedded in the programme
about “what works” are warranted
• indicates how the programme might need to adapt to
unforeseen contextual factors that influence implementation
• enables designers to engage with programme participants’
perspectives on the programme, giving participants a voice
• For EEF, helps to clarify
• the relative effectiveness of different “theories of change”
• the necessary and sufficient conditions under which different
programme logics might work best
Sources of further information and
support
•
Some reading
– Lendrum, A. & Humphrey, N. (2012). The importance of studying the implementation
of school-based interventions. Oxford Review of Education, 38, 635-652.
– Durlak, J.A. & DuPre, E.P. (2008). Implementation matters: A review of research on
the influence of implementation on program outcomes and the factors affecting
implementation. American Journal of Community Psychology, 41, 327-350.
– Kelly, B. & Perkins, D. (Eds.) (2012). Handbook of implementation science for
psychology in education. Cambridge: CUP
– Oakley, A. et al (2006). Process evaluation in randomised controlled trials of complex
interventions. British Medical Journal, 332, 413-416.
•
Organisations
– Global Implementation Initiative: http://globalimplementation.org/
– UK Implementation Network: http://www.cevi.org.uk/ukin.html
•
Journals
– Implementation Science: http://www.implementationscience.com/
– Prevention Science: http://link.springer.com/journal/11121
Download