Evaluation for Accountability: Myth or reality?

advertisement
Making Contribution Claims
IPDET 2011
Ottawa
John Mayne, Ph D
Advisor on Public Sector Performance
Adjunct Professor, University of Victoria
john.mayne@rogers.com
The context
• An intervention is expected to contribute to
certain desired results
• The desired results have been observed to
occur
• No single factor likely ‘caused’ the results;
there are several players involved
• Alternative approaches (such as RCTs, quasiexperiments) not available or possible
• But there is a need to say something useful
about the contribution the intervention is
making: Are you making a difference?
John Mayne
Advisor on Public Sector Performance
2
Theory-based Evaluations
• Growing acceptance of the need for
theory-based approaches
• To better design interventions
• To understand what works where and when
• Numerous approaches
• Realist evaluation
• Theory of change approaches
John Mayne
Advisor on Public Sector Performance
3
A results chain
activities
(how the program carries
out its work)
outputs
(goods and services
produced by the program)
Immediate outcomes
(the first level effects of the
outputs)
intermediate outcomes
(the benefits and changes
resulting from the outputs)
end outcomes
(impacts)
(the final or long-term
consequences)
John Mayne
Advisor on Public Sector Performance
Examples
negotiating, consulting, inspecting,
drafting legislation
Examples
checks delivered, advice given,
people processed, information
provided, reports produced
Examples
actions taken by the recipients, or
behaviour changes
External
Factors
Examples
satisfied users, jobs found, equitable
treatment, illegal entries stopped,
better decisions made
Examples
environment improved, stronger
economy, safer streets, energy saved
4
Results chain links
activities
(how the program carries
out its work)
outputs
Why will
these
immediate
outcomes
come about?
(goods and services
produced by the program)
Immediate outcomes
(the first level effects of the
outputs)
intermediate outcomes
(the benefits and changes
resulting from the outputs)
end outcomes
(the final or long-term
consequences)
John Mayne
Advisor on Public Sector Performance
Examples
negotiating, consulting, inspecting,
drafting legislation
Examples
checks delivered, advice given,
people processed, information
provided, reports produced
Examples
actions taken by the recipients, or
behaviour changes
External
Factors
Examples
satisfied users, jobs found, equitable
treatment, illegal entries stopped,
better decisions made
Examples
environment improved, stronger
economy, safer streets, energy saved
5
Theories of change
• A results chain with embedded
assumptions, risks and other
explanatory factors identified
• An explanation of what has to happen
for the results chain to work
Reduction in
smoking
Anti-smoking
campaign
Assumptions: target is reached, message is
heard, message is convincing, no other major
influences at work
Risks: target not reached, poor message, peer
pressure to smoke very strong
Other Explanatory Factors: reduction due to
trend pressure or price increases
John Mayne
Advisor on Public Sector Performance
6
A Generic Theory of Change
End Results
Theory of Change
Unintended effects
Behaviour
changes
Unintended effects
External Influences
Assumptions: How do
external factors influence
the realization of the
intervention’s ToC?
Changes in
knowledge, attitudes
skills, opportunities
and incentives
Risks: Risks to the links in
the ToC not occurring as
expected.
Reach &
Reaction
Unintended effects
John Mayne
Advisor on Public Sector Performance
changes in the target population expected
to influence the desired end result? What
has to happen? What factors influence
these processes?
Risks: Risks to the link not occurring.
Other Explanatory Factors: Socioeconomic factors
Assumptions: How are changes in
knowledge, attitudes, skills, opportunities
and/or incentives expected to change
behaviour? What has to happen? What
factors influence these processes?
Risks: Risks to the link not occurring.
Other Explanatory Factors: Peer or trend
pressure; other interventions
Assumptions: How does the intervention
Unintended effects
Other Explanatory
Factors: Socio-economic
factors; other interventions
Assumptions: How are behavioural
Activities and
Outputs
expect to enhance knowledge, attitudes
skills, opportunities and/or incentives?
What has to happen? What factors
influence these processes?
Risks: Risks to the link not occurring.
Other Explanatory Factors: Other
interventions; self-learning
Assumptions: How and to what extent
does the intervention output expect to
reach people? What has to happen? What
contextual factors influence these
processes?
Risks: Risks to the link not occurring.
7
Addressing causality
• “The only way to deal with causality is to
use a counterfactual”
• NOT TRUE
• Philosophy of science discusses several
alternative perspectives on causality
• Successionist (Hume & Mill’s Methods of
Agreement and Differences)
• Generative (mechanistic, or process
causality)
John Mayne
Advisor on Public Sector Performance
8
Addressing Causality
• The gold standard debate (RCTs et al)
• Intense debate underway, especially in
development impact evaluation
• In concept, RCTs may be great.
• In practice, RCTs have problems and
often limited applicability
• Then what do we do?
John Mayne
Advisor on Public Sector Performance
9
Causal Questions
1. Has the intervention caused the result?
o What would have happened without the
intervention?
2. Has the intervention made a difference?
o What contribution has the intervention
made?
3. Why has the result occurred?
o What role did the intervention play?
John Mayne
Advisor on Public Sector Performance
10
Mechanistic Causation
• Process or generative causality
• Tracing the links in the theory between
events
• The alternative to successionist
(counterfactual) approaches—variation
causality
• Everyday causality: auto mechanic, air
crashes, forensic work, doctors
John Mayne
Advisor on Public Sector Performance
Contribution analysis:
the theory
• There is a theory behind the
intervention with expected results
• The activities of the intervention were
implemented as planned
• The intervention theory is supported
by evidence; the sequence of results is
being realized, assumptions are holding
• Other influencing factors have been
assessed and accounted for
12
John Mayne
Advisor on Public Sector Performance
The Contribution Claim
Therefore,
• It is reasonable to conclude that the
intervention is making a difference—it
is contributing to (influencing) the
desired results
• This is taking a mechanistic approach
to causality: understanding &
confirming the causal mechanisms at
work in an intervention
John Mayne
Advisor on Public Sector Performance
13
Contribution analysis:
the practice
1. Set out the attribution problem
2. Critically develop the expected theory
of change
3. Gather the existing evidence
4. Assess the contribution story
5. Seek out additional evidence
6. Revise & strengthen the contribution
story
John Mayne
Advisor on Public Sector Performance
14
Developments in CA
• EES Prague Conference
• CA Forum website http://www.intevalgroup.org/-Forum-.html
• Upcoming Evaluation journal Special
issue on CA
• DfiD work on alternative methods
John Mayne
Advisor on Public Sector Performance
15
Main Messages
• Results chains, et al should not be seen
as theories of change
• Counterfactuals are not necessary (nor
sufficient) for ‘proving’ causality
• Key impact evaluation questions should
be: Why has the result occurred? What
has been the intervention’s contribution?
• Contribution analysis (and related
approaches) produce contribution claims
John Mayne
Advisor on Public Sector Performance
16
Some References
• Mayne, J. (2011). Addressing Cause and Effect in Simple and
Complex Settings through Contribution Analysis. In
Evaluating the Complex, R. Schwartz, K. Forss, and M. Marra
(Eds.), Transaction Publishers.
• Mayne, J. (2008). Contribution Analysis: An Approach to
Exploring Cause and Effect, ILAC Brief. Available at
http://www.cgiarilac.org/files/publications/briefs/ILAC_Brief16_Contributio
n_Analysis.pdf
• Mayne, J. (2001). Addressing Attribution through
Contribution Analysis: Using Performance Measures Sensibly.
Canadian Journal of Program Evaluation, 16(1), 1-24. See also
http://www.oagbvg.gc.ca/domino/other.nsf/html/99dp1_e.html
• Funnell, S. and P. Roggers (2011). Purposeful Program Theory.
John Mayne
17
Jossey-Bass.
Advisor
on Public Sector Performance
Download