“Every program for improving the lives of poor people in

advertisement
“Every program for improving the lives of poor people in
developing countries begins with an intuition about what
will work. However, the hopes and good intentions of
program implementers, coupled with the human tendency
to seek only confirming evidence, leads to programs being
initiated and even replicated without learning whether they
actually work. In the arena of social change, the null
hypothesis is more than an abstract statistical concept. It
reflects the reality that many intuitively obvious theories do
not in fact produce their intended outcomes. The Evaluation
Working Group’s proposals offer the beginning of a remedy
to this pervasive problem.
—Paul Brest, Former President
The William and Flora Hewlett Foundation
Impact Evaluation: How Can
We Learn More? Better?
Conference July 17, 2013
Impact Evaluation?
Where have we been?
Where are we going?
William D. Savedoff, Senior Fellow
CGD-3ie Conference
Washington, DC
July 17, 2013
Overview
•
•
•
•
•
Motivation
What happened? (from one perspective)
What followed?
Where are we?
Where are we going?
Motivation: Why did we ever talk
about an “Evaluation Gap?”
Everyone has their own story . . .
Results from Millions Saved (2002)
Experts nominated 56 public health interventions
56
Excluded
• 27 b/c impact could
not be documented
• 12 b/c too early or
small scale
Included
• 17 identified and
documented
Evaluation Gap Working Group
On methods:
– Start with the question and choose the
method which is most “appropriate,
feasible and rigorous to answer it”
Recommendations:
– Strengthen existing initiatives and
complementary forms of evaluation
– Create a new independent facility to
promote more and better quality studies
Knowledge is a public good
How to generate collective action?
• Split impact evaluation process from program
approval and implementation so it can be:
– selective
– concentrate financial & technical resources, and
– be perceived as independent & credible
• Link impact evaluation process to program
design so:
– questions will be relevant
– data collection will be appropriate
– conclusions can be rigorous
Founding of 3ie 2009
Our vision
• Improving lives through impact evaluation
Our mission
• Increase development effectiveness through better use of
evidence in developing countries
Our strategy
• We aim to
• Generate new evidence of what works
• Synthesise and disseminate this evidence
• Build a culture of evidence-based policy-making
• Develop capacity to produce and use impact evaluations
Source: www.3ieimpact.org
Impact Evaluations by Year, 1985-2011
140
3ie Founded
Number of Impact Evaluations
120
World Bank - SIEF
100
World Bank - DIME
80
Evaluation Gap
Working Group
60
40
OECD Expert Group
on Aid Evaluation
JPAL Starts
First Progresa
Studies
20
0
1985
1989
1994
1998
Source: 3ie Database of Impact Evaluations
2001
2004
2007
2010
12
10
Selected Impact Evaluations by Year of Publication,
1994-2012 using Journal of Economic Literature
8
6
4
2
0
1994
1996
1998
2000
2002
2004
2006
2008
2010
2012
Source: Tally from search in Journal of Economic Literature using terms “impact
evaluation” and “development” – 55 out of 198 tentatively identified as original
impact evaluations relevant to development policy questions. Only illustrative for
use as a cross-check on previous slide.
Impact Evaluations by Sector, 1985-2013
Social Protection
Health
Not specified
Education
Multisector
Water & Sanitation
Agriculture & Rural Development
Public Sector
Private Sector
Finance
Environment
Urban Development
Information & Communication
Energy
Economic Policy
0
Source: 3ie Database of Impact Evaluations
20
40
60
80
100
120
140
160
180
Impact Evaluations, Social Development Subsectors, 1985-2013
100
90
80
70
60
50
40
30
20
10
0
Conditional Cash
Transfers
Microfinance
Source: 3ie Database of Impact Evaluations
Community Driven
Development
Impact Evaluations by Methodology, 1985-2013
80
70
RCT
60
50
40
Diff. in
Difference
30
Instrumental
Variable
20
Regression
Discontinuity
10
0
1985
1988
1990
1994
1997
1999
2001
2003
Source: 3ie Database of Impact Evaluations
2005
2007
2009
2011
Where should we be going?
• More is better
– The Question comes first. Then choose appropriate,
feasible, rigorous methods
– Micro studies won’t answer every question but there
are plenty of questions for them to answer
• Fitting evidence together – more systematic
reviews; and more thinking & synthesis
• Public goods – clustering, replication,
triangulation, standards, pre-registration,
funding, networking, linking, teaching …?
Extra Slides
IE's by Evaluation Method, 1985-2013
400
350
300
250
200
150
100
50
0
RCT
unspecified
DiD
IV
RDD
Download