Presentation

advertisement
Evaluation Methods
Training and Capacity Building Programs
Nidhi Khattri
Independent Evaluation Group
November 17, 2008
IEG’s Mandate
► The World Bank’s independent evaluation
function established about 35 years ago
► The goals:
• learning from experience
• accountability for the achievement of objectives
IEG’s Independence: Direct Report to the Board
► Direct reporting to the Board of Executive Directors
► Headed by a Director-General, Evaluation (DGE)
• Appointed by the Board
• No World Bank Group position after current position
► Evaluations to the Committee on Development
Effectiveness (CODE)
► Evaluation content not negotiated with CODE/Board
IEG’s Links to Bank Management
► Bank management has opportunity to comment
► Draft Bank Management Response accompanies
evaluation
► IEG responds to Bank management comments at
Board meetings
► IEG’ Management Action Record (MAR) reports on
management’s progress on actions noted in
management response
IEG’s Evaluation Products
► Project Evaluations
•
•
Project assessments (ICR Reviews, PPARs)
Impact studies (e.g., Bangladesh Health, Ghana Education)
► Sector and Thematic Evaluations
•
Often linked to policy revision (e.g., forestry – altered sector
policy)
► Country Evaluations
•
•
•
Country Assistance Evaluations
Country Impact Reviews (IEG-IFC)
Reviews of CAS Completion Reports
► Global and Regional Program Evaluations
► Corporate Evaluations
•
Annual Review of Development Effectiveness (which now
includes the Annual Report on Operations Evaluation)
Evaluation Approaches
► Based on evaluation products
► Primarily Objectives Based for Projects and
Programs
•
•
•
•
•
Outcome
Risk to Development Outcome
Bank Performance
Borrower Performance
Monitoring and Evaluation (M&E) Quality
Outcome
► The extent to which the operation’s major
relevant objectives were achieved, or are
expected to be achieved, efficiently
► Outcome = Relevance + Efficacy + Efficiency
Bank Performance
► The extent to which services provided by the
Bank ensured quality at entry of the operation
and supported effective implementation through
appropriate supervision
► Bank Performance = Quality at Entry + Quality of
Supervision
Monitoring and Evaluation (M&E) Quality
►
M&E design—the extent to which the project was
designed to collect appropriate (input, output, outcome,
and impact) data given project objectives and given
already available data
►
M&E implementation—the extent to which appropriate
data was actually collected using appropriate collection
methods (to ensure data quality)
►
M&E utilization—the extent to which appropriate data
was used to inform decision-making and resource
allocation
Corporate Evaluation: Annual Review of
Development Effectiveness (ARDE)
► Annual meta-evaluations that provide a comprehensive
assessment of the Bank’s development effectiveness
► Draw on IEG’s recent project, sector, thematic, country,
and global evaluations
► Synthesize lessons that can be used to increase the
development effectiveness of World Bank assistance
► Highlight the findings of recent IEG evaluations around a
common theme
Recent IEG Evaluations around Training and
Capacity Building
► Using Training to Build Capacity (2007)
• Avg amount of client training estimated at $720 mio per year (90%
through projects, rest through WBI)
• Key component in 60% of investment projects, particularly in social,
rural, public sectors
• 37 (incl. 8 WBI) training programs (Bangladesh, Burkina Faso, Mexico,
Tunisia), 6 country surveys of 550 trainees, comparison with other
DTIs, etc.
► Capacity Building in Africa (2005)
• 25% of Bank lending to Africa; $9 bill between 1995-2004
► Public Sector Reform (2008)
• 1/6 of Bank lending and advisory support – and increasing
Methodology in Evaluation of Training
► Key Questions:
• To what extent did Bank-financed training have an impact on the
capacity of target organizations?
• What factors contribute to successful training?
• To what extent are such factors present in Bank-financed
training?
► Main Methods – using objectives-based methodology
• Survey of training participants
• Survey of training institutions
• In-depth field reviews
Training Results in Capacity Building Only When
Certain Conditions are Met
Input
Training
Output
Learning
Outcome
Workplace
behavior
change
Impact
Enhanced institutional or
organizational capacity
Effectiveness of Training: What is the Evidence?
Outputs
• Most training resulted in demonstrable learning
• But: Individual learning gains poor predictor of impact
• Project-based trained lacked basic results measures
Performance
Outcomes
• About half of trainees surveyed reported substantial
positive changes in work performance
• 10 project-based programs had significant impact (e.g.,
Impact on
Capacity
Other
Findings
Procurement reform, Community Groups, Exporters, SME)
• Best combined project funds + outside expertise
• WBI programs not rated due to lack of data
• Type of training provider not correlated with success
• Good training outcomes in both higher- and lowcapacity environments
Training Design: What Works?
Targeting
of Training
Content
• Adequate diagnosis of capacity gaps associated with
strong client commitment / involvement
• Training needs assessment the norm in highly-rated
programs, but often subject to funding constraints (WBI)
• Good participant selection requires engagement and
supervision (e.g., IMF Institute, MASHAV)
• Poor targeting most important cause for lack of impact
Pedagogy
• Generally high marks for design & teaching standards
• But course length and topic coverage needs to be
better matched with capacity building goals (ITCILO,JICA)
Workplace
Transfer
• Allow time for practical learning techniques / action
plans (InWent, JICA, MASHAV)
• Provide systematic follow-on support (Motorola, InWent)
When Training Works: What Matters Most?
Workplace
Context
Incentives
Client
Commitment
• Support by managers and peers is key driver for
successful workplace implementation (~90% feedback)
• But: 1/3 of trainees didn’t have adequate resources,
incentives or org. support to apply what they learned
• Focus needs assessment on organize. bottlenecks and
whether training is indeed the right tool.
• Lack of institutional incentives recurring problem in civil
service and highly decentralized training programs;
• Stronger incentives at work in training programs for
community and farmer groups, private sector firms
• Hi-level support for training key for workplace transfer
• Stand-alone training with limited ability for diagnosis,
dialogue, influence and follow-up
Key Internal Inhibitors for Effective Training?
Results
Orientation
• At design, most training programs fail to specify training
objectives and expected performance outcomes,
reflecting lack of broader capacity needs assessments
• At completion, performance evaluation rarely done
• No feedback / accountability loop >> no improvements
Standards
• Lack of established standards for training design and
implementation undercuts quality assurance;
Access to
Expertise
Alignment
• Team leaders for project-based training lack adequate
in-house support and voiced demand for more
• WBI earns high marks for Country Team consultations,
but collaboration at task level remains rare. Risk of
diffusion of program.
Persistent Capacity Gaps in Africa Despite
Substantial Inputs (IEG, 2005)
Key Finding
• 40% of sampled lending operations achieved CB goals, with
better outcomes in roads than in health / education.
• CB lacked clear results framework
Obstacles
(only 1/3 of projects clear about
relationships among individual, organizational. and institutional aspects of
capacity)
• Weak diagnostics of pol. econ and available country capacity
• High fragmentation of efforts; supply driven TA
• Training not embedded in broader HR strategies;
Recommended
Actions
Recent
Changes
• Strengthen K-base, operational framework, M&E
• Develop sector-specific guidance
• Promote country-led approach
• Re-assess role / modalities of training
• Capacity Building has moved to center stage in AFR CASs
• Focus shifting beyond individual skills to institutional support
•leadership, donor harmonization and better coherence
Mixed Outcomes on Public Sector Reform (IEG, 2008)
Successful
Themes
• Public Financial Management: good diagnostics,
indicators, joint undertaking with govts (PEFA)
• Tax Administration: strong MoF motivation, good TA
• Transparency: widened access to information
Reform
Challenges
• Civil Service and Administrative Reforms: lacks good
models, indicators, but too important to ignore
• Government-wide anti-corruption: key are political
commitment, strong judiciary
Recommended
Actions
• Recognize complex political + sequencing issues;
focus on basics first
• Prioritize anti-corruption effort on most harmful aspects
• Underpin civil service reforms through better diagnosis
Opportunities Going Forward?
Guidelines
• Review & adopt good practice standards for training
programs based on a realistic results framework
• Pool evaluation findings to expand practices/lessons
• Encourage systematic use in decision-making
Learning
& Quality
• Pool expertise around core training and capacity
building management practices
Awareness
• Identify case studies of influential training programs
Scope & Scale
Frontier
• Pilot new forms of collaboration
• Bridge the gap between training and other modes of
capacity building
• New modalities / increased demand for client-led
training
•How training can contribute to development objectives
THANK YOU!
IEG Website : http://www.worldbank.org/ieg
Download