Evaluation processes, methods and tools

advertisement
Evaluation methods and tools
(Focus on delivery mechanism)
Jela Tvrdonova, 2014
Content
The evaluation process
 Setting up evaluation system
 Direct and indirect programme effects
and their separation
 Evaluation design
 Evaluation methods
 Securing data
 Answering evaluation questions
 Summing up – key issues to be addressed

The evaluation process
Ongoing
Periodical
 Ex-ante
 Mid- term
 Ex-post
Setting up the evaluation systems
Administrative tasks and institutional set
up (steering group, monitoring committee,
evaluation managers etc.)
 Terms of reference (for independent
evaluator)
 Preparation of evaluation

Seting up the evaluation systems
Phases of the evaluation – evaluation
tasks
 Structuring (overlap with preparation)
 Observing
 Analysing
 Judging
Structuring
Review intervention logic for the
different measures to be evaluated,
 Review other topics to be evaluated (e.g.
delivery mechanism)


Set up the evaluation framework and
design
Review intervention logic
Review objectives, inputs, measures,
expected outputs, results and impacts
 Define of key terms
 Assess:

◦
◦
◦
◦
◦
Relevance
Coherence
Effectiveness
Efficiency
Intended unintended factors
EU policy objectives
Context, its description
SWOT and needs assessment
Complementarity
Relevance
RDP Intervention logic
Operational
objectives EU/MS
Measure level
Inputs
Source: EENRD 2014
Results
Outputs
Measures, projects
and their
management and
implementation
Efficiency
Effectiveness
Specific objectives
EU/MS
Axis level
Impacts
Coherence
Coherence
Overall objectives
EU/MS
Programme level
Review other topics

Identify evaluation need

Define key terms

Establish benchmarks if possible
Set up evaluation framework








Define programme specific evaluation questions,
judgment criteria and indicators
Link intervention logic with evaluation questions and
indicators
Remember intended and unintended factors of IL
Identify direct and indirect programme effects
Consider contextual factors
Chose evaluation design and methods to answer
evaluation questions
Screen data and information sources and ensure their
availability
Decide on the collection of additional data and
information to fill data gaps
Observing
Create the tools needed for the quantitative and
qualitative analysis: interview guides, questionnaires,
queries for extractions from databases, requests for
maps, guidelines for case studies, focus groups and any
other data collection instrument that the contractor
deems appropriate
 Collect data and qualitative information needed
for answering each evaluation question: databases,
studies, people to be interviewed, appropriate case
study areas etc.
 Description of the process of programme
implementation, composition of programmes,
priorities and target levels, budget

Analysing

Analysing all information available in view of assessing the effects and
impacts of measures, focus areas and programme in relation to the
programme's objectives and target levels.

In order to assess progress made, the link to the baselines, provided in
the context of ex-ante evaluations, has to be established.

Impacts will be identified as net-contributions to achievement of
programme's objectives.
In this respect evaluators have to:

Establish appropriate typologies of measures and/or beneficiaries in
view of reducing the complexity for dealing with the empirical analysis.

Process and synthesise available data and information, and - where
necessary – handle data gaps by modelling or other extrapolations. Apply a
measurement against the counterfactual as well as target levels.
Judging







Answer all evaluation questions (common and programme
specific questions)
Assess the impact, effectiveness and efficiency of the
programme
Assess measures with respect to their balance within the
programme
Judge on the degree to which the programme contributes to
achieving the objectives set out in the national and
Community strategy
Identify the factors which contributed to the success or
failure of the programme
Draft conclusions and recommendations based on the
findings
Identify possible adjustments necessary for improvement of
rural policy interventions
structuring
observing
analysing
judging
Setting intervention
logic per measure,
focus area and
program, setting up
evaluation
framework and
desgn
Development of
tools, Collecting
data – primary,
secondary,monito
ring
Analysing via using
various methods –
naive, advanced,
qualitative
quantitative
Developing
judgments,
answering
evaluation
questions
Evaluation phases
and key activities
Evaluation methods – qualitative
Qualitative approaches are useful during the three stages of an
impact evaluation:
When designing an impact evaluation, focus groups and
interviews with key informants to develop hypotheses
 In the intermediate stage, before quantitative impact
evaluation as the quick insights into what is happening in the
program.
 In the analysis stage, evaluators can apply qualitative methods
to provide context and explanations for the quantitative
results - triangulation

The applicability of qualitative methodologies to construct valid
counterfactuals is considered as rather limited, however
possible
Qualitative methods
Interviews
 Focus groups
 Surveys
 Case studies
 Field observations
 Literature reviews
 Other qualitative approaches

Criteria for selection of evaluation
methods







Credibility
Rigour
Reliability
Robustness
Validity
Transparency
Practicability
Also:
Ability to explain causality
Ability to eliminate a possible selection bias
Ability to isolate the effect of the programme from other
factors
Taking into account potential indirect effects
Answering evaluation questions
Evidence based answers
 Related to the contextual environment –
netting out
 Sound methodology and data

 Drafting
conclusions and
recommendations
Download