Using a results-based evaluation methodology in a public

advertisement
Using a results-based evaluation
methodology in a public sector
environment that focuses on
activities”
Terms of Reference - Example
• To conduct an evaluation of the effectiveness and impact of
the DLGTA support over the past five years using a results
based management (RBM) approach.
• Investigation into the performance and management of
the community libraries recapitalisation programme
• “The department of economic development agency microlevel monitoring and evaluation system”.
• Evaluation of the soul buddyz clubs
What are we faced with at the commencement of
the monitoring and evaluation assignment
Planning Statements:
Strategic Goals - Examples
• To invest and ensure the provision of quality social welfare
services to children including those in need of care and
protection.
• Facilitate integrated and responsive governance in a
developmental state.
• Promote health, prevent and manage illness or conditions
with emphasis on poverty, lifestyle, trauma and violence and
psychosocial factors
Planning Statements
Strategic Objectives - Examples
Strategic Objective
Indicator
Ensure that municipalities
meet basic needs of
communities.
• Executive development learning
framework.
Lateral contribution
Transformation
• Capacity for better
implementation of donor funded
projects
Design and deliver innovative
multimodal programmes.
Planning Statements
Measurable Objectives - Examples
• Reduce infant and under five morbidity and mortality
• To ensure effective leadership, management and administrative
support to the department through the continuous refinement
of organisational strategy and structure, in compliance with
appropriate legislation and best practice.
• Conduct campaigns on key nutrition priorities such as obesity
• To promote growth, social development and poverty reduction
through sound fiscal and financial policies, and the effective,
efficient and appropriate allocation of public funds.
Results Statements
Outcomes - Examples
Outcome
Enhance service delivery
Indicators
Regulatory and support mechanisms
for Municipal Councils and ward
committees reformed and
implemented
Maturity in M&E system for
cooperative governance
A coordinated and functional M,R&E
system for Provincial and
Local Government
Coordinated M&E system for
cooperative governance
Regulatory framework for M&E
implemented
Results Statements
Outputs - Example
Output
indicators
Improved quality and quantity of
measurable objectives and trend based
performance indicators by departments and
public entities
Publication and implementation of a
standard operating procedures manual
Local government budget Framework
Integrity of the framework: fiscal
sustainability, structure and trends in
fiscal indicators
Infrastructure delivery improvement
programme
Technical Assistants deployed in
targeted provincial Departments
Enhance coordination across government
for effective implementation of MFMA
An agenda that ensures alignment of
activities of national and government
departments consistent with agreed
Priorities
Where do we start
Where do we start
• Review all planning documents – these include:
– Strategic plans
– Annual performance plans
– Business Plans
– Operational Plans
• Map the planning statements, the indicators, targets, activities etc.
• Identify all available data
– Quarterly reports
– Annual Reports
– Baseline data
– Reports developed for special studies conducted
Evaluation Parameters
Result’s Matrix
Reconstructing the Intervention
Logic
•
•
•
•
•
Check whether the planning statements (goal, strategic objectives and measurable
objectives) could be reformulated as results statements (output, outcome and
impact), so they can be measured accordingly.
Check whether the sum of the planned components/support is sufficient to
produce the intended result.
Explicitly describe the planning assumptions and minimize the risk of failure (did
we assume too much?)
Map the indicators detailed in the Strategic Plans, APP and the Operational Plans
to the remapped results matrix.
In trying to establish the model of evaluation and the methodology to be
employed for the evaluation the following should be considered:
– remapping of the planning levels using the parameters governing logic
models.
– establish whether there are linkages between the activities, the outputs, the
outcomes and the impacts.
OUTPUTS
INPUTS
ACTIVITIES
OUTPUTS
INPUTS
ULTIMATE
OUTCOME
ACTIVITIES
OUTPUTS
INPUTS
ACTIVITIES
DIRECT
OUTCOME
INPUTS
INTERMEDIAT
E OUTCOME
ACTIVITIES
OUTPUTS
INPUTS
ACTIVITIES
OUTPUTS
DIRECT
OUTCOME
INTERMEDIA
TE OUTCOME
Project
Resources
DIRECT
OUTCOME
Logic Model
“Would you tell me, please,
which way I ought to go from here?”
“That depends a good deal
on where you want to get to,” said the Cat.
“I don’t much care where…” said Alice.
“Then it doesn’t matter which way you go,”
said the Cat.
“…so long as I get SOMEWHERE,”
Alice added as an explanation.
“Oh, you’re sure to do that,” said the Cat,
“if you only walk long enough.”
Evaluability Assessment
• Evaluability assessment is a brief preliminary study to determine whether
an evaluation would be useful and feasible.
• Establish the discrepancies between the rhetoric and the reality –
according to Nay and Kay (1982) different levels of policy makers and
programme managers may have different rhetorical models for the same
programme and they cling to these models because it is perceived that
approach to bring about change.
• The planning framework utilised and the evidence of the logical linkages
between the different levels of planning.
• The appropriateness and plausibility of the Strategic goals and Strategic
Objectives in terms of the legislative mandate and core functions of the
province.
Evaluability Assessment
•
Assess the extent to which the intervention logic provides evaluators with operational
benchmarks against which outputs, outcomes, impacts and assumptions can be
evaluated.
•
Check whether necessary baseline and monitoring data is available.
•
An exploration of the evaluation models to determine the degree the models are
relevant, appropriate and feasible in terms of:
– The availability of information – routine, ad hoc and research information
– The logic used to construct the programmes
– The timing and relevance of such an evaluation.
•
Check the availability of key informants, such as planners, intervention staff, and target
group representatives.
•
Assess the extent to which the evaluation questions can be answered, given the timing
of the evaluation in relation to the current phase of the intervention cycle.
The Evaluation
Measuring change and inferring causality
•
•
•
•
Relevance
– The extent to which a development intervention conforms to the needs
and priorities of target groups and the policies of recipient countries
and donors.
Sustainability
– The continuation or longevity of benefits from a development
intervention after the cessation of development assistance.
Efficiency
– The extent to which the costs of a development intervention can be
justified by its results, taking alternatives into account.
– Assumes that there has been effectiveness
Other evaluation criteria
– Appropriateness –tailored to local needs and changing demands
– Coverage –evenness, comprehensiveness
– Connectedness –co-operation
– Coherence and alignment
Components of an Evaluation
• There are 3 general components to
comprehensive evaluation:
– Process evaluation: How was the strategy/
programme implemented?
– Outcome evaluation: Did the strategy/
programme meet its objectives?
– Impact evaluation: Was the ultimate goal of the
strategy/programme achieved?
Every program has…
Every program evaluation
should have…
Goals
Impact Indicators
Objectives
Outcome Indicators
Activities
Process Indicators
Every Operational
Plan/Programme has…
Every evaluation should
have…
Goals
Impact Indicators
Objectives
Outcome Indicators
Activities
Process Indicators
What is Process Evaluation?
• Process evaluation:
– Addresses how, and how well, the program is functioning
• It can help to…
– Create a better learning environment
– Improve presentation skills
– Show accountability to funder
– Reflect the target populations
– Track service units
Process Evaluation con’t
• Key questions in process evaluation:
– Who is served?
– What activities or services are provided?
– Where, when, and how long is the program?
• Identify how a product or outcome is
produced
• Identify strengths & weaknesses of a program
• Create detailed description of the program
–
Every Strategic
Plan/Operational
Plan/Programme has…
Every program evaluation
should have…
Goals
Impact Indicators
Objectives
Outcome Indicators
Activities
Process Indicators
Outcome Evaluation
• Outcome evaluation:
– Measures the extent to which a Strategy/
programme produces its intended improvements
– Examines effectiveness, goal attainment and
unintended outcomes
– In simple terms, “What’s different as a result of
your efforts?”
Outcome Evaluation con’t
• Key questions in outcome evaluation:
– To what degree did the desired change(s) occur?
• Outcomes can be immediate, intermediate or
longer-term
• Outcomes can be measured at the patient,
provider, organization, or system level.
Every Strategic Plan/
Programme has…
Every evaluation should
have…
Goals
Impact Indicators
Objectives
Outcome Indicators
Activities
Process Indicators
Impact Evaluation
• Impact is sometimes used to mean “ ultimate outcome.”
• Impact is perhaps better defined as a longer-term or
ultimate outcome.
Results-Based Evaluation
Results-Based Evaluation
is
an assessment of a planned, ongoing, or completed
intervention to determine its relevance, efficiency,
effectiveness, impact and sustainability. The intent
is to incorporate lessons learned into the decisionmaking process.
INFORMATION (MIS) USE
Focus should be knowledge generation
31
Download