Monitoring and Evaluation Framework

advertisement
SRSA M&E FRAMEWORK 2010
MONITORING AND EVALUATION FRAMEWORK
JANUARY 2011
i
.
SRSA M&E FRAMEWORK 2010
TABLE OF CONTENTS
GLOSSARY OF TERMS ............................................................................................... VII
1. OVERVIEW................................................................................................................. 1
2. INTRODUCTION ........................................................................................................ 1
3. PURPOSE OF THE M&E FRAMEWORK IN SRSA ................................................... 2
4. THE OBJECTIVE OF THE M&E FRAMEWORK IS TO ARTICULATE THE ROLE
THAT M&E WILL PLAY IN ENSURING THE EFFECTIVE M&E OF THE
ORGANISATION BY: ....................................................................................................... 3
5. M&E GUIDING PRINCIPLES ..................................................................................... 4
6. LEGISLATIVE FRAMEWORK ................................................................................... 5
6.1
CONSTITUTION OF THE REPUBLIC OF SOUTH AFRICA, 1996 ........................................ 5
6.2
PUBLIC FINANCE MANAGEMENT ACT (PFMA), 1999 .................................................. 5
6.3
NATIONAL SPORT AND RECREATION AMENDMENT ACT, 1998 (AS AMENDED)................ 5
6.4 THE W HITE PAPER ON TRANSFORMING PUBLIC SERVICE DELIVERY (BATHO PELE W HITE
PAPER), 1997 .......................................................................................................... 6
6.5
PUBLIC SERVICE ACT, 1994 ..................................................................................... 6
6.6 POLICY FRAMEWORK FOR A GOVERNMENT-W IDE MONITORING AND EVALUATION SYSTEM
(GWM&E), 2007 ..................................................................................................... 6
6.7 FRAMEWORK FOR MANAGING PROGRAMME PERFORMANCE INFORMATION (NATIONAL
TREASURY, 2007) .................................................................................................... 7
6.8 FRAMEWORK FOR STRATEGIC PLANS AND ANNUAL PERFORMANCE PLANS (NATIONAL
TREASURY, 2010) .................................................................................................... 7
6.9 IMPROVING GOVERNMENT PERFORMANCE; OUR APPROACH (PRESIDENCY
PERFORMANCE MONITORING AND EVALUATION, 2009) ................................................ 7
6.10 SOUTH AFRICAN STATISTICS QUALITY ASSESSMENT FRAMEWORK (SASQAF) ............ 7
6.11 NATIONAL TREASURY GUIDELINES ............................................................................ 8
6.12 PUBLIC AUDIT ACT, 25 OF 2004 ............................................................................... 8
6.13 DIVISION OF REVENUE ACT, (DORA) ........................................................................ 8
7. DEPARTMENTAL MANDATE ................................................................................... 8
ii
.
SRSA M&E FRAMEWORK 2010
8. DESCRIPTION OF FUNCTIONAL RESPONSIBILITIES (REFERRED TO AS SRSA
PROGRAMMES) .............................................................................................................. 9
9. STAKEHOLDERS ...................................................................................................... 9
10.
SRSA’S APPROACH TO M&E ............................................................................. 10
10.1 ROLES AND RESPONSIBILITIES FOR M &E IN SRSA .................................................. 10
10.2 STRATEGIC PLANNING, MONITORING AND EVALUATION IN SRSA ................................ 12
10.2.1 Strategic planning ......................................................................................... 12
10.2.2 Monitoring and evaluation ............................................................................. 13
10.2.3 Relationship between strategic planning, monitoring, evaluation and reporting
...................................................................................................................... 13
10.3 THE LOGIC MODEL ................................................................................................. 15
10.4 PERFORMANCE INDICATORS, BASELINES AND TARGETS ............................................. 19
10.5 MONITORING APPROACH IN SRSA .......................................................................... 20
10.6 EVALUATION APPROACH IN SRSA ........................................................................... 23
10.6.1 Types of evaluations ..................................................................................... 23
10.7 EVALUATION REPORTING ........................................................................................ 27
10.8 DATA DISSEMINATION MANAGEMENT ........................................................................ 27
10.9 DATA QUALITY MANAGEMENT .................................................................................. 28
11.
M&E FORUM ......................................................................................................... 28
12.
M&E CAPACITY BUILDING ................................................................................. 30
13.
REFERENCES ...................................................................................................... 31
iii
.
SRSA M&E FRAMEWORK 2010
Tables
Table 1.1
Roles and Responsibilities of M&E
11-12
Table 1.2
Example of a logic model for SRSA
16-17
Table 1.3
Level of indicators for logic model
18-19
Table 1.4
Monitoring approach in SRSA
20-23
Table 1.5
Evaluation types and functions
26
Figures
Figure 1: Relationship between planning monitoring, evaluation and reporting
14
Figure 2: Logic model
15
iv
.
SRSA M&E FRAMEWORK 2010
The Power of Measuring Results

If you do not measure results, you cannot tell success from failure

If you cannot see success, you cannot reward it

If you cannot reward success, you are probably rewarding failure

If you cannot see success, you cannot learn from it

If you cannot recognize failure, you cannot correct it

If you can demonstrate results, you can win public support
(Adapted from Osborne & Gaebler, 1992)
v
.
SRSA M&E FRAMEWORK 2010
ABBREVIATIONS AND ACRONYMS
AG:
Auditor-General
CFO:
Chief Financial Officer
COO:
Chief Operations Officer
DG:
Director-General
GWME:
Government Wide Monitoring and Evaluation
M&E:
Monitoring and Evaluation
NF:
National Federation
NGO:
Non-Government Organisation
NSRA:
National Sport and Recreation Amendment Act
PFMA:
Public Finance Management Act
SASQAF:
South Africa Statistical Quality Assessment Framework
SRSA:
Sport and Recreation South Africa
vi
.
SRSA M&E FRAMEWORK 2010
GLOSSARY OF TERMS
Accountability: An agency, organisation or individual’s obligation to demonstrate and
take responsibility for performance in light of agreed expectations (Public Finance
Management Act, 29 of 1999).
Activities: The processes or actions that use a range of inputs to produce the desired
outputs and ultimately outcomes. In essence, activities describe "what we do" . Their
results can strart to be shown in 0 to 1 year (Framework for Strategic Plans and Annual
Performance Plans, 2010).
Baseline Data: The first measurement of an indicator. It sets the current condition
against which future change can be tracked (Handbook on Impact Evaluation:
Quantitative Methods and Practices, 2009).
Data Integrity: Data that are collected, analysed, and reported should have established
mechanisms in place to reduce the possibility of manipulation for political or personal
reasons (Handbook on Impact Evaluation: Quantitative Methods and Practices, 2009).
Data Precision: Data should be sufficiently accurate to present a fair picture of the
results achieved and enable SRSA to make confident management and policy decisions
(Handbook on Impact Evaluation: Quantitative Methods and Practices, 2009).
Data Reliability: Data should reflect stable and consistent data collection approaches,
including analysis, over time (Handbook on Impact Evaluation: Quantitative Methods and
Practices, 2009).
Data Timeliness: Data should have sufficient and current information which is also
available frequently enough to inform management decision-making at the appropriate
levels (Handbook on Impact Evaluation: Quantitative Methods and Practices, 2009).
vii
.
SRSA M&E FRAMEWORK 2010
Data Validity: Data are valid to the extent that they are clear, directly and adequately
represent the performance or the result that was intended to be measured.
Unrepresentative samples and transcription errors could compromise the validity of
reported data (Handbook on Impact Evaluation: Quantitative Methods and Practices,
2009).
Effectiveness: Measures the extent to which an objective has been achieved or how
likely it is to be achieved (Public Finance Management Act, 29 of 1999).
Efficiency: Assesses how inputs/resources are converted into outputs (Public Finance
Management Act, 29 of 1999).
Evaluation: Systematic and objective assessment of an ongoing or completed project,
programme, or policy, including its design, implementation, and results. The aim is to
determine the relevance and fulfillment of objectives, development efficiency,
effectiveness, impact, and sustainability. An evaluation should provide information that is
credible and useful, enabling the incorporation of lessons learned into the decision
making process of both recipients and donors (Handbook on Impact Evaluation:
Quantitative Methods and Practices, 2009).
Formative evaluation: Intended to improve performance and is most often conducted
during the implementation phase of projects or programmes. It may also be conducted
for other reasons such as compliance, legal requirements or as part of a larger
evaluation initiative (International Programme for Development Evaluation Training
Handbook, 2007).
Impact evaluation: Examines whether underlying theories and assumptions were valid,
what worked, what did not and why. Evaluation can also be used to extract crosscutting
lessons from operating unit experiences and determining the need for modifications to
strategic results framework draft. (Handbook on Impact Evaluation: Quantitative Methods
and Practices, 2009).
viii
.
SRSA M&E FRAMEWORK 2010
Impact: Results of achieving specific outcomes, such as reducing poverty and creating
jobs. Impacts are “how we have actually influenced communities and target groups.
Generally, changes start to occur in 5 to 10 years (Handbook on Impact Evaluation:
Quantitative Methods and Practices, 2009).
Inputs: Resources that contribute to the production and delivery of outputs. Inputs are
"what we use to do the work". They include finances, personnel, equipment and
buildings (Framework for Strategic Plans and Annual Performance Plans, 2010).
Institutional Monitoring and Evaluation: Focuses on assessing and analysing the
overall performance of the organisation to ascertain if the organisation is effectively and
efficiently fulfilling its purpose of contributing to realising its full potential as it executes its
mandate (Handbook on Impact Evaluation: Quantitative Methods and Practices, 2009).
Logic model: Illustrates the cause-effect relationship between activities and outputs
through to the final (outcomes and impacts) results. It is a visual way of expressing the
rationale thought process, or theory behind an organisation, programme, or initiative. It is
a representation of how the organisation or initiative is expected to lead to the results
(International Program for Development Evaluation Training Handbook, 2007).
Milestones: Used as indicators of how far you have travelled within a given time period
(performance) or how far you are from reaching your target destination (achievement). In
terms of programme monitoring the milestones describe the expected progress towards
the implementation of activities (events), delivery of outputs (progress) and achievement
of the outcome within a specific time period e.g. a quarter or a year ( Framework for
Strategic Plans and Annual Performance Plans, 2010).
Monitoring and Evaluation (M&E): The combination of monitoring and evaluation
which together provide the knowledge required for: a) adaptive project management, b)
reporting and accountability responsibilities, c) learning and d) empowering the primary
stakeholders (Handbook on Impact Evaluation: Quantitative Methods and Practices,
2009).
ix
.
SRSA M&E FRAMEWORK 2010
Monitoring and Evaluation system ( M&E system): A set of organisational
structures, management processes, standards, strategies, plans, indicators, information
systems, reporting lines and accountability relationships which enables institutions to
discharge their M&E functions effectively (Policy Framework for the Government-wide
Monitoring and Evaluation System, 2007).
Monitoring: Involves collecting, analysing, and reporting data on inputs, activities,
outputs, outcomes and impacts as well as external factors, in a way that supports
effective management. It is a continious function that uses the systematic collection of
data on specified indicators to provide management and the main stakeholders of an
ongoing development intervention with indications of the extent of progress and the
achievement of objectives and progress in the use of allocated funds. It actually reports
on actual performance against what was planned or expected (Policy Framework for the
Government-wide Monitoring and Evaluation System, 2007).
Outcomes: The medium-term results for specific beneficiaries that are the consequence
of achieving specific outputs. Outcomes should relate clearly to an institution’s strategic
goals set out in its plans. Outcomes are “what we want to achieve”. Outcomes are often
further categorised into immediate/direct outcomes and intermediate outcomes.
Generally changes start to occur in 2 to 5 years (Framework for Managing Programme
Performance Information, 2007).
Outputs: The final products, or goods and services produced for delivery. Outputs may
be defined as “what we produce or deliver”. Their results can start to be shown in 0 to 2
years (Framework for Managing Programme Performance Information, 2007).
Performance indicator: A pre-determined signal that a specific point in a process has
been reached or result achieved. The nature of the signal will depend on what is being
tracked and needs to very carefully chosen. In management terms, an indicator is a
variable that is used to assess the achievement of results in relation to the stated
goals/objectives (Framework for Managing Programme Performance Information, 2007).
x
.
SRSA M&E FRAMEWORK 2010
Performance information: Indicates how well an institution is meeting its aims and
objectives, and which policies and processes are working. It is key to effective
management, including planning, budgeting, implementation, monitoring and reporting.
Performance information also facilitates effective accountability, enabling legislators,
members of the public and other interested parties to track progress, identify the scope
of improvement and better understand the issues involved (Framework for Managing
Programme Performance Information, 2007).
Performance targets: Express a specific level of performance that the institution,
programme or individual aims to achieve within a given period (Framework for Managing
Programme Performance Information, 2007).
Plan:
Contains the detailed prescription of actions towards the achievement of the
objectives of the strategy (Framework for Strategic Plans and Annual Performance
Plans, 2010).
Policy: A guiding principle used to set direction in an organisation. It can be a course of
action to guide and influence decisions. It should be used as a guide to decision making
under a given set of circumstances within the framework of objectives, goals and
management philosophies as determined by senior management.
Policies are
statements of what government seeks to achieve through its work and why (Framework
for Managing Programme Performance Information, 2007).
Service Delivery Monitoring and Evaluation: Monitoring and the assessment of the
quality of service rendered in the provinces. (Handbook on Impact Evaluation:
Quantitative Methods and Practices, 2009).
Summative evaluations: The evaluation studies conducted at the end of an intervention
to determine the extent to which the anticipated outcomes were produced (International
Programme for Development Evaluation Training Handbook, 2007).
xi
.
SRSA M&E FRAMEWORK 2010
1. OVERVIEW
When government is voted into office, an inevitable contract of accountability is
entered into between government and the citizens it serves. It is therefore incumbent
on government to render efficient and effective service. Government can be more
accountable and effective if monitoring and evaluation systems are put in place.
Government’s approach in monitoring and evaluating its programmes is results-based.
This approach moves away from the traditional way of M&E which places emphasis
on the inputs, activities and outputs. The results-based monitoring and evaluation
processes can assist government by assessing its performance and identifying factors
which contribute to its service delivery outcomes that have a positive change in
people’s lives. The effectiveness and efficiency of government will in future receive
more emphasis.
2. INTRODUCTION
Sport and Recreation South Africa (SRSA) is committed to improving service delivery
and has therefore developed the monitoring and evaluation framework in line with the
Policy Framework for the Government-wide Monitoring and Evaluation System
(GWM&E) which emphasises effectiveness, integration of services, and encourages
that the roles and responsibilities of M&E should be seen in each work plan and
performance agreement. A requirement of the GWM&E Policy Framework is for each
and every government department to formally adopt an M&E strategy or framework
that will explain how the department will discharge its M&E. It is from this premise that
the SRSA M&E framework is developed.
This monitoring and evaluation framework is important to ensure that SRSA remains
accountable and effective in terms of performance information.
1
.
SRSA M&E FRAMEWORK 2010
The M&E framework will also focus on service delivery monitoring and evaluation to
asses the effectiveness and efficiency of programmes and projects run by provincial
departments, National Federations, Public entities and NGOs funded by SRSA to
deliver services on behalf of SRSA. The Directorate: Strategic Management,
Monitoring and Evaluation will play an oversight role in assisting provincial and local
departments, National Federations, Public entities as well as NGOs to deliver quality
services and in improving their services rendered to the public.
This framework will serve as a departmental M&E framework. It will serve as a
centralised M&E framework. It will guide how programme managers monitor and
evaluate the implementation of their projects and programmes.
The M&E framework will promote the achievement of the vision and mission of SRSA
as stated in the strategic plan. The framework is based on the strategic plan. It should
monitor and evaluate the organisational performance. This is called institutional
monitoring and evaluation. The framework is also aiming at improving the current M&E
system, improving performance and assisting the department to get an unqualified
audit report annually.
The framework will be used to inform steps to be undertaken by SRSA to achieve
goals in service delivery and improve performance. It will explain how data will be
collected, when, by whom, where, as well as the data flow processes. The M&E
templates for collecting data and reporting also form a critical component of this
framework.
3. PURPOSE OF THE M&E FRAMEWORK IN SRSA
The purpose of the M&E framework is to monitor, assess and analyse the overall
performance of the department in terms of SRSA’s mandate and legislation. The
framework will serve to monitor, assess and analyse the quality of services delivered
by SRSA. It is also intended to improve services rendered by provincial departments,
2
.
SRSA M&E FRAMEWORK 2010
National Federations, public entities and NGOs. It is intended to provide a step-bystep approach to the process, procedures and methods for monitoring and evaluating
organisational performance. The framework also serves as the key driver and
paradigm shift in terms of the organisational performance, moving away from the rule
of input/output based system to the results based approach.
4. THE OBJECTIVE OF THE M&E FRAMEWORK IS TO ARTICULATE THE ROLE
THAT M&E WILL PLAY IN ENSURING THE EFFECTIVE M&E OF THE
ORGANISATION BY:

Monitoring and evaluating the implementation of the strategic plan.

Providing clear M&E processes that will enable systematic collection, collation,
processing, analysis and interpretation of data.

Providing a basis for decision-making on the improvement of the performance of
SRSA.

Promoting effective and efficient use of resources.

Monitoring and evaluating service delivery within SRSA.

Assessing the overall satisfaction levels of services rendered to SRSA clients.

Playing an oversight role in monitoring and evaluating projects and programmes
of provincial departments, public entities and NGOs on a quarterly basis.

Improving reporting systems within SRSA.

Promoting a culture of continuous learning and improvement.

Improving programmes by identifying those aspects that are, working according
to plan and those in need of mid-course correction.

Disseminating best practice findings for improved project and programme
performance.

Ensuring proper coordination and standardisation processes and procedures
used for monitoring and evaluation.

Evaluating the extent to which the programme is having or has had the desired
impact.
3
.
SRSA M&E FRAMEWORK 2010
5. M&E GUIDING PRINCIPLES
As underlying principles this M&E framework will:

Be accessible to and understood by all stakeholders involved in the
implementation of the strategic plan within SRSA.

Encourage a right based approach to ensure that SRSA clients enjoy their
constitutional rights in terms of services rendered by SRSA.

Contribute to improved governance such as accountability and transparency.

Promote partnerships and work together with other programmes and subprogrammes to achieve strategic objectives.

Maintain the highest standards of ethical behaviour, honesty and professional
integrity.

Not be biased to any programme and maintain objectivity in all reporting.

Employ the logic model.

Be integrated into existing systems and be acceptable to stakeholders. This will
ensure proper data flows from sub programme to programme managers and
finally to the Strategic Management, Monitoring and Evaluation Directorate.

Ensure that there is a consistency in the utilisation of M&E instruments and data
collection methods as well as sourcing information from different sources
(triangulation) to ensure credible and valid data.

Be methodologically sound.

Provide a record of findings and recommendations that will be maintained and
implementation will be followed up.

Create a culture of learning based on utilising monitoring and evaluation
information as a basis for decision-making and accountability relationships for
management and governance purposes.
4
.
SRSA M&E FRAMEWORK 2010
6. LEGISLATIVE FRAMEWORK
There are legislative frameworks that drive and support the M&E framework of SRSA.
6.1 Constitution of the Republic of South Africa, 1996
The Constitution affirms the basic values and principles governing public
administration which must be complied by all government departments in terms of
section 195 of the Constitution of the Republic of South Africa Act, 108 of 1996 as
amended. Therefore according to schedule 4 and section 44 of the Constitution,
SRSA has been assigned the powers and functions to develop and implement
national policies and programmes regarding sport and recreation in the country. It is
the right of the South Africans to receive quality services rendered by SRSA. The M&E
framework will serve as a tool to contribute to the quality services rendered by
provincial departments, NFs and NGOs.
6.2 Public Finance Management Act (PFMA), 1999
The PFMA ensures that scarce government resources are utilised in an effective,
efficient and economic manner. It emphasises the importance of moving away from an
input-based budgeting system to an output-based results-oriented system. In terms of
budgeting and financial management, the focus is on obtaining value-for-money from
each department within government for every rand spent.
The PFMA promotes flexibility in the management of resources by ensuring that
accountability for the efficient and effective of resources does not remain the preserve
of treasury or finance alone, but is devolved to the line managers who in turn are
accountable for their particular responsibility. PFMA governs the accountability for
performance management.
6.3 National Sport and Recreation Amendment Act, 1998 (as amended)
The Act provides for SRSA to enter into service level agreements with National
Federations to be able to oversee and monitor and evaluate the implementation of
policies by the National Federations in the country. The M&E framework will help
National Federations to remain accountable for funds received from the department.
5
.
SRSA M&E FRAMEWORK 2010
6.4 The White Paper on Transforming Public Service Delivery (Batho Pele White
Paper), 1997
The Batho Pele strategy on service delivery is developed to introduce a new approach
to service delivery which puts people at the centre of planning and delivering services;
to improve the face of service delivery through increased commitment, personal
sacrifice, dedication and to improve the image of the Public Service. Service can be
effectively delivered and be improved if SRSA has an effective M&E framework in
place and implements it.
6.5 Public Service Act, 1994
This Act ensures that there is improved governance through direct accountability and
decision making as close as possible to the point of service delivery. SRSA can be
more accountable if M&E framework is well understood and accessible to officials in
the department. The M&E framework will support corporate governance in the
department including provinces and NGOs.
6.6 Policy Framework for a Government-Wide Monitoring and Evaluation
System (GWM&E), 2007
The Policy Framework for a Government-Wide Monitoring and Evaluation (GWM&E)
system was developed to provide decision-makers, in all government agencies and
departments with easy access and reliable information that would contribute towards
the management of their own processes by indicating which of their practices and
strategies worked well and which needed to be changed or improved. According to
the GWM&E system, government’s major challenge is to become more effective in
service delivery.
The GWM&E system identifies three critical data terrains, namely programme
performance information, social, economic and demographic statistics and
evaluations.
6
.
SRSA M&E FRAMEWORK 2010
6.7 Framework for Managing Programme Performance Information (National
Treasury, 2007)
The framework states that the organisation’s performance should be measured to see
if it meets its aims and objectives. It promotes accountability and transparency by
providing parliament, provincial legislatures, municipal councils and the public with
timely, accessible and accurate performance information. It also defines roles and
responsibilities for managing programme performance information. National Treasury
developed the Framework for Managing Programme Performance Information to
provide guidance on the management of programme performance information. It
states that government institutions should develop M&E systems to collect, collate,
verify and store information. The aim of the Framework for Managing Programme
Performance Information is to support regular audits of non-financial performance
information in government institutions.
6.8 Framework for Strategic Plans and Annual Performance Plans (National
Treasury, 2010)
This framework for Strategic Plans and Annual Performance Plans emphasises the
importance of result-based management. It supports the development of the Policy
Framework for the Government- Wide Monitoring and Evaluation System which
encourages government to focus on result- based monitoring and evaluation.
6.9 Improving Government Performance; Our Approach (Presidency
Performance Monitoring and Evaluation, 2009)
It states the importance of focusing more on the outcomes rather than the outputs to
improve service delivery. It also emphasises the establishment of delivery forums in
order to refine and provide more detail to the outputs, targets, indicators and key
activities for each outcome.
6.10 South African Statistics Quality Assessment Framework (SASQAF)
SASQAF emphasises the importance of using statistics for evaluating and measuring
the impact of policies, estimating progress in meeting national priorities such as
economic growth and job creation and assessing the success of initiatives aimed at
7
.
SRSA M&E FRAMEWORK 2010
reducing scourges such as crime and poverty. SASQAF is being used to evaluate the
quality of statistics.
6.11 National Treasury Guidelines
National Treasury guidelines are aimed at improving the cost-efficiency of public
spending in order to achieve the key outcomes targeted by government. The new
outcomes approach to budgeting encompasses a new approach to planning,
budgeting and monitoring of service delivery in line with departments. Strong focus is
given to progress against identified outcomes as well as other departmental
mandates.
6.12 Public Audit Act, 25 of 2004
This Act requires the Auditor General’s audit reports to reflect an opinion or conclusion
on the reported information relating to performance against predetermined objectives
of the auditee, which include constitutional institutions, departments, trade, entities
and other institutions as indicated by sections 4(1) and 4(3) of the Act.
6.13 Division of Revenue Act, (DoRA)
The DoRA deals with the provision of equitable division of revenue raised nationally
among national, provincial and local spheres of government. It is used to fund sport
support services and mass participation programmes in provinces. It is therefore
important that monitoring and evaluation system should be put in place to monitor and
evaluate the implementation of the funded projects.
7. DEPARTMENTAL MANDATE
The mandate of SRSA is to create an enabling environment to ensure that as many
South Africans as possible have access to sport and recreation activities, especially
those from disadvantaged communities. Furthermore, the department will endeavour
to increase international sport successes by strengthening performances at all levels
of participation.
8
.
SRSA M&E FRAMEWORK 2010
The Directorate: Strategic Management, Monitoring and Evaluation will ensure that
departmental programmes are monitored and evaluated according to the mandate of
the department.
8. DESCRIPTION OF FUNCTIONAL RESPONSIBILITIES (REFERRED TO AS
SRSA PROGRAMMES)
Programme 1: Administration
Purpose: Provide management, strategic and administrative support services.
Programme 2: Sport Support Services
Purpose: Support recognised sport and recreation bodies and public entities, and
monitor and report on their performance.
Programme 3: Mass Participation
Purpose: Create an enabling environment and provide support to increase the number
of participants in sport and recreation in South Africa.
Programme 4: International Liaison and Events
Purpose: Coordinate inter and intra government sport and recreation relations and
support the hosting of identified major events.
Programme 5: Facilities Coordination
Purpose: Facilitate the provision and management of sustainable sport and recreation
facilities.
9. STAKEHOLDERS
The following key stakeholders contribute to SRSA processes and are also benefiting
from this M&E framework:

The Minister of SRSA: The M&E Framework is intended to guide the Minister on
the performance of the department. As the most senior decision maker in SRSA,
the Minister requires the delivery of information of a certain type: strategic,
9
.
SRSA M&E FRAMEWORK 2010
analytical, outward oriented and concerned with integration and optimising
relationships.

Portfolio Committee and members of parliament: The framework is intended to
inform members of parliament on the performance of SRSA. In order to exercise
oversight and ensure accountability by the department, the Portfolio Committee
and the members of parliament need accurate and reliable information and reports
to be provided to them consistently and on time.

Programme managers and staff: The framework is intended to guide programme
managers and staff on how to monitor progress made with regards to departmental
goals and objectives as articulated in the strategic plan.

NGOs delivering sport and recreation services: The framework is intended to
guide NGOs on how to render a quality service by optimally utilising the resources
received from SRSA.

National Treasury:
Accurate non-financial data will be submitted to National
Treasury.

Auditor-General: The framework will be of assistance to the Auditor-General
because it will contribute to improved governance and an unqualified report.

National Federations: The framework will guide the National Federations on how
to monitor and evaluate the implementation of their projects.

Clients of SRSA: The framework is intended to guide and encourage SRSA to
apply the Batho Pele principles when dealing with its clients.
10. SRSA’s APPROACH TO M&E
10.1 Roles and responsibilities for M &E in SRSA
Effective management of performance information requires a clear understanding of
different responsibilities involved in managing performance. A number of officials play
a key role in ensuring that monitoring, reporting and programme evaluation are
competently undertaken within the department.
10
.
SRSA M&E FRAMEWORK 2010
Table 1.1: Roles and Responsibilities for M&E within SRSA
Stakeholder
Roles and Responsibilities
Minister

Uses M&E findings of institutional performance to ensure
that the desired outcomes and impact are achieved.

Director-
Provides parliament and Cabinet with detailed reports.
 Provides strategic leadership and management, as well as
General/Accounting
overall administrative, governance and performance
Officer
oversight.
 Accountable for the organisation’s performance.
 Provides strategic support to the Ministry and serves as
an interface between the department and parliament.
 Ensures that the departmental strategies and goals feed
into the broader government objectives and priorities.

Provides
strategic
leadership
in
intergovernmental
programmes within the sector as well as sector
partnerships.
Chief Operations
 Assists the Director-General in providing strategic support
Officer
and leadership of the department.
 Serves as the focal point for performance information in
the department.
 Ensures that systems are in place for the audit of
performance information.
Chief Financial

Officer
Ensures that budget allocation is linked to performance
information.

Accounts for the efficient and effective use of the
department’s financial expenditure.

Prepares financial reports.
11
.
SRSA M&E FRAMEWORK 2010
Strategic and

Executive Support
Renders strategic support to the Minister, Deputy Minister
and Director-General.

Coordinates strategic planning, monitoring and evaluation
responsibilities in the department.

Ensures that the Strategic Management and M&E sub
units sit together to develop indicators.
Strategic

the department’s changing M&E needs and environment.
Management,
Monitoring

Evaluation
SRSA
Updates/revises the M&E framework as required to reflect
Prioritises
M&E
deliverables
based
on
the
M&E
capacity/M&E staff within the unit.

Monitors and evaluates targeted projects of SRSA.

Monitoring approach (see table 1.4).

Monitor and evaluate the implementation of projects and
programmes/sub-
programmes in their areas of responsibilities. (Routine
programmes
monitoring of programmes is every manager’s function).
Internal Audit

Monitoring approach (see table 1.4).

Assesses the adequacy and effectiveness of internal
controls and systems put in place by the department to
collect, monitor and report performance information.

Evaluates the quality of performance information reported
in the QSRM reports and the annual report for
consistency of predetermined objectives and KPI’s.

Evaluates compliance with relevant legislation.
10.2 Strategic planning, monitoring and evaluation in SRSA
There is a relationship between strategic planning and monitoring and evaluation.
10.2.1 Strategic planning
Strategic planning is a process by which SRSA establishes its purpose and objectives,
and formulates actions designed to achieve these objectives within the desired timescale. The SRSA strategic plan will focus on the strategic outcome oriented goals,
12
.
SRSA M&E FRAMEWORK 2010
strategic goals indicators and strategic objectives. SRSA will use its strategic plan as a
tool to promote and plan the progressive implementation of its legislative mandate,
policies and programmes.
10.2.2 Monitoring and evaluation
SRSA will use monitoring and evaluation as a management tool to monitor and
evaluate the implementation of the strategic plan to determine whether it is achieving
its intended strategic objectives (outputs) and the desired strategic goals (outcomes).
10.2.3 Relationship between strategic planning, monitoring, evaluation and
reporting
SRSA, just like any other department, develops and/or updates its five year strategic
plan annually. The M&E framework assists the department to monitor and evaluate
the implementation of the strategic plan. Reports with findings are compiled and
evaluated to provide feedback on the successful implementation of the strategic plan.
The process becomes a cycle. It should be noted that planning, monitoring and
evaluation also complement each other.
The figure below shows the relationship between planning, monitoring, evaluation and
reporting:
13
.
SRSA M&E FRAMEWORK 2010
Figure 1: Relationship between strategic planning, monitoring, evaluation and
reporting
Fiscal Years 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021
Election
Electoral Cycle
Planning &
Budgeting
Election
5 Year Election Mandate
MTEF
• Strategic Plans
• Annual Performance
Plans and Budgets
Election
5 Year Election Mandate
MTEF
MTEF
APP
MTEF
...
5-year Strategic Plan
5-year Strategic Plan
Budget
...
Budget Budget Budget Budget Budget Budget Budget Budget
APP
APP
APP
APP
APP
APP
APP
APP
APP
APP
APP
(with MTEF)
In-year Reporting
• Monthly Financial
Reports
• Quarterly Performance
Reports
End-year Reporting
• Annual Reports
MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR MR
Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4
AR
AR
AR
AR
AR
AR
AR
AR
AR
AR
AR
AR
(with annual financial
statements)
Long-term Reporting
• End-term Performance
Reviews
EPR
EPR
.
14
.
SRSA M&E FRAMEWORK 2010
10.3 The Logic model
The M&E framework will adopt the logic model because it promotes the results-based
approach.
Figure 2: Logic model
The developmental results of achieving
specific outcomes
IMPACTS
What we aim to change?
The medium-term results for specific
beneficiaries that are the consequence
of achieving specific outputs
Manage towards
achieving these
results
OUTCOMES
What we wish to achieve?
The final products, or goods and
services produced for delivery
OUTPUTS
What we produce or deliver?
The processes or actions that use a
range of inputs to produce the desired
outputs and ultimately outcomes
ACTIVITIES
Plan, budget,
implement and
monitor
What we do?
The resources that contribute to
the production and delivery of
outputs
INPUTS
What we use to do the work?
The figure indicates the shift from output and pays more attention on the ultimate
results such as outcomes and impacts. The aim is to see the relevancy, effectiveness
and sustainability of the programmes and policies within the organisation. SRSA will
adopt this model. It will move away from output as the final result and pay more
attention to the outcomes and impacts of the programmes.
15
.
SRSA M&E FRAMEWORK 2010
The above figure further shows the components of a logic model which are the inputactivities- output- outcomes-impacts. It indicates the logical relationship between
inputs, outputs, outcomes and impacts. The inputs, activities and outputs can be used
as measures of efficiency whereas the results (outcomes) can be used as measures
to evaluate programme effectiveness.
It should be noted that outcomes and impacts are regarded as results. Results are the
changes or the differences that result from the project outputs. Results are usually
modified (e.g increased, decreased, enhanced, improved, maintained). The
Directorate: Strategic Management, Monitoring and Evaluation will ensure that SRSA
employs this logical relationship. This will help in clarifying the objectives of a given
project, programme or policy and to identify the causal links between inputs, activities,
outputs, outcomes and impact.
The logic model provides a clear and logical argument demonstrating how project,
programme and policy activities will provide the intended outcomes, noting important
causal mechanisms. Managers will also be able to differentiate between “what they
do” and “results”-outcomes.
SRSA will use the logic model to link resources, services, products, outcomes and
impacts of projects, programmes and policies in a hierarchy.
The table below,
explains the components of a logic model as stated in figure 2.
Table1.2: Example of a logic model for SRSA
Input
Budget
approved
Staff employed
Activity
Build sport
facilities.
Sport
training
takes place
Output
Sport
facilities
built.
Outcome
Impact
Improved
Better
participants’
international
skills
performances
Sport
facilities
utilised
Sport
16
.
SRSA M&E FRAMEWORK 2010
Budget
Training
training
received.
of Coaches
approved
coaches
trained
Improved
Better
coaching skills
international
performances
Staff employed
Budget
SRSA funds
Teams
Improved team
International
approved
SASCOC for
prepared
performance
performance
Healthy
team
Staff employed
preparation
Teams
and team
delivered
delivery
Budget
SRSA funds
Clubs
More
approved
NFs to develop
developed
opportunities to lifestyle
clubs
participate in
Staff employed
through sport
sport
Budget
Establishing
Mass
Life long
Active and
approved
mass
participation
partipation in
empowered
participation
programmes
mass
nation
programmes
established.
participation
Staff employed
programmes
Mass
participation
programmes
implemented
17
.
SRSA M&E FRAMEWORK 2010
Table1.3: Level of indicators for the above mentioned logic model for SRSA
Input
Activity
Output
Outcome
Impact
indicator
indicator
indicator
indicator
indicator
Budget
Number of
Number of
Number of
Number of
approved
sport
sport
participants
participants
facilities on
facilities
whose skills
whose
Number of staff
the process
built.
have improved
nternational
employed
of being
performances
built.
improved
Number of
Number of
sport
sport
training
facilities
taking place
utilised
Sport
training
received
Budget
Number of
Number of
Number of
approved
coaches
coaches
coaches whose participants
receiving
trained
skills have
whose
improved
nternational
Number of staff training
employed
Number of
performances
improved
Budget
Number of
Number of
Number of
Number of
approved
teams on
teams
teams whose
participants
preparation
prepared
performance
whose
have improved
nternational
Number of staff process due to
employed
funds from
Number of
performances
SRSA to
teams
improved
SASCOC
delivered
18
.
SRSA M&E FRAMEWORK 2010
Budget
Number of
Number of
Increase in the
Number of
approved
clubs in the
clubs
number of
participants
process of
developed
participants
living an active
Number of staff being
within the
life
employed
sporting codes
developed
Decrease in
the rate of
cholesterol
level
Decrease in
the rate of
obesity
Budget
Number of
Number of
Increase in the
Number of
approved
mass
mass
level of
healthy people
participation
participation
physical fitness
in South Africa
Number of staff programmes in
programmes
of people
employed
implemented
the process of
being
implemented
10.4 Performance indicators, baselines and targets

The Directorate: Strategic Management, Monitoring and Evaluation will also
develop indicators if the need arises to meet the requirements of M&E at an
operational level.

Performance indicators need to be “SMART” (specific, measurable, achievable,
realistic, trackable), relevant, clear (precise and unambiguous), economic
(available at a reasonable cost), adequate (provide a sufficient basis to assess
performance), monitorable, valid and reliable.
19
.
SRSA M&E FRAMEWORK 2010

The M&E framework will also make use of existing baselines and targets covered
in the departmental strategic plan.
10.5 Monitoring approach in SRSA
The Directorate: Strategic Management, Monitoring and Evaluation will play an
oversight role to ensure that the programmes/sub-programmes of SRSA are efficiently
monitored. Routine monitoring of programmes/sub-programmes is every manager’s
function.
The table below explains the monitoring approach in SRSA.
Table 1.4: Monitoring approach in SRSA
Stakeholder
Monitoring activities
Time frame
The Directorate: 
Develops reporting templates to meet Annually or when
Strategic
different
Management,
example, prepares quarterly programme
Monitoring
and
reporting
requirements.
For the need arises.
performance report template for the
Evaluation
departmental programmes (see Annexure
A).


Verifies the accuracy of identified data Quarterly or when
collected by programme managers.
the need arises.
Prepares a programme information
20th of the coming
evaluation report on QSRM reports. This
month after each
report will be presented during the QSRM
quarter.
meetings.

Consolidates quarterly programme
Quarterly.
performance reports from various
programmes and submit the executive
summary report to the Minister for his/her
information.
20
.
SRSA M&E FRAMEWORK 2010

Compiles the performance information as 30 April (at the end
part of the annual report.
of each financial
year)

Facilitates the development of
Strategic planning
performance indicators and targets at a
sessions
strategic and operational level within the
department.

The M&E sub-unit shall further develop When the need
indicators, if a need arises, to meet the arises
requirements of M&E at operational level.

Provides technical support to managers When the need
in as far as monitoring and evaluation is arises
concerned.

Facilitates proper reporting by different Quarterly
managers.

Ensures that information collected from
Quarterly
the departmental programmes is
disseminated to the top management.

Keeps a decision register of key strategic Quarterly
issues.


Makes recommendations on the
When the need
improvement of SRSA service delivery.
arises
The Directorate will play an oversight role
Quarterly.
by conducting a random selection of
projects at provincial level and pay a site
visit to monitor their implementation. The
aim of this process is to check the extent
to which service delivery levels comply
with national norms and standards.

Assist to design monitoring tools for the
Whenever
departmental programmes/sub-
necessary
21
.
SRSA M&E FRAMEWORK 2010
programmes.

Directors to populate the approved
No later than 10th
programmes/sub-
quarterly programme performance
of the coming
programmes
reporting template and submit it to their
month after each
managers
Chief Directors.
quarter.
Chief Directors to consolidate, verify, and
No later than the
sign off their quarterly programme
15th of the coming
performance reports in the approved
month after each
template and submit them to the
quarter.
SRSA

Directorate: Strategic Management,
Monitoring and Evaluation.

Chief Directors to submit the quarterly
Quarterly
programme performance reports which
must be accompanied by a list of
complementary documents with
relevant/more information and to make
the documents available or email it to the
Directorate: Strategic Management,
Monitoring and Evaluation.


Chief Directors to update their QSRM
No later than the
reports, in case the Directorate: Strategic
30th of the coming
Management, Monitoring and Evaluation
month after each
finds gaps in the reports.
quarter.
The quarterly programme performance Quarterly
reports will be presented by the relevant
Chief Directors.

Chief Directors to provide the Directorate: No later than the
Strategic Management, Monitoring and 30 April of each
Evaluation with accurate information in as financial year.
far as the annual report inputs are
concerned.
22
.
SRSA M&E FRAMEWORK 2010
Provinces, Public 
Monitor
entities, NFs and
projects and compile project reports.
NGOs

the
implementation
of
their Quarterly
Expected to analyse and verify their Quarterly
project
reports
to
ensure
accurate
information before those reports can be
forwarded to SRSA.
10.6 Evaluation approach in SRSA
The Directorate: Strategic Management, Monitoring and Evaluation will design
evaluation tools to be used to evaluate for instance the impact, sustainability,
relevance and the effectiveness of the programme/sub-programmes. The said
Directorate will further evaluate identified projects and programmes of SRSA.
However, it will work as a team with the relevant programme/sub-programme
managers.
10.6.1 Types of evaluations
a) Impact evaluation
It is the responsibility of the M&E unit to ensure that impact evaluations (impact
assessment studies) are conducted on behalf of identified departmental projects and
programmes.
The impact assessment is essentially about change and therefore asks the following
questions:

What is the impact of the project/programme on the lives of the people?

Whether, to what extent and how are goals achieved over time?

What is the relevance of objectives, efficiency, effectiveness, impact, sustainability,
in order to incorporate lessons learnt into the decision-making process?

What is the impact of resource allocation?
23
.
SRSA M&E FRAMEWORK 2010
Impact evaluations may be conducted on identified programmes. It will be conducted
according to the strategic themes of the department. The following are examples of
the strategic themes to be considered:

The impact of mass participation programme with emphasis on children, youth,
women, older persons, people with disabilities and people living in rural
communities.

The impact of sport and recreation in disadvantaged communities.

The extent to which SRSA contributes towards transformation.

The impact of SRSA in nation building.

The extent to which SRSA contributes towards the healthy life style of the nation.

The extent to which SRSA supports recognised sport and recreation bodies.
b) Programme evaluations/Formative/Mid-term evaluation
The Directorate: Strategic Management, Monitoring and Evaluation will evaluate
programmes of the department half-way through a five year period of a strategic plan
and at the end of five years. The purpose of doing this exercise is to determine the
extent to which programme activities or services achieve intended results or to assess
whether programmes are reaching the intended strategic goal of the department. For
example SRSA may consider assessing the results of Mass Participation Programme.
c) Process evaluation
Process evaluations take place once activities are underway and focus on tracking the
efficiency of the department or a given programme. Process evaluations focus on
providing information relating to what extent planned services are being realised, how
well services are being provided, in what timeframe, at what cost, and with what result.
Process evaluations analyse how efficiently inputs (money, time, equipment,
personnel, etc.) are being used in creation of outputs (products, results, etc.). Process
evaluations help organisations analyse what they planned to do versus what they
actually are achieving and are used to make adjustments or refinements in tactics or
implementation strategies. Process evaluations are often conducted informally (staff
24
.
SRSA M&E FRAMEWORK 2010
meetings, etc.) at regular intervals during the program year to assess progress toward
achieving the results. They need to be based on performance data (results from
indicator data collection) as well as staff observation of projects and programmes.
25
.
SRSA M&E FRAMEWORK 2010
Table 1.5 Evaluation types and their functions
The table below explains the evaluation types and their functions
Type of evaluation Frequency of
What questions do we ask Tools used
conducting evaluation
study
Impact evaluation
5 years, at the end of What is the impact of mass Interviews,
strategic plan period
participation programme on questionnaires
participants?
Does mass participation
programme bring change in
the lives of people?
Outcome at mid term Mid term period of the Does mass participation
Questionnaires
of the strategic plan strategic plan i.e
programme bring institutional
or at the end of the outcome evaluation will change, behavioural change,
programme
be conducted halfway new knowledge and
through a strategic
increased skills?
cycle
Process evaluation After year one of the
Is the programme reaching Quarterly
i.e during the
strategic plan or when its set target?
data/quarterly
implementation of the the particular
What are the challenges
reports
programme
programme is still at its identified?
implementation phase Are proposed interventions
working?
Were projects reported
according to guidelines?
Do we improve with time?
26
.
Methods used
Responsible
person
Face to face
The Directorate:
interviews, Focus Strategic
group
Management,
Monitoring &
Evaluation (SMME)
or external
evaluators
Survey
Directorate: SMME
or external
evaluators
Data analysis
SRSA
programme/subprogramme
managers working
as a team with the
Directorate :
SM,M&E.
SRSA M&E FRAMEWORK 2010
10.7 Evaluation reporting
The Directorate: Strategic Management, Monitoring and Evaluation will perform the
following reporting functions:

Report the results of impact evaluations that have been conducted for identified
programmes.

Use evaluation findings to enhance evidence-based decision making and
accountability in SRSA, and feed back to policy development or review
mechanisms.

Evaluate the results of services rendered by SRSA.
10.8 Data dissemination management

The Directorate: Strategic Management, Monitoring and Evaluation will manage
the information in the following manner:
o The (research) reports will be emailed to stakeholders.
o The research reports will also be published on the SRSA website.
o The information will be stored at the SRSA information centre.
The Directorate: Strategic Management, Monitoring and Evaluation will assist that the
SRSA information centre locates documents such as:

Copies of progress reports and copies of any relevant progress or evaluation
reports of projects/programmes by other donor agencies, NGOs, etc.

Copies of reports of all the surveys and research conducted in the domain of the
project/programme.

Copies of the periodic SRSA newsletter and other newsletters and printed media
relevant to the project.

Copies of relevant course materials and tools developed for the capacity building
project, such as training manuals, etc.

Quarterly and annual activity reports produced.
27
.
SRSA M&E FRAMEWORK 2010
10.9 Data quality management
The Directorate: Strategic Management, Monitoring and Evaluation has an oversight
role in terms of the following functions:

To ensure that there are records/documents kept to check the quality of data.

To check the quality of data on a quarterly basis.

To apply the following five key criteria when auditing data to ensure that data
quality is applicable to data source, data collection, data analysis, and data
reporting:
o
Validity: Here programme managers should come up with data that is clear,
direct, and adequately representing the result (output/outcome).
For
example, if the output is the development of a policy, then the actual
information should be relevant to the development of that policy.
o
Reliability: Programme manager should make sure that they get the same
results or findings if the same tool is used. For example, the same
questionnaire should reveal the same results and findings if used more than
once.
o
Integrity:
Programme
managers
should be
honest
when
providing
information for instance compiling the quarterly performance information
report is concerned. “Copy and paste” of incorrect and old information is not
acceptable.
o
Precision: Accurate information is needed.
o
Timeliness: Programme managers should have sufficient and current
information when compiling (quarterly) reports.
11. M&E FORUM
The Directorate: Strategic Management, Monitoring and Evaluation will look at the
establishment of M&E forum and terms of reference will be developed for the forum.
Objectives of the forum are to:
28
.
SRSA M&E FRAMEWORK 2010

Lead and guide the successful implementation of the M&E initiatives.

Improve the quality and performance of monitoring and evaluation by sharing
outcomes and lessons learned.

Guide and support M&E activities by providing a platform for interaction and
information sharing.

Build partnerships between directorates and other institutions.

Increase the dissemination and use of M&E.

Inform and train managers on M&E practices.

Communicate M&E activities and initiatives.

Ensure that best practice models are shared.

Get research expertise to provide expertise and technical assistance in terms of
the area of research.
The composition of the forum should be:

Directorate: Strategic Management, Monitoring and Evaluation.

Director: Internal Audit.

Representative from each programme/sub-programme.

Representatives from provincial departments responsible for sport and recreation.

Other relevant external stakeholders e.g. National Treasury, Presidency, etc.
Forum activities will include among others, the following:

Checking the quality of M&E documents and commenting and providing inputs on
M&E reports.

Interacting around and supporting research design, the formulation of research
questions and the structuring of Terms of References.

Ensuring that the department incorporates M&E findings in its policies, legislation
and strategies.

Ensuring the implementation of evaluation and other recommendations.
29
.
SRSA M&E FRAMEWORK 2010
12. M&E CAPACITY BUILDING
According to the World Bank, Monitoring and Evaluation capacity building is an
integrated development of skills, resources and infrastructures and the internal shift
towards an M&E culture in an organisation. The World Bank argues that capacity
building in general and M&E in particular, is far more than just training, and requires
complementary improvements in four major directions. These four pillars of M&E
capacity building are improvements in:

Institutional capacity: a move from less efficient to more efficient accountability
rules and incentives.

Organisational capacity: the tailoring and adaptation of the organisational
architecture of M&E government entities to the new rules and incentives.

Information and communication systems and technology capacity: using systems
and ICT for better and more timely information on results.

Human capacity, through training in M&E, but targeted on the sort of skills that are
suited to the particular institutional and organisational context, and thus will
actually be used and reinforced after they are imparted.
The M&E framework will address the importance of the above mentioned capacity
building to ensure performance improvement of services. The Directorate: Strategic
Management, Monitoring and Evaluation will ensure that there is M&E capacity
building taking place in the department. This should include among others, M&E
training courses, capacity building in terms of recruitment as currently the M&E subunit has limited capacity.
30
.
SRSA M&E FRAMEWORK 2010
13. REFERENCES
Bristol-Myers, S. 2002. M&E Capacity Building Workshop Handbook: Step by step
guide to Monitor and Evaluation. Manto Management.
Constitution of the Republic of South Africa, 108 of 1996.
Division of Revenue Act, (DoRA)
IFAD (International Fund for Agricultural Development), 2009. A Guide for Project.
Improving Government Performance; Our Approach (Presidency Performance
Monitoring and Evaluation, 2009.
IPDET (International Program for Development Evaluation Training) Handbook, 2007.
The World Bank: Washington, D.C.
Khandker, S.R.; Koolwal, G.B & Samad, H.A. 2009. Handbook on Impact Evaluation:
Quantitative Methods and Practices. The World Bank: Washington, D.C.
Kusek, J.Z. & Rist, R.C., 2004. Ten steps to a results-based monitoring and evaluation
system: A handbook for development practitioners. The World Bank: Washington,
D.C.
M&E: Managing for Impact in Rural Development.” Rome: IFAD.
National Treasury, 2007: Framework for Managing Programme Performance
Information, GWM&E. Pretoria.
National Treasury, 2009. Treasury Guidelines, Preparation of Expenditure Estimates
for the 2011 Medium Term Expenditure Framework, National Treasury.
31
.
SRSA M&E FRAMEWORK 2010
National Sport and Recreation Amendment Act, 1998 (as amended).
National Treasury, 2010. Framework for Strategic Plans and Annual Performance
Plans, Pretoria.
OECD, 1999. Improving Evaluation Practices: Best Practice Guidelines for Evaluation
System.
Policy Framework for the Government-wide Monitoring and Evaluation System, 2007.
Prosavac, E. J & Carey, R. G. 1997. Programme Evaluation-Methods and Case
Studies (5th Edition). New Jersey: Prentice Hall.
Public Audit Act 25 of 2004.
Public Service Act, 1994.
Public Finance Management Act, 29 of 1999.
Public Service Commission, 2008. Basic Concepts in Monitoring and Evaluation.
Rabie, B. & Ackron, J. 2009. Presentation on “Introduction to Public Sector Monitoring
and Evaluation: The School of Public Management & Planning. University of
Stellenbosch.
Statistics South Africa, 2007. South African Statistical Quality Assessment Framework
(SASQAF). First Edition, Pretoria.
White Paper on Transforming Public Service Delivery ( Batho Pele White Paper),
1997.
32
.
SRSA M&E FRAMEWORK 2010
ANNEXURES
Annexure A: Quarterly Programme Performance Reporting Template
Programme name
Programme objectives
Outcome/output Indicator
Milestone
Actual
Reasons
Corrective
e.g Quarter
for
measure
1
deviation
Annexure B: Monitoring plan template
Outco Indicat Baseline Targe Disaggregati Data collection Data
Reportin Responsi Data
Level
me/Out or
g
Of
put
t
on
tool/method
source
ble person storage
Frequen
cy
33
.
indicator
SRSA M&E FRAMEWORK 2010
Annexure C: Evaluation plan template
Type of
Frequency of
What questions do we
evaluation
conducting
ask
Tools used
Methods used
evaluation study
34
.
Responsible person
Download