Uploaded by Samandar Mahmodi

2016 Final Report Monitoring and Evaluation Assessment Afghanistan

advertisement
Administrative Office of the President
National Monitoring and Evaluation Assessment
Report
January 2016
Technically and Financially Supported by
the German Development Cooperation/ GIZ
Disclaimer: Views expressed in this publication are those of the authors, and do not
necessarily represent the view of GIZ or AOP.
Acknowledgement
The assessment team would like to express its gratitude and special thanks to Mr. Abdul
Subhan Raouf, Director General for Monitoring, Evaluation and Audit (AOP) and his team.
Assistance provided by Mr. Mohammad Hashem, Policy Advisor (OPAF-GIZ) is also greatly
appreciated. We would like to acknowledge assistance and cooperation of all governmental
officials and M&E representatives of the entities who have extended their helping hands in
provision of timely data and information for this assessment.
We also owe special thanks to Dr Wolfram Fischer (GIZ) who has provided technical
assistance and guidance in carrying-out this assessment.
We are also particularly thankful to those who provided their assistance during data
collection.
Sincerely,
The assessment team
Samandar Mahmodi
Akmal Samsor
2
Acronyms
AKDN
Agha Khan Development Network
ALCS
Afghanistan Living Conditions Survey
ANDS
Afghanistan National Development Strategy
AOP
Administrative Office of the President
CSO
Central Statistics Organization
CSO
Civil Society Organization
DfID
Department for International Development - UK
GIZ
Gesellschaft für Internationale Zusammenarbeit
IARCSC
Independent Administrative Reform and Civil Service Commission
IDLG
Independent Directorate of Local Government
ILO
International Labor Organization
M&E
Monitoring and Evaluation
MAIL
Ministry of Agriculture, Irrigation and Livestock
MDGs
Millennium Development Goals
MICS
Multi Indicator Cluster Survey
MoD
Ministry of Defense
MoE
Ministry of Education
MoEc
Ministry of Economy
MoF
Ministry of Finance
MoI
Ministry of Interior
MoICT
Ministry of Information and Communication Technology
MoLSAMD
Ministry of Labor, Social Affairs, Martyrs and Disabled
MoPW
Ministry of Public Work
MoUD
Ministry of Urban Development
MoWA
Ministry of Women Affairs
NGO
Non-Governmental Organization
NRVA
National Risks and Vulnerability Assessment
RIMU
Reform Implementation Management Unit
SDGs
Sustainable Development Goals
UNDP
United Nations Development Program
UNICEF
United Nations
USAID
United States Agency for International Development
TNA
Training Need Assessment
3
Table of Contents
Acknowledgement ................................................................................................................................................. 2
Acronyms............................................................................................................................................................... 3
Executive Summary .............................................................................................................................................. 5
1.
Introduction ................................................................................................................................................ 13
1.2.
1.3.
2.
Afghanistan Outlook .............................................................................................................................................. 13
The Role of Monitoring and Evaluation ................................................................................................................. 15
Methodology and Approach ...................................................................................................................... 17
2.1.
2.2.
2.3.
2.4.
2.5.
2.5.1.
2.5.2.
2.6.
2.7.
2.8.
2.9.
2.10.
3.
Findings ....................................................................................................................................................... 22
3.1.
3.2.
3.3.
3.4.
3.5.
3.6.
3.6.1.
3.6.2.
3.6.3.
3.6.4.
3.6.5.
3.6.6.
3.6.7.
3.6.8.
3.6.9.
3.6.10.
3.6.11.
3.6.12.
3.6.13.
3.6.14.
3.6.15.
4.
5.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
6.
Assessment Objective ............................................................................................................................................. 17
Tasks ...................................................................................................................................................................... 17
Deliverables ............................................................................................................................................................ 17
Duration.................................................................................................................................................................. 17
Stakeholders’ Analysis Process: ............................................................................................................................. 17
Brainstorming Session ................................................................................................................................... 18
Literature Review ........................................................................................................................................... 18
M&E Systems Assessment within Identified Institutions ....................................................................................... 19
Scope of the assessment ......................................................................................................................................... 19
Performance Assessment Domains......................................................................................................................... 20
Data Entry, Analysis and Recommendations: ........................................................................................................ 21
Data Validation Workshop:.................................................................................................................................. 21
Findings of the Literature Review: ........................................................................................................................ 22
Findings of the Assessment: .................................................................................................................................. 25
Performance Scores ................................................................................................................................................ 27
Governmental Entities’ M&E System Performance ............................................................................................... 28
Non-Governmental Entities’ M&E System Performance: ...................................................................................... 32
M&E Performance by Entity .................................................................................................................................. 35
Ministry of Communication and Information Technology (MoCIT) .............................................................. 35
Ministry of Economy ...................................................................................................................................... 37
Ministry of Urban Development..................................................................................................................... 38
Ministry of Women Affairs ............................................................................................................................. 39
Independent Administrative Reform and Civil Service Commission .............................................................. 40
Ministry of Agriculture, Irrigation and Livestock .......................................................................................... 41
Ministry of Public Health ............................................................................................................................... 43
Ministry of Education..................................................................................................................................... 44
Independent Directorate of Local Governance .............................................................................................. 45
Ministry of Public Work ................................................................................................................................. 47
Ministry of Labor, Social Affairs Martyrs and Disabled ................................................................................ 48
Ministry of Defense ........................................................................................................................................ 49
Kabul Municipality ........................................................................................................................................ 50
Ministry of Finance ........................................................................................................................................ 51
Ministry of Interior......................................................................................................................................... 53
Conclusion: ................................................................................................................................................. 54
Recommendations ...................................................................................................................................... 55
Assessment of Organizational Structure with M&E functions .................................................................................. 55
Human Capacity for M&E ......................................................................................................................................... 56
M&E Plans ................................................................................................................................................................. 57
M&E Advocacy, Communications and Culture ......................................................................................................... 57
Routine Program Monitoring ................................................................................................................................ 58
Surveys and Surveillance ........................................................................................................................................... 58
M&E Database ........................................................................................................................................................... 59
Supervision and Data Auditing .................................................................................................................................. 59
Evaluation and Research ............................................................................................................................................ 59
Data Dissemination and Use ................................................................................................................................... 59
Additional Recommendations ................................................................................................................................. 60
Annexes: Assessment Tool ........................................................................................................................ 62
4
Executive Summary
The Administrative Office of the President (AOP) is committed to facilitate leadership of
President’s Office in fulfilling Afghanistan National Unity Government’s responsibilities of
assuring citizens’ rights, social order, hearing public’s complaints and problems,
safeguarding social, cultural and religion values. Its mission is the strengthening of principles
such as good governance, social justice, accountability, rule of law, fighting corruption.
AOP’s mandate is to assure Presidential Office and cabinet’s timely decisions and its
effective implementation through providing essential services and better environment.
To achieve its mission, AOP with support of the national M&E working group initiated to
develop a national policy for monitoring and evaluation in order to advocate and support
monitoring and evaluation on the government level, improve delivery of services by the
Afghan government on national and sub national level, help the government entities make
smarter decisions that are supported by real data, and to provide data for measuring national
development’s goals and indicators.
An assessment was carried out to appraise the current M&E capacity of the key
stakeholders so that findings of the assessment can be used in the development of a National
Monitoring & Evaluation Policy (NMEP) for the Government of Afghanistan. A
commonly used tool, the “12 Components M&E Systems Strengthening Tool”, was used to
assess the performance of M&E systems within the selected entities. Fifteen governmental
and eight non-governmental entities were selected; the assessment consisted of primary data
collection from the selected entities and an in-depth review of the relevant literature. Data
across the following 10 performance domains was collected: (1) Organizational Structures
with M&E functions; (2) M&E Human Capacity; (3) M&E Plans; (4) M&E Advocacy,
communications and culture;(5) Routine Program Monitoring; (6) Surveys and Surveillance;
(7) M&E Databases; (8) Supervision and Data Auditing; (9) Evaluation and Research; and
(10) Data Dissemination and Use. A scale of 4 is used to assess the performance of each
domain, on the scale of 4 : 1 represents a “No” or zero performance; 2 represents less than
50% performance; 3 represents more than 75% performance; while 4 represents 100%
performance.
Key findings of the study are presented in chronological order under each area that was
assessed during the assessment process.
1. Organizational Structures with M&E Functions:
• 73% government entities versus 88% non-governmental entities have a unit to
undertake M&E related tasks while 27% government entities, (12%) nongovernmental entities do not have a unit to carry out the M&E related tasks. This was
the highest performing domain (mean score = 2.59) of the M&E system in government
entities.
• 36% government entities versus 88% non-government entities M&E units have a
written and approved mandate, 43% government entities have written but NOT
approved mandates while 21% do not have a written and approved mandate.
5
• M&E functions at the entity level are not well coordinated because monitoring and
evaluation is not considered the main purpose of an M&E unit. Lack of coordination
mechanism, presence of independent or parallel structures for M&E functions, fear of
M&E and room for data manipulation are some of the key factors that were identified
during the assessment of government entities.
• In most of the government entities (64%) M&E operates under the policy and plan
directorate or deputy ministry while in the remaining cases this unit either operates
independently or under other deputy ministries.
• 73% of government entities have a clear defined description of the tasks undertaken by
different sections, while in 27% of the cases overlap exists or there is confusion about
the tasks in different sections of the M&E unit.
• Most of the government entities (71%) need external support to undertake its M&E
functions, however the support is rarely provided on time.
2. Monitoring and Evaluation Capacity:
• At National Level, on average 19 persons, versus 14 persons in non-governmental
entities, carry out M&E functions within every entity, excluding MoI. High variability
is present between the entities (Standard Deviation =12.2 persons).
• 47% of M&E posts were vacant at the time of data collection in government entities,
which shows that about half of the human resource is not available to undertake the
M&E tasks within these entities. The vacancy rate in non-governmental entities is only
10%.
• Except for MoE and MoPH, none of the assessed-government entities have staff at
sub-national level to undertake the M&E tasks.
• Proper in-service capacity building system is non-existent or is only partly present
within government entities.
• Capacity development plans based on annual performance appraisals are not well
connected with the technical capacity constraints of the M&E staff.
• 72.7% of government entities reported the curricula used in pre-service education do
not have topics related to M&E capacity building. In only 27.3% of the cases the preservice curricula incorporates the M&E topic (private institutions).
3. M&E Plans:
• About 46% of government entities, versus 86% non-government entities, have an
entity level national strategic plan, while the remaining either do not have a national
strategic plan (15.4%) or they have an unapproved or outdated national strategic plan
(38%). The national strategic plans were based on ANDS and/or MDGs and at the time
of the assessment, both ANDS and MDGs were not considered as valid strategic
directions (ANDS ended in the 2013 while the MDGs are officially replaced by
SDGs).
6
• 40% of government entities, versus 83% non-governmental entities, have M&E plans
while the other entities either do not have one or an M&E plan is only partly available.
• In the government entities activity plans developed by the M&E unit are considered as
M&E plans (M&E plan=Logical Framework Vs. Activity Plan).
• Performance monitoring, carried out by the HR directorate at the end of each year, is
not connected with the strategic plan in the government entity.
• In the government entities data is collected from the field based on donor/NGO request
or based on exceptional needs such as emergency, outbreak, political pressure or
complaints. It becomes obvious that only field visits conducted by the M&E staff are
considered as real M&E activities.
• Lack of resources (human, financial, transportation) are considered as main constraints
in implementing M&E plans in government entities; however, the team observed that
absence of well-formulated M&E plans and lack of coordination with other units are
the main constraints for carrying out monitoring functions within the entity.
4. M&E Advocacy, communications and culture
• In 43% of the government entities, actors in decision-making processes are using
information products generated by the M&E unit.
• 53% of government entities, versus 83% non-government entities, reported the
presence of strong support for M&E at the entity level.
• In 47% government entities, versus in 86% non-government entities, M&E
information is requested and utilized during planning, revision and/or budgeting
processes.
5. Routine Program Monitoring
• In government entities routine program data is mostly collected by the MIS unit. It was
identified that in most of the entities MIS is not connected to M&E or operates parallel
of the M&E unit.
• M&E unit is rarely involved in design, implementation and routine monitoring of the
projects. The M&E unit is asked to do monitoring when the project monitoring team is
unable to perform its functions.
• Except MoPH and MoE, government entities do not collect data regarding its routine
services delivery; while 88% of non-government entities have clear guidelines for
routine program monitoring.
6. Surveys and Surveillance
• Most of M&E system activities in government entities are focused at tracking
“outputs” therefore an inventory of the mandate level surveys (to track outcomes
and/or impact) is non-existent. Only 8% of government entities, versus 80% nongovernment entities, have a complete list of the mandate level surveys.
7
• Nationally representative surveys, which collect outcome and impact level data (such
as NRVA, MICS, ADHS, AHS, Animal Census, etc.) are funded by different donors
and are conducted by international research organizations with minimal involvement
of the M&E units of the governmental entities.
7. M&E Databases
• Only 33% of government entities have databases for electronically capturing and
storing data. Manual data collection and storage is the most common method. On the
other hand 37% of non-governmental entities have functional databases for
electronically capturing and storing data, in 37% of entities databases are partly
present while 26% do not have a database.
• Among the government entities with functional databases, 43% have structures,
mechanisms and procedures for transmitting, entering and extracting data from the
databases. While this percentage is 28.6% for non-governmental entities.
• Written quality control mechanisms for accurate capturing of data are available only in
21% of the government entities and 50% of the non-governmental entities.
• Data stored in the databases are only accessible by specific people within the entity or
access is provided to a limited number of stakeholders of the entity.
8. Supervision and Data Auditing
• 80% of government entities, versus 43% of non-governmental entities, do not have
guidelines and tools for M&E supervision. This is the lowest performing domain
(mean score=1.49) of the M&E system within the government entities.
• Supportive supervision of the M&E process within government entities is non-existent
while it was weak within non-governmental entities.
• About 80% do not have protocols for data auditing in order to verify and ensure its
reliability and validity
• Among all government entities only MoPH has a system to audit its data. All other
government entities do not have a proper system for data audit before it’s fed into the
M&E system. On the other hand, most of non-governmental entities have data auditing
protocols; in 15% of the cases feedback of data auditing is not shared with the data
generators.
9. Evaluation and Research
• 80% of entities do not have a team, committee or a board to take responsibility for
coordination and approval of new research and evaluations. While this is evident in
40% of non-governmental entities that were assessed by the team.
• Only 23% of government entities, versus 80% non-governmental entities, conducted a
research/evaluation of its mandate level programs in the last 2 years.
8
• Program/Project evaluations are conducted by organizations with donor funding with
minimal involvement of the relevant government entity M&E unit.
10. Data Dissemination and Use
•
46% government entities, versus 86% non-governmental entities, have a list of the
relevant stakeholders with contact details so data and reports can be shared with them.
•
Only 23% government entities, versus 75% non-governmental entities, have plans in
place for data use and dissemination, while other entities disseminate data without any
proper strategy.
•
33% government entities, versus 86% non-governmental entities, provide online
access to part of its M&E data either through its website or through other social media
channels.
11. M&E Partnership and M&E Plan Costing:
• At national level, an entity or mechanism is not present to provide stewardship, to
coordinate and align M&E functions. The same is evident at the entity level.
• At national level and at the entity level resource allocation for the M&E functions to
achieve M&E goals is non-existent.
The mean score for all governmental entities is (2.04) which means, on average, all ministries
are performing below 50%; individually, the performance scores for each of the ministries by
descending order is MoPH (2.86), MoE (2.75), IDLG (2.62), IARCSC (2.51), MoF (2.42)
and MoEc (2.24), MoLSAMD (2.22), MoPW (2.17), MoI (2.08), MAIL (1.97), and MoWA
(1.69), MoICT (1.66), MoD (1.31), MoUD (1.10), and Kabul Municipality (1.05) with the
lowest average ranking.
Following measures are recommended to improve performance of M&E systems on national
and sub national level. The recommendations are distributed for each area that was evaluated
during the assessment process.
1.
Organizational Structure:
a. Establishment of M&E Directorates within government entities;
b. Link of M&E with Planning within every government entity;
c. Units performing the M&E functions should be structured under an M&E directorate
to align the tasks performed by each unit individually.
d. Establishment of Sub-national level M&E functions in order to get more accurate data
from the provinces and districts. This will also improve coordination between the
provincial institutions and the ministries.
9
2. Human Capacity:
a. Speed-up the process of recruitment of qualified M&E personnel to fill out the
current capacity gap that exists within government entities.
b. Strengthen in-service M&E capacity development to improve quality of
monitoring and evaluation, have more accurate data, and use data for decision
making.
c. Advocate for change in pre-service curriculum (M&E capacity building). This can
be done through integrating critical thinking and monitoring and evaluation topics
into curriculum of government and non-government universities and institutes or
add a new subject to the curriculum to develop student skills in monitoring and
evaluation.
3. M&E Plans:
a. Develop result oriented, outcome and impact based, national strategic plans; as
stated above more than 50% of government entities did not have a strategic plan
during the assessment process or they were focused more towards outputs.
b. Develop M&E plans based on national strategic plans.
c. Entities should carry out M&E functions based on national M&E plans, which
means ministries and government offices should develop the plans based on the
national indicators or objectives.
4. Advocacy, Communication and Culture:
a. National and entity level advocacy strategies for M&E need to be developed and
be part of M&E plans.
b. A culture of M&E for learning, transparency and accountability should be
promoted (M&E is not a controlling tool)
5. Routine Program Monitoring:
a. Establish/strengthen routine program monitoring (routine service delivery
monitoring)
b. Monitoring and evaluation should be done for the indicators identified for each
project or program in government entities. Which means the ministries and donor
agencies shall clearly define the indicators for which M&E data should be
collected.
6. Survey and Surveillance:
a. Build capacity of the M&E units to provide stewardship and to coordinate
mandate level surveys;
b. Conduct mandate level surveys on regular basis;
c. Establish surveillance systems specifically at MoE, MAIL and MOI.
10
7. M&E Databases:
a. Promote culture of electronic data storage and data use; establish data
management system on entity level.
b. Improve access to the stored data on entity level.
8. Supervision and Data Auditing:
a. Establish entity level mechanisms for data verification.
b. Establish a system of supportive supervision for M&E processes improvement.
9. Evaluation and Research:
a. Establish evaluation/research boards within the entities.
b. Introduce a policy of mandatory evaluations of key priority programs.
10. Data Dissemination and Use:
a. Introduce a policy of mandatory online availability of entity level data.
b. Promote the culture of data based decision making.
Additional Recommendations:
a. Establish on national level an entity/mechanism to provide stewardship,
coordination and alignment for M&E functions.
b. Establish entity level mechanisms (taskforce, coordination committee) to provide
stewardship, coordination and alignment for M&E functions.
c. Introduce a policy for the national and entity level in order to establish proper
resource allocations for M&E functions.
In addition to the above recommendations the following recommendations are drawn during
the data validation workshop where representatives of government and non-government
entities participated.
1. Terminologies related to monitoring and evaluation should be clearly defined and
standardized.
2. The policy for monitoring and evaluation should be developed in the next stage and shall
be provided with legislative support.
3. Strong follow-up on monitoring and evaluation which shall be supported by the office of
the president.
4. Advocacy for M&E on entity level and national level is a must and can only help to
empower monitoring and evaluation.
5. The non-governmental entities have robust processes and tools for “supervision and data
auditing” which can be utilized by the governmental entities. For example, USAID’s data
quality audit tools can be utilized for the quality assurance of the data collected by the
governmental entities. Also the tools developed by USAID for assessment and
development of the human capacity can be utilized by the governmental entities to
improve the performance of the domain, “Human Capacity for M&E”.
11
6. Coordination between units carrying out the M&E in every entity needs to be improved;
further collaboration and future cooperation by forming a national M&E board would be
a major step to strengthen the role of monitoring and evaluation.
7. Explicitly advocate for open data policy in order to have access to data and reports on
entity and national level and to improve planning process.
8. The proposed recommendations by the experts should be added to a more practical action
plan in order to improve M&E step by step.
9. The report and recommendations should be shared with the ministers in order to get more
support for timely actions and to consider M&E in future planning for each entity.
The role of AOP and entities should be clearly defined in regard to monitoring and
evaluation. For example, the format of data requested by AOP and entities shall be clearly
defined and should be practical and focused towards M&E data rather than a descriptive
report.
12
1. Introduction
The Administrative Office of the President (AOP) is committed to facilitate effective
leadership of President’s Office in fulfilling Afghanistan National Unity Government’s
responsibilities of assuring citizens’ rights, social order, hearing public’s complaints and
problems, safeguarding social, cultural and religion values.
Its mission is the strengthening of principles such as good governance, social justice,
accountability, rule of law, fighting corruption. It will do so through follow-up and
enforcement of Presidential orders, decrees and decisions based on the Afghan constitution,
international treaties and commitments, institutionalization of the respect for human rights,
reform in government policies and strategies, monitoring, evaluation and reporting to the
nation, so that Afghan people also become part of the efforts for development, economic
growth, lasting peace and social harmony.
AOP’s mandate is to assure Presidential Office and cabinet’s timely decisions and its
effective implementation through providing essential services and better environment.
Results Based Monitoring and Evaluation is the key to improved service delivery,
accountability, transparency, good governance and effectiveness, efficiency and sustainability
in programming and policy implementation. Monitoring and Evaluation plays an everimportant role in development, the transformation of evaluation in 2015 after declaration of
2015 as the international year of evaluation made it more important. Development agencies,
donors and governments have worked throughout 2014 and 2015 to develop a comprehensive
global evaluation agenda (EvalAgenda2020); the agenda was recently launched in
Kathmandu in the Parliament of Nepal on the 25th of November 2015. This launch of the
global evaluation agenda was subsequent to the endorsement of Sustainable Development
Goals (SDGs) by the United Nations on the 25th of September 2015 as the replacement for
Millennium Development Goals (MDGs).
1.2.
Afghanistan Outlook
Demography: The Government of the Islamic Republic of Afghanistan is a landlocked
country surrounded by six different countries and is located in between South and Central
Asia, with geographical area of 652,225 square kilometers. The population of Afghanistan is
close to 32 Million based on an estimate from 2014 1.
Political situation: Afghanistan is currently going through political transformation with
development priorities mostly related to security, peace building and humanitarian work to
ensure fulfillment of basic human rights. Politically, Afghanistan has rarely remained stable
with change in government as a result of series of wars. The conflict and insecurity has been
the major setback in the Afghanistan's development. The results of war, the destruction of
core institutions of state and a heavily war-torn economy led to unrivaled levels of absolute
poverty, national ill health, large scale illiteracy and the almost complete disintegration of
1
www.worldbank.org
13
gender equity (ANDS, 2008). During last decade, especially after 2001, Afghanistan has
attempted to establish democratic practices amidst the persistent security challenges.
Economy: The growth of Afghan economy is one of the slowest in the region, with annual
growth in economy estimated to be two percent in 2014 (a decrease from 3.7 percent in 2013)
(World Bank, 2015). The decrease in the growth rate has been a key concern for the country
where the growth rate during 2003 - 2012 was around an average of nine percent (World
Bank, 2015). The decrease in growth is attributed to: (a) political uncertainty combined with
weak reform progress leading to low confidence among investors and consumers, (ii)
drawdown in aid, affecting growth in non-agricultural sectors (manufacturing, construction,
and services) while the agricultural harvest in 2014 was strong for the third year in a row, and
(iii) poor security situation leading to compromise in development budget to security costs
and mandated social benefit spending.
Development: The development context of Afghanistan highlights number of challenges for
the country to overcome. As per World Bank data in 2011, around 36 percent of the Afghan
population lives below the poverty line, and the life expectancy at birth is 61 years (as of the
data updated in 2013). The Government of Afghanistan formulated Afghanistan National
Development Strategy (ANDS) for the period of 2008-2013 which has already come to an
end without much change in the development indicators.
Social sector: The usual effect of the sluggish national economy and poor security situation
is clearly visible in education, health, and livelihood, indicated in terms of poor quality of
services and outcomes. However, there were tremendous improvements during the last
decade. The education sector has expanded well: the school enrolment increased from
191,000 to 3.75 million during the period of 2001-2012 along with increased number of
teachers from 20,000 to more than 187,000 2. However, the net enrollment rate for primary
level remained pretty low at around 57 percent (66% for boys, 48% for girls) in 2014 (CSO,
2014). Based on the World Bank figures, the health services have also expanded over the
years: the number of functioning health facilities increased from 496 in 2002 to more than
2,000 in 2012. The data from household surveys (between 2003 and 2011) also show
significant decline in maternal and child mortality. The under-five mortality rate and infant
mortality rate dropped from 257 and 165 per 1,000 live births to 97 and 77 respectively. The
maternal mortality ratio is 327 per 100,000 live births in 2011, compared with 1,600 in 2002.
As reported in Afghanistan Living Conditions Survey (ALCS) 2014, the access to safe
drinking water and improved sanitation has improved in last five years. The proportion of
households with access to drinking water facility is 67% (compared to 43% in 2011), and
households with access to improved sanitation have reached 15% (from 8% in 2011) (CSO,
2014). The percentage of the population with access to electricity in Afghanistan is among
the lowest in the world (43 percent of population having access to electricity in 2012) 3. A
large section of the population also falls within the burden of food security due to poor
access, vulnerable security situations and seasonal difficulties. As of 2014, around 23 percent
of population is considered to be food insecure (compared to 11% in 2011) (CSO, 2014).
2
Reported by World Bank based on government estimates.
http://data.worldbank.org/indicator/EG.ELC.ACCS.ZS
3
14
Employment: With the labor force increasing by over 400,000 each year and very stagnant
business sector, Afghanistan has an increasing need to generate employment opportunities for
its new labour market entrants, along with those individuals who are already un- and
underemployed (ILO, 2012). Based on recent Afghanistan Living Conditions Survey - ALCS
2013/14, around 24 percent population of economically active age group is unemployed
(CSO, 2014). This proportion is an increase from the rate of 9 percent in 2011. The
employment among women is even poorer with unemployment rate of 38 percent compared
to 19% for men (CSO, 2014). Women's mobility outside the home is limited for cultural
reasons, especially in rural areas, while their educational attainment is very low. Majority of
the population of Afghanistan depend on agriculture for their income and livelihood. Sixty
percent of Afghans rely on agriculture for their livelihoods and their family’s sustenance. The
sector accounts for about 40 percent of Afghanistan’s gross domestic product 4.
1.3.
The Role of Monitoring and Evaluation
In defining the term monitoring, one needs to be exposed to a number of concepts associated
therewith. Monitoring is the continuous assessment of a program or project in relation to the
agreed implementation schedule. In addition and even more important supports monitoring
the process of permanent observation if the desired outcomes and the final objective of a
policy, program or project can be achieved.
It is also an excellent management tool which should, if used properly, provide continuous
feedback on the project, program or policy implementation as well assists in the identification
of potential successes and constraints to facilitate timely decisions. Unfortunately, in many
projects, the role of this is barely understood and has therefore a negative impact on project,
program or policy implementation. 5
Monitoring is not only concerned with the transformation of inputs into outputs, but can also
take the following forms:
Physical and financial monitoring
The progress of project or program activities are measured against established schedules and
indicators of success.
Process monitoring
Identifying factors accounting for progress of activities or success of output production.
Outcome and Impact monitoring
Measuring the initial responses and reactions to project activities and their immediate shortterm effects. Projects are monitored so as to:
4
5
•
Assess the stakeholders’ understanding of the project;
•
Minimize the risk of project failure;
https://www.usaid.gov/afghanistan/agriculture
https://www.irbnet.de/daten/iconda/CIB8942.pdf
15
•
Promote systematic and professional management; and
•
Assess progress in implementation.
In many developing countries, one tends to find the following aspects in monitoring and
evaluation of projects:
•
There is a dominant use of external consultants in monitoring and evaluation.
•
There is a dominant use of donor procedures and guidelines in monitoring.
•
Sustainability is often not taken into account.
•
Monitoring is sometimes used to justify past actions.
•
Concerns of stakeholders are normally not included.
•
Lessons learned are not incorporated.
One needs to recognize the role played by the various stakeholders in monitoring. These
players include the financiers, implementing agencies, project teams, interested groups such
as churches, environmentalists, etc. It should further be recognized that, to be an effective
management tool, monitoring should be regular but should take into account the risks
inherent in the project/program and its implementation.
In addition to the above mentioned areas monitoring and evaluation plays a very intensive
and crucial role in the following areas:
•
•
•
•
•
•
•
•
•
•
Provision of evidences for policy development;
Identifying program interventions that have worked well or did not work well;
Ensuring transparency and accountability;
Improved service delivery;
Justifying donor and aid funding;
Helps in timely and evidence-based decision making processes;
Improvement in program and interventions;
Identifying strengths, weaknesses, opportunities and threats;
Help the government to tackle corruption;
Improving good governance.
16
2. Methodology and Approach
This section of the report provides detail on methodology and approach that was utilized in
conducting M&E Stakeholders’ assessment for the Administrative Office of the President.
The following methodology was applied.
2.1.
Assessment Objective
The main objective of the assessment is to serve M&E and Audit department/ general
directorate to have necessary background information on M&E stakeholders in Afghanistan.
2.2.
Tasks
The tasks that were part of the assessment include but were not limited to:
•
•
•
•
•
Desk review of existing studies, reports and other relevant documents in the field of
monitoring and evaluation.
Interviews with key stakeholder, briefly assessing their current tasks and relevant
capacities – hereby using the National M&E network that is currently being
established.
M&E stakeholder analysis assessment to identify its key partners and stakeholders,
mapping of key, important and less important players and their interests, power and
influences.
Review of the major Afghan government institutions including but not limited to
MOEc, MOF, IARCSC, IDLG, MOE, MAIL, MOCIT, MOPW, MOD, MOWA and
MOI, MOLSAMD: which kind of standard M&E systems and tools currently exist
within government and donor systems.
Review of major donor M&E system, that is World Bank, DFID, USAID and up to 5
other donors: which kind of standard M&E systems and tools do they use and how is
this aligned with government M&E and supporting government M&E.
2.3.
Deliverables
Expected deliverables of this assignment are as following:
•
•
•
A detailed stakeholders’ analysis report of 40-50 pages;
Practical and implementable policy recommendations for the National M&E Policy;
Policy outline or Policy framework.
2.4.
Duration
The assignment started on 1st of December 2015, and continued until end of January 2016.
2.5.
Stakeholders’ Analysis Process:
Following are major activities that were part of the methodology for conducting the M&E
stakeholders’ assessment for the AOP.
17
2.5.1. Brainstorming Session
After a brainstorming session between the assessment team and the AOP-M&E team the
government and non-governmental entities were selected and added to the scope of the
assignment.
2.5.2. Literature Review
An in-depth review of relevant documents and literature on M&E systems was conducted.
The desk review provided information on the followings:
1.
2.
3.
4.
History and Structure of M&E Systems and M&E Activities;
Current Status of M&E Systems and M&E Activities;
Existing documentation related to M&E Capacity;
Existing documentation about the gaps in M&E capacity.
The following general and specific documents were reviewed:
S.N Document
Name/Description
Relevance to this
assignment
Will provide global context
on evaluation and its
importance
Source
1
Global Evaluation Agenda
v3 6
Available on the
EvalPartners and
IOCE websites
2
National Evaluation Policies’ A mapping of 115 countries is Copy available
Global Mapping report (2013) done, this document provides
7
information on who has
National Evaluation Policies
(NEPs), and link to 20
evaluation policies is
provided.
3
Review of at least 5 National
Evaluation Policies, including
the ones from Sri Lanka,
Nepal, Malaysia and India,
Pakistan 8
The policies will provide indepth understanding of how
NEP policies are developed in
other countries with similar
context.
Links provided in
the NEP mapping
document.
4
Review of M&E documents
and SOPs of five major
donors in Afghanistan
Will provide context on how
donors are approaching and
conducting M&E, and how
those are aligned with gov
requirements and policies.
Can be obtained
with the help of
AOP
6
http://www.mymande.org/evalyear/global_evaluation_agenda_2016-2020
http://www.pfde.net/index.php/publications-resources/global-mapping-report-2015
8
http://performance.gov.in/sites/all/document/files/pmes/pmes.pdf , http://www.smes.org.np/
7
18
5
Review of M&E documents
and systems for at least five
ministries and AOP
Will provide information on
the current status and nature
of M&E by the gov agencies
in Afghanistan. This will also
identify capacity and
institutional gaps to
implement comprehensive
M&E systems.
Can be obtained
with the help of
AOP
6
Review of at least two
documents on how M&E has
helped other governments in
improving performance,
preferably from Sri Lanka,
India, Malaysia and Nepal
who has NEPs enacted.
This will provide the
background and evidence on
the successes and failures of
NEPs in other countries, and
can help in formulation of
policy framework for
Afghanistan
Can be obtained
from Internet, as
well as from
IOCE 9
7
Some other documents that
will be identified during the
review, and recommended by
AOP.
General background
AOP, IOCE,
EvalPartners,
UNICEF, Word
Bank, USAID,
and UNDP 10
2.6.
M&E Systems Assessment within Identified Institutions
The team of experts proposed to use a standard tool: 12 components monitoring and
evaluation system strengthening tool 11 for assessment of M&E systems within the selected
entities. The team and the AOP office agreed to assess the selected entities using 10
performance domains out of 12, in which performance was analyzed relative to M&E of each
institution. The performance domains captured relative monitoring topics and specific aspects
of interest to the AOP, and included additional areas and aspects identified in the problem
tree analysis.
2.7.
Scope of the assessment
The assessment process encompassed the following government and non-government
entities:
•
•
•
•
•
Independent Directorate of Local Governance (IDLG)
Ministry of Finance (MoF)
Ministry of Economy (MoEc)
Independent Administrative Reform and Civil Service Commission (IARCSC)
Ministry of Women Affairs (MoWA)
9
http://www.ioce.net
https://www.usaid.gov/evaluation, http://web.undp.org/evaluation/
11
http://www.unaids.org/sites/default/files/sub_landing/files/1_MERG_Assessment_12_Components_ME_System.pdf
10
19
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Ministry of Education (MoE)
Ministry of Interior (MoI)
Ministry of Defense (MoD)
Ministry of Information and Communication Technology (MoICT)
Ministry of Urban Development (MoUD)
Ministry of Labor, Social Affairs, Martyrs and Disabled (MoLSAMD)
Ministry of Agriculture, Irrigation and Livestock (MAIL)
Ministry of Public Work (MoPW)
Ministry of Public Health (MoPH)
Kabul Municipality
Department of International Development (DFID)
United States Agency for International Development (USAID)
Gesellschaft für Internationale Zusammenarbeit (GIZ)
Japan International Cooperation Agency (JICA)
The United Nations Children's Fund (UNICEF)
Care International
Swedish Committee
Danish Committee for Aid to Afghan Refugees (DACAAR)
Note: Despite several attempts, the assessment team was unable to conduct interviews with
UNDP, AKDN, the World Bank and UNESCO.
2.8.
Performance Assessment Domains
The identified government entities as well as donor agencies and INGOs were assessed based
on the following performance domains using the 12 components monitoring and evaluation
system-strengthening tool.
1. Organizational Structures with M&E
2. Human Capacity for M&E
3. M&E Plans
4. M&E Advocacy, communications and culture
5. Routine Program Monitoring
6. Surveys and Surveillance
7. M&E Databases
8. Supervision and Data Auditing
9. Evaluation and Research
10. Data Dissemination and Use
M&E partnership and Costed M&E workplan were not included in the assessment due to the
following reasons:
• Time limitation in order to assess costed M&E workplan domain of the identified
entities.
• During desk review it was found that no formal M&E partnership existed on national
level and was thus not included in the assessment process.
20
2.9.
Data Entry, Analysis and Recommendations:
A standard database was designed for data entry and cleansing of the M&E stakeholders’
assessment. The assessment team reviewed the data before it was entered into the database;
the review included supplement documents (i.e. copy of strategic plan, M&E guidelines,
annual appraisal of M&E staff, screenshot of database used for storing training lists, and
database used for storing M&E data, data storage and retrieval guidelines) shared either in
printed or digital version during the in-depth interview. The documents were also used to see
if the rating done by the selected entities were realistic. Ratings were changed after the
supplement documents review.
After the data entry process, the assessment team cleaned data and analyzed the data using
specific computer applications (i.e. SPSS and STATA) widely used in research. Data was
analyzed for each entity and every domain; the findings were drawn in spider/radar charts
using (Microsoft Excel) for each domain of the M&E systems that was assessed.
Recommendations were drawn by the assessment team based on the findings during the data
analysis phase and were listed for each domain individually.
2.10. Data Validation Workshop:
The assessment team presented its findings and recommendations in two separate workshops
on January 11 and 12, 2016 to government and non-government entities representatives. The
workshop participants shared their inputs on next steps, recommendations, included in the
executive summary and detailed report, and findings presented by the assessment team.
21
3. Findings
In this section the findings of the literature review and the findings of the M&E stakeholder
assessment are presented.
3.1.
Findings of the Literature Review:
M&E principles, practices and standards can be used in all public institutions to improve the
effectiveness and impact of government programs and policies. M&E is an integral part of
the results-oriented public sector management. An efficient and effective public sector
management is structured around National Development Goals and has four main features:
(1) presence of core result attributes, (2) focus on common results, (3) interdependency
among the components and (4) effective vertical and horizontal linkages. Therefore, it is
critical to have an integrated national M&E system, which covers all aspects of public sector
management, i.e. measurement of performance (efficiency and effectiveness) at national,
ministerial, and program levels to assist public sector managers, decision-makers and the
country in moving to achieve its national goals.
M&E systems of the three countries namely India, Malaysia and South African are briefly
presented to document how M&E principles, practices and standards are used to improve the
impact and effectiveness of programs and policies in those countries, and how the integrated
M&E systems were developed and implemented.
Indian Performance Monitoring and Evaluation System (PMES):
At the beginning of each financial year, with the approval of the Minister concerned, each
department prepares a Results-Framework Document (RFD) consisting of the priorities set
out by the Ministry concerned, agenda as spelt out in the manifesto if any, President’s
Address, announcements/agendas as spelt out by the Government from time to time. After six
months, the achievements of each ministry / department are reviewed by a Committee on
Government Performance and the goals are modified, taking into account the priorities at that
point of time. At the end of the year, all ministries / departments review and prepare a report
listing the achievements of their ministry/department against the agreed results in the
prescribed format. This report has to befinalized by the 1st of May each year.
Initially 59 ministries / departments out of a total of 84 were covered under PMES and in
2013 about 80 ministries / departments were covered by the RFD policy. The Performance
Management Division (PMD), Cabinet Secretariat, initially Designed the System for
performance monitoring and evaluation, then Build the Capacity of the ministries/
department regarding the PMES, and then the system was implemented gradually.
Initially, PMD reviewed international best practices and designed Guidelines and Checklists
for preparing Results Framework Documents (RFD); and PMD created a task force of
independent experts for reviewing the quality of RFDs. Then, to build the capacity for
implementation of PMES, PMD conducted around 40 intensive, hands-on training programs
on RFD for around 3.000 senior officers in collaboration with IIM Ahmedabad and the IAS
22
Academy at Mussoorie. PMD also organized meetings of the independent task force experts
with ministries/departments to review their respective RFDs; Ensured that the revised RFDs
are consistent with the suggestions of independent experts. Finally, PMD received RFDs
approved by concerned ministers for four years (2009-2010, 2010-11, 2011-2012 and 201213) and prepared results against achievements for RFDs for 2009-2010, 2010-11 and 201112. 12
Malaysian Integrated Results Based Management System:
The Malaysian national M&E system is highly sustainable as it is closely supported by a
national policy and because it has been institutionalized within ministries and agencies. The
institutionalization of the M&E system within the government was made through the
budgetary process as well as the national development planning process. Measured results are
utilized for planning, implementing, monitoring and reporting on organizational performance,
with systematic links to personnel performance, and are important for resource allocation
decisions by the central budget office 13.
In 2009, the government adopted the Integrated Results Based Management (IRBM) system
commencing with the 10th Malaysia Plan 2011-2015. The Prime Minister has taken the
leadership for the M&E systems implementation in Malaysia. The national M&E agency in
Malaysia is integrated in two bodies namely the Ministry of Finance (MoF) and the
Implementation and Coordination Unit (ICU) under the Prime Minister’s Department. The
national M&E agency is responsible for program evaluations under the budget and the
Coordination Unit is responsible for project evaluations under the Malaysia Development
Plan.
The IRBM consists of five key components - two primary and three complementary or
support components. The primary components are the Results-Based Budgeting System
(RBB) and the Results Based Personnel Performance System (PPS). Three support
components are the Results-Based Monitoring and Evaluation (M&E) System, the
Management Information System (MIS), and an Enabling E-Government (EG) System. The
RBB and PPS are integrated into a framework known as “Integrated Performance
Management Framework (IPMF)”. The IPMF is mandated as the strategic planning
framework under IRBM. Therefore, all ministries and departments are required to prepare
their strategic plan for resource allocation using IPMF as part of the RBB system.
IRBM cascades down from the national level to the sector level, and down to the
implementation levels. The national M&E system closely determines the M&E system in the
ministries. In fact, the ministry M&E system is a sub-set of the integrated M&E system
across government. The national M&E system is an integrated system called My-Results
(MyRoL) and covers all Ministries and agencies, linked to all other M&E systems within
12
Cabinet Secretariat, Government of India. Performance Monitoring and Evaluation System for Government
Departments.
13
World Bank Data: http://data.worldbank.org/indicator/NY.GNP.PCAP.CD and https://www.cia.gov/library/publications/the-worldfactbook/geos/sf.html
23
Ministries and agencies and contribute to the national medium term development plan.
However, Ministries and agencies are allowed to retain their relevant legacy systems for their
own purpose as internal management of the programs within Ministries and agencies are an
internal matter 14.
The Implementation and Coordination Unit (ICU) under the Prime Minister’s Department
develops various project-monitoring systems, performs periodic collection and analysis of
financial and physical data, to meet various requirements of development plan; identifies the
problems encountered in implementation and the reasons for any gaps between planned and
actual performance. The ICU ensures that effective feedback on project implementation is
provided to the top management on a timely and regular basis for remedial or corrective
action. In addition the ICU provides advice, consultation and technical support for the
planning, implementation, monitoring and project management to all government agencies;
prepares reports on outcome evaluation of development programs/ projects (i.e. performance
reports) for use by all government agencies; monitors and coordinates performance indicators
at national, ministry and agency levels; conducts selected program evaluations, submits and
presents evaluation reports to the National Action Working Committee and the National
Action Council; and ensures that programs and projects are implemented consistently.
The Ministry of Finance under the political leadership and management of the Prime Minister
is responsible for budgeting and taxation. The ministry formulates and implement fiscal and
monetary policies; formulate financial management and accounting processes, procedures
and standards to be implemented by all Government institutions; manage the acquisition and
disbursement of federal Government loans from domestic and external sources; monitor that
Ministry of Finance Incorporated companies are managed effectively; monitor the financial
management of ministries, government departments and statutory bodies.
The ICU and MOF are supported by the M&E agencies (universities, research institutions) to
implement IRBM system 15.
South African National M&E System:
South Africa had various pillars of and for M&E, without a centrally driven system. During
the 2000s, there was a growing interest in M&E, and the M&E role in the presidency began
to strengthen. In 2005, the cabinet approved a plan for the development of a governmentwide M&E system. It was envisaged as a “system of systems” in which each department
would have a functional monitoring system, out of which the necessary information can be
extracted. In 2007, a policy framework was published to guide the government-wide M&E
system 16, which included the need for frameworks for program performance information,
statistical data quality and evaluation, and in the process sought to strengthen the links
between the Presidency, the Treasury, and the national statistics agency. Policy frameworks
were developed for these elements between 2007 and 2011.
14
Malaysian Government: www.malaysia.govt.my
Malaysia Evaluation Society: http://mes.org.my
16
Presidency, Office of. 2007. “Policy Framework for the Government-Wide Monitoring and Evaluation System.” Pretoria.
15
24
The Ministry of Performance Monitoring and Evaluation was created in the presidency in
2009 and the Department of Performance M&E (DPME) in January 2010. “Improving
Government Performance: Our Approach” 17 outlines the basis for the outcomes approach,
including a focus on a limited number of cross-government outcomes (which eventually
became 12) so that efforts to promote change could be focused; moving government to an
impact focus, rather than a focus on just conducting activities, which in many cases did not
translate into impacts on citizens; a performance agreement to be signed by ministers
including high-level targets against these outcomes, which the president would monitor; the
development of cross-government plans (delivery agreements) to deliver these outcome
targets, with a results-based management structure and indicators and targets at the different
levels; the use of existing coordination structures as “implementation forums” to focus on
achieving these outcomes; and regular monitoring and reporting to cabinet of performance on
delivery agreements progress.
The DPME has introduced a number of initiatives since its establishment, including a focus
on 12 government priority outcomes; the assessment of the quality of management
performance of national and provincial departments; a new system of monitoring front-line
services; a national evaluation system; and a municipal performance assessment tool. These
tools have contributed to a major increase in the availability of evidence for policy and
decision-making.
3.2.
Findings of the Assessment:
This section of the report presents major findings from the national assessment of the
Monitoring and Evaluation System in Afghanistan. These findings are drawn from in-depth
interviews, review of the systems and relevant literature.
Overall, data from 23 entities (government and non-governmental) were collected; data
shows that 73% (n=11) of the government entities had a unit to undertake M&E related tasks.
While 27% (n=4) of the government and 12% (n=1) of the non-government entities did not
have a unit to carry out the M&E related tasks. See tables 1 and 2 below with the names of
the entities having an M&E unit and the name of the unit.
Table 1: Governmental Entities/Ministries with Presence of an M&E Unit
17
Name of Entity
M&E Unit Present
Name of the M&E Unit
IARCSC
Yes
M&E Directorate
IDLG
Yes
M&E Directorate
Kabul Municipality
No
NA
MAIL
Yes
M&E Directorate
MoICT
No
NA
2009. “Improving Government Performance: Our Approach.” Pretoria, South Africa.
25
MoD
No
NA
MoE
Yes
Research and Evaluation Unit
MoEc
Yes
General Directorate of Results-based Monitoring
MoF
Yes
RIMU M&E Unit
MoI
Yes
General Directorate of M&E
MoLSAMD
Yes
M&E Directorate
MoPH
Yes
EHIS General Directorate
MoUD
No
NA
MoWA
Yes
Policy Implementation Monitoring Unit
MoPW
Yes
Research and Evaluation Unit
Table 2: Non-Governmental Entities with Presence of an M&E Unit
Name of Entity
M&E Unit Present
Name of the M&E Unit
Care International
Yes
Program Quality Unit
DACAAR
Yes
Program Reporting and Monitoring Department
DFID
No
NA
GIZ/MEC
Yes
MEC
JICA
Yes
Evaluation Department
SCA
Yes
M&E Unit
UNICEF
Yes
Social Policy Planning and M&E Section
USAID
Yes
M&E Team
Data shows that on average there are 19 posts within a governmental entity to carry out M&E
functions (excluding MoI). About 47% of the M&E posts were vacant at time of data
collection, which shows that about half of the human resource was not available to undertake
the M&E tasks within governmental entities. The table below shows summary statistics of
available human resource within the governmental entities.
26
Table 3: Average M&E Posts Filled and Vacant in the Government Entities
M&E Human Resource (Gov. Entities)
Mean
Std. Deviation
Min
Max
19
12.2
4
41
Number of Posts (in Tashkeel)
17.4
12.6
0
41
Number of Posts (Supported by Donors)
1.4
1.9
0
5
Number of Filled
10.2
8.9
0
29
Number of Posts Vacant
7.9
7.0
0
22
(MoI Excluded)
Total Number of Posts
However, when data was analyzed by including MoI, about 67% of the posts within the
governmental entities are vacant at time of the assessment, as from 149 M&E posts within the
MoI only one post was filled while 148 posts within the MoI is reported vacant.
It’s evident from the data that within non-governmental entities there are about 14 posts occupied to
carry out M&E related functions. Majority (90%) of the posts are filled to perform M&E tasks while
only 10% of the posts were vacant at the time of the assessment.
Table 4: Average M&E Posts Filled and Vacant in the Non-governmental Entities
M&E Human Resource (Non Gov.
Entities)
Mean
Std. Deviation
Min
Max
Number of Posts
13.8
8.3
6
30
Number of Filled
8.1
5.7
0
17
Number of Posts Vacant
1.4
1.9
0
4
Data about the presence of M&E staff at sub-national level (provincial and district level) was
also collected; however the data shows that except for the Ministry of Education and the
Ministry of Public Health, none of the governmental entities have staff at sub-national level
to undertake M&E tasks. National and sub-national M&E staff of governmental entities is
mostly occupied to monitor and/or evaluate the output level activities. However, the data of
non-governmental entities shows that at sub-national level are on average 3 employees
responsible to monitor and evaluate project activities.
3.3.
Performance Scores
As mentioned in the methodology section ten performance domains out of twelve of M&E
system within each entity were assessed; in this section performance scores of the ten
performance domains are presented. Firstly, performance scores of governmental and non-
27
governmental entities are presented in two separate groups, secondly performance score of
the each entity is presented individually.
A scale of 4 is used to assess the performance of each domain, on the scale of 4: 1 represents
a “No” or zero performance; 2 represents less than 50% performance; 3 represents 75%
performance; while 4 represents 100% performance.
3.4.
Governmental Entities’ M&E System Performance
The figure (figure 1) below presents the mean performance scores of M&E systems within
governmental entities; it can be observed in the figure that MoPH has the highest
performance score while Kabul Municipality has the lowest M&E performance score. The
score presented in the figure represents the average score of each entity across 10
performance domains: (1) Organizational Structures; (2) M&E Human Capacity; (3) M&E
Plans; (4) M&E Advocacy, communications and culture;(5) Routine Program Monitoring; (6)
Surveys and Surveillance; (7) M&E Databases; (8) Supervision and Data Auditing; (9)
Evaluation and Research; (10) Data Dissemination and Use.
Figure 1: Mean Performance Scores of the M&E System (Governmental Entities)
Similar findings were reported by an assessment carried out USAID in September 2014 in its
report: Rapid Assessment of GoIRA’s M&E Systems. The report shows that MoPH with a
mean score of 2.70 was the highest performing governmental ministry while MoEc and
MAIL got mean scores of 2.2 and 1.2 respectively. The USAID reports also shows that
MoMP was the lowest performing ministry (mean performance score of 0.0) as “the
assessment team discovered during interviews at MoMP and USAID that the MoMP has no
monitoring system. Each performance domain, therefore, received a score of zero.” The
USAID reports states, “there is little institutional monitoring capacity at MAIL, and data
quality is often poor. The ministry has over 1,000 indicators, most of which are not well
defined. MAIL collects data on 32 indicators that are mostly at the output level”. The report
also describes the performance of the M&E system of the MoEc: “Part of the MoEc’s
28
mandate is to monitor progress in meeting development priorities. The ministry has not
performed this function to the fullest for several reasons. The Afghanistan National
Development Strategy (ANDS) ended in 2013. The sheer number of expected outcomes (86)
and development indicators (276) significantly impeded the MoEc’s ability to monitor the
performance of line ministries. Responsibility for monitoring national development outcomes
and outputs is split between the MoEc and MoF, respectively, defying the need for a coherent
national “results chain”. Strained relations and poor coordination between the two parties
further complicates the matter. While the MoEc is responsible for monitoring achievement
toward outcomes, national programs focus on outputs that are tied to annual budget
allocations. The MoEc Directorates responsible for monitoring are unable to influence
monitoring and reporting practices of line ministries”.
The average performance scores of 14 governmental entities, excluding Kabul Municipality
Excluded, in each domain is calculated and is presented in the figure (figure 2) below:
Figure 2: Average M&E Performance Domains Score (Governmental Entities)
The highest performing domain (mean score =2.59) of the M&E system is “organizational
structure with M&E functions” which means that most of governmental entities (73%)
have a unit to carry out M&E functions and the units have a written mandate to execute its
functions: 36% of the entities had a written and approved mandate to carry out its activities
while 43% had a written mandate to carry out its activities which was not approved by an
authorized person; a small percentage (21%) of the entities were functioning without any
written and approved mandate.
In most of the entities (73%) a clear description of the tasks undertaken by different sections
of M&E unit is clearly defined while in 27% of the cases overlap or confusion existed about
29
tasks. Within their current capacity, most of the government entities (71%) needed external
support to undertake M&E functions however the support was rarely provided in a timely
manner.
The lowest performing domain (mean score=1.49) of the M&E system is “supervision and
data auditing” which means that most of the entities (80%) did not have guidelines and tools
for supervision, most of the entities (80%) did not have protocols for data auditing, and 71%
of the entities did not share results of the data auditing with the data collection section/unit.
Among all governmental entities only MoPH had a proper system to audit its data through a
process called, third party monitoring. All other entities did not have a proper system for
audit of data that is fed into the M&E system. The data generated from the field was assumed
to have an improved quality; it was stored in the system and used during decision-making
purposes.
The domain, “Human Capacity for M&E” was the second lowest performing domain as
mechanism for a proper in-service capacity building (Training Needs Assessment (TNA),
capacity building plan, database for tracking the trainees, inventory of the training providers
and capacity building through supervision or on-the-job training) was either non-existent or
was partly present within the assessed entities. The pre-service curricula used in learning
institutions (i.e. colleges, universities or/and technical schools) are not supportive in building
M&E related human capacity. 72.7% of entities reported the curricula used in pre-service
education does not include topics related to M&E capacity; in 27.3% of the cases the preservice curricula builds the M&E capacity (the pre-service curricula used in the private
institutions). In some cases subjects related to monitoring and evaluation are dropped from
the curriculum, this is reported by one of the participant in data validation workshop who also
worked as lecturer at one of the public universities.
The human resource directorate within each entity in collaboration with the relevant technical
unit directs the annual staff appraisal process; the capacity building plans are developed
accordingly. However, the objectives used in the annual performance appraisal are not based
on the national strategic plan objectives of the entity, proper plans for filling the capacity
gaps identified are not developed or not properly implemented. In addition, the process of
staff performance appraisal (led by the HR unit) is not linked with other M&E activities of
the entity i.e. performance monitoring is not linked with service delivery monitoring or
routine monitoring of the entity.
The domain, “M&E Databases” has a mean score of 1.96, which means that only 33% of
the entities had databases for electronically capturing and storing data; in most cases, data is
manually collected and stored. The entities with M&E databases, 43% of them had structures,
mechanisms and procedures for transmitting, entering and extracting data from these
databases. Written quality control mechanisms for accurate capturing of data were available
only in 21% of the governmental entities.
Most of the M&E systems activities are focused at tracking “outputs” of a given entity,
therefore an inventory of the mandate level surveys (to track outcomes and/or impact) was
30
non-existent at 58% of the entities. Only 8% of the entities have complete list of mandate
level surveys. The surveys are highly dependent on the donor funding, the entities did not
have ability to carry out mandate level surveys on their own. With the expectation of MoPH,
functional surveillance systems were non-existent among the entities.
The mean score for, “Data Dissemination and Use” is 2.04, which signifies that most (46%)
of the governmental entities do have a list of the relevant stakeholders including contact
details so that it can be used for data sharing with the stakeholders.
Only 23% of the entities have plans in place for data use and dissemination, while data is
disseminated without a proper strategy. Only 33% of the entities provides online access to
part of its M&E data either through the entity website or through social media channels.
As mentioned above, the M&E systems within the governmental entities are mostly focused
at the output level therefore “Evaluation and Research” domain of the M&E system is
performing below 50%. Most of the entities (80%) do not have a team/committee/board,
responsible for coordinating and approving new research and evaluations. Only 23% of the
entities have conducted a research/evaluation of its mandate level programs in the last 2
years.
The “Routine Program Monitoring” data is mostly collected by MIS unit, which in most
entities is not well connected with the M&E unit of the entity. In most of the governmental
entities the MIS unit collects data on routine program monitoring or routine services delivery.
Except MoPH and MoE, all other governmental entities do not collect data regarding its
routine services delivery. For example, the MAIL does not collect data regarding treatment,
vaccination or awareness provided to the livestock keepers in given periods of time, or IDLG
does not collect data regarding services provided by the district offices and provincial offices.
In about 43% of the entities, the stakeholders use information products generated by the
M&E system during the decision-making processes. In the remaining entities, information
products are either not generated and/or not used. However, in 47% of the entities, M&E
information is requested and utilized during the planning, revision and/or costing processes
within the entity. 53% of the governmental entities reported existence of strong support for
monitoring and evaluation within the entity.
About 46% of the governmental entities had an entity level national strategic plan, while
others either did not have a national strategic plan (15.4%) or they had an unapproved or
outdated national strategic plan (38%). The national strategic plans were based on ANDS
and/or MDGs and at the time of the assessment, both ANDS and MDGs were not considered
as valid strategic directions (ANDS ended in 2013 while MDGs are officially replaced by
SDGs).
The national strategic plans were not accompanied with a well-formulated M&E plan. Even
though, 40% of the entities had an M&E plan other entities either did not have a plan or the
M&E plan was partly present.
31
Activity plans developed by the M&E units to conduct a certain number of visits for data
collection were considered as M&E plans by these entities. Data was collected from the field
based on Donor/NGO request or on need basis (i.e. emergency, outbreak, political pressure
and complaint). Lack of resources (human, financial, transportation) were considered as main
constraints in implementation of M&E plans, however, the team observed that absence of the
well-formulated M&E plan and lack of coordination with other units as main constraints for
carrying out monitoring functions within the entity.
The M&E plans (i.e. M&E unit activity plans) were prepared in the first quarter of the year;
they were not regularly updated and were considered as a dead document. As reported above
in most cases “output” level data was collected, M&E units were not able to collect “outcome
or impact” level data. The “outcome and impact” data is collected by a number of nongovernmental agencies with or without close collaboration and coordination with M&E units
of the governmental entities. Nationally representative surveys (i.e. NRVA, MICS, ADHS,
AHS, Animal Census, etc.) that collect outcome and impact level data are funded by different
donors and conducted by International Research Organizations with minimal involvement of
the M&E units of the governmental entities.
3.5.
Non-Governmental Entities’ M&E System Performance:
The graph below presents the mean performance score across 10 performance domains of the
M&E systems within non-governmental entities. A comparison of the governmental and nongovernmental entities shows that M&E systems of non-governmental entities performed
better than the governmental entities within the context of Afghanistan.
The comparatively better performance of M&E systems in non-governmental entities might
be due to a number of reasons:
•
•
•
•
•
Smaller size of the non-governmental entities
Availability of more resources per person within the non-governmental entities
Availability of better human capacity
Years of experience in development and implementation of M&E systems (M&E
systems within governmental entities is relatively a new phenomenon), and
Linkage of staff performance monitoring with the strategic direction of the entity or
goals of the project/program
However, the aim of the this comparison is to document that tools and processes within a
given performance domain are available and are used by non-governmental entities in
context of Afghanistan, and these tools and processes can be adopted and implemented
within governmental entities.
32
Figure 3: Average M&E Performance Domains Score (Non-Governmental Entities)
Figure 3: Average M&E Performance Domains Score (Non-Governmental Entities)
The mean score for “organizational structure with M&E functions” is 3.55, the domain
that scored highest among the 10 domains that were used during the assessment. Data for this
domain shows that 88% of the non-governmental entities, assessed by the team, have a unit to
undertake M&E related tasks while 12% did not have a unit to carry out the M&E related
tasks. The entities that had unit to carry out M&E activities all had written and approved
mandate. It was also found that M&E functions at the entity level are not well coordinated
because M&E unit serves as a technical unit for tools development, project based M&E
system is over relied upon and the M&E team reports to the project manager of the same
project.
Non-governmental organizations scored second lowest, mean score equals to 2.83, on
“human capacity for M&E” domain of the M&E assessment tool. Data for this domain
shows there are 14 persons within a non-governmental entity to carry out M&E functions.
About 10% (47% in governmental entities) of the M&E posts were vacant at time of data
collection. It was also found that proper in-service capacity building system was partly
present within non-governmental entities.
Data for “M&E plans” domain shows that 86% of non-governmental entities (46% in
government entities) had an entity level strategic plan. In 83% of the entities M&E plan was
developed in relation to the strategic plan. It was found that 40% of the non-governmental
entities considered its M&E plan as a dead document, not regularly updated and modified.
87% of non-governmental entities reported presence of strong support for M&E on entity
level. It was also found that in 86% of the entities M&E information is utilized during the
planning, revision and or costing processes.
33
Data related to “routine program monitoring” shows that 88% of non-governmental entities
had guidelines for routine program monitoring, mostly due to the donor requirements. It was
evident from the data that 86% of non-governmental entities had data quality assurance
guidelines while 14% of lack such guidelines for routine program monitoring. In 75% of
cases data was checked and corrected before aggregation and further use.
80% of non-governmental entities had an inventory of the mandate level surveys to track
outcomes and or impact. It was also found that nationally representative surveys (i.e. NRVA,
MICS, ADHS, AHS, Animal Census, etc.), which collect outcome and impact level data are
funded by different donors and are conducted by International Research Organizations with
minimal involvement of the M&E units of the governmental entities.
Non-governmental entities scored low, mean score equals to 1.96, for “M&E databases”
domain of the M&E assessment tool. Data for this domain shows that only 37% of nongovernmental entities had functional databases for electronically capturing and storing data
M&E data, in 37% of entities these databases were partly present while 26% did not have a
database M&E data storage. Entities that had databases, 28.6% had structures, mechanisms
and procedures for transmitting, entering and extracting data from the databases. Written
quality control mechanisms for accurate capturing of data were available only in 50% of nongovernment entities. It was also found data stored in the databases are only accessible by
specific people within the entity and was not or partly accessible by the stakeholders of the
entity.
43% of non-governmental entities that were part of the assessment did not have guidelines
and tools for M&E supervision- regularly check the M&E processes in such a way that the
supervisor offers suggestions for improvement. Most of the entities had data auditing
protocols; however, in 15% of the cases feedback of data auditing was not shared with data
generators.
Data for “evaluations and research” domain of the M&E tool shows that 60% of nongovernmental entities have a team/committee/board, responsible for coordinating and
approving new research and evaluations. During last two years 80% of non-governmental
entities conducted a research/evaluation of its programs. It was also found that program
/project evaluations were conducted by research organizations with donor funding and
minimal involvement of the governmental M&E units.
Non-governmental entities, 86%, keep a list of the relevant stakeholders with contact details
for information and data sharing. 75% of the entities have plans in place for data use and
dissemination, while the remaining entities disseminate data without any proper strategy.
86% of the entities provide online access to part of its M&E data either through its website or
through other social media channels.
The figure (figure 4) below shows comparison of governmental and non-governmental
entities and how they scored in each domain.
34
Figure 4: Comparison of Governmental and Non-governmental entities
Non-governmental entities have robust processes and tools for “supervision and data
auditing” which can be utilized by the governmental entities. For example USAID’s data
quality audit tools can be utilized to the for quality assurance of data collected by
governmental entities. Also, tools developed by USAID for assessment and development of
human capacity can be utilized by the governmental entities to improve the performance of
human capacity for M&E domain of monitoring and evaluation.
The tools and process used by the different non-governmental entities are attached at the
annex of the report.
3.6.
M&E Performance by Entity
This section of the report provides details on performance of each entity regarding their
Monitoring and Evaluation system based on the 10 performance domains that were assessed
during this assessment.
3.6.1.
Ministry of Communication and Information Technology (MoCIT)
The overall mean performance score of MoICT M&E system is 1.6 (Figure 5), which means
that the M&E system within the ministry is performing below 50% of its performance.
During the in-depth interview it was found the Ministry recently decided to establish a
separate directorate for monitoring and evaluation, the number of posts are not yet approved
and filled, the scope of work and guidelines are not developed or approved for this unit.
MoICT does not have Monitoring and Evaluation Unit/Directorate and has no mechanism for
data collection, analysis and reporting to ensure that entity level mandate is assessed
35
periodically. There is no database or list where stakeholder details are kept so they can be
used during data sharing.
Since there is no M&E unit operational at the ministry, therefore the subsequent
performances domains within the Ministry are non-existent.
It was also found that MoCIT does not have a current strategic plan; the former strategic plan
expired in 2010. It was also reported that one person from the planning department is wholly
responsible for putting the strategic plan documents together.
A separate department, Statistics and Analysis, develops reports that are shared with the
AOP. These reports are mostly developed based on the one page document submitted by the
provincial directorates.
36
3.6.2. Ministry of Economy
The overall mean performance score of the M&E system within Ministry of Economy is 2.2,
which means that the M&E system within the ministry is performing above 50% but below
75% of its performance.
The Ministry of Economy has a Monitoring and Evaluation directorate, called General
Directorate of Policy and Results-based Monitoring; it’s reporting to the Minister. There is a
second directorate named Results-based Monitoring under the General Directorate of Policy
and Results-based Monitoring, the GD of Policy and Results-based Monitoring has 34 posts,
while the M&E directorate has 13 posts, out of which six posts are vacant.
During the data validation workshop it was reported that data collection happens passively:
data are reported based on the formats shared by the Ministry. There is no active data
verification mechanism in place to check the quality of the data.
MoEc has a strategic plan, which is not approved; monitoring plans are also developed but
are not regularly updated. The M&E directorate is performing its functions based on a
comprehensive manual for results-based monitoring.
37
3.6.3. Ministry of Urban Development
The average performance score of M&E system within Ministry of Urban Development is
1.1, which means that there are "Zero" systems and mechanisms in place for Monitoring and
Evaluation at the Ministry of Urban Development. Since there is no M&E directorate,
therefore, the remaining performance domains are non-existent at this Ministry.
38
3.6.4. Ministry of Women Affairs
Average performance score of the M&E system within Ministry of Women Affairs is 1.7,
less than 50%, which means that the M&E system within this Ministry non-existent.
The M&E unit within MoWA is operating under the Policy and planning directorate, it is has
four posts, one of which is vacant. There is no approved strategic plan for the Ministry of
Women Affairs, the M&E activities are also very fragmented, and does not operate based on
standard operating procedures. M&E plans are developed based on activities, and are at
output levels; the M&E plans are not regularly updated.
Human capacity within MoWA for M&E is very limited, in terms of expertise as well as in
terms of its adequacy. Supervision and data auditing is not done, and there is no database to
electronically store data.
39
3.6.5. Independent Administrative Reform and Civil Service Commission
Average performance score for M&E systems in IARCSC is 2.5, above 50% but below 75%.
IARCSC has a passive monitoring and evaluation system where data in most cases is reported
in the forms issued by the commission without active involvement of the M&E staff of the
commission.
There are several gaps within the M&E system, which include but not limited to lack of
routine program monitoring, supervision and data auditing. There is no procedure or
guidelines for data collection, recording, collating and reporting at the commission. Entity
level guidelines and tools for supervision of M&E do not exist at all. Evaluations and
Research is also non-existent, data dissemination and use is very limited. There are no
guidelines that support data analysis, presentation and use within the entity. Within the
commission a supportive environment for M&E is present.
Data collection tools used by the commission mostly collect qualitative data, which is often
times challenging to analyze and use for planning.
40
3.6.6. Ministry of Agriculture, Irrigation and Livestock
The average performance score for M&E system within Ministry of Agriculture, Irrigation
and Livestock (MAIL) is 1.97, which reveals that the M&E system within the ministry is
performing less than 50% of its performance. Across the 10 performance domains, some of
the performance domains are existent to an extent, while others are non-existent.
Routine Program Monitoring, Surveys and Surveillance, Supervision and Data Auditing,
Evaluation and Research scored 1.0 of all performance domains and indicate that these
performance domains are non-existent at the Ministry. While Human Capacity for M&E,
Databases, M&E Advocacy, Communications and Culture scored 1.88, 1.75, and 1.33
respectively which indicate that these performance domains of M&E system in MAIL are
performing partly.
However, Assessment of Organizational Structure with M&E Functions, Data Dissemination
and Use, M&E Plans scored 3.0, 3.0, and 2.20 respectively, which reflects that these
performance domains are relatively performing better than other domains.
MAIL has two separate units, MIS, and Statistics and Marketing, that performs part of the
M&E functions and are structured under policy and planning directorate while the M&E is an
independent unit which reports directly to the Minister. During the in-depth interview it was
found that there is no formal coordination mechanism (i.e. committee, board, taskforce) to
coordinate and align M&E functions within MAIL.
The M&E unit has written mandate, which was not yet approved; it was also found that
MAIL does not a current strategic plan therefore entity M&E plan was non-existent.
A functional animal disease surveillance system was not existent and regular animal census
data was not collected. The census was lastly conducted in year 2003 by UNFAO.
41
42
3.6.7. Ministry of Public Health
The average performance score for M&E system within Ministry of Public Health is 2.86; it
reveals that M&E system within the ministry is performing mostly higher than 50% but less
than 75. Majority of M&E performance domains are existent, while some are partly
functioning.
Most importantly, Ministry of Public Health has a strategic plan; M&E plans are developed
according to the strategic plan, but are rarely updated. Moreover, MoPH is one of the
ministries with sub-national presence of Monitoring and Evaluation; they have relatively
stronger management information system.
M&E Advocacy, Communications and Culture, Routine Program Monitoring, Surveys and
Surveillance, M&E Databases, Supervision and Data Auditing, Evaluation and Research,
Data Dissemination and Use scored of 3.33, 3.75, 3.0, 3.25, 3.33, 3.0, and 3.33 respectively,
which are relatively higher than other government entities. It shows that the performance of
these domains is higher than 75% but less than 100%.
MOPH scored 1.63 on human capacity domain, which reveals that this domain exist to some
extent or partly functioning, while Assessment of Organizational Structure with M&E
Functions, and M&E Plan scored 2.67, and 2.60 respectively.
43
3.6.8. Ministry of Education
The overall mean performance score of the M&E system within Ministry of Education is 2.75
that reveal the M&E system within the ministry is performing partly to mostly, higher than
50% but less than 75%. Across the 10 performance domains, majority of the performance
domains are existent, while others are partly functioning.
It was reported during the in-depth interview that surveys are conducted regularly; however, a
surveillance system is non-existent. Performance domains such as Assessment of
Organizational Structure with M&E Functions (3.33), M&E Plan (3.25), M&E Databases
(3.25), Evaluation and Research (3.20), and Routine Program Monitoring scored 3.33, 3.25,
3.25, 3.2 and 3.0 respectively. Performance of these domains was higher than 75%.
At the time of assessment 8 out of 18 government posts were vacant in the M&E unit, which
makes 44% of the total posts. It was also reported that previous strategic plan of MOE is
outdate while the new strategic plan is under development.
However, performance of M&E Advocacy, Communications and Culture (2.67), Supervision
and Data Auditing (2.67) units is present under the range of 50% to 75% while Human
Capacity for M&E (2.0), and Data Dissemination and Use (2.0) are under 50%.
44
3.6.9. Independent Directorate of Local Governance
The average performance score for M&E system within Independent Directorate of Local
Governance is 2.62, which shows that the M&E systems within IDLG is performing higher
than 50% but lower than 75%. Across the 10 performance domains, some are functional to
perform partly or mostly, while others are partly functional and or non-existent.
The average score for M&E Advocacy, Communications and Culture (4.0), M&E Plan
(3.60), Routine Program Monitoring (3.25) was 4.0, 3.60, and 3.25 respectively. While
Assessment of Organizational Structure with M&E Functions and Surveys and Surveillance
domains scored 3.0. The high score for these domains reflect that these performance domains
are somehow existent at IDLG.
Some performance domains like M&E Databases (2.75), Data Dissemination and Use (2.33),
Evaluation and Research (2.0) performed higher than 50% but lower than 75%. On the other
hand, performance domains such as Human Capacity for M&E (1.88), and Supervision and
Data Auditing (1.67) are non- existent or partly existent.
It was found during the in-depth interview process that IDLG doesn’t have a current strategic
plan; instead 100-day plans are developed for provinces based on which reports are received
from the governor office. There is no data verification mechanism to check validity and
quality of data reported by each province. In addition to that, data on routine service delivery
(i.e. distribution of national identity cards, passport services, driving license, visits to the
governor office, services provided by provincial directorates of each ministry) is nonexistent.
Data is by IDLG is often times collected in qualitative format which is far more challenging
to analyze and use for future decision making as the M&E unit is facing lack of human
capacity. Other challenges reported by the M&E unit during the in-depth interview process
were increased frequency of reports by the provincial offices, reports are provided in different
formats to various ministries and lack of human capacity. It was also found that there is no
centralized database where data regarding each province can be stored and accessed by
stakeholders whenever needed. Databases only existed on entity level where limited
individuals had access to it.
45
46
3.6.10. Ministry of Public Work
The M&E systems within Ministry of Public Work has an average performance score of 2.17
out of a possible 4, which shows that the M&E system within the ministry is performing,
higher than 50% but lower than 75%. Across the 10 performance domains used during the
assessment, some are existent, while others are non-existent.
Supervision and Data Auditing, M&E Plan, and Assessment of Organizational Structure with
M&E Functions have highest performance scores of 3.67, 3.60, and 3.33 respectively. While
M&E Advocacy, Communications and Culture, Routine Program Monitoring have the scores
of 2.33, 2.25 respectively, which are higher than 50% but less than 75%.
However, Human Capacity for M&E, M&E Databases, Evaluation and Research, Surveys
and Surveillance, and Data Dissemination and Use are non-existent at MoPW.
It was reported during the in-depth interview process that guidelines for routine monitoring
does not exist nor does data regarding routine monitoring is stored in an electronic database.
Coordination between donor agencies and MoPW was reported is one of the main challenges.
For example, in some cases donor agencies did not confirm with MoPW about construction
of a district and village level road before and have informed them after completion of the
project.
47
3.6.11. Ministry of Labor, Social Affairs Martyrs and Disabled
The average performance score for M&E systems is 2.22 within MoLSAMD, which means
that M&E system within the ministry is performing above 50% but below 75%.
The M&E directorate reports to general directorate of policy and planning which reports to
the Minister of MoLSAMD. There are three sub-departments within the M&E directorate, the
data analysis section, the projects implementation section and the action plans monitoring
section. However, within the ministry there are two other units, the Labor Market Survey
Directorate, reporting to general directorate of the Human Resources, and the statistics unit,
reporting to the provincial liaison office, which are involved in monitoring and evaluation of
the mandate level activities of the ministry. The M&E directorate, the labor market survey
directorate and the statistics units are not connected with one another through established
mechanisms. Each of the mentioned units functions independently in their own sphere of
influence. The HR directorate without coordination or collaboration of the M&E unit carries
out performance monitoring of the MoLSAMD staff. Staff performance monitoring is not
connected with the strategic directions of the ministry.
The organizational structure of the M&E directorate had 23 employees at the time of the
assessment; 11 posts were vacant, which means about 50% of the employees within the
present organizational structure was not recruited.
The M&E staff has not received any M&E related training since 2013; the year M&E
directorate was established. The M&E plan within the directorate is annual action plan
developed by three sub-departments of the M&E directorates. It is not in line with strategic
plan of the Ministry. It was also found that the Ministry does not have an updated and
approved strategic plan.
Even though a strong support for M&E is present within the entity yet the M&E directorate is
not involved in outcome and impact level monitoring of its mandate level activities; neither
did the M&E directorate conducted any outcome or impact level studies in the past. The data
within the unit is stored manually; electronic capturing of data is not practiced.
48
3.6.12. Ministry of Defense
The average performance score of M&E system within the Ministry of Defense is 1.31,
which is interpreted that the M&E system performance is minimal. The data shows that
within the ministry, an internal audit unit and inspector general office is present, which are
responsible for the audits, mostly financial.
The M&E unit within the ministry is not present to monitor the personnel performance
working within the ministry, activities carried out by different departments of the ministry.
The ministry does not have a strategic plan to outline the strategic directions of the ministry.
49
3.6.13. Kabul Municipality
The average performance score of M&E system within Kabul municipality is 1.05, which
means that a functional M&E system is non-existent with the municipality. The municipality
has construction control and internal audit sections, which are partly monitoring activities
within the entity. The municipality has partly developed strategic plan however an M&E
system to monitor the entity level strategic plan is non-existent. The municipality reported
that they monitor all projects implemented by this entity; however, a proper system with
well-defined tools is not present to monitor different projects.
The municipality also monitors the projects that are contracted out; these projects are
monitored through traditional monitoring techniques (i.e. surprise visits to the sites and use
observations). A well-defined mechanism for the monitoring projects is not present.
The municipality has a design unit, which usually formulates plans for the city, has an
implementation unit, which is responsible for implementation of the plan and a building
control unit, which monitors the quality of construction.
50
3.6.14. Ministry of Finance
The average performance score for M&E system within MoF is 2.43. Reform Implementation
Management Unit (RIMU) carries out the M&E function within the MoF. RIMU is not part
of the formal structure of MoF, it is funded by the World Bank for implementation of reforms
within the ministry. RIMU directly reports to the Deputy Minister of Administration, MoF.
RIMA also gather, compile and report relevant data from the deputy ministries at MoF.
Data is mostly stored manually not electronically in a centralized database system; data
reporting happens in specific or predefined formats.
The unit did not perform an evaluation of the programs implemented or funded by MoF
neither did it performed evaluation of the financial policies of MoF. The MoF is said to have
strategic plan, which is monitored by an M&E plan, however, the assessment team was not
able to get a copy of the strategic plan or the M&E plan. The unit is mostly focused on output
level monitoring; it does not conduct outcome or impact level monitoring.
Within MoF strong support is present for establishment of effective and efficient M&E
systems, however, over the years MoF was not able to establish a unit for monitoring and
evaluation that is formally part of the MoF structure.
Staff performance is monitored on annual basis using the system introduced by the civil
service commission; nevertheless, the performance monitoring is not linked to the strategic
plan of MoF.
51
52
3.6.15. Ministry of Interior
The average performance score for M&E system within Ministry of Interior is 2.08, which
reveals that the M&E system within MoI is performing partly that is equal to 50% Across the
10 performance domains, some domains are existent to an extent, while others are nonexistent.
It's also important to note that the General Directorate for Monitoring and Evaluation within
MoI is a newly established directorate with a life of only 5-6 months; a Tashkeel of 149
personnel for this directorate is approved but are not yet filled. During the assessment it was
found that the director position is filled only. During this short time, the M&E directorate was
able to develop certain documents (i.e. draft of Manuals, staff job descriptions and M&E
plans for six directorates of MoI) in order to streamline M&E activities.
Assessment of Organizational Structure with M&E Functions, M&E Plan, M&E Advocacy,
Communications and Culture, and Routine Program Monitoring scored 3.0.
Human Capacity for M&E, M&E Databases, Data Dissemination and Use, Surveys and
Surveillance, Supervision and Data Auditing, and Evaluation and Research scored from 1 to
1.60 on average, which indicates some of these performance domains of the M&E system
within MoI are non-existent or performing partly.
53
4. Conclusions:
Interestingly there is a huge difference in performance domains that were assessed among
governmental and non-governmental entities. Governmental entities have very fragmented
Monitoring and Evaluation systems, while non-governmental entities own more established
Monitoring and Evaluation system; therefore, possibilities and chances exist to transfer
knowledge from non-governmental entities to the governmental entities. Moreover, the M&E
systems of some of the governmental entities are also far stronger comparatively the other
governmental entities, thus the knowledge transfer possibilities within the governmental
entities is also possible and could pave the way for improvement of M&E systems in those
governmental entities that are lacking proper M&E systems and practices.
Monitoring and Evaluation structures within different governmental entities varies from
entity to entity, there are different names used for the monitoring units within entities, and the
reporting to superiors also differs, there is a huge possibility to streamline these systems.
•
It is evident that there are many challenges within the National M&E system in
Afghanistan, as this assessment revealed that the overall average performance score
for governmental entities is 2.0, which means, the National M&E system is partly
functioning and is very fragmented.
•
Based on the 10 performance domains that were included into this assessment,
Organizational Structures with M&E functions domain scored highest compare to
other domains of the M&E system, it is meant that mostly organizational structures
with M&E exist, while the subsequent domains are performing very low.
54
5. Recommendations
Based on findings of the assessment, it is imperative to make some practical
recommendations for improvement of the National Monitoring and Evaluation system within
governmental entities (i.e. national level). It is important to note that these recommendations
are directly implemented or taken into account in the development of National M&E Policy
for Afghanistan.
One of the key recommendations that can improve government effectiveness in general, is
introduction of a strong National Monitoring and Evaluation Policy that is evidence-based,
practical and implementable in the short to long span of time.
These recommendations are aimed to help to develop a policy for M&E that is more effective
and more practical.
A number of practical and evidence-based recommendations have been made for
development and implementation of a National Monitoring and Evaluation Policy (NMEP)
for the government of Afghanistan. These recommendations are based on detailed assessment
of M&E systems at different governmental and non-governmental entities, as well as an indepth review of the relevant documentation and literature from Afghanistan and the region
was also conducted in order to prepare these policy recommendations.
1. Assessment of Organizational Structure with M&E functions
•
The assessment revealed that some of the governmental entities do not have
functioning M&E units at all.
It is recommended that the government should take steps in establishing M&E units
within all governmental entities. The existence of M&E units is very crucial for
timely tracking of progress, improvement in service delivery, effectiveness and
efficiency.
In addition it is obvious that functioning M&E Units in all governmental entities
require a coordination mechanism (i.e. taskforce, committee, special M&E working
group or a board) in order to streamline M&E activities on national level.
•
During the assessment it was found that almost half of the M&E posts within the
governmental entities were vacant. Human resources play a vital role in effective
results based monitoring and evaluation of interventions, while lack of human
resources results in deficiencies in the overall M&E systems.
Therefore it is highly recommended that qualified personnel with M&E expertise
should be recruited to fill the vacant M&E positions within governmental entities.
However recruitment should take place only on the basis of the actual manpower
requirements which are laid down or can be identified on the basis of the monitoring
plans of the individual governmental entities.
55
•
Inconsistency in the type, structure and nature of M&E functions was massive amid
different governmental entities, it was found that in some cases M&E units report to
the ministers, while in other entities they either report the to deputy ministers or
directorates; in fact M&E structures in some entities are regarded as unimportant so
that they are placed as vulnerable units within directorates, understaffed with utmost
dependability. Moreover, in some governmental entities, M&E units are embedded
into the planning and policy directorates.
Therefore it is recommended that M&E structures be streamlined within
governmental entities, including change in their reporting structure to decision making
authorities within the entities.
•
Interestingly, some governmental entities have different units that carry out M&E
functions, but they are not interconnected to each other, duplication of efforts and lack
of synergy tends to be very high. For example, in some ministries separate
sections/units exist for MIS, Research and Evaluation, Statistics and Monitoring and
Evaluation; while all these are closely correlated to each other and should operate
under one unit with stronger synergy.
It is therefore highly recommended that substantial changes should be made in the
structure of M&E directorates to bring all these different sections under one unit.
•
Except for MoPH and MoE; there is no Monitoring and Evaluation happening at the
sub-national level, M&E structures, personnel, and M&E plans are even nonexistence at the sub-national level.
It is highly recommended that M&E should be planned and established at sub-national
level with structures, plans and actual data collection. Once the M&E system is
established on the sub national level it would make it less cumbersome to feed data to
the central Monitoring and Evaluation systems.
2. Human Capacity for M&E
•
The assessment revealed that systematic training needs assessment for national and
sub-national M&E staff is not carried out in most of the governmental entities; the
capacity building plans for M&E personnel are not existent in many of the
governmental entities nor are they connected with strategic plans of the entity. Human
capacity plays a very crucial role in effective development and implementation of
monitoring and evaluation systems.
It is therefore recommended that the governmental entities should conduct
comprehensive and systematic training needs assessments, and develop capacity
building plans for the M&E units; the plans should be aligned with the capacity gaps
of the employees. The capacity development for M&E staff can be coordinated and
implemented by the civil service commission.
•
It has been found that there is almost no formal M&E education embedded into higher
education curricula, however, some respondents cited that M&E is studied as a topic
56
under management in the private universities, but in the public universities it is not
included in the curricula at all.
It is highly recommended that M&E should be integrated into national curricula for
certain fields of study, such as economics, business administration, engineering, social
sciences, management, and political science as well as in development studies. It is
also recommended that specialized trainings be offered through the civil service
institute to the national M&E staff.
3. M&E Plans
•
Almost half of the entities do not have a national strategic plan or their strategic plan
was unapproved or outdated. National Strategic plans provides basis for effective
monitoring and evaluation, if strategic plans are not available, monitoring and
evaluation cannot happen as intended.
Hence, it is recommended that governmental entities should develop result oriented
national or their thematic strategic plans; based on which comprehensive Monitoring
and Evaluation plans should be developed and implemented. It is evident that M&E
plans are as good as strategic plans. Moreover, it is important to highlight that these
planning processes should be worked out together by planners and M&E experts.
•
It was noticed that entities with M&E plans rarely update their plans. M&E plans are
used for steering and managing the implementation of policies and interventions. In
most of the governmental entities the M&E plan was considered as a dead document.
Additionally, the M&E plans are focused on output level results.
It is recommended that the planners and implementers should regularly update the
M&E plans in steering meetings in order to take implementation decisions or replanning initiatives.
4. M&E Advocacy, Communications and Culture
•
The assessment revealed that many of the M&E directorates in the government
entities do not get the necessary and required support within the entity for monitoring
and evaluation. Data collection and reporting is basically everyone’s responsibility at
all entity levels. The M&E directorates or units are the hands and eyes of the senior
management at an organization because they are responsible to collect and compile all
information concerning the implementation of policies, programs and projects.
It is therefore recommended that M&E responsibilities should be explicitly defined
for key staff at the entity level, and the top management of the entity should provide
support to M&E to ensure getting verifiable information, accurate, relevant and easy
to understand, from different directorates. Subsequently, enhanced coordination
between other directorates and the Monitoring and Evaluation unit/directorate is a
must.
•
Monitoring and Evaluation units in the governmental entities rarely produce useful
reports (i.e. information products that could provide evidence for decision-making and
planning processes, as well as for informing the public).
57
It is therefore recommended that M&E units within the entities should produce
reliable and comparable reports and information products so that effective, timely and
evidence-based decisions can be made.
•
In most of the government entities the M&E is perceived as controlling mechanism,
staff members feel frightened to provide data to M&E.
Therefore it is recommended to initiate awareness sessions within the entities to
sensitize the environment for data collection whenever needed by M&E. Monitoring
& Evaluation is the main vehicle for steering policies, programs and projects and can
be used as a tool for knowledge building. Therefore M&E is not to be considered as a
controlling mechanism.
5. Routine Program Monitoring
Over half of the governmental entities did not produce routine program monitoring reports
based on M&E plans. Routine program monitoring reports help assess policy or program
implementation progress and define where one stand against the pre-defined milestones.
It is recommended that routine program monitoring should happen regularly through
steering meetings, while the sequence should be defined through the entities. The results
of the steering meetings should be used for further decision-making and planning
initiatives. They may be even publicized in order to reinforce transparency.
6. Surveys and Surveillance
Governmental entities do not conduct surveys to assess baselines as well as progress
actively. Conducting baseline assessment and periodic surveys will assist the entities to
measure the impact of their programs and projects. Currently such surveys are conducted
by donor agencies, while this falls under the government entities responsibilities and scope
of work.
It is therefore recommended that the capacity of government should be strengthened to
carry out such surveys in order to assess the impact of their interventions, and trust should
be built between donors in order to utilize findings from the government surveys for
decision-making purposes.
58
7. M&E Database
Data Storage, extraction, accessibility and reporting are considered crucial components of
monitoring and evaluation systems, effective data management systems improve data
consistency, accuracy, timeliness and quality. The findings show certain deficits of
comprehensive MIS and data automation and management systems, many of the
governmental entities collect and store data manually still to date which is hard to use for
analysis and future decision making. Physical security of data is also a big challenge
within the governmental entities, for example there is no functional filing system for
documents storage or the documents are lost due to unprecedented causes such as fire, rain
etc.
It is vital to recommend that the government entities should introduce computerized
databases and Management Information Systems for M&E. These systems should be
capable of capturing data on national and sub-national level.
8. Supervision and Data Auditing
Data collection is a complex activity, probabilities for errors is more likely, especially
when there are no guidelines and procedures to control, audit and assure the quality of data
received from the field. Many of the entities assessed do not have data auditing and data
quality assurance guidelines.
It is therefore recommended that data quality assurance and supervision guidelines should
be developed and implemented. These guidelines could be stand-alone quality assurance
guidelines or be part of more comprehensive M&E guidelines or Standard Operating
Procedures.
9. Evaluation and Research
Evaluation and Research committee, board or teams are non-existent at the governmental
entities whose main function, practically, would be to initiate, approve and manage
research and evaluation assignments.
It is recommended that an entity level committee/board should be established to act as an
overseeing group for research/evaluation.; Members of the committee/board should not be
in any way involved in evaluations within the same entity. The committee/board will also
not doing evaluation themselves. The team or board will be responsible to oversee
outsourced evaluation/research assignments; other responsibilities of the board may
include developing draft ToRs, reviewing research/evaluation protocols, tools, reports and
approving the final research/evaluation reports. Results of researches and evaluations
should be made public for the reason of transparency and accountability.
10.
Data Dissemination and Use
The findings show that the culture of data based decision making is weak and in most of
the cases stored data is not available at an online platform.
59
Therefore, it is recommended that a policy of mandatory availability of the data in an
online platform should be introduced and also the culture of data based decision making
should be promoted.
Additional Recommendations
•
The assessment has made transparent that the Government of Afghanistan has
identified the challenge to harmonize all the detected different habits of practicing
Monitoring and Evaluation.
Therefore firstly it is recommended the formulation of a “National Results-Based
Monitoring and Evaluation (RBME) Policy Framework”.
Secondly, as soon as the RBME Policy is approved, to use for its implementation a
program approach. The program plan itself is the basis for monitoring and evaluating
the implementation of the policy.
In addition, it is recommended that for the development of the program the capacities
of the national M&E working group should be utilized. Moreover, representatives
from the civil society side (such as National Evaluation Societies and Voluntary
Organizations for Professional Evaluations (VOPEs) should be part of the planning
group as well.
The responsibility of the implementation of the RBME Policy should be with the
General Directorate of M&E and Audit of the Administrative Office of the President.
•
The findings show as well that entity/mechanisms at national level to coordination,
provide stewardship and to align M&E functions is not present.
Therefore it is recommended to establish a national level entity/mechanism for
coordination, steering and alignment of M&E functions in future (the general
directorate of the M&E and Audit at AOP may play an effective role in this regard;
but also the national M&E working group may take over this function on behalf of
AOP, however being responsible to report to AOP).
Besides, entity level coordination mechanisms (taskforce, committee) should be
established and strengthened as well so that M&E functions within the entities can
perform in a better manner.
•
It is also evident that Monitoring and Evaluation consumes resources as well,
however, the cost of M&E resources are significantly lower than the costs of
inefficiencies, corruption, lack of good governance and poor public services.
In order to improve M&E systems nationwide there is the need for budget allocations
to be defined separately for human resources development, technology, routine data
collection, research, evaluation, advocacy and communication.
Therefore it is recommended that sufficient budget for monitoring and evaluation
directorates should be allocated at each entity.
60
•
As highlighted by EvalPartners and International Organization for Cooperation in
Evaluation (IOCE), National Evaluation Societies and Voluntary Organizations for
Professional Evaluations (VOPEs), and M&E working groups play a vital role in
advocating, promoting, standardizing and strengthening evaluation. They are in the
position to disseminate international best practices and share regional knowledge
among the practitioners of evaluation. Moreover, such VOPEs and Societies also
work with governments, CSOs and private sector to increase demand for the use of
evaluation data.
Therefore, it is highly recommended that AOP pave the way for required support to
the National M&E working group, Afghan Evaluation Society (AfES) and other
active VOPEs in Afghanistan, value their recommendations and take advantage from
their expertise in the implementation of the prospective National M&E Policy.
61
6. Annexes: Assessment Tool
National M&E Stakeholders’ Assessment
Assessment Tool / Instrument
Introduction
The M&E stakeholders’ assessment is aimed to identify key M&E stakeholders and to assess the current M&E capacity of
the key stakeholders so that findings of the assessment can be used in the development of a National Monitoring &
Evaluation Policy (NMEP) for the government of Afghanistan.
Part of the stakeholders’ assessment is to identify M&E systems within selected governmental institutions, donor agencies,
CSOs and UN agencies in order to see how the M&E system is placed in these institutions, what performance domains are
currently functioning and how it can be improved, and strengthened with a prospective National Monitoring & Evaluation
Policy.
This tool will assess the M&E system in the following performance domains: (1) Organizational Structures; (2) M&E Human
Capacity; (3) M&E Plans; (4) M&E Advocacy, communications and culture;(5) Routine Program Monitoring; (6) Surveys and
Surveillance; (7) M&E Databases; (8) Supervision and Data Auditing; (9) Evaluation and Research; and (10) Data
Dissemination and Use.
The mentioned 10 performance domains will be assessed. The focus will be first on the availability or existence of each
performance domain, subsequently its level of adequacy, its strengths, and weaknesses; and, following with its practical
use of each.
Informed Consent
This assessment is conducted for the General Directorate of M&E and Audit of the Administrative Office of the President
(AOP), the primary objective is to conduct an in-depth analysis of stakeholders, as well as their M&E systems in order to
develop a evidence-based NMEP for Afghanistan, that will be effectively implemented based on the realities on the
ground. The assessment is financially and technically support by GIZ.
The assessment tool will be shared with the M&E unit and the M&E unit can fill the questionnaire. Then, the M&E unit will
be visited by a team of consultants to review the questionnaire, to collect the documents mentioned in the questionnaire
and to conduct an in-depth interview with M&E unit. The assessment will take around an hour, we will ask you different
questions around the 10 performance domains, and will also ask you to provide us with some necessary documentations
(where required) to support the responses given by you/your entity/ministry.
Final analysis will be done through generalizing results, with some specific highlights to certain M&E system strengths, as
well as weaknesses.
Your support in this process is highly appreciated in advance; let me know should you have any questions.
Thank you,
The Assessment team
62
Demographic/General Information
S.N
Questions
1
Name of entity being assessed (please write the name of
the your ministry and your unit)
3
Name of person/s (who are completing the
questionnaire or are being interviewed)
4
Position / Title
5
Contact Number
6
Email Address
7
8
9
Relevance of position/person with entity’s M&E system
Does the person represent the entity or the M&E system
of the entity?
Do you want to be contacted for verification of
data/information provided?
Answers
Yes
No
Represents the entity
Represents the M&E system
Both
Yes
No
Assessment Tool rating/functioning criteria
A 5-point scale should be used as following:
1 = Yes completely (in case all (100%) requirements are met)
2 = Mostly (in case at least 75% requirements are met)
3 = Partly (in case less than 50% requirements are met)
4 = No-Not at all (in case no (0%) requirements are met)
5 = Not applicable (in cases where the requirement is not applicable in certain cases)
63
Assessment of Organizational Structure with M&E Functions
S.N
Questions
1
Is there an M&E unit within the entity?
2
If yes, what is it called? (Name of the unit)
3
To whom is this unit/division/directorate reporting?
(please share Organogram of the M&E Unit)
Answers
Yes
No
4
How many posts (full time, part time) are there within
the M&E unit/directorate at national level?
5
How many posts (full time, part time) are typically there
within the M&E unit/directorate at the provincial level?
6
How many posts (full time, part time) are there within
the M&E unit/directorate at typically district level?
7
M&E responsibilities are clearly defined in job
descriptions?
8
Is external support in M&E required for your entity to
meet certain M&E needs and requirements?
9
If yes, is the external need met on timely basis?
10
Does the entity have a written mandate, in form of SOP,
Manual, Guideline or policy to execute its M&E
functions?
11
What is the name of the unit, which is responsible for
the provision of routine ____________ information?
Human Capacity for M&E
S.
N
Questions
1
Training need assessment (TNA) has been conducted among
national M&E staff during the last one (1) year?
2
Training need assessment (TNA) has been conducted among
sub-national (provincial and district) M&E staff during the last
one (1) year?
3
Capacity building plan present for the national M&E staff?
Total Number of Posts: ________
Number of Posts in Tashkeel: _________
Number of Posts Supported by Donors: _____
Number of Posts Filled: _____________
Number of Posts Vacant: ___________
Total Number of Posts: ________
Number of Posts in Tashkeel: _________
Number of Posts Supported by Donors: _____
Number of Posts Filled: _____________
Number of Posts Vacant: ___________
Total Number of Posts: ________
Number of Posts in Tashkeel: _________
Number of Posts Supported by Donors: _____
Number of Posts Filled: _____________
Number of Posts Vacant: ___________
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of JDs for key M&E staff.
Yes
No
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of it.
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the TNA report.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, provide a copy of the TNA reports.
Yes, completely
Mostly
64
4
Capacity building plan present for the sub-national M&E
staff?
5
Relevant national curricula build M&E human capacity
relative to the M&E system in colleges, university and/or
technical schools?
6
Human capacity relative to M&E is being built through
routine supervision and on-the-job training?
7
There is an entity level database or register of who is
receiving M&E training to avoid duplication and assure
complementarity?
Partly
No-Not at all
Not Applicable
If 1 or 2, provide a copy of the capacity building plan.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, provide a copy of the capacity building plan.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, provide specific examples of courses taught in the
universities, which build M&E capacity.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, then please provide report of supervisory visits conducted or
on the job training reports of the last 6 months.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of database or register.
8
The entity has a database of trainers and other technical
service providers capable of building M&E capacity?
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the list trainers or entities?
M&E Plan
S.N
Questions
1
A national strategic plan for ______________ is available?
(Write the name of the Ministry/Entity)
2
There is an entity level M&E plan available?
3
After its development, the national M&E plan is updated
regularly?
4
Section/unit specific M&E plans exist?
i.e. reporting, data collection, analysis, database, etc.
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the national
strategic plan.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the national M&E
plan.
Yes, completely
Mostly
Partly
No, Not at all
Not Applicable
If 1 or 2, please provide meeting minutes of the last
2 review meetings in which the plan was updated.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide the organogram of the unit.
65
5
Reports are prepared according to the M&E plan, and
clearly reflect progress against pre-defined set of
performance targets?
M&E Advocacy, Communications and Culture
S.N
Questions
1
There are people who strongly advocate for and support
M&E within the agency/entity?
2
M&E system information products (reports, website
content, newsletters, maps, tables, charts, etc) are being
generated and are useful?
3
Minister, Deputy Ministers and General Directors request
related information before and/or during reviews, planning
and costing processes.
Routine Program Monitoring
S.N
Questions
1
Entity level guidelines exist that document the procedures
for recording, collecting, collating and reporting program
monitoring data?
2
3
4
Entity level guidelines exist that provide instructions on
how data quality should be maintained?
Routine monitoring reports are prepared and disseminated,
and these are based on the routine monitoring plans?
Officers responsible for receiving reports from lower levels
systematically verify their completeness, timeliness and
identify obvious mistakes before aggregating the data.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, kindly provide copies of the last three
reports.
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, kindly provide copy of reports, newsletters
or charts.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, obtain a copy of such information provided
to minister, deputy ministers and general directors
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, provide a copy of such guidelines.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, provide a copy of such guideline.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of such reports.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
66
If 1 or 2, please provide any possible evidence for
such practices.
Surveys and Surveillance
S.N
Questions
1
An inventory of the entity mandate level related surveys
exits and is updated in past 12 months.
2
After how many years entity mandate level surveys or
assessments that aims to assess the performance or
measurement of mandate are conducted?
M&E Databases
S.N
Questions
1
Database/s for electronically capturing and storing data
generated for/by the M&E system is available and
functional?
2
3
4
Structures, mechanisms, procedures and timeframe for
transmitting, entering, and extracting and merging data
from/to database exist.
IT equipments and supplies are available for maintaining
the M&E database/s?
Quality control mechanisms are in place to ensure that data
are accurately captured?
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide the list of surveys, and
reports of the conducted surveys.
_______________ Years
(Please provide mandate level assessment reports
or survey reports)
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a replica of the database or
data exported from the database in an Excel Spread
sheet.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of such procedures.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide an inventory list of the IT
equipment and supplies present at the M&E unit of
the entity.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the QC guidelines.
67
Supervision and Data Auditing
S.N
Questions
1
Entity level guidelines and tools for supervision on M&E
exist (as standalone or as a chapter/module of more
comprehensive supervision guidelines)?
2
Protocol for data auditing or quality assurance exists?
3
Data auditing results have been recorded and feedback
provided to units/sections or those who collected data?
Evaluation and Research
S.N
Questions
1
An inventory (register/database) exists of research and
evaluation institutions and their activities relevant to the
entity, and has been updated in the past 12 months.
2
A mandated entity level team/committee/board and
procedures exists which is responsible for coordinating and
approving new research and evaluations?
3
The entity conducted a research/evaluation of its mandate
level program (s) in the past 1-2 years?
4
Procedures exist for the mandated team to coordinate
and/or implement research and/or evaluations?
5
A research and evaluation plan exist within the entity to
direct research/evaluation assignments?
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the guidelines.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the protocol.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of such feedback
report.
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of such inventory.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide the procedures, the TOR of
the committee and list of members of committee.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please share the reports of the research or
evaluations.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the such
procedures.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide a copy of the research plan.
68
Data Dissemination and Use
S.N
Questions
1
A list of the relevant stakeholders of the entity along with
their contact details (email, phone) is available and
updated?
2
Information products are regularly disseminated to the data
providers?
3
Information products are regularly sent to a wide variety of
stakeholders- other than the data providers?
4
There are clear guidelines and procedures to support the
analysis, presentation and use of data at the entity level?
5
Stakeholders have access to the data/information products
in the public domain (online) or through a data center?
6
Explicit arrangement/plans are in place for data use and
dissemination?
SN
1
2
3
4
5
6
7
8
9
10
Name of Required Document
JDs of Key M&E Staff
Write M&E Mandate (Policy, Manual, Guideline,
SOPs)
National M&E Staff TNA Report
Sub-National M&E Staff TNA Report
National M&E Capacity Building Plan
Sub-National M&E Capacity Building Plan
A copy of the higher M&E related curricula
relevant to Entity
Supervisory Visits or On-the Job Training Report
Copy of the Trainee database showing staff
trained in M&E from within the entity unit
List of capable M&E trainers or entities present in
the Market
Provided
Answers
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide the list of the stakeholders.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide documents showing how
and when the information products are shared with
data providers.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide documents showing how
and when the information was shared with other
stakeholders.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, obtain a copy of such guidelines.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, please provide the online link where the
information is available.
Yes, completely
Mostly
Partly
No-Not at all
Not Applicable
If 1 or 2, kindly provide a copy of the data use and
data dissemination plans.
Not Provided
Remarks
69
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
Entity Level National Strategic Plan
Entity Level M&E Plan
M&E Plan review meetings minutes
Organogram of M&E unit
Copies of latest 3 M&E reports
Copy of M&E Information Products (Reports,
Newsletter, Chart, Maps)
Copy of Information provided to higher
authorities
Copy of Guideline for program monitoring data
DQA guidelines
Documents Data cleaning and validation
procedure
List of Surveys needed by the Entity that should
be conducted to assess the mandate of the entity
Replica of the M&E Database
Procedures or guidelines for database
management
Inventory list of M&E related IT equipment
QC guideline for database and data management
Supervision guideline to supervise M&E
staff/activities
DQA feedback reports
Inventory of available research and evaluations
institutions
Research/Evaluation Review Boards procedure,
TOR and list of Members
Research reports conducted by the entity
Research and Evaluation plan of the entity
List of the M&E stakeholders of the entity
Entity level Dissemination plan for M&E findings
Data Analysis and Data Presentation and Data
Use guidelines
Online link to access M&E information
70
Download