TERMS OF REFERENCE for Mid-Term Review of the UNEP/UNDP/GEF project: Implementing Integrated Water Resource and Wastewater Management in Atlantic and Indian Ocean SIDS I. PROJECT BACKGROUND AND OVERVIEW 1. Project General Information Project summary GEF SEC Project ID: 2706 Sub-programme: n/a UNEP approval date: GEF OP #: October 2010 GEF-4 IW SP3 Project Type: Expected Accomplishment(s): Full Scale Size Focal Area(s): International Waters IW SP3:Balancing over use and conflicting uses of water resources in transboundary surface and groundwater basins UNEP: 20 May 2012 UNDP: UNDP Project became effective on 17 Oct 2012, when three out of six Government signatures were secured. N/A GEF CEO Endorsement date: 28 December 2010 GEF Strategic Priority/Objective: Expected Start Date: January 2011 Actual start date: Planned completion date: UNEP: May 2016 UNDP: October 2016 Actual completion date: Planned project budget at approval: US$ 49,707,535 GEF Allocation: PDF GEF cost: Expected FSP co-financing: UNDP: US$ 4,500,000 UNEP: US$ 5,200,000 Total: US$ 9,700,000 US$290,000 $39,422,535 First Disbursement: UNEP: 23 July 2012 UNDP: July 2012 Date of last Steering Committee meeting: 2 July 2014 Mid-term review/ evaluation (planned date): December 2014 PDF co-financing: US$295,000 Date of financial closure: UNEP: 31 December 2016 UNDP: within 12 month after operational closure (Operational closure: 17 Oct 2016) Mid-term review/ evaluation (actual date): August – October 2015 2. Project Rationale The geographic scope of this regional project covers the Indian and Atlantic Oceans, focusing on six Small Island Developing States (SIDS); two (2) are in the Atlantic Ocean (Cabo Verde and São Tomé & Principe) and four are located in the Indian Ocean (Comoros, Maldives, Mauritius and Seychelles). The Small Island Developing States (SIDS) share geographically similar features and fundamentally similar problems with regard to water management and conservation, land-based sources of pollution, and issues of environmental flow relating to habitat and ecosystem protection. In acknowledgment of the vulnerability and the particular needs of SIDS, the project on implementing integrated water resources and wastewater management in Atlantic and Indian Oceans SIDS (AIO IWRM) was formulated to address sustainable water management in the six participating SIDS. The project was designed to contribute to sustainable development in the Atlantic and Indian Oceans Small Island Developing States (SIDS) through improvements in water resource and environmental management. The project is consistent with the GEF IV strategic objective for International Waters: (a) ‘to play a catalytic role in addressing trans-boundary water concerns by assisting countries to utilize the full range of technical assistance, economic, financial, regulatory and institutional reforms that are needed’, through supporting and building on existing political commitments and through promoting sustainable water use and improved water management now, making it easier to address the challenges of the future as climatic variability and change affect water resources further. The project will specifically contribute to achievement of the MDG targets for water supply and sanitation as spelled out in the national sustainable development strategies and specifically the MDG target of setting processes in motion towards National IWRM Plans. More specifically the project is delivering outcomes under GEF IV Strategic Program III (SP-3) through working with national governments and communities to address their needs for safe drinking water and other socio-economic benefits of sustainable and safe water resources, including balancing environmental requirements with livelihood needs. 3. Project Objectives and Components The Goal of the project is to ‘contribute to sustainable development in the Indian and Atlantic Ocean Small Island Developing States through improvements in natural resource and environmental management’. The overall Objective is to ‘accelerate progress on WSSD targets and IWRM and WUE plans and water supply and sanitation MDGs for the protection and utilization of groundwater and surface water in the participating countries’. This will be based on best practices and demonstrations of IWRM approaches. The project will deliver across a range of MDG targets using IWRM approaches (but with particular focus on MDG 7: Ensure Environmental sustainability) as the wider development entry point, and will help countries utilize the full range of technical, economic, financial, regulatory, and institutional measures needed to operationalize sustainable development strategies for waters and their drainage basins (both surface and ground water). The project consists of five components: Component C1 focusing on country-driven and designed demonstration activities focusing on sustainable water management to utilize Ridge to Reef IWRM approaches to bring significant environmental stress reduction benefits. Demonstration projects will act as catalysts for replication and scaling-up approaches to improve national water resources management, and regionally to support the Atlantic and Indian Ocean SIDS in reducing land based pollutants from entering the ocean. Component C2 aims to develop an IWRM and WUE Regional and National Indicator Framework for improved national and regional sustainable development using water as an entry point. It seeks to ensure countries can monitor IWRM implementation at national level based on improved collection of gender disaggregated data and indicator feedback and action. Component C3 focus on Policy, Legislative, and Institutional Reform for IWRM and WUE through supporting institutional change and re-alignment to enact National IWRM Plans and WUE strategies, including appropriate financing mechanisms and supporting and building further political will to endorse IWRM policies and plans. Component C4 provides a Capacity Building and Sustainability Program for IWRM and WUE, including Knowledge Exchange and Learning and Replication. Component C5 is the Project Management responsible for the efficient implementation of the project Each of the components as related outcome and outputs presented in Table 1 below; Table 1: Expected outcomes and outputs from log frame Outcomes Sub-outcomes / Outputs IWRM and WUE demonstrated through targeted on ground projects interventions. 1.1 Outcome 1.1 (Cabo Verde): Protection of groundwater resources, stabilization of coastal terrains and promotion of productive activities at coastal areas through the integrated planning and management of wastewater collection, treatment and reuse demonstrated in Tarrafal in the Island of Santiago. 1.1.1 1.2 Outcome 1.2 (Comoros): Improved water source protection through IWRM Planning and management in Mutsamudu on the island of Anjouan. 1.2.1 1.2.2 1.3 Outcome 1.3 (Maldives): Protection of the freshwater lens of Thoddoo Island from salinization and agro-chemical pollution, with improved drought season aquifer yield. 1.3.1 1.1.2 1.1.3 1.2.3 1.2.4 1.2.5 1.3.2 1.3.3 1.3.4 1.4 Outcome 1.4 (Mauritius): The protection and sustainable utilization of the Northern Aquifer of Mauritius demonstrated through the 1.4.1 1.4.2 Improved wastewater management systems, resulting in more treated wastewater as an alternative water source and reduced untreated wastewater discharge to the ground; Increased treated wastewater used for irrigation, resulting in less groundwater abstraction and increased crop productivity by the communities; Awareness raised on water use efficiency for domestic use as well as tourism sector Water resource assessment and monitoring systems Established Water quality improved through solid waste management and water source protection Reservoir protected from the effects of small-scale farming practices Watershed management plan for Mutsamudu developed Awareness raised on IWRM and catchment management and its contribution to MDGs and gender empowerment Sustainable and innovative groundwater extraction system (infiltration gallery system) established and operational; Agricultural practices improved, reducing groundwater contamination; Sustainable groundwater extraction systems accepted nationally as relevant to island potable water supply and planned for replication nationally; Groundwater quality monitoring system established and operational by gender balanced trained locals. Water Resources Assessment conducted to determine and monitor the safe yield and water quality of the aquifer; Improved water quality protection of the groundwater and lagoon water quality through improved wastewater treatment and management systems; integrated planning and management of wastewater collection, treatment and reuse. 1.4.3 1.4.4 1.5 Outcome 1.5 (São Tomé & Principe): Integrated River basin management plan for the Rio Provaz Basin developed to enable equitable water resources allocation and protection, contributing to sustainable economic development, public health and environmental protection. 1.5.1 1.5.2 1.5.3 1.5.4 1.6 Outcome (Seychelles): Protection of a coastal gravel aquifer through integrated land and water management measures (water demand management, land use, flood management) demonstrated in the island of La Digue. 2. IWRM and WUE indicators, baselines, targets and monitoring protocols discussed, agreed and adopted into long-term monitoring programs at national and ‘regional’ levels 3. SIDS employ new plans, policy tools and approaches in implementing IWRM commitments 1.6.1 1.6.2 Reduced stress on the aquifer through improved water demand management and dissemination of best practices, aiming for replication at the national level and beyond; Capacity strengthened and awareness raised among government, private sector and civil society for aquifer protection against over-extraction and contamination with special focus on climate change and gender empowerment Quality and quantity of water Resources in the Rio Provaz basin assessed; Institutional capacity (cross-sectoral coordination) strengthened and decentralized (municipal) water management fostered through the development and implementation of Basin Water Resources Allocation and Protection Strategy; Water pollution reduced through improved wastewater treatment systems (piloting ECOSAN wastewater management), solid waste collection and disposal and residential sanitation at the poor communities; Awareness raised of IWRM at the basin level to strengthen community participation in IWRM and to ensure sustainability Water abstraction reduced through water demand management measures, including rainwater harvesting, household water tanks to reduce peak water demand, wastewater reuse, improved metering and tariff reform; Groundwater availability and quality improved through improved septic tank management, wastewater collection and treatment, prevention of seawater intrusion, solid waste collection, and groundwater recharge. 2.1 Inventories of national monitoring practices related to IWRM, WUE and environment 2.2 Indicator Framework including process, stress reduction, environmental and socio-economic status, WUE, catalytic, governance and cross-cutting indicators; gender disaggregated data and participatory monitoring protocols agreed nationally and ‘regionally’ 2.3 Baselines and Targets established at national and ‘regional’ levels for Indicator Framework 2.4 Indicator framework and monitoring protocols tested and in use at demonstration sites, national and ‘regional’ levels 2.5 Institutional capacity for monitoring strengthened 3.1 SIDS IWRM Diagnostic Analyses strengthened and IWRM Road maps developed 3.2 National IWRM plans and WUE strategies developed and endorsed with attention to sustainability, financial mechanisms and replication strategies for demo projects 3.3 Functioning IWRM Partnerships within SIDS at national and other levels established or strengthened (e.g. national inter-sectoral committees, apex bodies, catchment committees, water user 4. Strengthened capacity allows stakeholders and institutions in SIDS to fulfill their role in local, national and regional IWRM processes and exchange best practices 5. Project implemented effectively and efficiently to the satisfaction of partners 4. groups as relevant) and among SIDS 4.1 Awareness created on roles and responsibilities of IWRM across governments, civil society, education systems and private sector; 4.2 Targeted trainings and communications platform strengthen stakeholder groups’ capacity to fulfil mandate in IWRM, including apex bodies and water champions (men and women) 4.3 Twinning or exchange programmes promote learning and transfer of experience in support of IWRM implementation 4.4 Replicable practices from demonstration projects and national IWRM processes identified and promoted 5.1 Capable human resources and efficient systems support project implementation 5.2 Monitoring, consultation and advisory mechanisms support project implementation Executing Arrangements The project is implemented by UNEP and UNDP and executed by UNOPS, involving two business units of UNOPS. UNDP is implementing targeted IWRM demonstration projects in all six countries under Component 1 (C1) through UNOPS Water and Energy Cluster (UNOPS WEC) based in Copenhagen, Denmark. UNEP is implementing the IWRM Regional and National Indicator Framework, Component 2 (C2), Policy, legislations and water sector reforms, Component 3 (C3), and regional and National awareness raising, capacity building and networking, Component 4 (C4). The UNEP components are implemented through UNOPS East Africa Hub (UNOPS EAH), formerly Kenya Operational Hub (KEOH) based in Nairobi, Kenya. Both UNEP and UNDP are jointly responsible for the Project Management and Coordination, Component 5 (C5). Within UNEP, the Division of Environmental Policy Implementation (DEPI) is responsible for project implementation under the direct oversight of the UNEP/GEF Task Manager, Africa. Within UNDP, its Mauritius Country Office has delegated Authority to oversee the implementation of UNDP component with technical and oversight support provided from the Regional Technical Advisor for Water and Ocean Governance. The GEF Implementing Agencies, that is UNEP and UNDP, provide project oversight to ensure that the project meets project goals. Executing Agency, that is UNOPS, ensures the project will be implemented within the available resources and assumes the over fiducial responsibilities for the project budget and expenditure. The Regional Project Coordination Unit (RPCU) is constituted by UNOPS (both the East Africa Hub (EAH) and Water and Energy Cluster (WEC)). UNOPS, through the Regional Project Coordination Unit (RPCU), provides day to day operational support and management. The Regional Project Coordination Unit (RPCU) is based in Nairobi, Kenya, hosted by the UNEP Division of Policy Implementation (DEPI). The Regional Project Steering Committee (RPSC) provides strategic guidance and direction to the project. It comprises one representative (the project’s national focal point) from each of the six participating countries and representatives from UNEP, UNDP and UNOPS. The RPSC reviews progress, provides strategic guidance to the Project and RPCU, and approves annual work plans and budget. The diagram below is the Project chart illustrating the project’s structures and arrangements. 1 Regional Steering Committee COUNTRIES UNEP UNDP National IWRM Focal Points UNOPS/EAH UNOPS/WEC Regional Project Coordination Unit Regional Technical Assistance Governance Coordination (Assistant to the IWRM FP) Demonstration Project Technical Assistance Demonstration Project Manager National Steering Committee 6 x Country Governance Structure National IWRM Governance Activities National IWRM Demonstration Project National Integrated Water Resources Management Stakeholders Fig. 1: Project Organizational Chart 5. Project Cost and Financing. The total project cost at the time of approval is US$ 49,122,535 with GEF allocation being US$ 9,700,000 (20%) and co-financing US$ 39,422,535 (80%). Table 2 below shows the breakdown per component and agency at CEO endorsement. Table 2: Approved Project Cost per Component and Funding Source Component GEF Allocation (US$) Co-financing (US$) 4,320,000 (11%) 33,872,881 (89%) 38,192,881 C2 - IWRM and WUE indicators, baselines, targets and monitoring protocols discussed, agreed and adopted into long-term monitoring programs at national and ‘regional’ levels (Implemented by UNEP/UNOPS KEOH) 657,300 (59%) 450,000 (41%) 1,107,300 C3 - SIDS employ new plans, policy tools and approaches in implementing IWRM commitments (Implemented by UNEP/UNOPS EAH) 1,556,300 (75%) 530,000 (25%) 2,086,300 C1 - IWRM and WUE demonstrated through targeted on ground projects interventions (Implemented by UNDP/UNOPS WEC) Total (US$) C4 - Strengthened capacity allows stakeholders and institutions in SIDS to fulfill their role in local, national and regional IWRM processes and exchange best practices (Implemented by UNEP/UNOPS EAH) C5 - Project implemented effectively and efficiently to the satisfaction of partners (Implemented by UNEP/UNOPS EAH & UNDP/UNOPS WEC) Total 6. 2,206,400 (82%) 483,500 (18%) 2,689,900 UNEP/EAH: 780,000 UNDP/WEC: 180,000 (19%) 9,700,000 (20%) 4,086,154 (81%) 5,046,154 39,422,535 (80%) 49,122,535 Implementation Issues The project logframe for the overall project has not been revised since the start of the project, also no revisions of the budget appeared necessary. The logframe for each demonstration project under Component 1 has been reviewed and necessary adjustments have been proposed and approved by the project steering committee. The project duration is 48 months, starting on 16 May 2012 (UNEP) and Oct 2012 (UNDP), is expected to be completed by May 2016 (UNEP) and Oct 2016 (UNDP), respectively. The project start for UNEP-implemented components was delayed due to changes in the Executing Agency -EA- arrangements (originally UNEP DEPI Marine and Coastal Division was to be the projects EA.) The UNDP-implemented pilot demonstration activities are scheduled to close in June 2016 with the Component 1 operational closure scheduled in Oct 2016. Considering some delays in project delivery in Component 2-4, a review of the project implementation plan for those Components has taken place at the beginning of 2015 by the current UNEP Task Master with proposals for an accelerated development of outputs of the UNEP components. Challenges faced to date in the implementation of components 2 to 4 of this project have resulted in less than 20% of delivery of outputs and considerable loss of momentum. Besides logistical difficulties (frequent changes of Project managers, incomplete hiring processes of incountry IWRM experts to assist focal points nationally and a change in UNEP Task Manager), there has been a general delay in the implementation process. Risks and the associated mitigation plan presented in the PIR2014 is shown in the Table 3 below. Table 3: The PIR for 2014 contained the following risk mitigation plan: TOP RISK MITIGATION PLAN Rank – importance of risk Risk Statement – potential problem (condition and consequence) Action to take – action planned/taken to handle the risk Who – person(s) responsible for the action Date – date by which action needs to be or was completed Rank Risk Statement Action to Take Condition Consequence 1 Delayed recruitment Components 2 and Post VAs, where of GCAs 3 seriously delayed necessary, for recruitment 2 Delayed recruitment Various project Combine VA with webof Communication Component editor responsibilities expert delayed Who Date PM Mid 2014 September PM Mid 2014 September TOP RISK MITIGATION PLAN Rank – importance of risk Risk Statement – potential problem (condition and consequence) Action to take – action planned/taken to handle the risk Who – person(s) responsible for the action Date – date by which action needs to be or was completed 3 Delayed training of Indicator Recruitment of PM Q4 GCAs framework consultants or delayed, as well as institution to conduct governance-related training activities FY13 rating FY14 rating Comments/narrative justifying the current FY rating and any changes (positive or negative) in the rating since the previous reporting period Medium Medium The delays in recruitment and the extended inception period have meant that there are risks associated with delivery within the expected timeframe. Efforts must be taken to redress the delays. The Risk 1 shown in the table has been mitigated at the time of preparation of the current document. One GCA was recruited in São Tomé and at a later stage the title of the position has been changed into IPSA, IWRM Policy Support Analyst. Four out of the remaining countries have advertised the position as per April 30th, 2015 II. TERMS OF REFERENCE FOR THE EVALUATION 1. Objective and Scope of the Evaluation In line with the implementing agencies guideline (UNEP Evaluation Policy 1 and the UNEP Programme Manual2, and UNDP Midterm review guidelines3) the Mid-term Review of the Project “Implementing Integrated Water Resources and Wastewater Management in Atlantic and Indian Ocean SIDS” is undertaken approximately half way through project implementation to analyze whether the project is on-track, what problems or challenges the project is encountering, and what corrective actions are required. The MTR will assess project performance to date (in terms of relevance, effectiveness and efficiency), and determine the likelihood of the project achieving its intended outcomes and impacts, including their sustainability. The evaluation has two primary purposes: (i) to provide evidence of results to meet accountability requirements, and (ii) to promote operational improvement, learning and knowledge sharing through results and lessons learned among UNEP, UNDP, UNOPS and Government partners. Therefore, the evaluation will identify lessons of operational relevance for project implementation and future project formulation. It will focus on the following sets of key questions, based on the project’s intended outcomes, which may be expanded by the consultants as deemed appropriate: a. Quality of project design, b. Achievement of main objectives and effectiveness of the programme. 1 http://www.unep.org/eou/StandardsPolicyandPractices/UNEPEvaluationPolicy/tabid/3050/language/en-US/Default.aspx 2 http://www.unep.org/QAS/Documents/UNEP_Programme_Manual_May_2013.pdf 3 http://web.undp.org/evaluation/documents/guidance/GEF/mid-term/Guidance_Midterm%20Review%20_EN_2014.pdf c. Efficiency of the implementation d. Sustainability of the effects e. Key cross-cutting issues f. Co-ordination, complementarity and coherence 2. Overall Approach and Methods The Mid-term Terminal Evaluation of the Project will be conducted by external and independent consultants to be recruited and contracted by UNOPS. The MTR will be an in-depth review/evaluation using a participatory approach whereby key stakeholders are kept informed and consulted throughout the evaluation process. Both quantitative and qualitative evaluation methods will be used to determine project achievements against the expected outputs, outcomes and impacts. It is highly recommended that the consultant(s) maintains close communication with the project team and promotes information exchange throughout the evaluation process in order to increase their (and other stakeholder) ownership of the review findings. The MTR must provide evidence based information that is credible, reliable and useful. The MTR will review all relevant sources of information including documents prepared during the preparation phase (e.g. PIF, the Project Document, project reports including quarterly progress reports, Annual Project Review/PIRs, project budget revisions, lesson learned reports), national strategic and legal documents, and any other materials that the evaluator considers useful for this evidence-based review). The evaluator will review the baseline GEF focal area Tracking Tool submitted to the GEF at CEO endorsement, and the midterm GEF focal area Tracking Tool which will be completed before the MTR field mission begins. The evaluator is expected to follow a collaborative and participatory approach ensuring close engagement with the implementing and executing agencies, Project Team, government counterparts (including the GEF Operational Focal Point). Stakeholder involvement should include interviews with stakeholders. The findings of the evaluation will be based on the following: (a) A desk review (see also Annex 4) of: 1. Relevant background documentation, inter alia the Diagnostic Analysis report, the Hotspot Analysis reports; 2. Project design documents (including minutes of the project design review meeting at approval); Annual 3. 4. 5. 6. 7. 8. Work Plans and Budgets or equivalent, revisions to the project (Project Document Supplement), the logical framework and its budget; Engagement Agreement documents, Partner institutions capacity assessment reports, Project Cooperation Agreements (with Countries for demonstration project implementation) Project reports including inception phase reports, Inception workshop report, quarterly progress reports, annual progress reports (Project Implementation Report, PIR), financial reports, consultants reports, progress reports from collaborating partners, project coordination minutes, National Steering committee minutes, regional Steering committee minutes, relevant correspondence etc.; Missions TOR and reports, monitoring reports Project outputs: Consultancy reports, etc. Evaluations/reviews of similar projects An implementation review plan (produced for the 5th June meeting with countries in Nairobi Kenya) (b) Interviews (individual or in group) with: 1. UNEP Task Manager 2. UNDP Regional Technical Adviser 3. Regional Project Coordination Unit; 4. UNEP Fund Management Officer; 5. UNDP Country Offices in Cabo Verde, Comoros, Maldives, Mauritius, São Tomé & Principe and Seychelles. 6. UNOPS East Africa Hub 7. UNOPS Water and Energy Cluster 8. Participating beneficiary Government teams and partners in Cabo Verde, Comoros, Maldives, Mauritius, 9. 10. 11. 12. (c) (d) São Tomé and Principe and Seychelles, National Steering Committees, Regional steering Committee, Relevant UNEP Sub-programme Coordinators, Relevant resource persons; Field visits: To beneficiary countries and participation in the Regional Steering Committee meeting Other data collection tools: The final MTR report should describe the full MTR approach taken and the rationale for the approach making explicit the underlying assumptions, challenges, strengths and weaknesses about the methods and approach of the review. 3. Key Evaluation principles Evaluation findings and judgements should be based on sound evidence and analysis, clearly documented in the evaluation report. Information will be triangulated (i.e. verified from different sources) to the extent possible, and when verification was not possible, the single source will be mentioned. Analysis leading to evaluative judgements should always be clearly spelled out. The evaluation will assess the project with respect to a minimum set of evaluation criteria grouped in six categories: (1) Strategic Relevance; (2) Attainment of objectives and planned result, which comprises the assessment of outputs achieved, effectiveness and likelihood of impact; (3) Sustainability and replication; (4) Efficiency; (5) Factors and processes affecting project performance, including preparation and readiness, implementation and management, stakeholder participation and public awareness, country ownership and drivenness, financial planning and management, UNEP supervision and backstopping, and project monitoring and evaluation; and (6) Complementarity with the UNEP and UNDP strategies and programmes. The evaluation consultants can propose other evaluation criteria as deemed appropriate. Ratings. All evaluation criteria will be rated on a six-point scale. Annex 2 provides guidance on how the different criteria should be rated and how ratings should be aggregated for the different evaluation criterion categories. Baselines and counterfactuals. In attempting to attribute any outcomes and impacts to the project intervention, the evaluators should consider the difference between what has happened with, and what would have happened without, the project. This implies that there should be consideration of the baseline conditions, trends and counterfactuals in relation to the intended project outcomes and impacts. It also means that there should be plausible evidence to attribute such outcomes and impacts to the actions of the project. Sometimes, adequate information on baseline conditions, trends or counterfactuals is lacking. In such cases this should be clearly highlighted by the evaluators, along with any simplifying assumptions that were taken to enable the evaluator to make informed judgements about project performance. The “Why?” Question. As this is a Mid-term Evaluation particular attention should be given to identifying implementation challenges and risks to achieving the expected project objectives and sustainability. Therefore, the “Why?” question should be at the front of the consultants’ minds all through the evaluation exercise. This means that the consultants need to go beyond the assessment of “what” the project performance was, and make a serious effort to provide a deeper understanding of “why” the performance was as it was, i.e. of processes affecting attainment of project results (criteria under category F – see below). This should provide the basis for the lessons that can be drawn from the project. In fact, the usefulness of the evaluation will be determined to a large extent by the capacity of the consultants to explain “why things happened” as they happened and are likely to evolve in this or that direction, which goes well beyond the mere review of “where things stand” at the time of evaluation. A key aim of the evaluation is to encourage reflection and learning by Project (staff and key project stakeholders). The consultant should consider how reflection and learning can be promoted, both through the evaluation process and in the communication of evaluation findings and key lessons. Communicating evaluation results. Once the consultant(s) has obtained evaluation findings, lessons and results, the Evaluation Office will share the findings and lessons with the key stakeholders. Evaluation results should be communicated to the key stakeholders in a brief and concise manner that encapsulates the evaluation exercise in its entirety. There may, however, be several intended audiences, each with different interests and preferences regarding the report. It is therefore planned that the consultant will participate in the Regional steering committee to interact and provide feedback. If considered necessary, a webinar, conference calls with relevant stakeholders may be planned. 4. Evaluation Criteria i. Strategic Relevance The evaluation will assess, in retrospect, whether the project’s objectives and implementation strategies were consistent with global, regional and national environmental issues and needs. The evaluation will assess whether the project was in-line with the GEF International Waters focal area’s strategic priorities and operational programme(s). The evaluation will also assess the project’s relevance in relation to UNEP’s and UNDP’s mandate and its alignment with their respective policies and strategies at the time of project approval. The magnitude and extent of any contributions and the causal linkages should be fully described. The MTR will assess the following four categories of project progress. 5. Project Strategy ii. Project Design: The evaluation will assess, in retrospect, whether the project’s objectives and implementation strategies were consistent with global, regional and national environmental issues and needs. The evaluation will assess whether the project was in-line with the GEF International Waters focal area’s strategic priorities and operational programme(s). The evaluation will also assess the project’s relevance in relation to UNEP’s and UNDP mandate and its alignment with their respective policies and strategies at the time of project approval. Review project design, specifically; Review the problem addressed by the project and the underlying assumptions. Review the effect of any incorrect assumptions or changes to the context to achieving the project results as outlined in the Project Document. Review the relevance of the project strategy and assess whether it provides the most effective route towards expected/intended results. Were lessons from other relevant projects properly incorporated into the project design? Review how the project addresses country priorities. Review country ownership. Was the project concept in line with the national sector development priorities and plans of the country (or of participating countries in the case of multi-country projects)? Were the executing agencies role well-chosen and developed? Review decision-making processes: were perspectives of those who would be affected by project decisions, those who could affect the outcomes, and those who could contribute information or other resources to the process, taken into account during project design processes? Review the extent to which relevant gender issues were raised in the project design. See Annex 9 of Guidance For Conducting Midterm Reviews of UNDP-Supported, GEF-Financed Projects for further guidelines. If there are major areas of concern, recommend areas for improvement. Analyze Results Framework/Logframe, specifically; Undertake a critical analysis of the project’s logframe indicators and targets, Assess how “SMART” the midterm and end-of-project targets are (Specific, Measurable, Attainable, Relevant, Time-bound), and Suggest specific amendments/revisions to the targets and indicators as necessary Are the project’s objectives and outcomes or components clear, practical, and feasible within its time frame? Examine if progress so far has led to, or could in the future catalyse beneficial development effects (i.e. income generation, gender equality and women’s empowerment, improved governance etc...) that should be included in the project results framework and monitored on an annual basis. Ensure broader development and gender aspects of the project are being monitored effectively. Develop and recommend SMART ‘development’ indicators, including sex-disaggregated indicators and indicators that capture development benefits. iii. Progress towards Results a. b. c. d. The project implementation (particularly that of the UNEP components) is heavily delayed due to a number of factors: A project implementation review for accelerated implementation of project delivery and increased synergies with the uNDP component was discussed informally with countries on the 5 th June 2015 in UNEP in Nairobi. The following is a summary of the challenges and remediation actions listed in the project implementation review (ANNEX for further details) In its fourth year of operation (since May 2012) the SIDS AIO Project faces a number of delays. Besides logistical difficulties (frequent changes of Project managers, incomplete hiring processes of in-country IWRM experts to assist focal points nationally and change in UNEP Task Managers in 2014/2015)), there has been a general delay in the implementation process. The proposed implementation review and workplan were developed and discussed informally with participating countries at an ad hoc meeting in UNEP in Nairobi on 5 th June 2015. The project is 70% into its total project duration (planned end mid 2016) with less than 20 % of the outputs achieved but with a remaining 80% of the budget for the implementation of the UNEP components. Timing being the main limiting factor in succeeding a turnaround of the project and the successful mainstreaming of IWRM policies and monitoring frameworks into the respective national processes, the meeting recommended that a one year (no cost) extension would be the basis on which accelerate implementation would stand a chance of success. e. The success of the implementation of UNEP components (2-4) will depend on the support provided to mainstreaming IWRM into the different national processes, the targeted capacity building to enhance related policy and monitoring frameworks as well as the necessary inter-ministerial coordination to endorse and sustain IWRM strategies and Road Maps. An update of the national diagnostic analyses in the six countries will provide the new baseline reflecting changes in water related policy processes in the last year. f. In line a proposal for accelerating the implementation process over the next months was proposed which include (I) reinforcement of the Regional PCU with a new Project Coordinator and a communication expert, (ii) accelerating the hiring process for the national IWRM experts to assist the focal points in the implementation of UNEP components, (iii) hiring of IWRM experts and trainers for sub-regional training workshops, (iv) hiring of an IWRM monitoring expert to be in charge of the activities under component 2 (IWRM indicator framework and monitoring national and regional dimensions). g. Regarding sustainability of the project outputs and the effective impact of activities it was proposed to build on the success of the demonstration activities (comp1 UNDP) and for UNEP to continue financing the demonstration activities along the extension of the project beyond mid 2016 to 2017. This will facilitate the upscaling of demo activities to the national process, create synergies and cross-fertilisation across project components and ensure replication of activities. h. The question of the role of the Nairobi and Abidjan Convention secretariats will be revisited in order to ensure optimal integration of the outputs into regional frameworks in the long term (especially in view of contribution to existing regional monitoring frameworks). iv. Progress Towards Outcomes Analysis: Review the logframe indicators against progress made towards the end-of-project targets using the Progress Towards Results Matrix (shown as Table 4 below), rate the progress using the 6-scale rating, and make recommendations for the areas marked either marginally unsatisfactory or unsatisfactory (or unlikely achieved or highly unlikely achieved). Table 4: Progress Towards Results Matrix (Achievement of outcomes against End-of-project Targets) Project Strategy Indicators Baseline levels Levels in 1st PIR (self reported) Mid-term Targets End of Project Target 1.1 (Cabo Verde): Protection of groundwater resources, stabilization of coastal terrains and promotion of productive activities at coastal areas through the integrated planning and management of wastewater collection, treatment and reuse demonstrated in Tarrafal in the Island of Santiago m3 of wastewater collected for treatment (or # of households connected to WWT system) Limited connection to sewerage system (typically 41%); currently system operating at 10% capacity Contractor identified and on ground; Extended sewerage system covering 100% of target households; system operating at its 75% capacity m3 of treated wastewater used for irrigation; ha of farmland under treated water number of farmers trained for the micro irrigation system with gender disaggregated data number of trees planted as a natural barrier against salinization number of awareness raising campaigns conducted None, -estimated to be operating at 10% studies on wastewater facility improvement have been completed with draft report and technical design details available by June 2014 not initiated, focus is to increase the volume of treated wastewater Xxx (TBD by dec 2013), -none, 0 ha under farming using treated waste 5ha under drip irrigation 110 farmers currently cultivating in Colonato Not initiated by June 2014 130 farmers using micro irrigation 150 farmers using micro irrigation None a tree nursery established, to be planted during the next rain season. Major awareness campaigns have being undertaken with two major events captured by national television. An awareness raising strategy was also developed ( to be translated and shared with other countries as a good example) There has being discussion with different stakeholders including research Institutions and University to be involved. TOR to procure services of Sites identified; 50,000 trees planted 250,000 trees planted, with at least 70% survival Awareness raising plan established, 2 major awareness raising campaigns every quarter in yr 3, at least 10 major campaigns by end of project 1.2 (Comoros): Improved water source protection through IWRM Planning and management in Mutsamudu on the Water resources assessed at identified monitoring points; Water resource monitoring operational None No water quality or quantity monitoring data collected Design of sewerage extension system in place At least 1 major campaign every quarter Regular water resources monitoring programme established water resources monitoring programme aopted and operational Midterm Level & Assessmen t Achievement Rating Justification for Rating island of Anjouan Solid waste collection system established and operational. No collection system; Volume/weight solid waste collected Amount of solid waste in River Mutsamudu not Monitored Volume/area of monitoring point covered in solid waste 1.3 (Maldives): Protection of the freshwater lens of Thoddoo Island from salinization and agrochemical Catchment management committee established Watershed management plan No committee Exists Groundwater quality baseline established Limited baseline water quality data No plan exists; no data collection or analysis exists to provide basis for management plan consultant are developed and discussed. There has been extensive discussion on the solid waste collection system with all stakeholders. A day has been set aside my the municipality and Governor of anjoaun for Solidwaste collection in the city of Anjoaun several river cleaning activities have been undertaken with all stakeholders with support of the Governor of Anjouan and the Army An interim multistakeholder Committee is in place and actively involved pending completion of the water resource assessment. Interim water committee formation is a major step towards the establishment of the watershed management plan 2014. Several clean up campaigns have been undertaken with all stakeholders. Initial meeting with farmers was held and follow up meetings agreed on Project design endorsed by Government. Vacancy announcement for Demonstration Project Manager advertised # of households increased serviced by solid waste collection system A solid waste collection in place and functional 30% reduction in solid waste observed at the monitoring point upstream of water supply intake; 50% reduction in solid waste observed at the monitoring point upstream of water supply intake; Committee established with clear TORs and operational A functional catchment management committee, with clear sustainability arrangements Consultation with landowners and catchment s/h completed; watershed surveys conducted; watershed zone map produced; management plan endorsed by stakeholders Consultation with landowners and catchment s/h initiated; watershed surveys conducted; Survey data available for watershed zone mapping 50% wells on faming plots and 50% of household wells monitored pollution, with improved drought season aquifer yields # of plots receiving water supply from the infiltration gallery system with metering No water provided from the system, All groundwater extracted from individual wells, no metering. Currently elevated salinity and electrical conductivity Project design endorsed by Government. 50% of agricultural plots on the island irrigated from the gallery System Project design endorsed by Government. Reduced nitrates and phosphates in GW Limited data available on GW quality Groundwater quality monitoring system established Non existent Project design endorsed by Government. Project design endorsed by Government. Salinity level below 700μS/cm, 30% reduction from average baseline salinity level 30% reduction from the baseline data Water Quality Baseline developed: Vulnerability of the aquifer against pollution and extraction assessed Limited Hydrogeological data on the northern Aquifer Increased m3 of treated wastewater re-used as alternative water resources (for recharge, irrigation, etc.) (co-fin) Impact assessment of the effectiveness of groundwater recharge using the treated wastewater against saline intrusion. 1500m3 per day injected in boreholes Effectiveness of the current practice unknown Not initiated by June 2014 Best practices for water demand management captured and disseminated to % of stakeholder bodies; Limited awareness of policy makers about the importance to protect groundwater and lagoon; Not initiated by June 2014 Reduced groundwater salinity(%) and electrical conductivity 1.4 (Mauritius): The protection and sustainable utilization of the Northern Aquifer of Mauritius demonstrated through the integrated planning and management of wastewater collection, treatment and reuse Data collection ongoing by Water Resources Unit. Analysis and vulnerability maps to be completed through consultancy Gender balanced training given to farmers; 20 shallow boreholes established for GW quality monitoring Required scientific baseline data agreed on, and plans for data collection in place; hydrogeological data collection protocols endorsed 2500m3 per day used as alternative water resources (irrigation, recharge, etc.) Impact assessment of the effectiveness of groundwater recharge using the treated wastewater against saline Intrusion undertaken and recommendations made A plan on dissemination of best practices on water demand management developed, 10% of public receive best practice guidance Scientific baseline reports on hydrogeological data, land use and pollution activities categorized and compiled; vulnerability map produced; protection measures in place at sensitive areas Aquifer effectively protected from saline intrusion using the results of the assessment; Salinity monitored 35% of public receive best practice guidance on pollution prevention and effective water consumption; Water demand issues for communication to policy makers identified on pollution prevention and effective water consumption; 1.5 (São Tomé & Principe): Integrated River basin management plan for the Rio Provaz Basin developed to enable equitable water resources allocation and protection, contributing to sustainable economic development, public health and environmental protection. # of briefings produced by % of stakeholders on water resource management and climate change/gender empowerment Limited awareness about saving water and polluting activities; Area surveyed and reported No formal inventories of land or water resources or use exists Water resources level, flow and quality data; GW resources potential established No regular data collection or quality assurance Basin water management committee; Basin Water Resources Allocation and Protection Strategy No stakeholder consultations on water resources management to date; no catchment management committee: no water resources management strategies at basin level or national level # of households using Ecosan No Ecosan technology currently used Not initiated by June 2014 Plans and approach for identification of issues for policy makers established 50% of IWRM coordination mechanism aware of climate change issues; 100% of IWRM coordination mechanism aware of climate change issues; system for gender disaggregated data on water management established Basin zoning undertaken and data and information needs agreed on Gender disaggregated data on water management become available Data needs and gaps agreed on; regular data collection system on water resources level, flow and quality established for both surface and ground water; Robust quantity and quality data sets collected; all major risk type assessed for GW extraction GW resources potential established Stakeholder consultations started; Interim committee has been established. Stakeholders for catchment committee identified and membership agreed on; Gender integrated catchment management committees established; Fact finding for EcoSan and solid waste disposal underway in collaboration with STP based international Strategies for promotion of ecosan established; Gender integrated catchment management committees established and operational; water resources management strategy developed in a participatory manner and endorsed by the basin stakeholders; the process at the demo basin informs the national level IWRM process 80% of constructed Ecosan units still in use at the end of the project (# of units TBD); lessons learned produced aiming for further promotion Interim Catchment committee has been established, consultancy recruitment ois ongoing Not initiated by June 2014 100% of catchments area surveyed 1.6 (Seychelles): Protection of a coastal gravel aquifer through integrated land and water management measures (water demand management, land use, flood management) demonstrated in the island of La Digue # of households business and community buildings with rainwater storage tanks Rainwater harvesting practiced only marginally m3 of re-used effluent Limited level of wastewater reuse practiced % reduction in peak water pressure requirements through installation of # of household water storage tanks Limited # of households with potable water storage tanks % reduction in leakage in water supply distribution System 2. IWRM and No leakage detection and reduction programme; bulk metering only at water treatment plant; Surface water salinity in marsh outlets and GW salinity; aquifer recharge capacity of marshland restored Inflow of seawater at high tide conditions; no tidal flaps installed; Development pressure reduced natural buffering capacity of marshland Volume of waste oils and batteries collected Existing collection programmes not effective – no collection Inadequate monitoring of water NGO wide consultation with all stakeholders on the approach for the rainwater harvesting and awareness campaign started Planning with two hotels was initiated. Mandatory installation have been discussed with the local office for planning and the La Digue advisory Council. The project is closely working with the Public Utilities Corporation and data has been analysed on mitigation measure agreed on 2 tidal reverse valves installed. Project has been working with local NGO promoting conservation and restoration of the marshland site evaluation was undertaken and mitigation approach agreed on. In addition, communities and authorities have been sensitised on the negative impacts and mitigation measures Delayed -strategy for rainwater harvesting established; 40% of targeted buildings using rainwater at domestic and commercial levels Landscape irrigation schemes using treated wastewater at two hotels Initial discussions and consultations on mainstreaming adoption of potable water storage tanks for all new buildings in progress A system of leak identification established; 40% of leaks fixed; #m of damaged pipes replaced 50% reduction of m3 of water loss (#s TBD during the inception) 2 tidal flaps installed; seawater flows inland into marshes reduced; 100% of targeted buildings using rainwater at domestic and commercial levels Landscape irrigation schemes using treated wastewater at 5 hotels, using 100 m3/day Mandatory installation of potable water storage tanks for all new buildings adopted by the land planning in La Digue 10% reduction in system peak pressures District meters installed and monitored; 100% of leaks fixed; new leaks kept at minimum; all damaged pipes replaced 70% reduction of m3 of water loss (#s TBD by Dec 2014) 4 tidal flaps installed; no seawater flows inland into marshes; reinstatement of the marshland Collection system in place 70% of households using collection system Water resources monitoring plans with National IWRM monitoring inventories and capacity WUE indicators, baselines, targets and monitoring protocols discussed, agreed and adopted into long-term monitoring programs at national and ‘regional’ levels 3. SIDS employ new plans, policy tools and approaches in implementing IWRM commitments 4. Strengthene d capacity allows resources levels, flows and quality Inadequate land and catchment characterization Poor understanding of water demands and water abstraction rates Delayed Negligible water pollution monitoring Delayed Poor awareness of IWRM processes Delayed Sectoral policies and planning Delayed Sectoral legislation and regulation No integrated resources, infrastructure and governance approaches Minimal inter-sectoral coordination Delayed No inter-agency coordination mechanisms at technical or political levels Weak government and nongovernment inter-engagement Delayed Weak government and nongovernment inter-engagement Little knowledge of integrated water and land management Delayed Delayed Delayed Delayed Delayed long-term tracking indicators established assessments Survey, database and assessment of water resources Key stakeholders trained on monitoring tools & monitoring plan implementation Key stakeholders using data for water resources and infrastructure management Development of Indicator Framework IWRM integrated into sectoral policies and planning IWRM integrated into sectoral legislation and regulation Development of national and regional baseline indicators and targets Testing of Indicator Framework IWRM indicator development and monitoring capacity building National IWRM diagnostic report baselines strengthened IWRM integrated into sectoral policies and plans Effective formalized inter-agency and inter-sectoral coordination & cooperation Effective government and civil society IWRM partnership National IWRM policies and plans Developed Water resources, infrastructure and governance approaches in all IWRM approaches IWRM partnerships established National IWRM coordination Completion of IWRM legislative reviews and amendments (Apex) bodies established Improved awareness among stakeholders on IWRM IWRM awareness materials produced and campaigns delivered stakeholders and institutions in SIDS to fulfill their role in local, national and regional IWRM processes and exchange best practices approaches Minimal access to external IWRM approaches for SIDS Delayed Increased stakeholder IWRM capacity IWRM training modules and courses delivered Weak networks of local and national stakeholders Delayed Multi-media IWRM knowledge base developed and functional Little knowledge exchange and transfer from regional SIDS and inter-regional SIDS Delayed Multi-media knowledge resource base readily available to IWRM stakeholders Networks of national partners exchange information and knowledge on IWRM Delayed 5. Project implemented effectively and efficiently to the satisfaction of partners Limited in-country capacity to manage multi-sectoral interventions Limited in-country capacity to facilitate multi-stakeholder governance processes Delayed Delayed Inter-national and interregional SIDS partnerships sharing IWRM information Effective management and delivery of demonstration projects Networking of IWRM partners Inter-regional SIDS coordination and knowledge sharing Project management training Effective facilitation of national IWRM governance reform process Stakeholder engagement Effective monitoring of IWRM delivery Project monitoring and evaluation, and reporting In addition to the progress towards outcomes analysis: Compare and analyse the GEF Tracking Tool at the Baseline with the one completed right before the Midterm Review. Identify remaining barriers to achieving the project objective in the remainder of the project. By reviewing the aspects of the project that have already been successful, identify ways in which the project can further expand these benefits. 6. Project Implementation and Adaptive Management I) Management Arrangements: Review overall effectiveness of project management as outlined in the Project Document. Have changes been made and are they effective? Are responsibilities and reporting lines clear? Is decision-making transparent and undertaken in a timely manner? Recommend areas for improvement. Review the quality of execution of the Executing Agency/Implementing Partner(s) and recommend areas for improvement. Review the quality of support provided by the GEF Partner Agency (UNDP) and recommend areas for improvement. ii) Work Planning: Review any delays in project start-up and implementation, identify the causes and examine if they have been resolved. Are work-planning processes results-based? If not, suggest ways to re-orientate work planning to focus on results? Examine the use of the project’s results framework/ logframe as a management tool and review any changes made to it since project start. iii) Finance and co-finance: Consider the financial management of the project, with specific reference to the cost-effectiveness of interventions. Review the changes to fund allocations as a result of budget revisions and assess the appropriateness and relevance of such revisions. Does the project have the appropriate financial controls, including reporting and planning, that allow management to make informed decisions regarding the budget and allow for timely flow of funds? Informed by the co-financing monitoring table to be filled out, provide commentary on co-financing: is cofinancing being used strategically to help the objectives of the project? Is the Project Team meeting with all co-financing partners regularly in order to align financing priorities and annual work plans? iv) Project-level Monitoring and Evaluation Systems: Review the monitoring tools currently being used: Do they provide the necessary information? Do they involve key partners? Are they aligned or mainstreamed with national systems? Do they use existing information? Are they efficient? Are they cost-effective? Are additional tools required? How could they be made more participatory and inclusive? Examine the financial management of the project monitoring and evaluation budget. Are sufficient resources being allocated to monitoring and evaluation? Are these resources being allocated effectively? v) Stakeholder Engagement: Project management: Has the project developed and leveraged the necessary and appropriate partnerships with direct and tangential stakeholders? Participation and country-driven processes: Do local and national government stakeholders support the objectives of the project? Do they continue to have an active role in project decision-making that supports efficient and effective project implementation? Participation and public awareness: To what extent has stakeholder involvement and public awareness contributed to the progress towards achievement of project objectives? vi) Reporting: Assess how adaptive management changes have been reported by the project management and shared with the Project Board. Assess how well the Project Team and partners undertake and fulfil GEF reporting requirements (i.e. how have they addressed poorly-rated PIRs, if applicable?) Assess how lessons derived from the adaptive management process have been documented, shared with key partners and internalized by partners. vii) Communications: Review internal project communication with stakeholders: Is communication regular and effective? Are there key stakeholders left out of communication? Are there feedback mechanisms when communication is received? Does this communication with stakeholders contribute to their awareness of project outcomes and activities and investment in the sustainability of project results? Review external project communication: Are proper means of communication established or being established to express the project progress and intended impact to the public (is there a web presence, for example? Or did the project implement appropriate outreach and public awareness campaigns?) For reporting purposes, write one half-page paragraph that summarizes the project’s progress towards results in terms of contribution to sustainable development benefits, as well as global environmental benefits. 7. Sustainability Sustainability is understood as the probability of continued long-term project-derived results and impacts after the external project funding and assistance ends. The evaluation will identify and assess the key conditions or factors that are likely to undermine or contribute to the persistence of benefits. Some of these factors might be direct results of the project while others will include contextual circumstances or developments that are not under control of the project but that may condition the sustainability of benefits. The evaluation should ascertain to what extent follow-up work has been initiated and how project results will be sustained and enhanced over time. The reconstructed ToC will assist in the evaluation of sustainability, as the drivers and assumptions required to achieve higher-level results are often similar to the factors affecting sustainability of these changes. The MTR will validate whether the risks identified in the Project Document, Annual Project Review/PIRs and the ATLAS Risk Management Module are the most important and whether the risk ratings applied are appropriate and up to date. If not, explain why. In addition, assess the following risks to sustainability: i. Financial risks to sustainability: What is the likelihood of financial and economic resources not being available once the GEF assistance ends (consider potential resources can be from multiple sources, such as the public and private sectors, income generating activities, and other funding that will be adequate financial resources for sustaining project’s outcomes)? ii. Socio-economic risks to sustainability: Are there any social or political risks that may jeopardize sustainability of project outcomes? What is the risk that the level of stakeholder ownership (including ownership by governments and other key stakeholders) will be insufficient to allow for the project outcomes/benefits to be sustained? Do the various key stakeholders see that it is in their interest that the project benefits continue to flow? Is there sufficient public / stakeholder awareness in support of the long term objectives of the project? Are lessons learned being documented by the Project Team on a continual basis and shared/ transferred to appropriate parties who could learn from the project and potentially replicate and/or scale it in the future? iii. Institutional Framework and Governance risks to sustainability: Do the legal frameworks, policies, governance structures and processes pose risks that may jeopardize sustenance of project benefits? While assessing this parameter, also consider if the required systems/ mechanisms for accountability, transparency, and technical knowledge transfer are in place. iv. Environmental risks to sustainability: Are there any environmental risks that may jeopardize sustenance of project outcomes? 8. Catalytic Role and Replication The MTE will assess the catalytic effect already played by the project and methodology. The catalytic role of interventions is embodied in their approach of supporting the creation of an enabling environment and of investing in pilot activities which are innovative and showing how new approaches can work. UNEP also aims to support activities that upscale new approaches to a national, regional or global level, with a view to achieve sustainable global environmental benefits. The evaluation will assess the catalytic role played by this project, namely to what extent the project has: (a) (b) (c) (d) (e) (f) catalyzed behavioural changes in terms of use and application, by the relevant stakeholders, of capacities developed; provided incentives (social, economic, market based, competencies etc.) to contribute to catalyzing changes in stakeholder behaviour; contributed to institutional changes, for instance institutional uptake of project-demonstrated technologies, practices or management approaches; contributed to policy changes (on paper and in implementation of policy); contributed to sustained follow-on financing (catalytic financing) from Governments, private sector, donors etc.; created opportunities for particular individuals or institutions (“champions”) to catalyze change (without which the project would not have achieved all of its results). Replication is defined as lessons and experiences coming out of the project that are replicated (experiences are repeated and lessons applied in different geographic areas) or scaled up (experiences are repeated and lessons applied in the same geographic area but on a much larger scale and funded by other sources). The evaluation will assess the approach adopted by the project to promote replication effects and determine to what extent actual replication has already occurred, or is likely to occur in the near future. What are the factors that may influence replication and scaling up of project experiences and lessons? 9. Efficiency (i) The evaluation will assess the cost-effectiveness and timeliness of project execution. It will describe any cost- or time-saving measures put in place in attempting to bring the project as far as possible in achieving its results within its (severely constrained) secured budget and (extended) time. It will also analyse how delays, if any, have affected project execution, costs and effectiveness. Wherever possible, costs and time over results ratios of the project will be compared with that of other similar interventions. ii) The evaluation will give special attention to efforts by the project teams to make use of/build upon pre-existing institutions, agreements and partnerships, data sources, synergies and complementarities with other initiatives, programmes and projects etc. to increase project efficiency. For instance, [insert relevant examples for the project being evaluated]. 10. Factors and Processes Affecting Project Performance I. Preparation and readiness. This criterion focuses on the quality of project design and preparation. Were project stakeholders4 adequately identified and were they sufficiently involved in project development and ground truthing e.g. of proposed timeframe and budget? Were the project’s objectives and components clear, practicable and feasible within its timeframe? Are potentially negative environmental, economic and social impacts of projects identified? Were the capacities of executing agencies properly considered when the project was designed? Was the project document clear and realistic to enable effective and efficient implementation? Were the partnership arrangements properly identified and the roles and responsibilities negotiated prior to project implementation? Were counterpart resources (funding, staff, and facilities) and enabling legislation assured? Were adequate project management arrangements in place? Were lessons from other relevant projects properly incorporated in the project design? What factors influenced the quality-at-entry of the project design, choice of partners, allocation of financial resources etc.? Were any design weaknesses mentioned in the Project Review Committee minutes at the time of project approval adequately addressed? II. Project implementation and management. This includes an analysis of implementation approaches used by the project, its management framework, the project’s adaptation to changing conditions and responses to changing risks including safeguard issues (adaptive management), the performance of the implementation arrangements and partnerships, relevance of changes in project design, and overall performance of project management. The evaluation will: (a) (b) (c) (d) (e) Ascertain to what extent the project implementation mechanisms outlined in the project document have been followed and were effective in delivering project milestones, outputs and outcomes. Were pertinent adaptations made to the approaches originally proposed? Evaluate the effectiveness and efficiency of project management and how well the management was able to adapt to changes during the life of the project. Assess the role and performance of the teams and working groups established and the project execution arrangements at all levels. Assess the extent to which project management responded to direction and guidance provided by the UNEP Task Manager, the UNDP Regional Technical Adviser and project steering including national and regional steering committees. Identify operational and political / institutional problems and constraints that influenced the effective implementation of the project, and how the project tried to overcome these problems. III. Stakeholder participation, cooperation and partnerships. The Evaluation will assess the effectiveness of mechanisms for information sharing and cooperation with other UNEP and/or UNDP projects and programmes, external stakeholders and partners. The term stakeholder should be considered in the broadest sense, encompassing both project partners and target users of project products. The Theory of Change (TOC) and stakeholder analysis should assist the evaluator in identifying the key stakeholders and their respective roles, capabilities and motivations in each step of the causal pathways from activities to achievement of outputs, 4 Stakeholders are the individuals, groups, institutions, or other bodies that have an interest or ‘stake’ in the outcome of the project. The term also applies to those potentially adversely affected by the project. outcomes and intermediate states towards impact. The assessment will look at three related and often overlapping processes: (1) information dissemination to and between stakeholders, (2) consultation with and between stakeholders, and (3) active engagement of stakeholders in project decision making and activities. The evaluation will specifically assess: (a) (b) (c) (d) (e) (f) (g) the approach(es) and mechanisms used to identify and engage stakeholders (within and outside UNEP/UNDP) in project design and at critical stages of project implementation. What were the strengths and weaknesses of these approaches with respect to the project’s objectives and the stakeholders’ motivations and capacities? How was the overall collaboration between different functional units of UNEP and/or UNDP involved in the project? What coordination mechanisms were in place? Were the incentives for internal collaboration in UNEP/UNDP adequate? Was the level of involvement of the Regional, Liaison and Out-posted Offices in project design, planning, decision-making and implementation of activities appropriate? Has the project made full use of opportunities for collaboration with other projects and programmes including opportunities not mentioned in the Project Document? Have complementarities been sought, synergies been optimized and duplications avoided? What was the achieved degree and effectiveness of collaboration and interactions between the various project partners and stakeholders during design and implementation of the project? This should be disaggregated for the main stakeholder groups identified in the inception report. To what extent has the project been able to take up opportunities for joint activities, pooling of resources and mutual learning with other organizations and networks? In particular, how useful are partnership mechanisms and initiatives to build stronger coherence and collaboration between participating organisations? How did the relationship between the project and the collaborating partners (institutions and individual experts) develop? Which benefits stemmed from their involvement for project performance, for UNEP and for the stakeholders and partners themselves? Do the results of the project (strategic programmes and plans, monitoring and management systems, sub-regional agreements etc.) promote participation of stakeholders, including users, in environmental decision making? IV. Communication and public awareness. The evaluation will assess the effectiveness of any public awareness activities that were undertaken during the course of implementation of the project to communicate the project’s objective, progress, outcomes and lessons. This should be disaggregated for the main stakeholder groups identified in the inception report. Did the project identify and make us of existing communication channels and networks used by key stakeholders? Did the project provide feedback channels? V. Country ownership and driven-ness. The evaluation will assess the degree and effectiveness of involvement of government / public sector agencies in the project, in particular those involved in project execution and those participating in project steering committees (regional and/or national levels): (a) (b) (c) To what extent have Governments assumed responsibility for the project and provided adequate support to project execution, including the degree of cooperation received from the various public institutions involved in the project? How and how well did the project stimulate country ownership of project outputs and outcomes? [Any other project-specific questions] VI. Financial planning and management. Evaluation of financial planning requires assessment of the quality and effectiveness of financial planning and control of financial resources throughout the project’s lifetime. The assessment will look at actual project costs by activities compared to budget (variances), financial management (including disbursement issues), and co-financing (see Annex 3). The evaluation will: (a) Verify the application of proper standards (clarity, transparency, audit etc.) and timeliness of financial planning, management and reporting to ensure that sufficient and timely financial resources were available to the project and its partners; (b) (c) (d) Assess other administrative processes such as recruitment of staff, procurement of goods and services (including consultants), preparation and negotiation of cooperation agreements etc. to the extent that these might have influenced project performance; Present the extent to which co-financing has materialized as expected at project approval (see Table 1). Report country co-financing to the project overall, and to support project activities at the national level in particular. The evaluation will provide a breakdown of final actual costs and co-financing for the different project components (see tables in Annex 3). Describe the resources the project has leveraged since inception and indicate how these resources are contributing to the project’s ultimate objective. Leveraged resources are additional resources— beyond those committed to the project itself at the time of approval—that are mobilized later as a direct result of the project. Leveraged resources can be financial or in-kind and they may be from other donors, NGO’s, foundations, governments, communities or the private sector. VII. Analyse the effects on project performance of any irregularities in procurement, use of financial resources and human resource management, and the measures taken by implementing and executing agencies to prevent such irregularities in the future. Determine whether the measures taken were adequate. VIII. Supervision, guidance and technical backstopping. The purpose of supervision is to verify the quality and timeliness of project execution in terms of finances, administration and achievement of outputs and outcomes, in order to identify and recommend ways to deal with problems, which arise during project execution. Such problems may be related to project management but may also involve technical/institutional substantive issues in which UNEP has a major contribution to make. IX. The evaluators should assess the effectiveness of supervision, guidance and technical support provided by the different supervising/supporting bodies including: (a) (b) (c) The adequacy of project supervision plans, inputs and processes; The realism and candour of project reporting and the emphasis given to outcome monitoring (results-based project management); How well did the different guidance and backstopping bodies play their role and how well did the guidance and backstopping mechanisms work? What were the strengths in guidance and backstopping and what were the limiting factors? X. Monitoring and evaluation. The evaluation will include an assessment of the quality, application and effectiveness of project monitoring and evaluation plans and tools, including an assessment of risk management based on the assumptions and risks identified in the project document. The evaluation will assess how information generated by the M&E system during project implementation was used to adapt and improve project execution, achievement of outcomes and ensuring sustainability. M&E is assessed on three levels: (a) M&E Design. The evaluators should use the following questions to help assess the M&E design aspects: Arrangements for monitoring: Did the project have a sound M&E plan to monitor results and track progress towards achieving project objectives? Have the responsibilities for M&E activities been clearly defined? Were the data sources and data collection instruments appropriate? Was the time frame for various M&E activities specified? Was the frequency of various monitoring activities specified and adequate? How well was the project logical framework (original and possible updates) designed as a planning and monitoring instrument? SMART-ness of indicators: Are there specific indicators in the logframe for each of the project objectives? Are the indicators measurable, attainable (realistic) and relevant to the objectives? Are the indicators time-bound? Adequacy of baseline information: To what extent has baseline information on performance indicators been collected and presented in a clear manner? Was the methodology for the baseline data collection explicit and reliable? For instance, was there adequate baseline information on pre-existing accessible information on global and regional environmental status (b) and trends, and on the costs and benefits of different policy options for the different target audiences? Was there sufficient information about the assessment capacity of collaborating institutions and experts etc. to determine their training and technical support needs? To what extent did the project engage key stakeholders in the design and implementation of monitoring? Which stakeholders (from groups identified in the inception report) were involved? If any stakeholders were excluded, what was the reason for this? Did the project appropriately plan to monitor risks associated with Environmental Economic and Social Safeguards? Arrangements for evaluation: Have specific targets been specified for project outputs? Has the desired level of achievement been specified for all indicators of objectives and outcomes? Were there adequate provisions in the legal instruments binding project partners to fully collaborate in evaluations? Budgeting and funding for M&E activities: Determine whether support for M&E was budgeted adequately and was funded in a timely fashion during implementation. M&E Plan Implementation. The evaluation will verify that: The M&E system was operational and facilitated timely tracking of results and progress towards projects objectives throughout the project implementation period; PIR reports were prepared (the realism of the Task Manager’s assessments will be reviewed) Half-yearly (and quarterly for UNDP-implemented Component) Progress & Financial Reports were complete and accurate; Risk monitoring (including safeguard issues) was regularly documented Information provided by the M&E system was used during the project to improve project performance and to adapt to changing needs. 11. Evaluation Consultant Role and Specific Tasks For this evaluation, one international consultant will be hired by UNOPS to carry out the evaluation in line with UNEP and UNDP guidelines as set out in this terms of reference.. By undersigning Individual Contractor Agreement (ICA with UNOPS, the consultant certifies that he/she has not been associated with the design and implementation of the project in any way which may jeopardize their independence and impartiality towards project achievements and project partner performance. In addition, the consultant will not have any future interests (within six months after completion of the contract) with the project’s executing or implementing units. The consultant will be responsible for overall management of the evaluation, in close consultation with the UNEP and UNDP, and will ensure timely delivery of its outputs as set out in this TORs. . More specifically: - Conduct a preliminary desk review and introductory interviews with project staff; Draft the reconstructed Theory of Change of the project; Prepare the evaluation framework; Develop the desk review and interview protocols; Draft the survey protocols (partner survey and user survey); In consultation with project Coordination team, plan and develop a review schedule; Travel and visit countries as well to undertake interviews and data collection. Travel may include meeting with the national and or Regional steering committee. Preparation and submission of various deliverables as set out in this terms of reference including inception report, draft MTR report and final MTR report. The consultant will ensure timely data collection supported by the PCU, analyse data, and prepare the main report for the evaluation, and ensure together that all evaluation criteria and questions are adequately covered. Evaluation Consultant Qualifications and Selection Criteria The consultant should have extensive evaluation experience, including of large, regional or global programmes and using a Theory of Change approach; and a broad understanding of large-scale, consultative assessment processes and factors influencing use of assessments and/or scientific research for decision-making. The required qualification, experience and competencies include: Advanced university degree in water resources management, international development, environmental sciences, monitoring and evaluation, or other relevant fields; Demonstrated experience in international consulting experience and extensive evaluation experience, including of large, regional or global programmes and using a Theory of Change approach; Broad understanding of the Integrated Water Resources Management, its theory, application and implementation processes; Excellent understanding of, or experience in, the roles of projects, which aim to catalyse policy and institutional reforms. Knowledge of the UN system, and specifically of UNEP, UNDP and GEF on their strategic priorities and their portfolio; Past experience in evaluating the UNDP, UNEP or GEF IW portfolio is considered as an advantage; Excellent writing skills in English; Proficiency in French and/or Portuguese is considered as an advantage; Attention to detail and respect for deadlines; Respect for cultural diversity and appreciation of different political, policy and institutional settings; Minimum 15 years of professional experience; An additional 5 years of professional experience in the relevant field can substitute an advanced university degree. The fee of the Evaluator will be agreed on a deliverable basis and paid upon acceptance of expected key deliverables by the UNEP and UNDP. Evaluation Deliverables: The following are the key MTR deliverables: a. Inception report b. Draft midterm review report c. Final main report incorporating comments received from evaluation stakeholders as appropriate, including a “response to comments” annex d. 2 page bulletin summarising project findings (see template in Annex 2.) The evaluator’s attention is drawn on the following important notes on the deliverables: Inception Report: The evaluator will prepare an inception report containing a thorough review of the project context, project design quality, a draft reconstructed Theory of Change of the project, the evaluation framework and a tentative evaluation schedule. It is expected that a large portion of the desk review will be conducted during the inception phase. It will be important to acquire a good understanding of the project context, design and process at this stage. The review of design quality will cover the following aspects (see Annex 5 for the detailed project design assessment matrix): 1. 2. 3. 4. 5. 6. Strategic relevance of the project Preparation and readiness; Financial planning; M&E design; Complementarily with UNEP and UNDP strategies and programmes; Sustainability considerations and measures planned to promote replication and up-scaling. The inception report will also include a stakeholder analysis identifying key stakeholders, networks and channels of communication. This information should be gathered from the Project document and discussion with the project team. The evaluation framework will present in further detail the overall evaluation approach. It will specify for each evaluation question under the various criteria what the respective indicators and data sources will be. The evaluation framework should summarize the information available from project documentation against each of the main evaluation parameters. Any gaps in information should be identified and methods for additional data collection, verification and analysis should be specified. Evaluations/reviews of other large assessments can provide ideas about the most appropriate evaluation methods to be used. Effective communication strategies help stakeholders understand the results and use the information for organisational learning and improvement. While the evaluation is expected to result in a comprehensive document, content is not always best shared in a long and detailed report; this is best presented in a synthesised form using any of a variety of creative and innovative methods. The evaluator is encouraged to make use of multimedia formats in the gathering of information, e.g., videos, photos, sound recordings. Together with the full report, the evaluator will be expected to produce a 2-page summary of key findings and lessons The inception report will also present a tentative schedule for the overall evaluation process, including a draft programme for the country visit and tentative list of people/institutions to be interviewed. The PCU may assist developing a proposed travel schedule in the most effective manner in terms of time and costs. The proposed travel schedule must be approved by UNEP and UNDP before a field mission begins. All financial matters related to field trips (tickets, DSA and Terminal Expenses) will be made through UNOPS EAH office in Kenya. The respective Task Managers (TMs) in UNDP and UNEP will submit the inception report for review and approval before any further data collection and analysis is undertaken. The Mid-term Review Report The Mid-term Review Report should be brief (no longer than 40 pages – excluding the executive summary and annexes), to the point and written in plain English. The report should follow the annotated Table of Contents outlined in Annex 1. It must explain the purpose of the evaluation, exactly what was evaluated and the methods used (with their limitations). The report will present evidence-based and balanced findings, consequent conclusions, lessons and recommendations, which will be cross-referenced to each other. The report should be presented in a way that makes the information accessible and comprehensible. Any dissident views in response to evaluation findings will be appended in footnote or annex as appropriate. To avoid repetitions in the report, the authors will use numbered paragraphs and make cross-references where possible. Submission of the draft MTR report. The evaluator will submit draft report for review by various stakeholders including UNEP, UNDP, UNOPS and beneficiaries and stakeholders in the participating countries. It is envisaged that a Regional Project Steering Committee meeting will be organised from 25-28 October 2015 in Maldives where the evaluator will have a chance to present the draft Mid-Term Review Report and preliminary findings to the Steering Committee members face-to-face. Steering Committee members can provide feedback on any errors of facts and may highlight the significance of such errors in any conclusions either during the meeting, or through a separate bilateral meeting with the evaluator, or through the submission of written comments before and after the presentation. It is also very important that stakeholders provide feedback on the proposed recommendations and lessons. Comments would be expected no later than 1 week after the presentation of the draft MTR report by the evaluator at the RPSC meeting. Any comments or responses to the draft report will be sent to evaluator for consideration in preparing the final draft report, along with the evaluator’s own views. Submission of the final MTR report. The final report shall be submitted by Email to the respective Task managers in UNDP and UNEP and shared with appropriate units. The final evaluation report will be made available through GEF, UNDP and the UNEP Evaluation Office web-sites (e.g. www.unep.org/eou). The evaluator will submit the final report no later than 1 week after reception of stakeholder comments. The evaluator will prepare a response to comments, listing those comments not or only partially accepted by them that could therefore not or only partially be accommodated in the final report. They will explain why those comments have not or only partially been accepted, providing evidence as required. This response to comments will be shared with the interested stakeholders to ensure full transparency. At the end of the evaluation process, the UNEP and UNDP as the implementing agencies will prepare a Recommendations Implementation Plan (or Management Response) in the format of a table to be completed and updated at regular intervals by the agency focal points for the project (UNEP Task Project Manager and UNDP Regional Technical Adviser). UNEP and UNDP will jointly be responsible for the implementation of the recommendations. Schedule of Payment: Deliverables Signature of contract Inception report Submission and approval of the draft evaluation report Submission and approval of the final evaluation report Percentage payment Travel expenses/DSA 20% of fees 30% of fees 50% of fees 12. Schedule of the Evaluation Table below presents the tentative schedule for the evaluation. Timeframe 29 July 2015 7 August 2015 15 August 2015 20 August 2015 1 – 30 September 2015 By 5 October 2015 By 9 October 2015 12-23 October 2015 25-28 October 2015 Activity Application closes Select MTR Consultant Contract issued and the MTR exercise starts. Preparation and submission of the MTR Inception Report, including details work plan indicating all mission schedules for approval. Document review MTR mission: stakeholder meetings, interviews, field visits. Mission wrap-up meeting & presentation of initial findings- earliest end of MTR mission Submission of the draft report (in English) for circulation for comments. Translation of the draft report into French and Portuguese for circulation for comments (arranged by PCU). Review of the first draft by stakeholders and submission of written comments directly to the evaluator with copy to PCU Presentation of the draft report by the consultant at the Regional Project Steering Committee meeting By 6 November 2015 By 13 November 2015 By 20 November 2015 By 4 December 2015 By 9 December 2015 Deadline for the submission of comments on the draft report by stakeholders to the evaluator (in English, French or Portuguese) Translation of comments, as necessary (A translator arranged by PCU, but the evaluator to directly deal with the translator). Submission of the final report (in English), incorporating comments, by the evaluator to UNEP and UNDP. Preparation & Issue of Management Response by UNEP and UNDP Submission of the final MTR report and the Management Response to GEF (by UNEP) Final MTR report and Management Response posted online (by UNEP and UNDP) Expected date of full MTR completion 13. Logistical Arrangements This Mid-term Evaluation will be undertaken by one independent evaluation consultant to be contracted by UNOPS. The consultant will work under the overall guidance provided by the UNEP Task Manager and the UNDP Regional Technical Advisor. The project team will, where possible, provide logistical support (introductions, meetings etc.) allowing the consultants to conduct the evaluation as efficiently and independently as possible. It is, however, the consultant’s individual responsibility to arrange for their travel, visa, obtain documentary evidence, plan meetings with stakeholders, organize online surveys, and any other logistical matters related to the assignment. Annex 1. Annotated Table of Contents of the main evaluation deliverables Annex 2. Evaluation Ratings Annex 3. Project costs and co-financing tables Annex 4. Documentation list for the evaluation to be provided by the UNEP, UNDP and Project Team Annex 5. Template for the assessment of the quality of project design Annex 6. Introduction to Theory of Change / Impact pathways, the ROtI Method and the ROtI Results Score sheet (old version – A new version is under development) Annex 7. Stakeholder Analysis for the Evaluation Inception Report. Annex 1. Annotated Table of Contents of the main evaluation deliverables INCEPTION REPORT Section Notes Data Sources Max. number of pages 1 1. Introduction Brief introduction to the project and evaluation. 2. Project background Summarise the project context and rationale. How has the context of the project changed since project design? Background information on context 3 3.Stakeholder analysis See notes in Annex 7 1 4. Review of project design Summary of project design strengths and weaknesses. Complete the Template for assessment of the quality of project design (Annex 5 of the Terms of Reference). Project document Project preparation phase. TM/PM Project document and revisions 5. Reconstructed Theory of Change The Theory of Change should be reconstructed, based on project documentation. It should be presented with one or more diagrams and explained with a narrative (see Annex 6). The evaluation framework will contain: Detailed evaluation questions (including new questions raised by review of project design and ToC analysis) and indicators Data Sources It will be presented as a matrix, showing questions, indicators and data sources. Description of the approach and methods that the consultant will use to promote reflection and learning through the evaluation process. Project document narrative, logical framework and budget tables. Other project related documents. Review of all project documents. 2 pages of narrative + diagram(s) Review of project documents, stakeholder analysis, discussions with the Evaluation Manager, Task Manager and Project Coordinator Discussion with project team on logistics. 1 6. Evaluation framework 7. Learning, Communication and outreach 8. Evaluation schedule - - Updated timeline for the overall evaluation (dates of travel and key evaluation milestones, based on the indicative schedule included in the TOR) Proposed schedule for field visits, accompanied by a proposed itinerary and cost estimates 2 + completed matrix provided in annex of the inception report 5 2 9. Annexes A- Completed matrix of the overall quality of project design B- List of individuals and documents consulted for the inception report C- List of documents and individuals to be consulted during the main evaluation phase MAIN REPORT Project Identification Table An updated version of the Table 1 (page 1) of these TORs Executive Summary Overview of the main findings, conclusions and recommendations of the evaluation. It should encapsulate the essence of the information contained in the report to facilitate dissemination and distillation of lessons. The main points for each evaluation parameter should be presented here (with a summary ratings table), as well as the most important lessons and recommendations. Maximum 4 pages. I. Introduction A very brief introduction, mentioning the name of evaluation and project, project duration, cost, implementing partners and objectives of the evaluation. Objectives, approach and limitations of the evaluation II. The Project A. Context Overview of the broader institutional and country context, in relation to the project’s objectives, including changes during project implementation. Factors to address include: The complexity of the project implication arrangements (no.of partners/components, geographical scope, ambitiousness of objective) The proportion of the Project Managers time/workplan available to the project The ease or difficulty of the project’s external operating environment (climate, infrastructure, political/economic stability, socio-cultural factors) Perceived capacity/expertise of executing partners B. Objectives and components C. Target areas/groups D. Milestones/key dates in project design and implementation E. Implementation arrangements F. Project financing G. Project partners H. Changes in design during implementation I. Reconstructed Theory of Change of the project III. Evaluation Findings Estimated costs and funding sources and FMO’s A. Strategic relevance B. Achievement of outputs C. Effectiveness: Attainment of project objectives and results This chapter is organized according to the evaluation criteria presented in section II.4 of the TORs and provides factual evidence relevant to the questions asked and sound analysis and interpretations of such evidence. This is the main substantive section of the report. Ratings are provided at the end of the assessment of each evaluation criterion. i. Direct outcomes from reconstructed TOC ii. Likelihood of impact using RoTI and based on reconstructed TOC iii. Achievement of project goal and planned objectives D. Sustainability and replication E. Efficiency F. Factors affecting performance IV. Conclusions and Recommendations A. Conclusions This section should summarize the main conclusions of the evaluation, told in a logical sequence from cause to effect. It is suggested to start with the positive achievements and a short explanation why these could be achieved, and, then, to present the less successful aspects of the project with a short explanation why. The conclusions section should end with the overall assessment of the project. Avoid presenting an “executive summary”-style conclusions section. Conclusions should be crossreferenced to the main text of the report (using the paragraph numbering). The overall ratings table should be inserted here (see Annex 2). B. Lessons Learned Lessons learned should be anchored in the conclusions of the evaluation. In fact, no lessons should appear which are not based upon an explicit finding of the evaluation. Lessons learned are rooted in real project experiences, i.e. based on good practices and successes which could be replicated or derived from problems encountered and mistakes made which should be avoided in the future. Lessons learned must have the potential for wider application and use. Lessons should briefly describe the context from which they are derived and specify the contexts in which they may be useful. C. Recommendations As for the lessons learned, all recommendations should be anchored in the conclusions of the report, with proper cross-referencing. Recommendations are actionable proposals on how to resolve concrete problems affecting the project or the sustainability of its results. They should be feasible to implement within the timeframe and resources available (including local capacities), specific in terms of who would do what and when, and set a measurable performance target. In some cases, it might be useful to propose options, and briefly analyse the pros and cons of each option. It is suggested, for each recommendation, to first briefly summarize the finding it is based upon with cross-reference to the section in the main report where the finding is elaborated in more detail. The recommendation is then stated after this summary of the finding. Recommendations should be SMART - Specific, Measurable, Achievable, Result-oriented and Time-bound Annexes These may include additional material deemed relevant by the evaluator but must include: 1. Response to stakeholder comments received but not (fully) accepted by the evaluators 2. Evaluation TORs (without annexes) 3. Evaluation program, containing the names of locations visited and the names (or functions) and contacts (Email) of people met 4. Bibliography 5. Summary co-finance information and a statement of project expenditure by activity (See annex 3 of these TORs) 6. Evaluation findings and lessons. A short and simple presentation of evaluation findings and lessons ensures that information is easily accessible to a wide range of audiences. (Use the 2-page template provided in Annex 2) 7. Any other communication and outreach tools used to disseminate results (e.g. power point presentations, charts, graphs, videos, case studies, etc.) 6. Brief CVs of the consultants Important note on report formatting and layout Reports should be submitted in Microsoft Word “.doc” or “.docx” format. Use of Styles (Headings etc.), page numbering and numbered paragraphs is compulsory from the very first draft report submitted. Consultants should make sure to gather media evidence, especially photographs, during the assignment and insert a sample in the final report in the appropriate sections. All media collected during the assignment shall become property of the UNEP and UNDP; which shall ensure that the authors are recognised as copyright owners. The consultant(s) grants permission to the UNEP and UNDP to reproduce the photographs in any size or quantity for use in official publications. The consultant(s) shall seek permission before taking any photographs in which persons are recognisable and to inform them that the photographs may be used in UNEP and UNDP official publications. Examples of UNEP Mid-term Evaluation Reports are available at www.unep.org/eou. Annex 2. Evaluation Ratings The evaluation will provide individual ratings for the evaluation criteria. Most criteria will be rated on a six-point scale as follows: Highly Satisfactory (HS); Satisfactory (S); Moderately Satisfactory (MS); Moderately Unsatisfactory (MU); Unsatisfactory (U); Highly Unsatisfactory (HU). Sustainability is rated from Highly Likely (HL) down to Highly Unlikely (HU). In the conclusions section of the report, ratings will be presented together in a table, with a brief justification cross-referenced to the findings in the main body of the report. Criterion A. Strategic relevance B. Achievement of outputs C. Effectiveness: Attainment of project objectives and results 1. Achievement of direct outcomes 2. Likelihood of impact 3. Achievement of project goal and planned objectives D. Sustainability and replication 1. Financial 2. Socio-political 3. Institutional framework 4. Environmental 5. Catalytic role and replication E. Efficiency F. Factors affecting project performance 1. Preparation and readiness 2. Project implementation and management 3. Stakeholders participation and public awareness 4. Country ownership and driven-ness 5. Financial planning and management 6. UNEP supervision and backstopping 7. Monitoring and evaluation a. M&E Design b. Budgeting and funding for M&E activities c. M&E pPlan Implementation Overall project rating Summary Assessment Rating HS HU HS HU HS HU HS HU HS HU HS HU HL HU HL HU HL HU HL HU HL HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU HS HU Rating for effectiveness: Attainment of project objectives and results. An aggregated rating will be provided for the achievement of direct outcomes as determined in the reconstructed Theory of Change of the project, the likelihood of impact and the achievement of the formal project goal and objectives. This aggregated rating is not a simple average of the separate ratings given to the evaluation sub-criteria, but an overall judgement of project effectiveness by the consultants. Ratings on sustainability. All the dimensions of sustainability are deemed critical. Therefore, the overall rating for sustainability will be the lowest rating on the separate dimensions. Ratings on Financial planning and management: An aggregated rating will be provided based on an average of the various component ratings listed in the table below. Please include this table as an annex in the main report: GEF projects Financial management components Attention paid to compliance with procurement rules and regulations Contact/communication between the PM & FMO PM & FMO knowledge of the project financials FMO responsiveness to financial requests PM & FMO responsiveness to addressing and resolving financial issues Rating Evidence/ Comments HS:HU HS:HU HS:HU HS:HU HS:HU Were the following documents provided to the evaluator: A. B. C. D. An up to date cofinancing table A summary report on the projects financial management and expenditures during the life of the project - to date A summary of financial revisions made to the project and their purpose Copies of any Y/N Y/N Y/N Y/N completed audits Availability of project financial reports and audits Timeliness of project financial reports and audits Quality of project financial reports and audits FMO knowledge of partner financial requirements and procedures Overall rating HS:HU HS:HU HS:HU HS:HU Ratings of monitoring and evaluation. The M&E system will be rated on M&E design, M&E plan implementation, and budgeting and funding for M&E activities (the latter sub-criterion is covered in the main report under M&E design). M&E plan implementation will be considered critical for the overall assessment of the M&E system. Thus, the overall rating for M&E will not be higher than the rating on M&E plan implementation. Overall project rating. The overall project rating should consider parameters ‘A-E’ as being the most important with ‘C’ and ‘D’ in particular being very important. Annex 3. Project costs and co-financing tables Project Costs Component/subcomponent/output Estimated cost at design Actual Cost Expenditure ratio (actual/planned) Co-financing UNEP own Financing (US$1,000) Co financing (Type/Source) Planne d Actua l Government Other* Total (US$1,000) (US$1,000) (US$1,000) Planne d Actua l Planne d Actua l Planne d Total Disbursed (US$1,000 ) Actua l Grants Loans Credits Equity investments In-kind support Other (*) Total s * This refers to contributions mobilized for the project from other multilateral agencies, bilateral development cooperation agencies, NGOs, the private sector and beneficiaries. Annex 4. Documentation list for the evaluation to be provided by the UNEP, UNDP and Project Team Project design documents (Approved Project Identification Form (PIF), Project Document and Annexes, minutes from the project validation meeting, GEF IW Tracking Tool completed prior to the GEF CEO Endorsement, etc.) GEF IW Tracking Tool completed prior to the MTR process starts, cleared by UNEP and UNDP Project supervision plan, with associated budget Correspondence related to project Supervision mission reports Steering Committee meeting documents, including agendas, meeting minutes, and any summary reports Project progress reports, including financial reports submitted Project Implementation Reviews (PIRs) Management memos related to project Other documentation of supervision feedback on project outputs and processes (e.g. comments on draft progress reports, etc.). Project revision and extension documentation Specific project outputs: guidelines, manuals, training tools, software, websites, press communiques, posters, videos and other advertisement materials etc. Any other relevant document deemed useful for the evaluation Annex 5. Template for the assessment of the quality of project design General guideline: The original project document, the TOC-D, and the RTOC-D are key sources of information for completing this assessment. 1. Project Document Project preparation and readiness 1 Does the project document provide a description of stakeholder consultation during project design process? 2 Does the project document include a clear stakeholder analysis? Are stakeholder needs and priorities clearly understood and integrated in project design? (see annex 9) 3 Does the project document entail a clear situation analysis? 4 Does the project document entail a clear problem analysis? 5 Does the project document entail a clear gender analysis? Relevance 6 Is the project document i) clear in terms of relevance to: 7 8 9 10 Is the project document i) clear in terms of relevance Global, Regional, Sub-regional and National environment al issues and needs? ii) UNEP mandate iii) the relevant GEF focal areas, strategic priorities and operational programme(s)? (if appropriate) iv) Stakeholder priorities and needs? Gender equity Addressed by PRC Evaluation Comments Rating Addressed by PRC Evaluation Comments Rating 11 to cross-cutting issues ii) 12 iii) Intended Causality 13 14 15 16 17 18 19 20 21 22 23 24 Results SouthSouth Cooperatio n Bali Strategic Plan and Are the outcomes realistic? Are the causal pathways from project outputs [goods and services] through outcomes [changes in stakeholder behaviour] towards impacts clearly and convincingly described? Is there a clearly presented Theory of Change or intervention logic for the project? Is the timeframe realistic? What is the likelihood that the anticipated project outcomes can be achieved within the stated duration of the project? Are activities appropriate to produce outputs? Are activities appropriate to drive change along the intended causal pathway(s)? Are impact drivers and assumptions clearly described for each key causal pathway? Are the roles of key actors and stakeholders clearly described for each key causal pathway? Is the ToC-D terminology (result levels, drivers, assumptions etc.) consistent with UNEP definitions (Programme Manual) Efficiency Does the project intend to make use of / build upon pre-existing institutions, agreements and partnerships, data sources, synergies and complementarities with other initiatives, programmes and projects etc. to increase project efficiency? Sustainability / Replication and Catalytic effects Does the project design present a strategy / approach to sustaining outcomes / benefits? Does the design identify social or political factors that may influence positively or negatively the sustenance of project results and progress towards impacts? Does the design foresee sufficient activities to promote government and stakeholder awareness, interests, commitment and Addressed by PRC Addressed by PRC Addressed by PRC 25 26 27 28 29 30 incentives to execute, enforce and pursue the programmes, plans, agreements, monitoring systems etc. prepared and agreed upon under the project? If funding is required to sustain project outcomes and benefits, does the design propose adequate measures / mechanisms to secure this funding? Are financial risks adequately identified and does the project describe a clear strategy on how to mitigate the risks (in terms of project’s sustainability) Does the project design adequately describe the institutional frameworks, governance structures and processes, policies, subregional agreements, legal and accountability frameworks etc. required to sustain project results? Does the project design identify environmental factors, positive or negative, that can influence the future flow of project benefits? Are there any project outputs or higher level results that are likely to affect the environment, which, in turn, might affect sustainability of project benefits? Does the project design foresee adequate measures to promote replication and upscaling / does the project have a clear strategy to promote replication and upscaling? Are the planned activities likely to generate the level of ownership by the main national and regional stakeholders necessary to allow for the project results to be sustained? Learning, Communication and outreach Addressed by PRC Has the project identified appropriate methods for communication with key stakeholders during the project life? Are plans in place for dissemination of results and lesson sharing. Do learning, communication and outreach plans build on analysis of existing communication channels and networks used by key stakeholders ? Risk identification and Social Safeguards 31 Are all assumptions identified in the ToC presented as risks in the risk management table? Are risks appropriately identified in both, ToC and the risk table? Addressed by PRC 32 33 34 35 36 37 38 39 40 41 42 43 44 Is the risk management strategy appropriate? Are potentially negative environmental, economic and social impacts of projects identified? Does the project have adequate mechanisms to reduce its negative environmental footprint? Have risks and assumptions been discussed with key stakeholders? Governance and Supervision Arrangements Is the project governance model comprehensive, clear and appropriate? (Steering Committee, partner consultations etc. ) Are supervision / oversight arrangements clear and appropriate? Management, Execution and Partnership Arrangements Have the capacities of partners been adequately assessed? Are the execution arrangements clear and are roles and responsibilities within UNEP clearly defined? Are the roles and responsibilities of external partners properly specified? Financial Planning / budgeting Are there any obvious deficiencies in the budgets / financial planning? (coherence of the budget, do figures add up etc.) Has budget been reviewed and agreed to be realistic with key project stakeholders? Is the resource utilization cost effective? How realistic is the resource mobilization strategy? Are the financial and administrative arrangements including flows of funds clearly described? Monitoring Does the framework logical capture the key elements of the Theory of Change for the project? Addressed by PRC Addressed by PRC Addressed by PRC Addressed by PRC - 45 46 47 48 49 50 51 52 53 54 55 have ‘SMART’ indicators for outcomes and objectives? - have appropriate 'means of verification' ? - Are the milestones appropriate and sufficient to track progress and foster management towards outputs and outcomes? Is there baseline information in relation to key performance indicators? How well has the method for the baseline data collection been explained? Has the desired level of achievement (targets) been specified for indicators of outputs and outcomes? How well are the performance targets justified for outputs and outcomes? Has a budget been allocated for monitoring project progress in implementation against outputs and outcomes? Does the project have a clear knowledge management approach? Have mechanisms for involving key project stakeholder groups in monitoring activities been clearly articulated? Evaluation Addressed by PRC Is there an adequate plan for evaluation? Has the time frame for evaluation activities been specified? Is there an explicit budget provision for midterm review and terminal evaluation? Is the budget sufficient? 2. Project alignment with the Sub programme Addressed by PRC Evaluation Comments Rating 1 2 3 4 5 6 7 8 9 10 11 12 13 Does the project form a coherent part of the programme framework? Is the relevance of the project in terms of SP higher level results clearly described? How well have linkages with other projects in the same Programme Framework been described? Where linkages with other SPs are mentioned, are they well articulated? If the project is a pilot, is it clear why the pilot is relevant to higher level SP results? Are the designed activities relevant in terms of contributing / producing the identified PoW Output(s)? (Based on project design only) Are output indicators appropriate to measure contribution to / delivery of the PoW Output(s)? What is the likelihood that the project’s contribution towards PoW output(s) will be achieved within the duration of the PoW? (consider also funding, timing, staffing etc.) Are the intended results likely to contribute to the stated EA? (Based on design only) Is the pathway from project outputs to EA contribution clearly described? Are the indicators appropriate to measure contribution to EA? What is the likelihood that the project’s contribution towards the EA will be achieved within the duration of the PoW? (Consider also funding, timing, staffing etc.) Do project milestones track progress to PoW output and all the way to the EA? 3. Project approval process (specific to the project under review) Evaluation Comments 1 2 3 What were the main issues raised by PRC that were addressed? What were the main issues raised by PRC that were not addressed? Were there any major issues not flagged by PRC? Annex 6. Introduction to Theory of Change / Impact pathways, the ROtI Method and the ROtI Results Score sheet (old version – A new version is under development) Terminal evaluations of projects are conducted at, or shortly after, project completion. At this stage it is normally possible to assess the achievement of the project’s outputs. However, the possibilities for evaluation of the project’s outcomes are often more limited and the feasibility of assessing project impacts at this time is usually severely constrained. Full impacts often accrue only after considerable time-lags, and it is common for there to be a lack of long-term baseline and monitoring information to aid their evaluation. Consequently, substantial resources are often needed to support the extensive primary field data collection required for assessing impact and there are concomitant practical difficulties because project resources are seldom available to support the assessment of such impacts when they have accrued – often several years after completion of activities and closure of the project. Despite these difficulties, it is possible to enhance the scope and depth of information available from Terminal Evaluations on the achievement of results through rigorous review of project progress along the pathways from outcome to impact. Such reviews identify the sequence of conditions and factors deemed necessary for project outcomes to yield impact and assess the current status of and future prospects for results. In evaluation literature these relationships can be variously described as ‘Theories of Change’, Impact ‘Pathways’, ‘Results Chains’, ‘Intervention logic’, and ‘Causal Pathways’ (to name only some!). Theory of Change (ToC) / impact pathways Figure 1 shows a generic impact pathway which links the standard elements of project logical frameworks in a graphical representation of causal linkages. When specified with more detail, for example including the key users of outputs, the processes (the arrows) that lead to outcomes and with details of performance indicators, analysis of impact pathways can be invaluable as a tool for both project planning and evaluation. Figure 1. A generic results chain, which can also be termed an ‘Impact Pathway’ or Theory of Change. The pathways summarise casual relationships and help identify or clarify the assumptions in the intervention logic of the project. For example, in the Figure 2 below the eventual impact depends upon the behaviour of the farmers in using the new agricultural techniques they have learnt from the training. The project design for the intervention might be based on the upper pathway assuming that the farmers can now meet their needs from more efficient management of a given area therefore reducing the need for an expansion of cultivated area and ultimately reducing pressure on nearby forest habitat, whereas the evidence gathered in the evaluation may in some locations follow the lower of the two pathways; the improved farming methods offer the possibility for increased profits and create an incentive for farmers to cultivate more land resulting in clearance or degradation of the nearby forest habitat. Figure 2. An impact pathway / TOC for a training intervention intended to aid forest conservation. The GEF Evaluation Office has recently developed an approach to assess the likelihood of impact that builds on the concepts of Theory of Change / causal chains / impact pathways. The method is known as Review of Outcomes to Impacts (ROtI)5 and has three distinct stages: a. Identifying the project’s intended impacts b. Review of the project’s logical framework c. Analysis and modelling of the project’s outcomes-impact pathways: reconstruction of the project’s Theory of Change The identification of the projects intended impacts should be possible from the ‘objectives’ statements specified in the official project document. The second stage is to review the project’s logical framework to assess whether the design of the project is consistent with, and appropriate for, the delivery of the intended impact. The method requires verification of the causal logic between the different hierarchical levels of the logical framework moving ‘backwards’ from impacts through outcomes to the outputs; the activities level is not formally considered in the ROtI method6. The aim of this stage is to develop an understanding of the causal logic of the project intervention and to identify the key ‘impact pathways’. In reality such processes are often complex: they might involve multiple actors and decision-processes and are subject to time-lags, meaning that project impact often accrues long after the completion of project activities. The third stage involves analysis of the ‘impact pathways’ that link project outcomes to impacts. The pathways are analysed in terms of the ‘assumptions’ and ‘drivers’ that underpin the processes involved in the transformation of outputs to outcomes to impacts via intermediate states (see Figure 3). Project outcomes are the direct intended results stemming from the outputs, and they are likely to occur either towards the end of the project or in the short term following project completion. Intermediate states are the transitional conditions between the project’s direct outcomes and the intended impact. They are necessary changes expected to occur as a result of the project outcomes, that are expected, in turn, to result into impact. There may be more than one intermediate state between the immediate project outcome and the eventual impact. When mapping outcomes and intermediate states its important to include reference to the stakeholders who will action or be effected by the change. Drivers are defined as the significant, external factors that if present are expected to contribute to the realization of the intended impacts and can be influenced by the project / project partners & stakeholders. Assumptions are 5 GEF Evaluation Office (2009). ROtI: Review of Outcomes to Impacts Practitioners Handbook. http://www.gefweb.org/uploadedFiles/Evaluation_Office/OPS4/Roti%20Practitioners%20Handbook%2015%20June%202009.p df 6 Evaluation of the efficiency and effectiveness in the use of resources to generate outputs is already a major focus within UNEP Terminal Evaluations. the significant external factors that if present are expected to contribute to the realization of the intended impacts but are largely beyond the control of the project / project partners & stakeholders. The drivers and assumptions are considered when assessing the likelihood of impact, sustainability and replication potential of the project. Since project logical frameworks do not often provide comprehensive information on the processes by which project outputs yield outcomes and eventually lead, via ‘intermediate states’ to impacts, the impact pathways need to be carefully examined and the following questions addressed: o Are there other causal pathways that would stem from the use of project outputs by other potential user groups? o Is (each) impact pathway complete? Are there any missing intermediate states between project outcomes and impacts? o Have the key drivers and assumptions been identified for each ‘step’ in the impact pathway. Figure 3. A schematic ‘impact pathway’ showing intermediate states, assumptions and impact drivers7 (adapted from GEF EO 2009) In ideal circumstances, the Theory of Change of the project is reconstructed by means of a group exercise, involving key project stakeholders. The evaluators then facilitate a collective discussion to develop a visual model of the impact pathways using cards and arrows taped on a wall. The component elements (outputs, outcomes, intermediate states, drivers, assumptions, intended impacts etc.) of the impact pathways are written on individual cards and arranged and discussed as a group activity. Figure 4 below shows the suggested sequence of the group discussions needed to develop the ToC for the project. 7 The GEF frequently uses the term “impact drivers” to indicate drivers needed for outcomes to lead to impact. However, in UNEP it is preferred to use the more general term “drivers” because such external factors might also affect change processes occurring between outputs and outcomes. Figure 4. Suggested sequencing of group discussions (from GEF EO 2009) In practice, there is seldom an opportunity for the evaluator to organise such a group exercise during the inception phase of the evaluation. The reconstruction of the project’s Theory of Change can then be done in two stages. The evaluator first does a desk-based identification of the project’s impact pathways, specifying the drivers and assumptions, during the inception phase of the evaluation, and then, during the main evaluation phase, (s)he discusses this understanding of the project logic during group discussions or the individual interviews with key project stakeholders. Once the Theory of Change for the project is reconstructed, the evaluator can assess the design of the project intervention and collate evidence that will inform judgments on the extent and effectiveness of implementation, through the evaluation process. Performance judgments are made always noting that project contexts can change and that adaptive management is required during project implementation. The Review of Outcomes towards Impact (ROtI) method requires ratings for outcomes achieved by the project and the progress made towards the ‘intermediate states’ at the time of the evaluation. According to the GEF guidance on the method; “The rating system is intended to recognize project preparation and conceptualization that considers its own assumptions, and that seeks to remove barriers to future scaling up and out. Projects that are a part of a long-term process need not at all be “penalized” for not achieving impacts in the lifetime of the project: the system recognizes projects’ forward thinking to eventual impacts, even if those impacts are eventually achieved by other partners and stakeholders, albeit with achievements based on present day, present project building blocks.” For example, a project receiving an “AA” rating appears likely to deliver impacts, while for a project receiving a “DD” this would be very unlikely, due to low achievement in outcomes and the limited likelihood of achieving the intermediate states needed for eventual impact (see Table 1). Table 1. Rating scale for outcomes and progress towards ‘intermediate states’ Outcome Rating Rating on progress toward Intermediate States D: The project’s intended outcomes were not delivered C: The project’s intended outcomes were delivered, but were not designed to feed into a continuing process after project funding B: The project’s intended outcomes were delivered, and were designed to feed into a continuing process, but with no prior allocation of responsibilities after project funding D: No measures taken to move towards intermediate states. C: The measures designed to move towards intermediate states have started, but have not produced results. B: The measures designed to move towards intermediate states have started and have produced results, which give no indication that they can progress towards the intended long term impact. A: The project’s intended outcomes were delivered, and were designed to feed into a continuing process, with specific allocation of responsibilities after project funding. A: The measures designed to move towards intermediate states have started and have produced results, which clearly indicate that they can progress towards the intended long term impact. Thus a project will end up with a two letter rating e.g. AB, CD, BB etc. In addition the rating is given a ‘+’ notation if there is evidence of impacts accruing within the life of the project. The possible rating permutations are then translated onto the usual six point rating scale used in all UNEP project evaluations in the following way. Table 2. Shows how the ratings for ‘achievement of outcomes’ and ‘progress towards intermediate states translate to ratings for the ‘Overall likelihood of impact achievement’ on a six point scale. Highly Likely Likely Moderately Likely Moderately Unlikely AA AB BA CA BB+ CB+ DA+ DB+ BB CB DA DB AC+ BC+ AC BC CC+ DC+ CC DC AD+ BD+ Unlikely AD BD DD+ CD+ Highly Unlikely CD DD In addition, projects that achieve documented changes in environmental status during the project’s lifetime receive a positive impact rating, indicated by a “+”. The overall likelihood of achieving impacts is shown in Table 11 below (a + score above moves the double letter rating up one space in the 6-point scale). The ROtI method provides a basis for comparisons across projects through application of a rating system that can indicate the expected impact. However it should be noted that whilst this will provide a relative scoring for all projects assessed, it does not imply that the results from projects can necessarily be aggregated. Nevertheless, since the approach yields greater clarity in the ‘results metrics’ for a project, opportunities where aggregation of project results might be possible can more readily be identified. 1. 2. 3. 1. 2. 3. Rating justification: Intermediate states 1. 2. 3. Rating justification: Impact (GEBs) Rati ng Ove (+) rall Outcomes Rati ng (D – A) Outputs Rati ng (D – A) Results rating of project entitled: 1. 2. 3. Rating justification: Scoring Guidelines The achievement of Outputs is largely assumed. Outputs are such concrete things as training courses held, numbers of persons trained, studies conducted, networks established, websites developed, and many others. Outputs reflect where and for what project funds were used. These were not rated: projects generally succeed in spending their funding. Outcomes, on the other hand, are the first level of intended results stemming from the outputs. Not so much the number of persons trained; but how many persons who then demonstrated that they have gained the intended knowledge or skills. Not a study conducted; but one that could change the evolution or development of the project. Not so much a network of NGOs established; but that the network showed potential for functioning as intended. A sound outcome might be genuinely improved strategic planning in SLM stemming from workshops, training courses, and networking. Examples Funds were spent, outputs were produced, but nothing in terms of outcomes was achieved. People attended training courses but there is no evidence of increased capacity. A website was developed, but no one used it. (Score – D) Outcomes achieved but are dead ends; no forward linkages to intermediate states in the future. People attended training courses, increased their capacities, but all left for other jobs shortly after; or were not given opportunities to apply their new skills. A website was developed and was used, but achieved little or nothing of what was intended because users had no resources or incentives to apply the tools and methods proposed on the website in their job. (Score – C) Outcomes plus implicit linkages forward. Outcomes achieved and have implicit forward linkages to intermediate states and impacts. Collaboration as evidenced by meetings and decisions made among a loose network is documented that should lead to better planning. Improved capacity is in place and should lead to desired intermediate outcomes. Providing implicit linkages to intermediate states is probably the most common case when outcomes have been achieved. (Score - B) Outcomes plus explicit linkages forward. Outcomes have definite and explicit forward linkages to intermediate states and impacts. An alternative energy project may result in solar panels installed that reduced reliance on local wood fuels, with the outcome quantified in terms of reduced C emissions. Explicit forward linkages are easy to recognize in being concrete, but are relatively uncommon. (Score A) Intermediate states: The intermediate states indicate achievements that lead to Global Environmental Benefits, especially if the potential for scaling up is established. “Outcomes” scored C or D. If the outcomes above scored C or D, there is no need to continue forward to score intermediate states given that achievement of such is then not possible. In spite of outcomes and implicit linkages, and follow-up actions, the project dead-ends. Although outcomes achieved have implicit forward linkages to intermediate states and impacts, the project deadends. Outcomes turn out to be insufficient to move the project towards intermediate states and to the eventual achievement of GEBs. Collaboration as evidenced by meetings and among participants in a network never progresses further. The implicit linkage based on follow-up never materializes. Although outcomes involve, for example, further participation and discussion, such actions do not take the project forward towards intended intermediate impacts. People have fun getting together and talking more, but nothing, based on the implicit forwards linkages, actually eventuates. (Score = D) The measures designed to move towards intermediate states have started, but have not produced result, barriers and/or unmet assumptions may still exist. In spite of sound outputs and in spite of explicit forward linkages, there is limited possibility of intermediate state achievement due to barriers not removed or unmet assumptions. This may be the fate of several policy related, capacity building, and networking projects: people work together, but fail to develop a way forward towards concrete results, or fail to successfully address inherent barriers. The project may increase ground cover and or carbon stocks, may reduce grazing or GHG emissions; and may have project level recommendations regarding scaling up; but barrier removal or the addressing of fatal assumptions means that scaling up remains limited and unlikely to be achieved at larger scales. Barriers can be policy and institutional limitations; (mis-) assumptions may have to do with markets or public – private sector relationships. (Score = C) Barriers and assumptions are successfully addressed. Intermediate state(s) planned or conceived have feasible direct and explicit forward linkages to impact achievement; barriers and assumptions are successfully addressed. The project achieves measurable intermediate impacts, and works to scale up and out, but falls well short of scaling up to global levels such that achievement of GEBs still lies in doubt. (Score = B) Scaling up and out over time is possible. Measurable intermediate state impacts achieved, scaling up to global levels and the achievement of GEBs appears to be well in reach over time. (Score = A) Impact: Actual changes in environmental status “Intermediate states” scored B to A. Measurable impacts achieved at a globally significant level within the project life-span. . (Score = ‘+’) Annex 7. Stakeholder Analysis for the Evaluation Inception Report. The evaluator should request the project team to provide a list of key stakeholders, and evidence of stakeholder mapping and analysis. If the project is unable to provide this, or if the evaluation team feels the information provided is not complete, the evaluator should develop the stakeholder map based on evidence provided in the project document (and using methods described in the programme manual or other stakeholder mapping techniques of their choice). The purpose of stakeholder analysis in the preparation of the evaluation inception report is: 1. To understand which individuals or groups are likely to have been affected by, or to have affected the activities of the project. 2. To ensure that the evaluation methodology includes mechanisms for the participation of key stakeholder groups in the process. 3. To enable the evaluation to identify and make use of key channels of communication between the project and its stakeholders (and between the stakeholders themselves). In the review of Project design the evaluator should assess whether the project address the following issues (as specified by UNEP’s Quality Assessment Section8): Have all stakeholders9 who are affected by or who could affect (positively or negatively) the project been identified and explained in the stakeholder analysis? Did the main stakeholders participate in the design stages of the project and did their involvement influence the project design? Are the economic, social and environmental impacts to the key stakeholders identified, with particular reference to the most vulnerable groups10? Have the specific roles and responsibilities of the key stakeholders been documented in relation to project delivery and effectiveness? For projects operating at country level, are the stakeholder roles country specific? Is there a lead national or regional partner for each country/region involved in the project? In the review of project outputs and outcomes, the evaluation should consider: Were outputs accessible to all the relevant stakeholder groups? Have desired outcomes and impacts occurred amongst all stakeholder groups (and if not, consider why this might be). 8 See The Quality Assessment Section’s Matrix for Project Review. Information on stakeholder analysis can also be found in UNEP’s programme manual. 9 Stakeholders can be governmental and non-governmental stakeholders, including business and industry. Project beneficiaries are often representatives of Civil Society and within UNEP defined as the belonging to the nine Major Groups as defined in the Agenda 21: Business and Industries, Children & Youth, Farmers, Indigenous People and their communities, Local Authorities, NGO’s, the Scientific & Technological Community, Women, Workers and Trade Unions. 10 Vulnerable groups such as: women, children, youth, elderly people, indigenous peoples, local communities, persons with disabilities and below poverty line Have there been any unanticipated outcomes or impacts with particular reference to the most vulnerable groups. In the review of factors affecting performance the evaluation should consider: Participation of key stakeholders What were the roles and responsibilities of key stakeholders and how did their performance affect the achievement of project outputs and outcomes.