American Evaluation Association Annual Conference - 2013 Managing Evaluations for Consistently High Quality Molly Hageboeck USAID M&E Projects Overseas Managed by MSI Pakistan Afghanistan Columbia Uganda South Sudan Kenya Ethiopia USAID M&E Projects Overseas Managed by MSI Keys to Success • Evaluations are projects – they can be managed • Identify key intervention points – quality checkpoints • Create tools for exerting quality control at the checkpoints • Share the tools with clients and evaluation teams -- Field handbook -- New website MSI build for USAID E3 to improve M&E includes evaluation management tools. MSI Evaluation Management Checkpoints for USAID’s Process Stage 1 Stage 2 Decision to Evaluate to Issuance of SOW Decision to evaluate Evaluation Manager assigned Evaluation parameters defined (type, timing) Development partner input (as appropriate) Evaluation design/plan developed (USAID/ initial version) Evaluation dissemination/ utilization plan developed by USAID (initial version – include list of what evaluation team needs to provide to USAID) Design/plan reviewed/approved (USAID/initial version) SOW drafted SOW reviewed and approved (Quality Control checkpoint) Solicitation issued (if external evaluators are to be involved) The Evaluation Management Process Stage 3 Proposal Review to Approval for Data Collection to Begin Proposals reviewed/ team selected Team inception report on performance monitoring findings (if required by SOW) (Quality Control checkpoint) Team planning meeting (TPM) Initial meetings with development partners Detailed evaluation design/plan developed/refined by team Evaluation design/plan (or modifications) approved (Quality Control checkpoint) Register the evaluation with USAID/ Washington Support During Data Collection and Analysis Weekly status review with team against field work plan and schedule Troubleshooting as needed to assist evaluation team in the field Stage 4 Initial Evaluation Results Briefing to Final Report Initial briefing (on completeness) of evaluation findings, conclusions and recommendations (Quality Control Checkpoint) Approval to proceed to drafting report (if approval is required by SOW) Submission of Draft Report Oral briefing on draft report (if required by SOW) Review of draft report – feedback to team (Quality Control Checkpoint) Evaluation dissemination/ utilization plan updated/expanded by USAID (final version) Submission of Final Report Review/acceptance of final report and other deliverables Stage 5 Dissemination of Final Report to Assessment of Evaluation Influence Dissemination of evaluation report and executive summary (per dissemination/ utilization plan) Formal evaluation review meeting Evaluation review minutes disseminated Follow-up on implementation status of accepted recommendations (per dissemination/ utilization plan) Follow-up on impact of evaluation (per utilization plan) Quality Checkpoint 1 Evaluation Statement of Work (SOW) Common Problems • Management purpose is not clear/transparent • Evaluation Questions – to many, not matched to purpose, not feasible • There isn’t always an opportunity to comment on or negotiate the SOW Solution: Help your Clients Improve the SOWS they Prepare MSI Checklist for Developing/Reviewing Evaluation SOWS Built it in about 2000 Gave it to USAID in 2010 Quality Checkpoint 2 Written Review of Existing Information Before Final Design Common Problems Solutions: • Late receipt of project reports/performance data • Team reviews often cursory – important data not extracted & shared • Ask for reports when the SOW is issued. • Develop/require a structured desk review product within a short time frame MSI Desk Review Template – First Deliverable from Teams – Before Final Design Quality Checkpoint 3 Final Evaluation Design/Plan Prior to Field Work Common Problems Solution: • The field team did not prepare the proposal stage design – and may not follow it • Teams too often start the field work without a final design, data collection and analysis (and sampling plan and all necessary instruments • Detailed evaluation design and formal review/approval on a question by question basis from the actual team including all instruments before they get the keys to the jeep. • Provide teams with a structured format to get started MSI “Getting to Answers” Matrix Built it in about 2005 Gave it to USAID in 2010 Example 1: “Getting to Answers” Matrix Evaluation Questions 1. Type of Answer Needed Descriptive Comparative (normative) Cause-and-Effect 2. Descriptive Comparative (normative) Cause-and-Effect Data Collection Method(s) Data Source(s) Sampling or Selection Criteria Data Analysis Method(s) Quality Checkpoint 4 Post-Field Work and Analysis Pre-Draft Briefing Common Problems Solution: • Teams start writing before they work out a clear flow of findings, conclusions and recommendations grounded in their evaluation evidence. • Many reports not well supported by evidence • Many mix up findings, conclusions and recommendations – and confuse readers. • Required oral briefing in bullets to ensure all questions have been addressed and F-C-R flow Is logical • Block remaining LOE until this step is passed – as the team may need to get more data before it writes. Quality Checkpoint 5 Structured Quality Focused Review of Draft Report Common Problems Solution: • Clients tend to review draft evaluation reports on substantive reports often skipping over structural and professional quality aspects. • Quality fine points may not get attention until the final stage – when all LOE has been spent • Or they remain missed until a meta-evaluation finds the flaws • Evaluation quality review checklist – shared with teams the day they start and all members of draft report review teams. • Checklist based feedback to team – and repeat use of checklist with final report to verify that improvements have been made MSI Checklist for Reviewing Evaluation Reports Built it in about 2000 Gave it to USAID in 2010 Current “News” on MSI’s Evaluation Management System • Update of MSI Handbook for Field Teams is underway • Recent meta-evaluation for USAID of 2009-2012 evaluations found problems that greater internal use of an evaluation management system and associated tools would have caught -- and a recommendation on strengthen internal evaluation management practices in USAID has been provided.