Level Best AN APPROACH TO EVALUATION FOR SMALLER AND GRASSROOTS ORGANIZATIONS By Marcia Festen and Marianne Philbin With special thanks to the Pratt Richards Group for additional slides and content Learning Objectives • Become more familiar with basic principles of evaluation • Learn an approach that will make ongoing evaluation easier Our Discussion Today • Based on the premises, principles and steps outlined in Level Best • 5-Step Framework • Worksheet The Challenge “It is easy to tell if you are succeeding in business— you make money. In philanthropy, measuring performance can be fiendishly tricky and take a lot longer.” Warren Buffet “The New Powers In Giving” The Economist, June 2006 Why Evaluate Your Program? • To learn, plan and improve. With limited resources, you need to be as strategic as possible • To better articulate the progress you’re making toward your goals But First… Pick a card Your Experience • What’s been the best and worst you’ve experienced relative to undertaking evaluation? • Your assumptions, anxieties, realities? A Few Basic Principles Evaluation becomes less daunting when you know that… It’s Not Research • Monitoring – measurement, on an ongoing and regular basis, of program implementation, or results of a service. • Research – the systematic process of collecting data in a controlled environment in order to prove or disprove a hypothesis. Definition of Evaluation Evaluation is the systematic process of asking questions, and then collecting information to help answer those questions, in order to improve the work of your organization, and often to tell the story of change. Shift your thinking …from ‘prove’ to ‘improve” …from ‘judge’ to ‘learn’ …from ‘after’ to ‘before’ • Rather than thinking of evaluation as the test that follows the work, begin to think of evaluation as the measures you put in place beforehand to help you run your programs. • How will you evaluate the work you’re planning for this year, or next year? Many methods, much jargon Process evaluation, outcome evaluation, summative, formative, participatory, Logic Model, quasi-experimental design etc. Just start by asking: • What do we want to know? which is generally: ARE WE MAKING PROGRESS TOWARDS OUR GOALS? Quick Exercise • “I wish I knew whether…” _______________________ . Level Best Evaluation Steps 1. Planning…… why/what you will evaluate 2. Asking……… one-two key questions you want answered 3. Tracking…… the activities that you conduct, the signs that you’re making progress towards your goals 4. Learning…. from what you track and what it tells you 5. Using…. the insight you gain to shape your next program Limit it • Rolling evaluation: our term for an evaluation process wherein smaller organizations choose one or two questions to ask, or areas per year to evaluate, and build evaluation learning year by year, over time. For Example “Although we are running 3 programs as part of our Leadership Development project, this year we will focus evaluation on our Parent Training Program. “The purpose of this particular evaluation is to help us determine whether or not to invest more in 2007 in this particular program.” Step 1: Planning • Identify a project or area you’d like to evaluate. It can be…. a priority, an idea, a particular program, a strategy within a program, an aspect of your process… Worksheet Avoid the BIGGEST Mistake Know what you want to evaluate? First step: be absolutely clear on its GOAL. You cannot evaluate something for which no goal has been set ! You can’t just stroll and gather… • It is a common misperception to assume that there is something simply “there” that will become apparent if you “point the camera” and shoot, or go looking in a general way. How the Goal Shapes the Evaluation Example: “The Kids Will Keep Their Rooms Clean” For Every Program * GOAL or DESIRED CHANGE: * Rationale: * SIGNS OF SUCCESS – We’ll agree we’ve made progress towards this goal IF: * STEPS to get us there: Worksheet • Fill in as we go • Take a moment now to work together to fill in Box B, C, D • (for now, skip A !) SO---Remind Everyone of the PROJECT’S GOAL What is the desired overall outcome for this project? Write down the “goal statement” in Box C Worksheet Why evaluate this now? • Box D. Write down one purpose you could imagine having for the particular evaluation you might conduct. Worksheet Must-Have: #1 Agreement as to the CONDITION that your organization or program is addressing (is your youth theater program trying to change kids, or change local boring theater?). Clarity as to the CHANGE you hope to make. The question you ask will change depending upon the condition you see yourself addressing. Worksheet Identifying the “condition” your program addresses For example… Must-have: #2 A clear GOAL STATEMENT for any program or area you want to evaluate; goal statement typically refers to the effect you want to have on the condition. Step 2: Asking • Once you determine the purpose of the evaluation, brainstorm with your team three or four questions you’d love to be able to answer better. • Select one or two to explore in this evaluation. Put the others on a “calendar” for next year or the following year. Types of Evaluation Questions • In Process Evaluation the guiding questions focus on the quality of a program’s components or implementation. • In Outcome Evaluation the guiding questions focus on the extent to which a program is achieving its desired results. “You” You do work. When you evaluate how well you do what you do, it’s called a PROCESS EVALUATION. “Them” Your work has results. It influences what people do or believe. When you evaluate what they do, it’s called an OUTCOME EVALUTION. Impact Lots of work produces multiple outcomes over time. This equals IMPACT “Theory of Change” The “IF” part of the statement often refers to what your organization will do (process) the “THEN” refers to what you hope will happen as a result (outcome). In some circles, this is referred to as your “Theory of Change.” So.. are we? / did they? • IF we offer reading tutors, THEN more people will be able to read. • IF we educate teens effectively about sexually transmitted diseases, THEN fewer will engage in unsafe practices. Theory of Change IF we…. THEN… Strategies Immediate Outcomes Which will lead to… Intermediate Outcomes Community or Social Change Intended IMPACT (long-term) What’s Most Important not whether you’re doing process or outcome evaluation… simply: What do you want to know? All flows from The Question • You must ask a specific question: what you track, consider and learn is dependent on the question you ask. You Can Ask/Evaluate Anything “Are teachers benefitting from the training we offer them? (How do we know?)” “Do visitors like our exhibits? (How do we know?)” “Are we serving girls as well as we serve boys in this program? (How do we know?)” But you must ask something! • Evaluation is NOT about collecting massive amounts of information, and then attempting to sort through it later to see “what it says” Step 3: Tracking • List two or three things that if they occurred, would be signs of progress, examples that might show you’re * making progress towards the desired outcome for the project or * that provide information that would help you answer the question you’ve posed. Indicators What information would tell you if you’re making progress toward your desired outcome? What would you track to help you answer the evaluation question you posed? That’s an “indicator” Program Example • Desired Outcome: Our theater’s main stage audience engagement increases • How We’ll Know We’re Making Progress/ #1: Mainstage attendance as a percent of total seat capacity will increase to 65% for the 2010-11 season. • How We’ll Know We’re Making Progress/ #2: 90% of mainstage audience members completing the end of season survey will be “very” or “somewhat satisfied” with the season. Organizational Example • Desired Outcome: Board engagement improves • How we'll know we're making progress / #1: 80% of Board members will contribute $500 or more in FY2011. • How we'll know we're making progress / #2: Each quarterly Board meeting will have 75% attendance or better in FY2011. Golden Rule In evaluation we want to measure what we can control. So… Measure What Is in YOUR Control… Not everything that you want to achieve in your program will be measurable. Track What’s Closest to What YOU do The goal of our tutoring program is to help students… --someday get jobs --graduate from high school -- better homework completion rate -- attend homework helper program -- be referred to this program by teacher Keep drilling down, ask “what needs to happen before that can happen?” You can’t track it all. S.M.A.R.T. Indicators • Specific • Measurable • Attainable • Relevant • Time bound Which Indicators to Track • Refer back to the CONDITION you’re hoping to change or address in the first place Use What You Have • Think creatively – how might you be able to tell that you are achieving your desired results? • Chances are, you’re already “evaluating” on some level. What kinds of information are you already routinely gathering? There are no laws; there is no one right way. There is no evaluation police Peer Feedback • Volunteers: Share your question • Colleagues: What might your colleague track in order to answer that question? Don’t overdo it “Most of the time, what funders and nonprofits really want to know is if an intervention can have positive outcomes given the right conditions, and if the results are worth the investment-- and they only need to know these answers ‘beyond a reasonable doubt’…. Usually, this doesn’t require a great deal of time or money. It does, however, require being very clear about what you want to know, and why you want to know it.” ---Report from Gill Foundation Sources of Information • Agency Records • Questionnaires/Surveys • Interviews • Focus Groups • Direct Observation Who Does It? Another good reason to have an Evaluation Working Group is to help think through the evaluation implementation. Who is going to: • Collect data • Enter it • Compile it for review? Step 4: Learning Instead of thinking in terms of data analysis, helpful to think about data interpretation. You are looking for patterns and trends; You are attempting to find correlation and not causation. What did you learn? Do your findings speak for themselves? What does our data tell us? 1. How successful were we in reaching our desired outcomes/goals? 2. What might we do next year to improve our programs/services? 3. How might these improvements lead to greater impact in the future? Challenges And if you don’t like the results? Step 5: Using Plug your new knowledge back into the cycle! The Evaluation Cycle Program Implementation Program Planning Program Evaluation Use Evaluation Results to: • Make Your Case • Plan Your Next Move • Gain Perspective • Identify Resources Needed: Do NOT Use Results to: • Punish staff (though evaluation can identify staff/management issues that need to be addressed) • Distract from other issues or cover up a problem • Spend over budget • Overconfidently dismiss alternate views or approaches. • Prove to the world how good you are (While it’s good to toot your horn and show your results, don’t do it at the expense of learning how to be even better.) Where to Use Evaluation Findings Decision-Making Moments • Staff meetings • Board meetings • Annual planning • Strategic planning • Budget planning Show and Tell Moments • Proposals/Report to funders • Annual report • Board meetings Rolling Evaluation Don’t forget! The purpose of rolling evaluation is to isolate one or two key questions to ask PER year. So when you have successfully completed one cycle, begin again by revisiting some of the questions you “shelved” last year. Level Best Evaluation Steps 1. Planning…… identifying your evaluation goals 2. Asking……… one-two key questions you want answered 3. Tracking…… the activities that you conduct, the signs that you’re making progress towards your goals 4. Learning…. from what you track and what it tells you 5. Using…. the insight you gain to shape your next program