Sheila Bell, MA. GSPIA 2117 Public Program Evaluation REPORT WRITING Class Objectives Students will learn about: General components of a report Role of the audience in reporting Core message and how to write one Useful ways to report data Recommendations and lessons learned Why Report Results? Improve your program Promote changes when necessary Crystallize ideas about findings Involve stakeholders Rally support for your program A Good Evaluation Report Presents the findings Draws conclusions about the findings Is prepared with the audience in mind Who will be reading this? What are the most important message for the audience? Writing Effective Reports Revisit your evaluation questions Know your audience and target reports specifically to them Simplify—make sure the core message is effective Focus on actions—give guidance about next steps Writing Effective Reports Be aggressive! Look for opportunities to report results, don’t wait for the audience to ask for more Know your audience and target reports specifically to them. Simplify – get to the point! Know what they are looking for Make sure that the core message is noticed, creates interest, and is followed up with details Focus on actions. Move beyond general information and give some guidance about what to do next Components of a Report Re-state evaluation questions Explain the program being evaluated—participants, activities, and desired impact Identify the evaluation activities—data collection methods, sources, person responsible Describe results of evaluation data—relate to standards, compare, quote, categorize, clarify, and determine significance State conclusions, recommendations and/or next steps Communicating and Reporting Formats Least Interactive Short Written Communications Memos and E-mail Post Cards Potentially Interactive Verbal Presentations PowerPoint Presentations and Transparencies Interim Reports Final Reports Flip Charts Video Presentations Posters Most Interactive Working Sessions Synchronous Electronic Communications Chat Rooms Teleconferences Videoconferences Executive Summaries Newsletters, Bulletins, Briefs, Brochures News Media Communications Web-Site Communications Photography Cartoons Poetry Drama Web Conferences Personal Discussions Different Types of Audiences Primary—want to use your results Funders Board of Directors Current partner agencies Staff Secondary—associated with program, have an interest Staff Clients—present and future Potential partner agencies Community members Consider Your Audience What background do they have in the subject or your program? What will they want to know? How much time and/or interest will they have? What will they do with it? Audience Characteristics Audience ____________________________________________ For each characteristic, circle the response that best describes this audience. AUDIENCE CHARACTERISTICS How accessible? Easily With Some Effort With Substantial Effort Don’t Know Reading Ability? Familiarity with program? High Level Very Familiar Mid Level Somewhat Low Level Familiar Non-Reader Not Familiar Don’t Know Don’t Know Attitude toward/interest level in program? Positive/High Neutral Negative/Low Don’t Know Role in decision making about program or evaluation? Familiarity with R&E in general? Crucial Very Familiar Important Minor Somewhat Familiar No Role Not Familiar Don’t Know Don’t Know Attitude toward/ interest level in the evaluation? Experience using evaluation findings Positive/High Substantial Neutral Some Negative/Low None Don’t Know Don’t Know Purpose of Core Messages Capture and maintain audience’s attention Use an appropriate tone for the audience Are meaningful, have value, and are clear for the audience Contain strong supportive evidence at the beginning of the message Are provided with information about how the different messages from an evaluation relate to each other Make clear and relatively easy the ultimate action that the audience must take A Successful Core Message When the audience is able to answer the following questions: What? -- The message So What? – The relevance of the message Now What? –The action related to the message Core Message in 3 Easy Steps Select 2-3 major evaluation findings Summarize each finding into 1 or 2 sentences Add one data point to each finding FINDING + DATA POINT = MESSAGE Creating Messages Audience: ______________________________________________________________________ Characteristics:__________________________________________________________________ Finding: Message Data to Support: Relevance for this audience (So what?) Action Implications (Now what?) REMEMBER. . . If a tree falls in a forest and no one is there to hear it does it make noise? You have to “DISSEMINATE” your findings if you want others to take notice!!!! Presenting Data in Reports Make findings interesting Simplicity Ease of interpretation Be clear about definitions Make comparisons Carefully and appropriately How to Present Effective Graphics Write a message sentence Decide what type of comparison this message implies Experiment with several different graphics to show this comparison Select the graphic that conveys the message best Actually construct the graphic Pilot-test the graphic and revise if necessary Insert the final graphic into the report or briefing Describing Process Data: Bar Charts 180 160 140 120 100 Boys 80 Girls 60 40 20 0 Bar charts are useful for visually displaying data that can be used for comparison Describing Process Data: Bar Charts Number of Program Activities Probation Officer Meetings Court Appearances Crisis Intervention Teacher Meetings Children Tutored 0 50 100 150 200 250 300 350 Describing Process Data: Line Charts Number of Children Tutored School A School B 250 200 150 100 50 0 1st Quarter 2nd Quarter 3rd Quarter 4th Quarter Describing Process Data: Pie Charts Staff Hours for Violence Prevention Program 1% 2% Tutoring Hours 0% Preparation and Documentation 2% Teacher Meetings Crisis Intervention 34% Court Appearances 61% Probation Officer Meetings Q52: Which statement best describes your employment situation? Employed (n=48) Unemployed (n=354) All respondents (n=402) dissatisfied with status, desire to change it 10% N=5 30% N=105 27% N=110 neither satisfied nor dissatisfied with status, unsure of next steps 10% N=5 11% N=40 11% N=45 satisfied with status, don't want to change it 80% N=38 59% N=209 61% N=247 Several graphic representations of the same responses Employed (n=48) Unemployed (n=354) 10% 30% 10% dissatisfied with status, want to change it dissatisfied with status, want to change it neither satisfied nor dissatisfied with status, unsure of next steps neither satisfied nor dissatisfied with status, unsure of next steps 59% satisfied with status, don't want to change it 11% 80% satisfied with status, don't want to change it Several graphic representations of the same responses 100% 90% 80% 70% 60% satisfied with status, don't want to change it 50% neither satisfied nor dissatisfied with status, unsure of next steps 40% dissatisfied with status, want to change it 30% 20% 10% 0% Employed (n=48) Unemployed (n=354) All respondents (n=402) Several graphic representations of the same responses 100% 90% 80% 70% 60% 50% Employed (n=48) Unemployed (n=354) 40% 30% 20% 10% 0% dissatisfied with status, want to neither satisfied nor dissatisfied satisfied with status, don't want change it with status, unsure of next steps to change it Discussion Where you interpret your results Interpretations go beyond the data: Adds context Determines meaning Teases out substantive significance based on deduction or inference Discuss/Interpret Your Results What do results mean? What is the significance of the findings? Why did the findings turn out this way?* What are the possible explanations of the results?* * Internal and external validity issues are discussed here Quantitative Data ProblemsYou May Find Ceiling Effect Questions too easy Participants already know material Lack of results Not enough time between pre- and post-test Not measuring changes accurately Missing data Data tools are confusing or too long Data tools administered incorrectly Qualitative Data Problems You May Find Missing data Data tools are confusing or too long Data tools administered incorrectly Inconsistent data across focus groups or interviewers Poor training Lack of practice Another reason for problems -- Differences across or within the groups Conclusions/Recommendations/ Reflections and Lessons Learned This final section adds ACTION to analysis, interpretation, and judgment Only recommendations grounded in data ought to be formulated Conclusions/Recommendations/ Reflections and Lessons Learned Important section Needed to get support of stakeholders Act on the recommendations Influence policy or practice May be one of the only sections read Conclusions Major conclusions about the effectiveness of the program as a whole and its sub-parts Comment on firmness of conclusions Reserve judgment on any aspects of program Data – missing or inaccurately recorded Program – parts were new or revised Was anything overlooked that would make this evaluation an incomplete picture of the program? Recommendations and Lessons Learned Recommendations—individual statements based on the data results that prescribe what should be done in the future and by whom Lessons Learned—statements based on the evaluation and evaluator’s expertise that clarify the knowledge and value obtained that can be used in the future Successful Recommendations and Lessons Learned Based on evaluation results Are noteworthy Able to be implemented or acted on Are ample—recommendations Are encompassing—lessons learned Recommendations Supported by the data Specific to parts of the program Strengthens Weaknesses Suggestions Improvements ** This section is often crucial for program refinement and improvement Writing Effective Recommendations Draw possible recommendations from a wide variety of sources Work closely with agency personnel throughout the process Consider the larger context within which the recommendations must fit Writing Effective Recommendations Generally offer realistic recommendations Decide carefully whether to be general or specific Think twice before recommending fundamental changes Show the future implications of recommendations Writing Effective Recommendations Make recommendations easy to understand Stay involved after recommendations have been accepted If a recommendations is not accepted, look for other opportunities to recommend it again Recommendation Tracking Track for each recommendation Who is responsible for investigating, researching options, and implementation What progress has been made—steps taken, outcomes, and next steps Length of time from when recommendation was made to when action was completed www.socialresearchmethods.net/kb/pecycle.php