CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent Pregnancy Prevention (CAPP) providers in New York State Overview • Review Basic Concepts of Evaluation • The Science of Implementation • CAPP evaluation: Implementing EBPs • Evaluation Partnership with COE 1) You care about youth 2) You want to make a difference in their lives 3) You want to know whether you have made a difference in their lives My question is, “Are we making a difference?” Program Evaluation can help us answer the question: Are we making a difference??? What is Program Evaluation? “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming.” Michael Quinn Patton (1997) Evaluation Terminology Types of Program Evaluation • PROCESS • OUTCOME http://www.actforyouth.net/youth_development/evaluation/ Process Evaluation • Focuses on: – What happened in the program •Who got how much of what? •Was the program implemented as planned? – Participant reactions to the program Examples of Process Questions • Which youth are participating in our program? (neighborhood, RHY, LGBTQ, FC) • Who are we not reaching? • How many sessions were offered? What % of participants attended all of the sessions? • What program activities were conducted? • Were there any adaptations made to the EBP? Outcome Evaluation • Focuses on whether the program made a difference • Answers the question: SO WHAT? What difference does the program make for participants, individuals, groups, families, and the community? Examples of Outcome Questions • Have adolescents increased knowledge about • • • • different types of birth control? Have adolescents learned how to use a condom? Have attitudes toward condom use changed? Are parents more knowledgeable about adolescent sexuality? Do parents talk to their kids about contraception? Process Data are foundational In order to get good outcome data, must obtain good process data CAPP Initiative Goal 1: Promote healthy sexual behaviors and reduce the practice of risky sexual behaviors among adolescents Core Strategy 1: Provide comprehensive, age appropriate, evidence-based and medically accurate sexuality education to promote healthy sexual behaviors We know a lot about what works to prevent teen pregnancy •Increase age of first intercourse EBP •Increase use of condoms •Decrease # sexual partners •Decrease frequency of sex Decrease Teen Pregnancy Promote Adol Sexual Health The Prevention Research Cycle Feedback Loop Identify problem or disorder and determine its extent Identify risk and protective factors associated with the problem Develop intervention and conduct efficacy trials Conduct large scale effectiveness trials of the intervention Implement the program in the community and conduct ongoing evaluation Reproduced from Fig. 1. The interactive systems framework for dissemination and implementation (p174) published in Wandersman et al. 2008. Just DO IT! What do we know about implementing EBPs in communities? Taking EBPs to scale • Very little is known about the processes required to effectively implement EBPs on a national scale (Fixsen et al, 2005) • Research to support the implementation activities that are being used is even rarer • While many EBPs have yielded positive outcomes in research settings, the record at the local level of “practice” is mixed (Wandersman, 2009; Lesesne et al, 2008). What do we know about Implementation? Durlak and DuPre, 2008: • Level of implementation influences program outcomes • If EBPs are not implemented with fidelity and quality, not likely to result in outcomes observed in research • Achieving good implementation increases chances of program success and stronger benefits for participants Factors Affecting Implementation • • • • • Community Level Facilitator Characteristics Program Characteristics Organizational Capacity Training and TA Need to Document Implementation • Assessment of implementation is critical in program evaluation • Evaluations that lack carefully collected implementation data are incomplete • Our understanding of program outcomes rests on knowing how the intervention was delivered. The Fidelity Tension • Program developers and prevention researchers are concerned that changes in implementation of EBP will dilute effectiveness • Community leaders and practitioners are concerned that “one size does not fit all” US Department of Health and Human Services, 2002 HELP NEEDED!!! Data Collection Tools for CAPP Evaluation of Implementation • Fidelity Check List: individualized, keep track of what you did, successes/challenges • Attendance Record: who you reached, where, dosage Fidelity Checklist Demographic Survey Attendance Record After you have completed an entire cycle of the EBP (i.e., ALL of the EBP sessions or modules): 1) Send all the completed evaluation tools (except the Brief Demo Survey!) to the Center of Excellence 2) Make sure that you clip all completed documents together so that we can keep track of individual EBP cycles. This includes: • Fidelity Check List (one per EBP cycle) • Attendance Record (one per EBP cycle with all names removed) 3) Mail these documents to: Amy Breese Cornell University ACT for Youth Center of Excellence Beebe Hall Ithaca, NY 14853 Questions? Amanda Purington: ald17@cornell.edu or 607-255-186 Comments? Jane Powers: jlp5@cornell.edu 607-255-3993