Making Your Family Treatment Drug Court Evaluation Work Beth L. Green, Ph.D. Vice President, NPC Research green@npcresearch.com Co-Principal Investigator, SAMHSA/CSAT National Family Treatment Drug Court Evaluation www.npcresearch.com What Is Evaluation? Systematic collection and use of program-related information – Uses include: program improvement, accountability, program management, outcome evaluation, and program and policy development. Evaluation should help to inform and improve programs as they develop, and not focus only on whether the programs “worked” or “didn’t work.” 6/23/2006 NADCP Conference 2 What Is Evaluation? Good evaluation requires carefully thinking through: – The questions that need to be answered, – The type of program being evaluated, and – How information will be generated and used. 6/23/2006 NADCP Conference 3 What Do We Know About FTDCs? Despite the huge increase in number of family treatment drug courts (FTDCs), drug court evaluations have focused almost entirely on adult drug courts. Little is known about whether and how FTDCs work. Preliminary results from NPC’s study of FTDCs suggest they can be effective in: – Reducing the length of time it takes parents to enter treatment – Increasing the time parents spend in treatment – Increasing the likelihood that parents complete treatment – Increasing the number of families reunified – Other child welfare (and other) outcomes less clear But results vary by site and program model & results are preliminary. 6/23/2006 NADCP Conference 4 Family Treatment Drug Courts Require Cross-System Evaluation Evaluation of a project or program that is a collaboration of two or more agencies, departments, or systems Not just evaluating a single agency or program, but how agencies work together Family treatment drug courts involve the court system, child welfare system, and treatment system, and sometimes other systems 6/23/2006 NADCP Conference 5 Challenges to Conducting Cross-System Evaluations Multiple stakeholders Multiple questions Multiple information needs Balancing the multiples: – Coming to consensus about evaluation questions – Prioritizing which questions can be answered, when, and with what degree of certainty – Keeping the scope of the evaluation within resource constraints (time, staffing, & money) Evaluating the usually complex, cross-system collaborative process 6/23/2006 NADCP Conference 6 Five Steps for Successful Evaluation 1. 2. 3. 4. 5. Identify Stakeholders Develop a Logic Model Develop a Data Collection Plan Collect & Manage Information Analyze & Report Information 6/23/2006 NADCP Conference 7 Step 1: Identify Key Stakeholders 6/23/2006 NADCP Conference 8 Involving Stakeholders From Multiple Systems Shared ownership is key to program and evaluation success: Make sure everyone has at least one of his/her key issues addressed in the evaluation. Include stakeholders who have authority to: – Implement data collection plans, – Grant access to data, and – Make changes in the program based on information. Decide who’s in charge, who makes final decisions, and who’s responsible for implementing and overseeing the evaluation. 6/23/2006 NADCP Conference 9 Step 2: Develop a Logic Model 6/23/2006 NADCP Conference 10 Elements of a Logic Model The logic model lays out what the program is expected to achieve and how it is expected to work, based on an expected chain of events that link: A. Program Activities – what are you doing & how often? (sometimes called “Inputs”) B. Target Population – who are you doing it for? C. Theory of Change – why or how do you think activities lead to outcomes? D. Short-Term Outcomes – what changes do you expect in participants or systems first? E. Long-Term Outcomes – what changes do you expect in participants or systems later? 6/23/2006 NADCP Conference 11 Benefits of a Logic Model 1. Develops shared understanding of the program across the systems 2. Helps to bare assumptions about how the program is expected work and what outcomes are expected 3. Helps to restrain over-promising about what evaluation can do 4. Provides a framework for evaluation questions and data-gathering 5. Promotes communication between system stakeholders 6/23/2006 NADCP Conference 12 When You Go Home, Remember: Involve stakeholders from all relevant systems in developing the program’s logic model. This helps to build buy-in and shared understanding. For more information and tools for building logic models, go to: http://casat.unr.edu/westcapt. Select “Planning and Best Practices” and then “Step 7: Evaluation.” 6/23/2006 NADCP Conference 13 Building Your Logic Model: Step 2A: Write a Detailed Program Description Program Activities: Describe the things your FTDC is doing that are different from “business as usual” – For today, pick 1 or 2 things that your FTDC is doing differently Target population: Identify your target population, including eligibility or exclusionary criteria – who decides who is in the program and how do they get in? 6/23/2006 NADCP Conference 14 Building Your Logic Model: Step 2B: State Expected Outcomes Defining Long- and Short-Term Outcomes – Short-Term Outcomes: the immediate program effects that you expect to achieve during or soon after the program is completed – Long-Term Outcomes: the long-term or ultimate effects from the program (3 months, 6 months, 1 year) 6/23/2006 NADCP Conference 15 Building Your Logic Model: Step 2B: State Expected Outcomes Don’t Confuse Outcomes with Outputs: – Outcomes: refer to changes produced (in individuals, communities, or systems) by your program. Example: An outcome might be increasing family reunification rates. – Outputs: refer to the number of opportunities your program has to create these changes in the form of clients served, activities implemented, etc. Example: An output might be the number of clients served each year. 6/23/2006 NADCP Conference 16 Building Your Logic Model: Step 2B: State Expected Outcomes There is no right number of outcomes. Programs have more influence over immediate outcomes and less influence over longer-term outcomes. Long-term outcomes, however, should be within the scope of the program's purpose and target audience. 6/23/2006 NADCP Conference 17 Building Your Logic Model: Step 2C: State the “Theory of Change” A statement explaining why activities should lead to outcomes Use an if-then format: – If we do this activity, then what changes happen that will lead to the short term outcomes? – If we increase the frequency of hearings, then we will know sooner if parents relapse. – If we know sooner that parents relapse, then we will be able to increase the intensity of their treatment more quickly, and then parents will be more likely to successfully complete treatment. 6/23/2006 NADCP Conference 18 Step 2D: Write Evaluation Questions Using Your Logic Model Evaluation questions can: Describe: 1. 2. Compare: 1. 2. 3. What’s happening with services What’s happening with individuals Changes over time in individuals Changes vs. “services as usual” What things were like before and after the FTDC started Today: Write one descriptive evaluation question that you think is important to address and one comparative evaluation question. Comparative questions require a reference point – better than X, improved compared to X, etc. 6/23/2006 NADCP Conference 19 Prioritizing Evaluation Questions Ultimately, you may want to generate a “long list” of possible evaluation questions that can be considered. But this list must be prioritized. To prioritize, review logic model and consider: What questions MUST you answer (to meet reporting or other requirements)? What questions would you LIKE to answer and WHY (how will you use the information)? What changes or outcomes are realistic? What are your resources for gathering and compiling information? 6/23/2006 NADCP Conference 20 Remember! No evaluation can answer every question. Be thoughtful and narrow your questions to those that are most important to answer. For today, pick one evaluation question to work with 6/23/2006 NADCP Conference 21 Step 3: Develop a Data Collection Plan 6/23/2006 NADCP Conference 22 Step 3A: Identify Data to Gather Review research question(s) What information is needed to answer the questions? What information is already collected somewhere & how can you get it? What new information will need to be collected? YOUR PRIME DIRECTIVE: AVOID DUPLICATION OF PAPER WORK & DATA COLLECTION!!! 6/23/2006 NADCP Conference 23 Possible Types of Data 1. Administrative Data –computerized or written case files a. Court Records b. CPS Records c. Treatment data 2. Other Sources of Data a. Stakeholders – interviews, surveys, focus groups b. Parents – interviews, surveys, focus groups c. Service providers – can provide their own information and information about parents 6/23/2006 NADCP Conference 24 Key Data Collection Decisions Key Decisions: Who will you collect information about? Which participants, and for how long? Will you collect data on clients who “drop out” of the program? – If not, you risk the accusation of “creaming.” – If so, be prepared to commit resources to try to find/locate parents who aren’t participating in your program. 6/23/2006 NADCP Conference 25 Key Data Collection Decisions, cont’d. If evaluating outcomes, what is your evaluation design? – Experimental Control Group Design – Random Assignment – Quasi-Experimental Comparison Group Design – Pre-Post Design – Longitudinal Designs – Performance Measurement Design 6/23/2006 NADCP Conference 26 Gaining Access to Data for Evaluation HIPAA: Allows data to be shared for program evaluation purposes through MOAs: Memorandum of Agreement signed by all systems, showing agreement to participate in the evaluation and willingness to share data; “waiver of authorization” clause can sometimes be used. Get participant release for data-sharing anyway, if possible. This would include agreement to share court, child welfare, and treatment data for evaluation purposes. 6/23/2006 NADCP Conference 27 3B: Use Your Logic Model to Develop a Data Collection Plan Today, identify possible data sources to answer your evaluation question When you go home, make a WRITTEN plan for: – What information to collect/compile – How and when data will be collected and compiled – Ensuring data-sharing agreements and/or proper signed releases are in place – Where centralized data will be stored – Who will take responsibility for ensuring data are recorded, compiled, and reported 6/23/2006 NADCP Conference 28 Step 4: Collect & Manage Information 6/23/2006 NADCP Conference 29 Managing Data Collection Cross-Systems Someone (individual person, agency) must be responsible for each agency. Someone must be responsible across agencies. If you are having additional data collected by line staff – will need supervisor support to make sure this happens. Important to build in “checks” along the way to make sure needed data are being collected: – Quarterly reports of key information – Data review and quality control – Nothing more true than the old maxim “garbage in, garbage out” 6/23/2006 NADCP Conference 30 A Simple Management Information System (MIS) Client level information will be most useful, ultimately A simple excel spreadsheet can be your “Management Information System” (MIS) Keep key information to answer your research questions Develop a system to ensure information “flows” into (and out of) the MIS 6/23/2006 NADCP Conference 31 Suggested (Minimal) Data for an FTDC MIS Data Elements Should Link to Evaluation Plan Client identifiers that will allow tracking into child welfare and treatment system Key dates: – CPS report – Jurisdictional/Dispositional hearings, – Drug court acceptance, – Drug court hearings, – Treatment entry & completion, – Drug court drop out or graduation – Permanent placement date – Date(s) of child placements 6/23/2006 Key (limited) client or family issues (drug of choice, particular service plan issues, child issues, key demographics, etc.) Key Outcome Variables: – Final placement decision – Treatment completion status – Length of time spent in treatment – Length of time child spent in foster care NADCP Conference 32 Step 5: Analyze & Report Information 6/23/2006 NADCP Conference 33 Involve your Stakeholders Be sure data analysis is answering your research questions. Be sure stakeholders from all systems review data to provide interpretations. 6/23/2006 NADCP Conference 34 Know Your Audience Define your audience: Who will see the information? What do they care about? Keep it simple: Presented data should be readable and understandable – Percentages & percent changes – Number of events – Averages For today, how would you analyze and present the data to answer your evaluation questions? 6/23/2006 NADCP Conference 35 Remember! Good evaluation does not have to be complicated. Distinguish what you need to know vs. what would be nice to know. Good cross-system evaluation will require time and energy to plan and implement. Good cross-system evaluation, like cross-system programs, require higher levels of collaboration, stakeholder participation, and coordination than typical evaluations. 6/23/2006 NADCP Conference 36 For more information about the FTDC National Evaluation, contact: Beth L. Green, Ph.D., Principal Investigator, green@npcresearch.com Sonia Worcel, M.A., M. P. P., Project Director, worcel@npcresearch.com For evaluation tools and “step by step” guides, go to: http://casat.unr.edu/westcapt. Select “Planning and Best Practices” and then “Step 7: Evaluation.” 6/23/2006 NADCP Conference 37