Research and Evaluation: Grantee Panel 2014 AmeriCorps State and National Symposium Objectives • Share examples of how programs are building evidence at various stages of the evidence continuum • Highlight diverse grantees and subgrantees – – – – Small and large Focus area representation Multi-site/multi-issue programs Rural Building Evidence of Effectiveness Evidence Based Evidence Informed Assess program’s outcomes Ensure effective implementation Identify a strong program design -Gather evidence supporting the interventionDesign/Adopt a strong program -Develop a Logic Model -Create Implementation Materials -Pilot implementation -Document program process(es) -Ensure fidelity in implementation -Evaluate program’s quality and efficiency -Establish continuous process improvement protocols [Performance Measures - Outputs] -Develop indicators for measuring outcomes -Conduct pre-/postintervention evaluation to measure outcomes -Conduct process evaluation [Performance Measures - Outcomes] Obtain evidence of positive program outcomes -Examine linkage between program activities and outcomes -Perform multiple preand post-evaluations (time series design) -Conduct independent (unbiased) outcome evaluation(s) -Conduct meta-analysis of various studies Attain strong evidence of positive program outcomes -Establish causal linkage between program activities and intended outcomes/impact (e.g. Conduct quasiexperimental evaluation using a comparison group , evaluation with random assignment (RCT), regression analysis, or other appropriate study design) -Conduct Multiple independent evaluations using strong study designs -Measure cost effectiveness compared to other interventions addressing same need Panelists • Fa’iz Rab, AmeriCorps Program Manager, Lutheran Family Services • Julie McClure, Director, Napa County Office of Education Community Programs • Amy Foss, Vice President, Conservation Legacy • Karen Domerski, General Counsel, Jumpstart Nebraska Economic Opportunity Corps (AmeriCorps State and National) Support | Educate | Empower The Nebraska Economic Opportunity Corps is an AmeriCorps State program of Lutheran Family Services of Nebraska, Inc. Our mission is to improve the economic opportunities of at-risk youth, new Americans, and veterans through education and employment programs that empower individuals towards self-sufficiency. NEOC Performance Measures Self-Sufficiency Matrix Outcomes are measured through the use of an adapted version of the Snohomish County Matrix developed by the Snohomish County SelfSufficiency Taskforce in 2004*. – Assessed domains are tied to factors impacting improved employment and education prospects. – Beneficiaries scored on a five level scale from “incrisis” to “thriving.” *The taskforce was collaborative partnership of the county’s human services community including the Snohomish County Community Action Division of the Human Services Department, United Way of Snohomish County, and community partners. NEOC Self-Sufficiency Matrix Benefits of the Matrix • Flexibility – Customize for individual host sites intervention method. • Collaboration – Design encourages a collaborative effort among providers in defining improvement. • Case Management – Can be used for both long term and short term interventions. • Self-Assessment – Keeps the member focused on the goal of self sufficiency. • Management – Assists in determining what is or is not working. • Communication – Helps communicate provision of service. • Measurement – Measures beneficiary progress. Challenges • Lack of understanding of how to complete the Matrix • Technology failures • Collecting reports • Basic math issues • Counting assessments Overcoming Challenges • More training and communication of requirements • Researched and experimented with different reporting methods, which led to using Google for Nonprofits Google for Nonprofits • Free service: Nonprofits qualify for business level service at no cost. • Improved communication: Multiple tools including email, chat, and Google Hangouts. • Less paper: Assessment forms are included on a separate tab for easy access. • Security: Encryption of mobile devices using our account. • Reduces reporting obstacles: Members may complete assessments through their phone. Example Google Sheet Going Forward • Improve methods to capture outcomes. – Work out any issues that arise from this year. • Create process and assessment of beneficiaries’ perspective on services provided. – Potentially using online survey methods or randomized follow-up calls from program staff. CalSERVES: AmeriCorps Programs of Napa County Office of Education CalSERVES AmeriCorps Programs • After School: Local Direct Service, 34 FT/60 HT • Collective Impacts: Local Direct Service, 20 FT/36 HT Members • PREP: State-wide Capacity Building, 40 FT Members • VIP: State-wide Capacity Building, 100 FT Members What We Did • Develop Evaluation Goals and Questions • Employ an Evaluation Firm that Understands AMC • Determine the Method of Evaluation • Determine Strategy for Comparison Group • Conduct Analysis • Report Findings • Discuss Program Changes and Next Steps for Research Evaluation questions • • • Does participation in the CalSERVES VIP program increase organizational capacity to utilize volunteers compared to organizations that did not take part in the program? Does participation in the CalSERVES VIP program improve organizational capacity concerning volunteer recruitment, training, and retention compared to organizations that did not take part in the program? Does participation in the CalSERVES VIP program improve organizational capacity to create and sustain successful volunteer programs compared to organizations that did not take part in the program? VCA Instrument • • To measure the effect of program participation on organizational and volunteer capacity over time, data were collected using the Volunteer Capacity Assessment (VCA), a survey that includes three sections with items focused on: organizational capacity, volunteer recruitment, and elements of a successful volunteer program. The VCA is designed to assess how well a non-profit or educational organization is prepared for recruiting, training, and utilizing volunteers to achieve its mission and goals. The instrument asks for organizations to report how well their practices align with the best practices listed on the survey. Propensity Score Matching List of Characteristics Used in Propensity Score Matching What the Results Showed Average Pre/Mid VCA Score and Average Post VCA Scores, Matched Comparison Conclusion “Overall, the partner sites that participated in the CalSERVES VIP program reported strong, positive changes over time on the items included in the VCA instrument. Organizations that did not participate in the program reported much smaller (and sometimes negative) changes on the VCA items. Because the matching procedure diminishes the likelihood that factors other than participation in the program influenced the change in reported scores over time, the evidence strongly suggests that the CalSERVES VIP program positively impacts organizations’ capacity concerning volunteer programs.” Challenges • Finding the right evaluation firm • Willingness of comparison sites to participate • Resources to fund evaluation Successes • Kept costs reasonable using comparison group design. • Able to develop and implement a comparison group design that allowed for the development of some real understanding of impact. • Able to use data to inform the program and determine next steps. Next on Our Research Agenda • Examine the type of organizations (both in size and scope of work) that benefit the most from the CalSERVES VIP program. • Explore the specific types of services that are most beneficial to these organizations. Conservation Legacy: A Progressive and Phased Approach Amy Foss Overview – Our Projects and Past Evaluations What problems do our land management agencies face? • Multiple uses that impact the land, how to maintain natural resources that communities and human use, preservation, habitat and species protection Our Projects/Intervention • Trail improvement, forest fuels reduction, habitat restoration, invasive species removal, surveys and research Our Evaluation History • Measurement of member outcomes via the Public Lands Service Coalition Study • Project specific measurement of outcomes via agency experts • Challenges/What we are pursuing: Wanting to delve deeper into these questions and research additional ways to make these measurement more evidence based Challenges We Face – Outcomes are so varied and broad, not possible to measure everything that we do. – We don’t have a singular activity with a singular output. – Outcomes are not immediate, can take years to materialize. – Don’t know where we are going to end up! Phased Approach to New Evaluation Three Phases • Phased Approach – Three phases over multiple years • Independent Research Team working in partnership with us throughout the three phases. – Research team consists of professors and graduate students at NC State and BYU. NC State and BYU – NC State is the lead. – Researchers also working on similar studies including one focused on the community impacts of Rails to Trails. Phase 1 Literature Review 2 Goals of Literature Review – First, it was designed to provide a summary of scientific evidence about the impacts of conservation activities similar to those conducted by conservation corps. – Second, it was designed to identify potential strategies and indicators to measure impacts in future research and evaluation with conservations corps. Phase 2 • Find a few things that conservation corps do from the literature review and then use similar measurement tools (as were used in the literature review) and apply them to our activities to determine possible impacts. • Will be a trial of different measurement approaches to determine best possible connections, impact, and outcomes that are reliable. • Identify how to measure multi-year impacts or change in condition in partnership with land management agencies. • Determine what is feasible and cost effective without over burdensome cost. Step 3 • Expand upon Step 2 and incorporate additional conservation corps in different areas across the country. • Final assessment of the results. • Figure out the final implementation costs and determine reality. Our Goal • Multi-year approach to finding best possible measurement tools using experts in the field and researchers to help us. • Never possible to have a fully experimental evaluation. • We don’t know exactly where we are going to end up! Contact Info Amy Foss Conservation Legacy amy@conservationlegacy.org 970-749-1151 Jumpstart Learning Collaborative • Jumpstart provides small grants ranging between US $7,500 and $10,000 for promising research that addresses significant questions on the processes and impact of Jumpstart. • Provides researchers with the opportunity to investigate the influence of Jumpstart (1) on the lives of young children who live in low-income and highstress communities, and (2) on adult volunteers who implement the Jumpstart program. • Up to three grants will be awarded each year. • Funded studies may be carried out using any research method or approach as long as the focus of the project is on examining the Jumpstart program. Priority will be given to applications that use rigorous research designs and methodologies. Research Questions 1. In what ways does Jumpstart contribute to children’s kindergarten readiness, specifically social emotional and executive functioning skills? 2. In what ways does Jumpstart support the development of children’s phonological awareness, oral language, and/or books and print knowledge? 3. In what ways does Jumpstart’s implementation fidelity (e.g. fidelity to curriculum, Corps member training) affect 1) Jumpstart children’s kindergarten readiness and 2) Corps members’ learning, knowledge, and overall Jumpstart experience? 4. In what ways does Jumpstart’s Corps member training affect children’s kindergarten readiness? 5. Does the Jumpstart Program impact children with different demographic backgrounds in different ways (e.g., Dual language learners or entering language levels)? 6. What impact does Jumpstart have on our Corps members? Examples of Funded Studies A Proposal to Study the Impact of Participating in Jumpstart on Corps Members • Impact that participation in Jumpstart has on Corp members’ civic engagement, knowledge of child development, work readiness, and teamwork and collaboration skills, when compared to a similar group of students who are not involved in Jumpstart. Proposed Evaluation Study of Full-Class Delivery of Jumpstart in Chicago Sites • Quasi experimental evaluation - researchers will be able to compare changes over time in a large sample of preschoolers receiving the full-class delivery model, to demographically matched children attending similar preschool programs, but not receiving Jumpstart. Assessment of Language, Literacy, and Social-Emotional Development Among Children Attending Preschools Served by Jumpstart • This study will pilot a comprehensive assessment battery including direct measures of cognitive and social skills targeted by Jumpstart. A Multi-system Approach to Examining Effects of Jumpstart on Children’s Stress Response • The purpose of the current project is to explore the effects of Jumpstart involvement on children’s stress physiology. We hypothesized that enrollment in Jumpstart would decrease children’s levels of cortisol and alpha-amylase across the preschool day. Jumpstart Research Consortium Bi-annual Convening of Researchers Goals of Research Consortium • To leverage learning from external research studies for program improvement • To expand our knowledge on establishing and maintaining university research partnerships • To determine avenues for future research through the Jumpstart Learning Collaborative What questions do you have?