Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns Hopkins University Steven M. Ross, Ph.D. Professor and Evaluation Director The Johns Hopkins and EIA Partnership JHU has an MOU with EIA to market our evaluation services to its members Dr. David Andrews, Dean of the School of Education (SOE), is on the EIA board of Directors The Johns Hopkins and EIA Partnership The SOE believe strongly that the future success of 21st Century education must include “nontraditional forces,” such as social entrepreneurs (TFA, New School Venture Fund, others) and the $4 billion education industry as key components in American (and international) education The Johns Hopkins and EIA Partnership SOE is also building online courses on educational leadership, entrepreneurism, education organizations, educational policy for rising executives in member companies SOE and EIA are working to endow an institute on private sector education at SOE The Johns Hopkins and EIA Partnership We want SOE to be the place where education companies go for research and development and instruction The Center for Research and Reform in Education: Evaluation Services Independent studies of program implementation, products, and outcomes Literature reviews and research papers on selected topics Best Evidence Encyclopedia (BEE) Recent and Ongoing CRRE Evaluations Parent Engagement and Partnership Program (EIA) Middlebury Interactive Program (EIA) JUMP Math in NYC Middle School Matters National Institute of School Leaders Three principal preparation programs Women’s Initiative Fellowship Program High school reform in Minnesota Social-emotional learning in Northern Ireland The Leader in Me Program in two schools Pre-K and K Early Literacy English Language Learners in Texas Types of Evaluation Studies Simplest and Least Costly ◦ Case Study Example: Examining a middle school’s use of a new computer program for supplementing math instruction Types of Evaluation Studies Simplest and Least Costly ◦ Survey/Interview Study Example: How 325 principals who participated in online leadership training react to the program and their application of the skills taught Types of Evaluation Studies Simplest and Least Costly ◦ Achievement Profile Study Example: Descriptive analysis of posted state assessment scores for 25 schools before and after using a new after-school program in E/LA Types of Evaluation Studies Medium Rigor and Cost ◦ Mixed-Methods Control Group Study Example: Program Schools A and B are compared on district science assessments to Control Schools C and D Types of Evaluation Studies Medium Rigor and Cost ◦ Quantitative Control Group Study Example: Using statistical controls, comparisons are made on school-level AP scores in chemistry between 26 program schools and 50 control schools Types of Evaluation Studies Medium Rigor and Cost ◦ Qualitative Control Group Study Example: Through observations, interviews, and surveys, teaching methods and student engagement are compared at two schools receiving professional development in projectbased learning and two control schools Types of Evaluation Studies Most Rigorous and Costly (Often funded by federal grants) ◦ Mixed-Methods Randomized Comparison Study Example: 10 schools randomly selected to use a new program are compared on student-level test scores and qualitative measures to 10 schools randomly selected to serve as control sites Types of Evaluation Studies Most Rigorous and Costly (Often funded by federal grants) ◦ Mixed-Methods Matched Comparison Study Example: 10 schools that elected to use a new program are compared on student-level test scores and qualitative measures to 10 matched schools serving as control sites What Determines Rigor? Multiple measures (triangulation) Standardized (unbiased/objective) measures Treatment-control group comparisons Equivalent comparison groups What Determines Cost? Accessibility of data Cooperativeness of participants Travel The Best Study for You: Major Considerations What questions do you want to answer? How quickly do you need the answers? What resources are available to fund the study? How accessible are participants and data? Steven M. Ross Evaluation Director, CRRE sross19@jhu.edu