Catching Students Early: The Importance of Early Warning Systems in Supporting Students AtRisk of Dropping Out Amy Peterson Rural Dropout Prevention Project American Institutes for Research North Carolina Department of Public Instruction Agenda • What is EWS and why is it Important? • What does EWS look like in NC • How can we think about implementing EWS in our schools and districts What is EWS and why is it important? IES Practice Guide on Dropout Prevention Recommendations Diagnostic • Use data systems that identify individual students at high risk of dropping out. • Assign adult advocates to at-risk students. • Provide academic support and enrichment. Targeted • Implement programs to improve students’ behavior. Interventions • Personalize the learning environment. Schoolwide • Provide rigorous and relevant instruction. Interventions Source: Dynarski et al., 2008 How do we know if someone might dropout? Early Warning Systems (EWS) can give us a good idea. EWS •Uses readily available data •Relies on indicators that predict students’ likelihood of meeting specific academic goals or outcomes (e.g., high school graduation, college readiness) •Is a tool for educators to target student interventions in a timely way •Provides information to educators so that adults can take actions to support at-risk students Early Warning Indicators • Commonly used early warning indicators include the following: – Attendance – Behavior or disciplinary incidents – Course performance, especially the number of Fs – Grade point average (GPA) 6 How Do We Know These Are Important? • Research from several U.S. school districts (Chicago, Baltimore, Philadelphia) provides a strong foundation for defining early warning signs that students might drop out • • Consortium on Chicago School Research http://ccsr.uchicago.edu/publications/on-trackgraduation Everyone Graduates Center http://www.every1graduates.org/ • Local adaptation is key High School Graduation Outcome: Freshman Absences Graduation Rates, by Freshman Absences Percentage Who Graduated in Four Years 100% 80% 60% 40% 20% 0% 0-4 5-9 10-14 15-19 20-24 25-29 30-34 35-39 40+ Days Absent Per Semester Source: Allensworth and Easton, 2007 8 High School Graduation Outcome: Freshman Course Failure Percentage Who Graduated in Four Years Graduation Rates, by Freshman Course Failures 100% 80% 60% 40% 20% 0% 0 1 2 3 4 5 Semester Course Failures 6 7 8 More than 8 Source: Allensworth and Easton, 2007 9 High School Graduation Outcome: Freshman GPA Graduation Rates, by Freshman GPA Percentage Who Graduated in Four Years 100% 80% 60% 40% 20% 0% 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5+ Source: Allensworth and Easton, 2007 Freshman GPA 1 High School Graduation Outcome: Chicago’s On-Track Indicator Number of Semester Core Course Failures Number of Credits Accumulated Freshman Year Less than 5 credits 5 or more credits 2 or more courses Off-Track Off-Track 0 or 1 courses Off-Track On-Track Students are on-track if they: 1. 2. Have not failed more than one semester-long core course AND Have accumulated enough credits for promotion to the 10th grade 1 High School Graduation Outcome: Freshman On-Track Status High School Graduation Outcome: Middle Grade Indicators Sixth-grade students demonstrating at least one “flag” had only a 10 percent to 20 percent likelihood of graduating from high school in five years. Engagement •80 percent or lower attendance rate Course Performance •Failing mathematics or English Behavior •Unsatisfactory behavior grade Source: Balfanz, 2009 1 High School Graduation Outcome EWS Indicators and Thresholds for Middle Grades and High School Indicators Incoming Indicator Dropout Thresholds Middle Grades High School Previous year EWS tool exit indicator or locally validated indicators of risk Attendance Missed 20 percent or more of instructional time Missed 10 percent or more of instructional time Course Performance Failure in an English language arts (ELA) or mathematics course Failure in one or more courses Behavior End-of-Year Indicator Earned 2.0 or lower GPA (on a 4-point scale) Locally validated thresholds EWS exit indicator or locally validated indicators of risk 1 4 What does this look like in North Carolina? NC EWS System • Did you know about the risk reports in Power School? • How familiar are others in your school about the risk reports available? Is it supported by administration? • Do you use a different method for EWS? Debora I don’t know if we would want to add some specific slides here about what this might look like in NC Challenges • What challenges have you had with using the Power School Risk Reports? • Are there specific barriers? – e.g., Inability to include number grades How can we implement this in our schools and districts? How Can we Use EWS Data to Improve Outcomes? STEP 7 Evaluate and refine the EWIMS process STEP 1 Establish roles and responsibilities STEP 6 Monitor students STEP 5 Assign and provide interventions STEP 2 Use the EWS tool STEP 3 Review the EWS data STEP 4 Interpret the EWS data EWIMS: Seven-Step Implementation Process 2 Steps 1 & 2: Setting up the Structure for Using the EWS • Getting the right people together – Ensuring they understand the indicators – Ensuring they can make decisions – Getting people comfortable with using the tool and accessing the data and reports • Developing processes – How frequently will you review data? – What are different team member’s roles? – How will information be communicated within and outside of the team? 2 Step 3: Review the EWS Data • Review and monitor EWS indicators to: – Identify students at risk – Understand patterns in student engagement and academic performance • Questions to ask about EWS data: – Student-level patterns: What do your data tell you about individual students who are at risk? – School-level patterns: What do your data tell you about how the school is doing? 2 Example 1: Student-Level Report 2 Exploring Data to Identify Students or Groups of Students of Concern Review the Big Picture EWS Data • What does indicator-level data tell us? • What does the EWS data say across levels? • school, • groups of students, • individual students • What patterns begin to emerge within your data? 2 Symptom Versus Cause 25 Step 4: Interpret the EWS Data • The EWS team must look beyond the indicators: – Indicators are observable symptoms, not root causes – Examine additional data from sources beyond the EWS indicators to determine root causes • Looking at data beyond EWS indicators can: – Help identify individual and common needs among groups of students – Raise new questions and increase understanding of why students fall off-track for graduation 2 Activity: Interpret the Data • Consider the example data from Donald: – What are some other data sources you might consider? 2 Potential Underlying Factors 1. Individual-level factors a. b. c. d. Academic Physical Social Emotional/Psychological 2. Classroom- and/or school-level factors 3. Family-level factors 4. Other outside factors a. Pragmatic/Logistical Issues b. Outside of school 2 Identify Additional Data to Determine Underlying Cause of Risk Identify Additional Data • What data are you currently collecting that can supplement EWS data to determine underlying cause of risk? • What additional information could you collect to better understand the underlying causes of risk? • Are there gaps in data you have available? • How will additional data be collected? 2 Avoiding Too Much Information (TMI) • Are some data sources looking at the same thing? • Are some data sources likely to yield more or better information than others? • Should some data sources be collected and reviewed before others? • What are the most valuable or efficient sources we can use? 3 What Is a Root Cause? • From the Savannah River Project (a nuclear power station): Root Cause is “the most basic cause that can reasonably be identified, that we have control to fix, and for which effective recommendations for prevention can be implemented.” • From Total Quality Schools by Joseph C. Fields: A Root Cause is “the most basic reason the problem occurs.” • From Clark County School District “Root Cause—the deepest underlying cause, or causes, of positive or negative symptoms within any process that, if dissolved, would result in elimination, or substantial reduction, of the symptom.” (Clark County School District, 2012; Preuss, 2003) 3 Because …. Why? Because Primary Concern Why? Why? Data sources: Why? Data sources: Probable Root Because Because Why? Data sources: Data sources: (Adapted from Clark County School District, 2012) 3 Confirm Likely Root Cause Confirm Likely Root Cause • What have you learned from this new data or evidence? • What do you now believe is the likely cause(s) of risk? • What do student(s) need (define the problem to be solved)? 3 Step 5: Assign and Provide Interventions • Matches individual students to specific interventions after having gathered information about: – Potential root causes for individual students who are flagged as at risk – The available academic and behavioral support and dropout prevention programs in the school, district, and community • A tiered approach can be used to match students to interventions based on their individual needs. 3 Tiered Approach to Dropout Prevention Tier 3: Individualized Tier 2: Targeted Tier 1: Universal 3 Example of Interventions Aligned with EWS Indicators Type of Intervention Attendance Focus of Intervention (ABCs) Behavior Universal (all students) Every absence brings a response. Create a culture that says attending every day matters. Positive social incentives for good attendance Data tracking by teacher teams Teach, model, and expect good behavior. Positive social incentives and recognition for good behavior Advisory Data tracking by teacher teams Research-based instructional programs In-classroom support to enable active and engaging pedagogies Data tracking by teacher teams Targeted (15 to 20 percent of students) Two or more unexcused absences in a month brings brief daily check by an adult. Attendance team (teacher, counselor, administrator, parent) investigates and problem solves (why isn’t student attending?). Two or more office referrals brings involvement of behavior team. Simple behavior checklist that students bring from class to class, checked each day by an adult Mentor assigned Elective extra-help courses, tightly linked to core curriculum; preview upcoming lessons and fill in knowledge gaps Targeted, reduced class size for students whose failure is rooted in social– emotional issues Intensive (5 to 10 percent of students) Sustained one-on-one attention and problem solving Appropriate social service community supports In-depth behavioral assessment (why is student misbehaving?) Behavior contracts with family involvement Appropriate social service or community supports One-on-one tutoring Course Failures Source: Mac Iver & Mac Iver, 2009, p. 23. Dropout Prevention Intervention Mapping Inventory Source: Therriault et al., 2013, Appendix B. Finding & Evaluating Evidence on Intervention Programs: Where? What Works Clearinghouse Collaborative for Academic, Social, & Emotional Learning National Dropout Prevention Center Best Evidence Encyclopedia 38 Finding & Evaluating Evidence on Strategies: Where? Doing What Works National High School Center Center on Instruction National Center on Intensive Intervention 39 Step 6: Monitor Students and Interventions • Identify students whose needs are not being met and/or students who may no longer be struggling – Are they reaching short and long term goals? – Are they making progress at an acceptable rate? • Identify whether the instruction/intervention needs to be adjusted or changed • Determine whether there are some interventions that are more effective than others • Communicate with appropriate stakeholders and solicit their involvement 4 Measuring Progress at the Secondary Level • Will vary depending on areas you target with interventions (e.g., attendance, behavior, academics) • Frequency of data collection may vary across tiers and data sources Examples of Progress Monitoring at the Secondary Level • Ongoing formal and informal formative assessment – – – – – – – – – Benchmark assessments Quizzes, end-of-unit tests Common writing prompts Grades Attendance Teacher-developed Curriculum Based Measures (e.g., algebra) Maze passage Time-sampling for behavior Office referrals More information on academic tools: The High School Tiered Interventions Initiative: Progress Monitoring http://www.rti4success.org/video/high-school-tiered-interventions-initiative-progress-monitoring Students Monitoring Their Own Progress Source: The Doing What Works Library. (2010). Doing what works: Taking ownership. Washington, DC: U.S. Department of Education. Retrieved from http://dwwlibrary.wested.org/media/taking-ownership Step 7: Evaluate and Refine the EWIMS Process • Continuous improvement cycle • Refine the EWIMS implementation process: – During the school year – At the end of a school year • Identify short- and long-term needs and solutions: – Student needs – School climate – Organizational needs (school and/or district) 4 Questions Additional Resources • North Carolina DPI Dropout Prevention and Intervention: http://www.ncpublicschools.org/dropout/ • North Carolina DPI MTSS Wiki: http://mtss.ncdpi.wikispaces.net/ • Center on Response to Intervention at AIR: http://www.rti4success.org/ – Tiered Interventions in High Schools: Using Preliminary 'Lessons Learned' to Guide Ongoing Discussion: http://www.rti4success.org/resource/tiered-interventions-highschools-using-preliminary-lessons-learned-guide-ongoing Additional Resources • National Center on Intensive Intervention: http://www.intensiveintervention.org/ • Early Warning Systems in Education: www.earlywarningsystems.org • What Works Clearinghouse: Dropout Prevention Practice Guide: http://ies.ed.gov/ncee/wwc/PracticeGuide.aspx?sid=9 References • • • • • • Allensworth, E. M., & Easton, J. Q. (2007). What matters for staying on-track and graduating in Chicago Public High Schools: A close look at course grades, failures, and attendance in the freshman year. Chicago, IL: Consortium on Chicago School Research at the University of Chicago. Retrieved from http://ccsr.uchicago.edu/sites/default/files/publications/07%20What%20Matters%20Final.pdf Alliance for Excellent Education. (2013). High school state cards: North Carolina. Retrieved from http://all4ed.org/wp-content/uploads/2013/09/NorthCarolina_hs.pdf Balfanz, R. (2009). Putting middle grades students on the graduation path (Policy and Practice Brief). Baltimore, MD: Johns Hopkins University, Everyone Graduates Center. Retrieved from https://www.amle.org/portals/0/pdf/articles/Policy_Brief_Balfanz.pdf Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., & Smink, J. (2008). Dropout Prevention: A Practice Guide (NCEE 2008–4025). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/ PracticeGuide.aspx?sid=9 Hernandez, D. (2012). Double jeopardy: How third-grade reading skills and poverty influence high school graduation. Baltimore: The Annie E. Casey Foundation. Retrieved from http://gradelevelreading.net/wp-content/uploads/2012/01/Double-Jeopardy-Report-030812-forweb1.pdf Johnson, J., Showalter, D., Klein, C., & Lester, C. (2014). Why rural matters: 2013– 2014: The condition of rural education in the 50 states. Rural School and Community Trust. Retrieved from http://www.ruraledu.org/articles.php?id=3181 References • • • • • Mac Iver, M. A., & Mac Iver, D. J. (2009). Beyond the indicators: An integrated school-level approach to dropout prevention. Arlington, VA: George Washington University, Center for Equity and Excellence in Education, Mid-Atlantic Equity Center. Retrieved from http://maec.ceee.gwu.edu/sites/default/files/Dropout%20report%208.11.09.pdf National High School Center, National Center on Response to Intervention, and Center on Instruction. (2013). Tiered interventions in high schools: Using preliminary “lessons learned” to guide ongoing discussion. Washington, DC: American Institutes for Research. Retrieved from http://www.rti4success.org/resource/tiered-interventions-high-schools-using-preliminary-lessons-learnedguide-ongoing North Carolina Department of Public Instruction. (n.d.). North Carolina DPI MTSS Wiki. Retrieved from http://mtss.ncdpi.wikispaces.net/ Stetser, M. C., & Stillwell, R. (2014). Public high school four-year on-time graduation rates and event dropout rates: School years 2010–11 and 2011–12. First look (NCES 2014-391). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubs2014/2014391.pdf Therriault, S., O’Cummings, M., Heppen, J., Yerhot, L., & Scala, J. (2013). High school early warning intervention monitoring system implementation guide. Washington, DC: National High School Center, U.S. Department of Education. Retrieved from http://www.earlywarningsystems.org/wpcontent/uploads/documents/EWSHSImplementationguide2013.pdf Thank You Amy Peterson ampeterson@air.org