Purposes of Assessment Knappa School District 4/19/10 Purposes of Assessment 1. Screening 2. Progress Monitoring 3. Diagnostic The Importance of data Screening Good Screening Tools Screening Tools are designed to: 1. Compare all students to the same grade-level standard 2. Be efficient, standardized, reliable, and valid 3. Accurately identify those who are on track to meet grade level expectations, and those who will need more support 4. Evaluate the quality of your schoolwide instructional system Screening tools are NOT designed to: 1. Identify specific instructional levels for individual students 2. Take a large amount of instructional time and vary between classrooms 3. Identify WHY some students will need more support 4. Evaluate individual teachers 1. Compare all students to the same grade-level standard ALL students are assessed against the grade level-standard at least 3 times a year, regardless of instructional level "If you don't know where you are going, you will wind up somewhere else.“ ~ Yogi Berra 2. Be efficient, standardized, reliable, and valid “Curriculum Based Measures (CBM) are usually composed of a set of standard directions, a timing device, set of materials, scoring rules, standards for judging performance, and record form or charts.” These are also called General Outcome Measures (GOM) Essential Features of CBM Tools • Robust indicator of academic health • Brief and easy to administer • Can be administered frequently • Must have multiple, equivalent forms (If the metric isn’t the same, the data are meaningless) • Must be sensitive to growth What are Screening Tools? Screening Tools • • • • DIBELS Oral Reading Fluency Maze EasyCBM Not Screening Tools • • • • • • • Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned OAKS • Passing OAKS 3rd grade (204) places a child in the 17th percentile • Passing 3rd grade is not enough. • By 5th grade, if these students stay at the same percentile they will fail OAKS • By 10th grade, if these students stay at the same percentile they will fail OAKS • This student will score a 236 and pass at the 37th percentile Based on 2008-2009 ODE percentiles Reading CBM: DIBELS Dynamic Indicators of Basic Early Literacy Skills Reading CBM: DIBELS Dynamic Indicators of Basic Early Literacy Skills Grade Levels K-2 Time to administer 1-2 minute per measure Skills measured Phonemic awareness, Phonics Strengths Free Quick and easy to administer Highly predictive of later reading Research-based benchmarks Weaknesses Limited to early literacy skills Can only give to 1 student at a time Reading CBM: ORF Oral Reading Fluency Reading CBM: ORF Oral Reading Fluency Grade Levels 1-6 (DIBELS) Time to administer 1 minute per passage Skills measured Oral reading fluency and accuracy Strengths Free Quick and easy to administer Highly predictive of later reading Research-based benchmarks Indicator of reading comprehension Weaknesses Can only give to 1 student at a time Not a comprehensive comprehension measure Oral reading fluency and comprehension activity • Find a partner. One person will be the Test Administrator, paper A. The other will be the Test Taker, paper B. • Read your directions to yourself. Test Administrator, give the test and score. • Test Taker, read the passage aloud and be prepared to answer questions. Oral reading fluency and comprehension activity Oral Reading Errors per Page at Different Levels of Accuracy 95% 98% 99% The Secret Life of Bees 18.5 7.4 3.6 My Brother Sam is Dead 15 6 3 The Magic School Bus 6 2.4 1.2 Why Use Fluency Measures for Screening? • Oral Reading Fluency and Accuracy in reading connected text is one of the best indicators of overall reading comprehension (Fuchs, Fuchs, Hosp, & Jenkins, 2001) • Always examine fluency AND accuracy • Students MUST be accurate with any skill before they are fluent. Oral reading fluency (ORF) does not tell you everything about a student’s reading skill, but a child who cannot read fluently and accurately cannot fully comprehend written text and will need additional support. Fluency Non-Example Reading CBM: Maze Grade Levels 1 – 8 (AIMSWEB) 3-6 (DIBELS) Time to administer 3-5 minutes Skills measured Reading Comprehension Strengths Free (if using DIBELS) Quick and easy to administer Can administer to groups of students Indicator of reading comprehension Weaknesses Time consuming to score Only a “surface” comprehension measure Students make less growth than on ORF 3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998) 4. Evaluate the quality of your schoolwide instructional system • Are 80% of your students proficient? • Are 80% of students reaching benchmarks and “on track” for next goal? If not, then the core curriculum needs to be addressed Improving Your Core 90 min reading block structure Fidelity Explicit, systematic, feedback, application Professional Development What does your screening data look like? ? Talk to your neighbor about… • 1 thing you already knew • 1 thing that was new to you • 1 unanswered question you still have Progress Monitoring Progress Monitoring Tools Brief & Easy Sensitive to growth Frequent Equivalent forms What are Progress Monitoring Tools? Progress Monitoring Tools • • • • DIBELS Oral Reading Fluency Maze EasyCBM • • • • • • • Not Progress Monitoring Tools Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned Where are we? What is our goal? What course should we follow? How are we doing? Our Goal Desired Course We are Here Actual Course Progress Monitoring: The GPS for Educators! Purpose of Progress Monitoring Answers the question(s): Are the children learning? How can we tell? Are they making enough progress? Can we remove some of our supports? Do we need to change or intensify our supports? How often do you progress monitor students? Determined by district decision rules and level of need • Best practice recommendations: – Intensive: 1x per week – Strategic: 1x or 2x per month How do we know if a student is making adequate progress? Correct words per Minute 60 50 Decision Rules 40 30 20 10 Dec. Scores Jan. Scores Feb. Scores March Scores April Scores May Scores June Scores Questions to Consider • How many data points below the line before you make a change in instruction/intervention? • What do you change? – Group size? – Time? – Curriculum? – Other factors? Progress Monitoring Phonics for Reading 27 31 30 35 25 32 34 38 We do not use progress monitoring data to… …select specific short-term instructional goals …take a lot of time away from instruction …diagnose educational problems …assign grades to students …evaluate teachers Progress Monitoring data tell us WHEN a change is needed Progress Monitoring data does not necessarily tell us WHAT change is needed Talk to your neighbor about… • 1 thing you already knew • 1 thing that was new to you • 1 unanswered question you still have Diagnostic Answer the question… Why? Diagnostic assessments may include: • In curriculum assessments: – Quick Phonics Screener (in Triumphs) – Weekly assessment data – Unit and Benchmark assessment data • Survey Level Assessments • Error Analysis or Running Records • Any formal or informal assessment that answers the question: – Why is the student having a problem? Treasures Weekly Assessments Weekly Assessment Weekly Leveled Assessment Treasures Weekly Assessments Organizing Weekly Assessment Data U3 W1 Voc Poss GB TD BS GH BD EL LB SS PR YS KD TC MK JA CK JC PH DP JW QI BF FB Avg Skill U3 W2 Read Comp Const Resp Voc U3 W3 Read Comp Const Resp Voc U3 W4 Read Comp Const Resp Voc U3 W5 Read Comp Const Resp Voc Read Comp Const Resp total correct Percent 3 2 2 3 2 2 3 2 2 3 2 2 3 2 2 35 #/35 = % 3 1 2 2 1 2 2 1 2 1 2 2 3 2 2 28 80% 3 1 2 2 2 2 3 2 2 2 2 1 2 2 2 30 86% 2 1 2 2 2 1 3 2 2 2 2 1 22 79% AB 3 2 1 2 1 2 2 2 2 3 2 0 3 1 2 28 80% 3 1 1 3 2 2 3 2 1 2 2 1 3 1 2 29 83% 3 0 1 2 1 2 2 2 2 2 1 2 2 2 2 26 74% 0 1 0 0 1 0 2 2 2 3 1 0 1 1 1 15 43% 3 0 2 AB 2 1 1 AB 1 0 1 11 52% 3 2 2 2 1 2 2 1 2 0 1 2 2 2 2 26 74% 3 2 1 3 0 1 3 2 1 2 2 2 3 2 1 28 80% 3 1 1 0 1 2 2 2 0 3 2 1 2 0 0 20 57% 3 0 0 2 2 2 AB 0 1 0 1 2 2 15 54% 3 2 0 3 2 2 3 2 2 3 1 1 1 1 2 28 80% 3 2 1 3 1 2 2 0 1 3 2 2 AB 22 79% 3 2 2 3 2 0 3 2 1 3 2 2 2 0 0 27 77% 1 1 0 2 2 0 3 2 1 1 2 0 1 2 2 20 57% 3 2 2 AB 3 2 1 3 2 1 3 2 1 25 89% 3 1 2 1 2 2 2 1 1 2 0 2 2 2 2 25 71% 3 2 2 3 2 2 3 2 2 3 2 2 3 2 2 35 100% 3 1 2 3 1 2 3 2 1 3 1 1 3 0 2 28 80% 3 2 1 2 2 2 1 0 2 2 2 1 3 1 2 26 74% 3 0 1 3 2 0 AB 2 1 2 2 2 1 19 68% 2.8 1.2 1.2 2.2 1.5 1.6 2.2 1.6 1.3 2.1 1.4 1.5 24.2 74% Make Inferences Plot and Setting 2.4 1.6 Cause and Effect 1.4 Make Inferences Sequence Diagnostic Assessment Questions “Why is the student not performing at the expected level?” “What is the student’s instructional need?” We do not use diagnostic data… …for all students …to monitor progress towards a long-term goal …to compare students to each other Talk to your neighbor about… • 1 thing you already knew • 1 thing that was new to you • 1 unanswered question you still have Final Questions/Comments Jon Potter jpotter@ttsd.k12.or.us (503) 431-4149