Benchmark Screening: What, Why and How A module for pre-service and in-service

advertisement
Benchmark Screening:
What, Why and How
A module for pre-service and in-service
professional development
MN RTI Center
Author: Lisa H. Stewart, PhD
Minnesota State University Moorhead
www.scred.k12.mn.us click on RTI Center
MN RtI Center
1
MN RTI Center Training Modules


This module was developed with funding from the MN legislature
It is part of a series of modules available from the MN RTI Center
for use in preservice and inservice training:
Module Title
Authors
1. RTI Overview
Kim Gibbons & Lisa Stewart
2. Measurement and RTI Overview
Lisa Stewart
3. Curriculum Based Measurement and RTI
Lisa Stewart
4. Universal Screening (Benchmarking): (Two
parts)
Lisa Stewart
What, Why and How
Using Screening Data
5. Progress Monitoring: (Two parts)
Lisa Stewart & Adam Christ
What, Why and How
Using Progress Monitoring Data
MN RtI Center
6. Evidence-Based Practices
Ann Casey
7. Problem Solving in RTI
Kerry Bollman
8. Differentiated Instruction
Peggy Ballard
9. Tiered Service Delivery and Instruction
Wendy Robinson
10. Leadership and RTI
Jane Thompson & Ann Casey
11. Family involvement and RTI
Amy Reschly
12. Five Areas of Reading
Kerry Bollman
13. Schoolwide Organization
Kim Gibbons
2
Overview

This module is Part 1 of 2

Module 1: Benchmark Screening: What, Why
and How





What is screening?
Why screen students?
Criteria for screeners/what tools?
Screening logistics
Module 2: Using Benchmark Screening Data
MN RtI Center
3
Assessment: One of the Key
Components in RTI
Curriculum and
Instruction
Assessment
School Wide
Organization &
Problem Solving
Systems
(Teams, Process, etc)
MN RtI Center
4
Adapted from Logan City School District, 2002
Assessment and Response to
Intervention (RTI)

A core feature of RTI is identifying a
measurement system

Screen large numbers of students



Identify students in need of additional intervention
Monitor students of concern more frequently

1 to 4x per month

Typically weekly
Diagnostic testing used for instructional planning to
help target interventions as needed
MN RtI Center
5
Why Do Screening?

Activity

What does it mean to “screen” students?

Why is screening so important in a Response to
Intervention system? (e.g., what assumptions of
RTI require a good screening system?)

What happens if you do NOT have an efficient,
systematic screening system in place in the
school?
MN RtI Center
6
Screening is part of a
problem-solving system


Helps identify students at-risk in a PROACTIVE way
Gives feedback to the system about how students progress
throughout the year at a gross (3x per year) level



If students are on track in the fall are they still on track in the
winter?
What is happening with students who started the year below
target, are they catching up?
Gives feedback to the system about changes from year to
year

Is our new reading curriculum having the impact we were
expecting?
MN RtI Center
7
What Screening Looks
Like in a Nutshell




School decides on brief tests to be given at each grade level and trains
staff in the administration, scoring and use of the data
Students are given the tests 3x per year (Fall, Winter, Spring)
Person or team assigned in each building to organize data collection
All students are given the tests for their grade level within a short time
frame (e.g., 1-2 weeks or less). Some tests may be group
administered, others are individually administered.




Benchmark testing: about 5 minutes per student, desk to test
(individually administered)
Administered by special ed, reading, or general ed teachers or paras
Entered into a computer/web based reporting system by clerical staff
Reports show the spread of student skills and lists student scores, etc.
to use in instructional and resource planning
MN RtI Center
8
Example Screening Data:
Spring Gr 1 Oral Reading Fluency



10/51 (20%) high risk
22/51 (43%) some risk
19/51 (37%) low risk: on or above target
Class lists
then identify
specific
students
(and
scores) in
each
category
MN RtI Center
DRAFT May 27, 2009
9
Screening Data

Gives an idea of what the range of student
skills is like in your building and how much
growth over time students are making
MN RtI Center
DRAFT May 27, 2009
10
Screening Data can be linked to
Progress Monitoring


The goal is
to have a
cohesive
system.
If possible, use the
same measures for
both screening and
progress
monitoring
(e.g, CBM).
MN RtI Center
Screen ALL students 3x per year (F, W, S)
Strategic Support and Monitoring
Students at Some Risk
Intensive Support &
Monitoring for
Students at Extreme
Risk
11
A Smart System Structure
School-Wide Systems for Student Success
Academic Systems
Behavioral Systems
Intensive, Individual Interventions
•Individual Students
•Assessment-based
•High Intensity
•Of longer duration
5-10%
10-15%
Targeted Group Interventions
•Some students (at-risk)
•High efficiency
•Rapid response
Universal Interventions
•All students
•Preventive, proactive
75-85%
Intensive, Individual Interventions
•Individual Students
•Assessment-based
•Intense, durable procedures
5-10%
10-15%
Targeted Group Interventions
•Some students (at-risk)
•High efficiency
•Rapid response
75-85%
Universal Interventions
•All settings, all students
•Preventive, proactive
MN RtI Center
12
Terminology Check

Screening


Universal Screening


Collecting data on all or a targeted group of students in a
grade level or in the school
Same as above but implies that all students are screened
Benchmarking

Often used synonymously with the terms above, but
typically implies universal screening done 3x per year and
data are interpreted using criterion target or “benchmark”
scores
MN RtI Center
13
“Benchmark” Screening

Schools typically use cut off or criterion scores to
decide if a student is at-risk or not. Those scores or
targets are also referred to as “benchmarks”, thus the
term “benchmarking”

Some states or published curriculum also use the
term benchmarking but in a different way (e.g., to
refer to the documentation of achieving a specific
state standard) that has nothing to do with screening.
MN RtI Center
14
What to Measure for Screening?
Create a “Measurement Net”:
Beginning of Year
Reading Math
Writing
Grade
Middle of Year
Reading
Math
Writing
K
 Letter Names
 Rhyming
 Letter Sounds
1
 Letter Names  Test of
Early
 Phoneme
Numeracy
Segment
 Math
 Nonsense
CBM
Words
 Word Use
2
 Nonsense
Words
 Word Use
 Oral Reading
Math
CBM
Written
 Oral Reading
Expression  Word Use
CBM
Math
CBM
Written
Expression
CBM
3
Oral Reading
Math
CBM
Oral Reading
Math
CBM
4
Oral Reading
Math
CBM
Oral Reading
Math
CBM
5
Reading
Maze
Math
CBM
Reading
Maze
Math
CBM
6
Reading
Maze
Math
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Reading
Maze
Math
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Test of
Early
Numeracy
---
--
Test of
 Letter Names
Early
 Rhyming
Numeracy
 Phoneme
Segment
 Nonsense
Words
 Phoneme
 Test of
Segment
Early
Numeracy
 Nonsense
Words
 Math
CBM
 Word Use
 Oral Reading
Reading
End of Year
Math
Writing
 Letter Names
 Rhyming
 Phoneme
Segment
 Nonsense
Words
Written
 Phoneme
Expression
Segment
CBM
 Nonsense
Words
 Word Use
 Oral Reading
---
Test of
Early
Numeracy
---
Math
CBM
Written
Expression
CBM
 Oral
Reading
 Word Use
Math
CBM
Written
Expression
CBM
Oral Reading
Math
CBM
Oral Reading
Math
CBM
Reading
Maze
Math
CBM
Reading
Maze
Math
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Written
Expression
CBM
Note. All measures are fluency measures using standardized administration and scoring.
Reading measures are a combination of CBM, DIBELS and IGDIs measures
MN RtI Center
How do you decide what Measures to
Use for Screening?

Lots of ways to measure reading in the schools:









Measure of Academic Progress (MAP)
Guided Reading (Leveled Reading)
Statewide Accountability Tests
Published Curriculum Tests
Teacher Made Tests
General Outcome Measures (Curriculum-Based Measurement “family”)
STAR Reading
Etc
Not all of these are appropriate. Some are not
reliable enough for screening, others are designed for
another purpose and are not valid or practical for
screening all students 3x per year
MN RtI Center
16
Characteristics of An Effective
Measurement System for RTI
valid
inexpensive
reliable
easily understood
simple
can be given often
quick
sensitive to growth over
short periods of time
MN RtI Center
Credit: K Gibbons, M Shinn
17
Effective Screening Measures

Specific


Sensitive


Students who “pass” really do go on to do well
Practical


Identifies at risk students who really are at risk
Brief and simple (cheap is nice too)
Do no harm

If a student is identified as at risk will they get help or is it
just a label?
MN RtI Center
18
Reference: Hughes & Dexter, RTI Action Network
Buyer Beware!

Many tools may make claims about being a
good “screener”
MN RtI Center
19
Measurement and RTI: Screening




Reliability coefficients of at least r =.80. Higher is
better, especially for screening specificity.
Well documented predictive validity
Evidence the criterion (cut score) being used is
reasonable and creates not too many false positives
(students identified as at risk who aren’t) or false
negatives (students who are at risk who aren’t
identified as such)
Brief, easy to use, affordable, and results/reports are
accessible almost immediately
MN RtI Center
National Center for RTI
Review of Screening Tools
MN RtI Center
Note: Only reviews tests submitted, if it is not on the list
it doesn’t mean it is bad, just that it wasn’t reviewed
21
www.rti4success.org
RTI, General Outcome Measures and
Curriculum Based Measurement

Many schools use Curriculum Based Measurement (CBM)
general outcome measures for screening and progress
monitoring
 You don’t “have to “ use CBM, but many schools do

Most common CBM tool in Grades 1- 8 is Oral Reading Fluency
(ORF)
 Measure of reading rate (# of words correct per minute on a
grade level passage) and a strong indicator of overall reading
skill, including comprehension

Early Literacy Measures are also available such as Nonsense
Word Fluency (NWF), Phoneme Segmentation Fluency (PSF),
MN RtI Center
Letter
Name Fluency (LNF) and Letter Sound Fluency (LSF)
22
Why GOMs/CBM?

Typically meet the criteria needed for RTI screening
and progress monitoring



Reliable, valid, specific, sensitive, practical
Also, some utility for instructional planning (e.g., grouping)
They are INDICATORS of whether there might be a
problem, not diagnostic!



Like taking your temperature or sticking a toothpick into a cake
Oral reading fluency is a great INDICATOR of reading
decoding, fluency and reading comprehension
Fluency based because automaticity helps discriminate
between students at different points of learning a skill
MN RtI Center
23
GOM…CBM… DIBELS… AIMSweb…
MN RtI Center
DRAFT May 27, 2009
24
CBM Oral Reading Fluency



Give 3 grade-level passages using standardized admin and
scoring; use median (middle) score
3-second rule (Tell the student the word & point to next word)
Discontinue rule (after 0 correct in first row, if <10 correct on 1st
passage do not give other passages)
MN RtI Center
Errors
Not Errors
Hesitation for >3 seconds
Incorrect pronunciation
for context
Omitted Words
Words out of order
Repeated Sounds
Self-Corrects
Skipped Row
Insertions
Dialect/Articulation
25
Fluency and Comprehension
The purpose of reading is comprehension
A good measures of overall reading
proficiency is reading fluency because
of its strong correlation to measures
of comprehension.
MN RtI Center
Screening Logistics

What materials?

When to collect?

Who collects it?

How to enter and report the data?
MN RtI Center
27
What Materials?


Use computer or PDA-based testing system
-ORDownload reading passages, early literacy
probes, etc. from the internet



Many sources of CBM materials available free or
low cost: Aimsweb, DIBELS, edcheckup, etc.
Often organized as “booklets” for ease of use
Can use plastic cover and markers for scoring to
save copy costs
MN RtI Center
28
Screening Materials in K and Gr 1

Screening Measures will change from Fall to
Winter to Spring slightly

Early literacy “subskill” measurement is
dropped as reading develops

Downloaded materials and booklets
MN RtI Center
29
K and Gr 1 Measures
AIMSweb Early Literacy and R-CBM(ORF)
Fall
Letter
Naming
Letter
Sounds
Kindergarten
Winter
Letter
Naming
Letter
Sounds
Nonsense
Words
Spring
Letter
Naming
Letter
Sounds
Nonsense
Words
Fall
Letter
Naming
Letter
Sounds
Nonsense
Words
Grade 1
Winter
Spring
Nonsense
Words
R-CBM
Nonsense
Words
R-CBM
Rhyming
Alliteration
Picture
Naming
Phoneme
Phoneme
Phoneme
Phoneme
Phoneme
Segmentation Segmentation Segmentation Segmentation Segmentation
Picture
Picture
Naming
Naming
Word Use
Word Use
Word Use
Fluency
Fluency
Fluency
(optional)
(optional)
(optional)
MN RtI Center
General Literacy Risk Factor= Black, Alphabetic Principle = Green
Phonemic Awareness = Purple, Vocabulary = Blue
30
Fluency with Connected Text & Comprehension= Red
Gr 2 to 12: AIMSweb Early Literacy and
CBM Measures
Fall
Nonsense
Word
Fluency
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
Grade 2
Winter
Spring
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
Fall
R-CBM
R-Maze
(optional)
Fall
Grade 3
Winter
Spring
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
R-CBM
R-Maze
(optional)
Word Use
Fluency
(optional)
Grade 4-12+
Winter
R-CBM
R-Maze
(optional)
Spring
R-CBM
R-Maze
(optional)
MN RtI Center
31
Screening Logistics: Timing

Typically 3x per year: Fall, Winter, Spring



Have a district-wide testing window!
(all grades and schools collect data within the
same 2 week period)
In Fall K sometimes either test right away and
again a month later or wait a little while to test
Benchmark testing: about 5 minutes per
student (individually administered)


In the classroom
In stations in a commons area, lunchroom, etc.
MN RtI Center
32
Screening Logistics: People

Administered by trained staff



paras, special ed teachers, reading teachers,
general ed teachers, school psychologists, speech
language, etc.
Good training is essential!
Measurement person assigned in each building
to organize data collection

Either collected electronically or entered into a webbased data management tool by clerical staff
MN RtI Center
33
Screening Logistics Math Quiz 


If you have a classroom with 25 students
and to administer the screening measures
takes approx. 5 min. per student (individual
assessment time)…
How long would it take 5 people to “screen”
the entire classroom?
MN RtI Center
34
Remember:
Garbage IN…. Garbage OUT….

Make sure your data are reliable
and valid indicators or they won’t
be good for nuthin…



Training
Assessment Integrity
checks/refreshers
Well chosen tasks/indicators
MN RtI Center
35
Use Technology to Facilitate Screening
MN RtI Center
36
Using Technology to Capture Data

Collect the data using technology such as a PDA

Example: http://www.wirelessgeneration.com/
http://www.aimsweb.com

Students take the test on a computer

Example: STAR Reading
http://www.renlearn.com/sr/
MN RtI Center
37
Using Technology to Organize and
Report Data

Enter data into web-based data management system

Data gets back into the hands of the teachers and
teams quickly and in meaningful reports for problem
solving

Examples
 http://dibels.uoregon.edu
 http://www.aimsweb.com
 http://www.edcheckup.com
MN RtI Center
38
Screening is just one part of an overall
assessment system for making decisions
*Decision-Tree for Screening, Instructional Decision-Making, & Progress Monitoring with DIBELS
Did the student meet or exceed the Low Risk/Benchmark goals on the
most recent DIBELS testing?
YES
NO
Did the student fall into the "Some
risk" category or the "At-risk"
category? An intervention plan may
be needed.
Next progress check is
regularly scheduled DIBELS
testing for all students
Some
risk
Do other data (e.g., OS,
BMRR, DRA) indicate
some concern?
NO
Make sure a good curricula is
in place in the classroom and
consider monitoring monthly.
At
risk
Do other data (e.g., OS, BMRR,
DRA) indicate high level of
concern? (important here to get
good info)
YES
Put the student in strategic
instruction (e.g., small group
with supplemental curricula).
Be SURE TO CONTINUE TO
USE DATA to make changes
as needed. Monitor monthly.
NO
YES
Put the student in intensive instruction
(e.g., 1:1 or very small group with
supplemental and direct instruction
curricula). Be SURE TO CONTINUE
TO USE DATA to make changes as
needed. Monitor weekly!
MN RtI Center
*Note: The concept and content of this model was provided by Dr. Lisa Stewart of MSUM
39
Remember: Screening is part of a
problem-solving system


Helps identify students at-risk in a PROACTIVE way
Gives feedback to the system about how students progress
throughout the year at a gross (3x per year) level



If students are on track in the fall are they still on track in the
winter?
What is happening with students who started the year below
target, are they catching up?
Gives feedback to the system about changes from year to
year

Is our new reading curriculum having the impact we were
expecting?
MN RtI Center
40
Build in Time to USE the Data!
Schedule data “retreats” or grade level
meeting times immediately after screening
so you can look at and USE the data for
planning.
MN RtI Center
41
Common Mistakes







Not enough professional development and communication
about why these measures were picked, what the scores do
and don’t mean, the rationale for screening, etc
Low or questionable quality of administration and scoring
Too much reliance on a small group of people for data
collection
Teaching to the test
Limited sample of students tested (e.g., only Title students! )
Slow turn around on reports
Data are not used
MN RtI Center
42
Using Screening Data: See Module 2!
MN RtI Center
43
Articles available with this module






Stewart & Silberglitt. (2008). Best practices in developing academic local
norms. In A. Thomas & J. Grimes (Eds.) Best Practices in School Psychology,
V, NASP Publications.(pp. 225-242).
NCRLD RTI Manual (2006). Chapter 1: School-wide screening Retrieved
from http://www.nrcld.org/rti_manual/pages/RTIManualSection1.pdf 6/26/09
Jenkins & Johnson. Universal screening for reading problems: Why and how
should we do this? Retrieved 6/23/09, from RTI Action Network site:
http://www.rtinetwork.org/Essential/Assessment/Universal/ar/ReadingProblems
Kovaleski & Pederson (2008) Best practices in data analysis teaming. In A.
Thomas & J. Grimes (Eds.) Best Practices in School Psychology, V, NASP
Ikeda, Neessen, & Witt. (2008). Best practices in universal screening. In A.
Thomas & J. Grimes (Eds.) Best Practices in School Psychology, V, NASP
Publications.(pp. 103-114).
Gibbons, K (2008). Necessary Assessments in RTI. Retrieved from
http://www.tqsource.org/forum/documents/GibbonsPaper.doc on 6/26/09
MN RtI Center
44
RTI Related Resources

National Center on RTI


RTI Action Network – links for Assessment and
Universal Screening


http://www.scred.k12.mn.us/ and click on link
National Center on Student Progress Monitoring


http://www.rtinetwork.org
MN RTI Center


http://www.rti4success.org/
http://www.studentprogress.org/
Research Institute on Progress Monitoring
http://progressmonitoring.net/
 Center
MN RtI
45
RTI Related Resources (Cont’d)

National Association of School Psychologists


National Association of State Directors of Special
Education (NADSE)


www.nasdse.org
Council of Administrators of Special Education


www.nasponline.org
www.casecec.org
Office of Special Education Programs (OSEP) toolkit
and RTI materials

http://www.osepideasthatwork.org/toolkit/ta_responsiveness_in
tervention.asp
MN RtI Center
46
Key Sources for Reading Research,
Assessment and Intervention…

University of Oregon IDEA (Institute for the Development of
Educational Achievement) Big Ideas of Reading Site


Florida Center for Reading Research


http://www.texasreading.org/utcrla/
American Federation of Teachers Reading resources (what works
1999 publications)


http://www.fcrr.org/
Texas Vaughn Gross Center for Reading and Language Arts


http://reading.uoregon.edu/
http://www.aft.org/teachers/pubs-reports/index.htm#reading
National Reading Panel

http://www.nationalreadingpanel.org/
MN RtI Center
47
Recommended Sites with Multiple Resources

Intervention Central- by Jim Wright (school psych
from central NY)


Center on Instruction


http://www.interventioncentral.org
http://www.centeroninstruction.org
St. Croix River Education District

http://scred.k12.mn.us
MN RtI Center
48
Quiz

1.) A core feature of RTI is identifying a(n) _________
system.

2.) Collecting data on all or a targeted group of students
in a grade level or in the school is called what?




A.) Curriculum
B.) Screening
C.) Intervention
D.) Review
MN RtI Center
49
Quiz (Cont’d)

3.) What is a characteristic of an efficient
measurement system for RTI?





A.) Valid
B.) Reliable
C.) Simple
D.) Quick
E.) All of the above
MN RtI Center
50
Quiz (Cont’d)

4) Why screen students?

5) Why would general education teachers
need to be trained on the measures used if
they aren’t part of the data collection?
MN RtI Center
51
Quiz (Cont’d)


6) True or False? If possible the same tools
should be used for screening and progress
monitoring.
7.) List at least 3 common mistakes when
doing screening and how they can be
avoided.
MN RtI Center
The End 

Note: The MN RTI Center does not endorse any particular
product. Examples used are for instructional purposes only.

Special Thanks:


Thank you to Dr. Ann Casey, director of the MN RTI Center, for
her leadership
Thank you to Aimee Hochstein, Kristen Bouwman, and Nathan
Rowe, Minnesota State University Moorhead graduate
students, for editing work, writing quizzes, and enhancing the
quality of these training materials
MN RtI Center
Download