Progress Monitoring

advertisement
Progress Monitoring in an
RTI Model
April, 2011
Basic Goal of Assessment
•
The ultimate goal of assessment is to identify
problems with instruction and to lead to
instructional modifications. A good share of
present-day assessment activities consist of little
more than meddling…We must use assessment
data to improve instruction…The only way to
determine the effectiveness of instruction is to
collect data.
•
Ysseldyke and Algozzine (1995)
Curriculum-Based Measurement:
Introduction
Objectives
•
•
•
Learn levels of assessment & associated
purposes
Increase knowledge and skills in curriculumbased measurement (CBM)
Apply knowledge of CBM to diagnostic
assessment practices
◦ Learn can’t do/won’t do assessment
◦ Learn survey level assessment
Objectives (continued)
•
•
•
Learn goal setting formula
Apply graphing techniques
Analyze data
A + B = pC
Problem Solving Process
Define the Problem
Defining Problem/Directly Measuring Behavior
Evaluate
Response to
Intervention (RtI)
DATA
Implement Plan
Implement As Intended
Progress Monitor
Modify as Necessary
Problem Analysis
Validating Problem
I.D. Variables that
Contribute to Problem
Develop Plan
Response to
Intervention
RtI
is the practice of (1) providing high-quality
instruction/intervention matched to student needs
and (2) using learning rate over time and level of
performance to (3) make important educational
decisions (Batsche, et al., 2005)
Problem-solving is the process that is used to
develop effective instruction/interventions.
Anita Archer
• Relentless in education
• Ability and outcome
Formative vs.
Summative
•
“When the cook tastes the soup, that’s
formative. When the guests taste the
soup, that’s summative.” - Robert Stake
Benchmarking & Progress Monitoring
What is the difference?
•
Benchmark: aid in identification of students
at-risk; administered 3 times a year (fall,
winter, spring)
•
Progress Monitoring: used to track individual
students’ learning, plan instruction, and
provide feedback to students; administered
weekly or biweekly
Why CBM?
Benchmarking & Progress Monitoring
What is the difference?
•
Benchmark: aid in identification of students at-risk;
administered 3 times a year (fall, winter, spring)
•
Progress Monitoring: used to track individual
students’ learning, plan instruction, and provide
feedback to students; administered weekly or
biweekly
CBM – What is It?
•
•
•
•
◦
◦
◦
◦
◦
◦
Formative assessment
Measure of student performance over time
An analysis of specific skill on an individual
student
Tool
Identifying struggling students
Set goals
Align instruction with desired outcomes
Provides diagnostic information
Progress monitor
IEP development
What is the Goal of CBM?
•
Goal is two-fold:
• Monitor student progress
1.Inform instruction / teacher practice
Why Should I Do It?
•
•
•
•
•
•
Measure of class-wide performance
An alternative to other assessment
procedures– often replaces costly, timeconsuming, disruptive practices
Quick & Easy
Establishes reliability & validity
Direct low-inference measures
Can be easily summarized & presented
◦ Parents, students, colleagues
B D
I
P
B
D
I
P
Benchmarking
- Norms
Diagnostic Assessment
- Can’t do/Won’t Do
- Survey Level
Intervention Selection
- Standard Treatment (Tier II)
- Who, where, frequency, duration,
materials
Progress Monitoring &
Analysis
- Data collection
- Data Management
- Next Steps
Assessment and MTSS
Tier
3
Tier
2
Tier
1
Adapted from Burns & Riley-Tillman (2010)
Assessment and MTSS
•
Tier III – Identify discrepancy for individual.
Identify causal variable. Implement individual
intervention.
Tier
3
•
•
Tier
2
Tier II – Identify discrepancy for individual.
Identify category of problem. Assign small
group solution.
Tier I – Identify discrepancy between
expectation and performance for class or
individual.
Tier
1
Adapted from Burns & Riley-Tillman (2010)
How Does It Fit Together?
Step 1
All
students at
a grade
level
Winter
Step 3
Additional Diagnostic
Assessment
Instruction/Int
ervention
Individual
Diagnostic
Step 4
Results/Monitoring
Individual
Instruction
Weekly
Intensive
5%
Universal
Screening
Fall
Step 2
Group
Diagnostic
Small Group
Differentiated
by Skill
Targeted
15%
Spring
2x month
None
Universal
80%
Continue
with Core
Instruction
Benchmarks
Grades
Classroom
Assessments
Utah CRT
B - Benchmarking
• What - An indicator used to identify the
expected understandings and skills needed for
content standards by grade level - consider
norms
• When - typically 3x/year - predetermined
intervals (e.g., a mid-year benchmark).
• Who - all students
• How - timed leveled probes
What is Reading CBM?
•
•
•
•
One-minute probe (e.g., DIBELS, 6-min. Solution)
Administered individually
Provide intervention and progress monitor at
instructional level
Different measures
◦ Oral Reading Fluency (ORF)
◦ Maze (Comprehension)
◦ Early-reading (Initial Sound, Phoneme Segmentation,
Nonsense Word, Letter Naming Fluency)
(See Chapters 3 & 4 in ABCs of CBM)
Reading CBM - How?
•
•
•
•
•
•
•
•
•
Select appropriate material for probe
Place probe in front of and facing the student
Keep copy for the examiner (on clipboard)
Provide directions
Start timer
Have student perform task for allotted time (1 minute for
reading tasks)
Score probe
Display data on graph/chart
Video Clips . . . . Examples
QuickTime™ and a
YUV420 codec decompressor
are needed to see this picture.
Scoring Reading Probe
•
Oral Reading Fluency:
◦ Mark as correct
• # of words read correctly in one minute
◦ Mark as incorrect:
• Misread words
• Omissions
• Hesitations - words read by assessor (read after 3 seconds)
• Reversals – two or more words not read in order
(see page 146 in ABCs of CBM)
What is Math CBM?
•
Math CBM can be broken into three areas:
◦ Early numeracy
◦ Computation
◦ Concepts and Applications
•
Focus will be on administering, scoring, and
using CBM for Computation
Math CBM-How?
•
Math CBM is conducted having students answer
computational problems for two minutes
Count correct digits – NOT correct problems
Use standardized procedures
Have necessary materials
◦
◦
◦
◦
◦
◦
Different but equivalent math sheets
Directions for administration
Writing utensils
Stopwatch
Quiet environment
Scoring rules & procedures
•
•
•
Math CBM-How? (cont)
•
•
•
•
•
•
•
Select appropriate material for probe
Place probe in front of and facing the student
Can be administered individually or as an entire class
Provide directions
Start timer
Have student perform task for allotted time (2
minutes for math tasks)
Score probe
Curriculum-Based Measurement:
Introduction
Administration Practices
•
First time administration
◦ Have three equivalent math sheets
◦ Recommend doing it in one session
◦ Median score of three samples used for baseline (first
data point on graph)
•
Every time
◦
◦
◦
◦
Standardized procedures (see pg 156)
Count correct digits
Record on graph
Use for decision making
Scoring Procedures
•
Correct answers . . .
◦
◦
◦
◦
Correct
Correct
Correct
Correct
•
Errors
answers gets credit for longest method used
digits for unfinished problem
digits for reversed or rotated digits
digits for placeholder – any symbol!
◦ Mark with slash (/)
Expected Growth Rates & Norms
•
•
•
Benchmarks - Table 7.1 (p. 109)
Norms – Table 7.3 (p. 111)
Growth Rates – Table 7.1 (p. 109)
◦ greater progress is possible
•
If student doesn’t make adequate progress, it
doesn’t mean she lacks the ability to learn
math – it means instruction needs to be
changed!
Math CBM Sheets
•
•
•
•
Math often has a specific scope & sequence
that can vary from state to state, school to
school, or curriculum to curriculum
Educators often create math sheets linked to
the state’s core – progress monitoring aligned
with outcome measure!
Math sheets should have different problems
but equivalent difficulty
Can be purchased or created
Math Sheets (cont)
•
•
•
•
Items on sheet not in order presented in
curriculum
Similar item types are grouped diagonally –
assists when looking for patterns in student
responses (see page101)
Consider mixed vs. single skill
Single-skill sheet helpful for short-term
planning – used to gain some diagnostic
information & assist in decision making
CBM – Types & Their Purpose
•
•
•
(see page 15)
General Outcome Measures (GOMs) – used to
sample performance across several goals by
using “capstone” tasks that are complex
Skills-based Measures (SBMs) – used to screen,
progress monitor, & do survey-level assessment
where “capstones” not available
Mastery Measures (MM) – used on parts of
curriculum that contain discrete & easily
identified sets of items
Diagnostic - Can’t Do/Won’t Do
•
Purpose
◦ Determine motivation vs. skill deficit
•
Technique
◦ Administer same probe – add incentive
◦ Timing - Soon after benchmark/screener
• Decision Rules
• >=15% increase=motivation (Witt & Beck, 1999)
• <15% skill deficit
• Consider both
Diagnostic
Survey-Level Assessment
Purposes
◦To determine the appropriate instructional
placement level for the student.
• The highest level of materials that the student can be
expected to benefit from instruction.
◦To provide baseline data, or a starting point, for
progress monitoring
• In order to monitor progress toward a future goal, you
need to know how the student is currently performing.
Survey Level Assessment-Reading
1. Start with grade level passages/worksheets (probes)
2. Administer 3 separate probes (at same level of difficulty) using
standard CBM procedures.
3. Calculate the median score (i.e. the middle).
4. Is the student’s score within instructional range?
◦
Yes - This is the student’s instructional level.
◦
No - If above level (too easy), administer 3 probes at next level
of difficulty.
◦
No - If below level (too hard), administer 3 probes at previous
level of difficulty.
Reading CBM – Tasks by Grade
•
•
•
•
•
Kindergarten – Letter sound fluency (LSF)
Grade 1 – Oral reading fluency (ORF) and/or
word identification fluency (WIF)
Grade 2 – Oral reading fluency (ORF)
Grade 3 – Oral reading fluency (ORF)
Grade 4+ - Oral reading fluency (ORF) &
mazes
See Chapters 3 & 4
Reading CBM – Norms & Growth
(see pages 47 & 49)
•
Norms
◦ Compare student’s score to the performance of others
in her grade or at her instructional level
◦ Data collected on thousands of students – numbers are
very similar
•
Growth Rates
◦ Provide an indication of the average number of words
per week we would expect students to improve
◦ Not necessarily new words - students reading same
words at a faster rate each week
Survey Level Assessment-Math
1. Start with grade level math worksheets (probes)
2. Administer 3 separate probes (at same level of difficulty) using
standard CBM procedures.
3. Calculate the median score (i.e. the middle).
4. Is the student’s score within instructional range?
◦
Yes - This is the student’s instructional level.
◦
No - If above level (too easy), administer 3 probes at next level
of difficulty.
◦
No - If below level (too hard), administer 3 probes at previous
level of difficulty.
Math CBM – Tasks by Grade
•
•
•
•
•
•
Grade 1 – Addition, subtraction
Grade 2 – Addition, subtraction
Grade 3 – Addition, subtraction,
multiplication, division
Grade 4+ - Multiplication, division
All Grades – Mixed-math skills
Secondary – consider common assessments
based on state or district core
What if Student Data Doesn’t Reflect Adequate
Growth?
•
It is our obligation to fix the problem!
•
We do NOT lower expectations!
◦ Build up prerequisite skills
◦ Increase length of daily lesson
◦ Alter way we respond when error is made
“Learning is a result of instruction, so when the rate
of learning is inadequate, it doesn’t always mean
there is something wrong with the student. It does
mean the instruction needs to be changed to better
meet the student’s needs.” (p. 47)
How Often Should Data Be Collected?
Three considerations:
◦ 1. Purpose – screening vs. benchmarking
◦ 2. Importance of task – learning to read vs.
learning Roman numerals
◦ 3. Significance of problem – student’s
difficulty increases
need effective
instruction
need more frequent
monitoring
•
I - Intervention
P - Progress Monitoring
How to Set & Graph Goals
• 1.
End of Year Benchmarks
• 2. Norms - Levels of performance
• 3. Rate of progress – goal setting
◦ (# of weeks x growth rate) + median baseline
= goal
•Students with greatest deficits need
steepest slopes – more intense & effective
interventions
CBM – Easy To Display
•
One of the main benefits of CBM is the
data are easily displayed in graphs &
charts
◦ Standard Graph for CBM
line graph
◦ Vertical axis = skill being measured
◦ Horizontal axis = time or sessions
Graphing – What’s Included
•
•
•
•
•
•
•
•
•
•
•
Baseline
Y Axis Label
X Axis Label
Aim Line - Goal
Data Points
Intervention Line
Intervention Line Label
Change Line
Change Line Label
Level of material being used
Student demographics
Decision Rules: What is a “Good”
Response to Intervention?
Positive
Response

Gap is closing

Can extrapolate point at which target student(s) will “come in
range” of target--even if this is long range

Level of “risk” lowers over time
Questionable
Response

Rate at which gap is widening slows considerably, but gap is still
widening

Gap stops widening but closure does not occur
Poor

Response
Gap continues to widen with no change in rate.
Response to Intervention
Positive
Questionable
Performance
Expected Trajectory
Poor
Observed Trajectory
Time
Decision Rules: Linking RtI to
Intervention Decisions
Positive
Continue intervention with current goal

Continue intervention with goal increased

Fade intervention to determine if student(s)
have acquired functional independence.

Decision Rules: Linking RtI to
Intervention Decisions
Questionable

Was intervention implemented as intended?
If no - employ strategies to increase implementation
integrity


If yes 

Increase intensity of current intervention for a short period of time
and assess impact. If rate improves, continue. If rate does not
improve, return to problem solving.
Dual Discrepancy
Decision Rules: Linking RtI to
Intervention Decisions
Poor
 Was
intervention implemented as intended?

If no - employ strategies in increase implementation integrity

If yes -
Is intervention aligned with the verified hypothesis? (Intervention
Design)


Are there other hypotheses to consider? (Problem Analysis)

Was the problem identified correctly?
 Dual
Discrepancy
(Problem Identification)
Sample Graph
Peer
Reading
Sight word
practice
Using Data to Inform Instruction
Data Point Analysis – “If-Then Rules”
◦ 3-4 successive data points above the aim line
– move on (add “weight”)
◦ 3-4 successive data points below the aim line
– change intervention to boost learning
◦ 3-4 successive data points lie around the aim
line – make no changes
•
Graph – Analyze
Graph – Analyze
Graphing the Data
•
•
•
•
By hand
5-Clicks in Excel
www.InterventionCentral.com
www.updc.org
◦ CBM Focus
◦ PM Focus
Evidence-Based Education
Adapted from Burns & Riley-Tillman (2010)
CBM Data & IEP Development
•
•
•
•
CBM data are an excellent source for writing
goals & objectives
CBM data provide information on specific
student skills (i.e. ORF, math facts)
CBM data are sensitive to student improvement
CBM data allow teachers to make instruction
decisions
Planning & Using CBM
10 Steps for using CBM
• 1. Who will be using CBM?
• 2. Which skills of CBM will be implemented?
• 3. What materials will be used?
• 4. When will implementation start?
• 5. Who will train the staff?
• 6. Who will manage the materials?
• 7. Who will collect the data?
• 8. Where will the data be collected?
• 9. Who will manage the data once collected?
• 10. How will data be shared?
(Checklist for Using CBM – pp. 160-161)
Additional Resources
•
•
•
•
•
•
•
•
•
•
•
•
•
Updc.org
The ABC’s of CBM
One-Minute Academic Functional Assessment and Interventions: “Can’t”
Do It…or “Won’t” Do It?
Functional Assessments: A Step-by-Step Guide to Solving Academic
and Behavior Problems
Implementing Response-to-Intervention in Elementary and Secondary
Schools (Burns & Gibbons, 2008)
I’ve DIBEL’d, Now What?
AIMSweb.com
Interventioncentral.org
Pre-Referral Intervention Manual
School Problem Solving Teams
What Works Clearinghouse: http: //ies.ed.gov/ncee/wwc/
www.Studentprogress.org
www.CBMnow.org
Download