Response to intervention

advertisement
Response to Intervention
RTI Data Challenge:
Setting Individual RTI
Academic Goals Using
Research Norms for
Students Receiving ‘OffLevel’ Interventions
Source: Shapiro, E. S. (2008). Best practices in setting progressmonitoring monitoring goals for academic skill improvement. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology V
(pp. 141-157). Bethesda, MD: National Association of School
Psychologists.
www.interventioncentral.org
Response to Intervention
Setting Individual Student RTI Academic
Goals Using Research Norms
To set a goal for student academic performance, four
elements are needed:
• The student’s baseline academic performance. Prior to
starting the intervention, the teacher calculates
baseline performance by assessing the target student
several times with the academic measure that will be
used to measure that student’s progress once the
intervention begins.
• Estimate of ‘typical’ peer performance. The teacher
has a reliable estimate of expected or typical peer
performance on the academic measure that will be
used to measure the target student’s progress.
www.interventioncentral.org
2
Response to Intervention
Setting Individual Student RTI Academic
Goals Using Research Norms
To set a goal for student academic performance, four
elements are needed (cont.):
• Estimate of expected weekly progress. The teacher
selects a rate of weekly academic progress that the
target student is expected to attain if the intervention is
successful.
• Number of weeks for the intervention trial. The teacher
decides on how many weeks the RTI intervention will
last, as the cumulative, final academic goal can be
calculated only when the entire timespan of the
intervention is known.
www.interventioncentral.org
3
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
1. Comparing Student Performance to Benchmarks and
Flagging Extreme Discrepancies. The student is
administered reading fluency probes equivalent to his or her
current grade placement and the results are compared to
peer norms. If the student falls significantly below the level of
peers, he or she may need additional assessment to
determine whether the student is to receive intervention and
assessment ‘off grade level’.
www.interventioncentral.org
4
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
In January, Mrs. Chandler, a 4th-grade teacher,
receives her classwide reading fluency screening
results. She notes that a student who has recently
transferred to her classroom, Randy, performed at 35
Words Read Correct (WRC) on the 1-minute
AIMSweb Grade 4 fluency probes.
Mrs. Chandler consults AIMSweb reading-fluency
research norms and finds that a reasonable minimum
reading rate for students by winter of grade 4 (25th
percentile) is 89 WRC.
www.interventioncentral.org
5
Response to Intervention
Example of Progress-Monitoring
Off-Level: Randy
AIMSweb Norms:
‘Typical’ reader (25th
percentile) in Gr 4 at
mid-year (winter
norms): 89 WRC
Target Student Randy:
35 WRC
Conclusion: Randy’s
grade-level
performance is in the
‘frustration’ range.
He requires a SurveyLevel Assessment to
find his optimal
Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate:
2006-2007 School
Year
‘instructional’
level.
www.interventioncentral.org
6
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
2. Conducting a Survey Level Assessment (SLA). For
students with large discrepancies when compared to
benchmarks, the teacher conducts a SLA to determine the
student’s optimal level for supplemental intervention and
progress-monitoring.
•
•
The teacher administers AIMSweb reading probes from
successively earlier grade levels and compares the student’s
performance to the benchmark norms for that grade level.
The student’s ‘instructional’ level for intervention is the first
grade level in which his reading-fluency rate falls at or above
the 25th percentile according to the benchmark norms.
www.interventioncentral.org
7
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
Because Randy’s reading fluency rate is so far below
the grade-level norms (a gap of 54 WRC), his teacher
decides to conduct a Survey Level Assessment to find
the student’s optimal grade level placement for
supplemental reading instruction.
www.interventioncentral.org
8
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
9
On
GradeLevel
3-level
2-level
Survey
probes,
Randy attains
Assessment.
The a
teacherscore
conducts
median
of 64
48a
Survey Level
WRC.
Assessment with
The
Randy,
AIMSweb
assessing
winter
him
th percentile)
norm
using(25
CBM
reading
rd grade
for
fluency
a 23nd
grade
probesstudent
student
from
issuccessively
69 WRC. earlier
53
grades until he
The
performs
student
at isornow
still
above
inin
the‘instructional’
25th percentile
the
‘frustration’
range
and
range
according
theand
Survey
the
to the
Survey
Level
Assessment
Level
Assessment
continues.
AIMSweb
norms.
ends.
Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: 2006-2007 School Year
www.interventioncentral.org
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
3. Selecting a Progress-Monitoring Goal. To set a
progress-monitoring goal, the teacher looks up the
benchmark WRC for the 50th percentile at the
student’s off-level ‘instructional’ grade level previously
determined through the Survey Level Assessment.
www.interventioncentral.org
10
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
11
Goal-Setting.
To find
the progressmonitoring goal for
Randy, his teacher
looks up the
benchmark WRC for
the 50th percentile at
Grade 2 (his off-level
‘instructional’ grade
level)—which is 79
WRC.
This becomes the
progress-monitoring
goal for
the School
student.
Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate:
2006-2007
Year
www.interventioncentral.org
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
4. Translating a Progress-Monitoring Goal into
Weekly Increments.
The teacher’s final task before starting the progressmonitoring is to translate the student’s ultimate
intervention goal into ‘ambitious but realistic’ weekly
increments.
One useful method for determining weekly growth rates
is to start with research-derived growth norms and to
then use a ‘multiplier’ to make the expected rate of
weekly growth more ambitious.
www.interventioncentral.org
12
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
4. Translating a Progress-Monitoring Goal into Weekly
Increments. (Cont.)
• The teacher first looks up the average rate of weekly student
growth supplied in the research norms. (NOTE: If available, a
good rule of thumb is to use the growth norms for the 50th
percentile at the ‘off-level’ grade at which the student is
receiving intervention and being monitored.)
• The teacher then multiplies this grade norm for weekly growth
by a figure between 1.5 and 2.0 (Shapiro, 2008). Because the
original weekly growth rate represents a typical rate student
improvement, using this multiplier to increase the target
student’s weekly growth estimate is intended accelerate
learning and close the gap separating that student from peers.
www.interventioncentral.org
13
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
14
Randy’s ultimate goal is 79 WRC (the
50th percentile norm for grade 2).
Determining
Weekly
Rate
of Improvement
During
the Survey
Level
Assessment,
(ROI). Randy
is totoberead
monitored
onat the
Randy
was found
64 WRC
nd grade level.
at grade 2. The teacher finds—
2intervention
according to AIMSweb norms—that a typical
studentisina Grade
2 (at
percentile)
There
15-WRC
gaptheto50
bethclosed
to has
a rate
of improvement
get
Randy
to his goal. of 1.1 WRC per week.
She2 multiplies
1.1 per
WRC
figure
At
additionalthe
WRC
week
on by 1.8
(teacher judgment)
obtainclose
a weekly
growth
intervention,
Randy toshould
the gap
goal forabout
Randy
of about 2.0weeks.
additional WRCs.
within
8 instructional
Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: 2006-2007 School Year
www.interventioncentral.org
Response to Intervention
How to Set a Goal for an ‘Off-Level’ Intervention
5. Advancing the Student to Higher Grade Levels for
Intervention and Progress-Monitoring
The teacher monitors the student’s growth in reading fluency
at least once per week (twice per week is ideal).
• When the student’s reading fluency exceeds the 50th
percentile in Words Read Correct for his or her ‘off-level’
grade, the teacher reassesses the student’s reading fluency
using AIMSweb materials at the next higher grade.
• If the student performs at or above the 25th percentile on
probes from that next grade level, the teacher advances the
student and begins to monitor at the higher grade level.
• The process repeats until the student eventually closes the
gap with peers and is being monitored at grade of placement.
www.interventioncentral.org
15
Response to Intervention
Example of Progress-Monitoring Off-Level: Randy
16
Advancing the Student to
Higher Grade Levels (Cont.).
So Mrs. Chandler assesses
Randy
on AIMSweb
reading
Advancing
the Student
to
fluency
probes Levels
for Grade
Higher Grade
of 3 and
finds
that he reads on average
72
Progress-Monitoring.
His
WRC
—exceeding
the Grade
teacher,
Ms. Chandler,
notes 3
25
cut-off
thatth percentile
after 7 weeks
of of 69 WRC.
intervention, Randy is now
Therefore,
is advanced to
reading 82 Randy
WRC—exceeding
Grade
progress-monitoring
and
the 79 3WRC
for the 50th
his
intervention
materials
are
percentile
of students
in Grade
adjusted
2 (winter accordingly.
norms).
Source: AIMSweb® Growth Table Reading-Curriculum Based Measurement: Multi-Year Aggregate: 2006-2007 School Year
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
17
Response to Intervention
How to Set a Student Goal: Example (Cont.)
The reading intervention planned for Randy would last
8 instructional weeks. Mrs. Chandler consulted the
research norms and noted that a typical rate of growth
in reading fluency for a 4th-grade student is 0.9
additional words per week.
Mrs. Chandlers adjusted the 0.9 word growth rate for
Randy upward by multiplying it by 1.5 because she
realized that he needed to accelerate his learning to
catch up with peers. When adjusted upward, the
weekly growth rate for Randy increased from 0.9 to
1.35 additional words per minute.
www.interventioncentral.org
18
Response to Intervention
How to Set a Student Goal: Example (Cont.)
Multiplying the expected weekly progress of 1.35
additional words by the 8 weeks of the intervention,
Mrs. Chandler found that Randy should acquire at
least 11 additional words of reading fluency by the
conclusion of the intervention.
She added the 11 words per minute to Randy’s
baseline of 70 words per minute and was able to
predict that—if the 8-week intervention was
successful—Randy would be able to read
approximately 81 words per minute.
www.interventioncentral.org
19
Response to Intervention
How to Set a Student Goal: Example (Cont.)
Because Randy would not be expected to fully close
the gap with peers in 8 weeks, Mrs. Chandler
regarded her intervention goal of 81 words per minute
as an intermediate rather than a final goal.
However, if the intervention was successful and the
student continued to add 1.35 words per week to his
reading fluency, he could be expected to reach an
acceptable level of fluency soon.
www.interventioncentral.org
20
Response to Intervention
How to Monitor a Student Off-Level
• Conduct a ‘survey level’ assessment of the student to find their
highest ‘instructional’ level (between 25th and 50th percentile).
• Student is monitored at ‘off level’ during the intervention (e.g.,
weekly). The ‘goal’ is to move the student up to the 50th percentile.
• Once per month, the student is also assessed at grade level to
monitor grade-appropriate performance.
• When the student moves above 50th percentile on off-level, the
interventionist tests the student on the next higher level. If the
student performs above the 25th percentile on the next level,
monitoring starts at the new, higher level.
www.interventioncentral.org
21
Response to Intervention
Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33].
Eugene, OR: University of Oregon.
www.interventioncentral.org
22
Response to Intervention
Activity: Academic Goal-Setting
At your tables:
• Review the guidelines for academic goal-setting using
research-based norms.
• How can you promote the use of this goal-setting approach
in your school?
www.interventioncentral.org
23
Response to Intervention
www.interventioncentral.org
24
Response to Intervention
How to Set a Student Goal
Determining Weekly Growth Rate: Method 2:
If research norms with ‘ambitious’ rates of student growth are
available, these can be used to determine the student’s
weekly expected Rate of Improvement.
www.interventioncentral.org
25
Response to Intervention
Example of Research-Based Norms for Weekly
Reading Fluency Growth
Predictions for Rates of Reading Growth by Grade
(Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993)
Increase in Correctly Read Words Per Minute for Each Instructional Week
Grade Level
Realistic Weekly Goal Ambitious Weekly
Goal
Grade 1
2.0
3.0
Grade 2
1.5
2.0
Grade 3
1.0
1.5
Grade 4
0.85
1.1
Grade 5
0.5
0.8
Grade 6
0.3
0.65
www.interventioncentral.org
26
Response to Intervention
www.interventioncentral.org
27
Response to Intervention
www.interventioncentral.org
28
Response to Intervention
RTI Lab: Creating District
Decision Rules for Analyzing
RTI Data to Determine LD
Eligibility
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
RTI Data & LD Determination: Agenda…
1. Learning Disabilities in the Age of RTI:
Introduction
2. Analyzing Student Academic Risk: Performance
Level and Rate of Improvement
3. Evaluating a Student’s ‘Non-Responder’ Status: A
Comprehensive Checklist
4. Developing Your District’s Decision Rules for
Using RTI Data to Determine ‘Non-Response’
Status: First Steps
www.interventioncentral.org
Response to Intervention
“
The process by which public schools identify
students as learning disabled often appears to be
confusing, unfair, and logically inconsistent. In fact,
G. Reid Lyon of the National Institute of Child
Health and Human Development has suggested
that the field of learning disabilities is a sociological
sponge whose purpose has been and is to clean
up the spills of general education. (Gresham,
2001)
”
Source: Gresham, F. M.. (2001). Responsiveness to intervention: An alternative approach to the identification of learning
disabilities. Paper presented at the Learning Disabilities Summit, Washington DC.
www.interventioncentral.org
31
Response to Intervention
RTI & Special Education
Eligibility
www.interventioncentral.org
Response to Intervention
Special Education Eligibility & RTI:
Establishing Confidence at Every Link
• Special Education Eligibility Teams review
the CUMULATIVE RTI information collected in general
education (‘intervention audit’).
• If that Team lacks confidence in any one of the links in the
RTI chain, it will be difficult to identify the student as an RTI
‘non-responder’
• The goal of this workshop is to help schools to identify each
link in the RTI chain and to know how to measure the
quality of that link.
www.interventioncentral.org
33
Response to Intervention
RTI Assumption: Struggling Students Are ‘Typical’
Until Proven Otherwise…
RTI logic assumes that:
– A student who begins to struggle in general education is typical,
and that
– It is general education’s responsibility to find the instructional
strategies that will unlock the student’s learning potential
Only when the student shows through well-documented
interventions that he or she has ‘failed to respond to
intervention’ does RTI begin to investigate the possibility
that the student may have a learning disability or other
special education condition.
www.interventioncentral.org
34
Response to Intervention
RTI ‘Pyramid of
Interventions’
Tier 3
Tier 2
Tier 1
Tier 3: Intensive interventions.
Students who are ‘nonresponders’ to Tiers 1 & 2 are
referred to the RTI Team for more
intensive interventions.
Tier 2 Individualized
interventions. Subset of
students receive interventions
targeting specific needs.
Tier 1: Universal interventions.
Available to all students in a
classroom or school. Can consist
of whole-group or individual
strategies or supports.
www.interventioncentral.org
35
Response to Intervention
What previous approach to diagnosing
Learning Disabilities does RTI replace?
Prior to RTI, many states used a ‘Test-Score Discrepancy Model’ to
identify Learning Disabilities.
• A student with significant
academic delays would be
administered an battery of
tests, including an
intelligence test and
academic achievement test(s).
• If the student was found to
have a substantial gap between a higher IQ score and lower
achievement scores, a formula was used to determine if that gap
was statistically significant and ‘severe’.
• If the student had a ‘severe discrepancy’ [gap] between IQ and
achievement, he or she would be diagnosed with a Learning
Disability.
www.interventioncentral.org
36
Response to Intervention
Avg Classroom Academic
Performance Level
Discrepancy 1: Skill Gap
Discrepancy 2:
(Current
Gap in Rate of
Performance Level)
Learning (‘Slope
Target
of Improvement’)
Student
‘Dual-Discrepancy’: RTI Model
of Learning Disability (Fuchs 2003)
www.interventioncentral.org
37
Response to Intervention
Current NYS Definition of ‘Learning Disabled’
www.interventioncentral.org
38
Response to Intervention
RTI Information: What It Does and Doesn’t Do
• The primary purpose for the special education eligibility
team to evaluate general-education RTI information is
to rule out instructional explanations for the struggling
student’s academic concerns.
• RTI information does not in and of itself provide detailed
information to allow schools to draw conclusions about
a student’s possible neurological differences that make
up the construct ‘learning disabilities’.
• Therefore, RTI information allows for a rule-out (the
learning problem resides within the student, not the
classroom) but does not in and of itself provide positive
evidence of a learning disability.
www.interventioncentral.org
39
Response to Intervention
Using RTI Information to Identify the ‘NonResponding’ Student: Goodbye, Gate
• As a special education eligibility team adopts a process for
evaluating a student’s RTI information as a ‘non-responder’ to
intervention as part of an evaluation for learning disabilities,
the team will discover that there is no longer a single
‘actuarial number’ or gate to determine ‘risk’ of LD in the
manner of a test score discrepancy analysis.
• Therefore, the special education eligibility team must have
confidence in the quality of the intervention and assessment
programs available to the struggling student in the general
education setting.
• Today’s workshop is about increasing that level of confidence.
www.interventioncentral.org
40
Response to Intervention
Team Activity: What Are Your Major
Challenges in Using RTI Data to Help to
Determine Special Education Eligibility?
What are the major challenge(s) that
your school or district faces as you
make the transition to using RTI data to
help to make special education
eligibility decisions?
www.interventioncentral.org
41
Response to Intervention
Evaluating a Student’s ‘NonResponder’ Status: An RTI
Checklist
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: An RTI Checklist
Interventions: Evidence-Based & Implemented With Integrity
• Tier 1: High-Quality Core Instruction
• Tier 1: Classroom Intervention
• Tier 2 & 3 Interventions: Minimum Number & Length
• Tier 2 & 3 Interventions: Essential Elements
• Tier 1, 2, & 3 Interventions: Intervention Integrity
www.interventioncentral.org
44
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: Activity
At your table: Review these ‘RTI Non-Responder’
elements.
• Tier 1: High-Quality Core Instruction
• Tier 1: Classroom Intervention
• Tier 2 & 3 Interventions: Minimum Number &
Length
• Tier 2 & 3 Interventions: Essential Elements
• Tier 1, 2, & 3 Interventions: Intervention Integrity
Select the element that you see as your school or
district’s greatest challenge. Brainstorm ideas to
positively address that challenge.
www.interventioncentral.org
45
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: An RTI Checklist
Academic Screenings: General Outcome Measures and SkillBased Measures
• Selection of Academic Screening Measures
• Local Norms Collected via Gradewide Academic
Screenings at Least 3 Times Per Year
www.interventioncentral.org
51
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: An RTI Checklist
Dual Discrepancy Cut-Offs: Academic Skill Level and Student
Rate of Improvement
• Cut-point Established to Define ‘Severely Discrepant’
Academic Performance
• Cut-Off Criterion Selected to Define Discrepant Slope
www.interventioncentral.org
54
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: An RTI Checklist
Data Collection
• Use of Both ‘Off-Level’ and Enrolled Grade-Level
Benchmarks & Progress-Monitoring Measures to
Assess Student Skills and Growth
• Student Baseline Calculated
• Student Goal Calculated
• Regular Progress-Monitoring Conducted
www.interventioncentral.org
57
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: An RTI Checklist
Application of RTI Decision Rules to a Particular Student Case
• Despite the Tier 2/3 Interventions Attempted, the
Student’s Skills Continue to Fall Below the Boundary of
‘Severely Discrepant’ Academic Performance
• Despite the Tier 2/3 Interventions Attempted, the
Student’s Rate of Improvement (Slope) Continues to Be
Discrepant
www.interventioncentral.org
62
Response to Intervention
www.interventioncentral.org
Response to Intervention
Special Education Eligibility Team & RTI
Information: Recommendations
• Create guidelines for general education to use to
determine whether a student is a ‘non-responder’ under
RTI. NOTE: Such guidelines are for the sole use of
general education and should not be interpreted as RTI
‘special education eligibility criteria’.
• Create a checklist for schools to collect, collate, and
‘package’ RTI information for presentation to the
Special Education Eligibility Team.
www.interventioncentral.org
64
Response to Intervention
Special Education Eligibility Team & RTI
Information: Recommendations (Cont.)
• RTI information should be reviewed prior to the initial LD
eligibility meeting. If there is questionable information, the
Special Education Eligibility Team should contact the
school to clarify questions.
• At the actual eligibility meeting, any concerns or questions
about the RTI information should be framed in neutral
terms and tied to the dual discrepancy RTI LD model.
Whenever possible, schools should not feel ‘blamed’ for
shortcomings of RTI information and should feel that the
identification process is transparent.
www.interventioncentral.org
65
Response to Intervention
Special Education Eligibility Team & RTI
Information: Recommendations (Cont.)
It should be an expectation that at eligibility meetings:
• The Special Education Eligibility Team can ask for
clarification of any building RTI information presented
• The Team is able to articulate how it interprets information
and why it reaches its decision.
www.interventioncentral.org
66
Response to Intervention
Evaluating a Student’s ‘Non-Responder’
Status: Activity
At your table: Discuss how your school or district may
use the document Evaluating a Student’s ‘NonResponder’ Status: An RTI Checklist to:
• increase compliance at every link in the ‘RTI chain’
• develop specific decision rules for determining
whether a student referred for a possible Learning
Disability is a ‘non-responder’ to intervention
Be prepared to share the main points of your
discussion with the large group.
www.interventioncentral.org
67
Response to Intervention
Assessing Intervention Integrity
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
Why Assess Intervention Integrity?
When a struggling student fails to respond adequately to a
series of evidence-based interventions, that student is
likely to face significant and potentially negative
consequences, such as failing grades, long-term
suspension from school, or even placement in special
education.
It is crucial, then, that the school monitor the integrity with
which educators implement each intervention plan so that
it can confidently rule out poor or limited intervention
implementation of the intervention as a possible
explanation for any student’s ‘non-response’.
www.interventioncentral.org
69
Response to Intervention
Intervention Integrity Check: Direct Observation
Intervention integrity is best assessed through direct
observation (Roach & Elliott, 2008).
– The key steps of the intervention are defined and
formatted as an observational checklist.
– An observer watches as the intervention is
conducted and checks off on the checklist those
steps that were correctly carried out. The observer
then computes the percentage of steps correctly
carried out.
www.interventioncentral.org
70
Response to Intervention
Limitations of Direct Observation as an Intervention Integrity Check
• Direct observations are time-consuming to conduct.
• Teachers who serve as interventionists may at least
initially regard observations of their intervention
implementation as evaluations of their job performance,
rather than as a child-focused RTI “quality check”.
• An intervention-implementation checklist typically does
not distinguish between--or differentially weight--those
intervention steps that are more important from those
that are less so. If two teachers implement the same 10step intervention plan, for example, with one instructor
omitting a critical step and the other omitting a fairly
trivial step, both can still attain the same implementation
score of steps correctly completed.
Source: Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to
intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and
practice of assessment and intervention (pp. 244-251).
www.interventioncentral.org
71
Response to Intervention
‘Yes/No’ Stepby-Step
Intervention
Check
Each Step
Marked
‘Negotiable
or ‘NonNegotiable’
Intervention Script
Builder
www.interventioncentral.org
Response to Intervention
Supplemental Methods to Collect Data About Intervention Integrity
Teacher Self-Ratings: As a form of self-monitoring,
directing interventionists to rate the integrity of their own
interventions may prompt higher rates of compliance
(e.g., Kazdin, 1989). However, because teacher selfratings tend to be ‘upwardly biased (Gansle & Noell,
2007, p. 247), they should not be relied upon as the sole
rating of intervention integrity. One suggestion for
collecting regular teacher reports on intervention
implementation in a convenient manner is to use Daily
Behavior Reports (DBRs; Chafouleas, Riley-Tillman,, &
Sugai, 2007).
Sources:
Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention and
instruction. New York: Guilford Press.
Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to
intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and
practice of assessment and intervention (pp. 244-251).
Kazdin, A. E. (1989). Behavior modification in applied settings (4th ed.). Pacific Gove, CA: Brooks/Cole..
www.interventioncentral.org
73
Response to Intervention
Teacher
Intervention
Integrity Self-Rating
Intervention Contact
Log
www.interventioncentral.org
74
Response to Intervention
Supplemental Methods to Collect Data About Intervention Integrity
Intervention Permanent Products: If an intervention plan
naturally yields permanent products (e.g., completed
scoring sheets, lists of spelling words mastered,
behavioral sticker charts), these products can be
periodically collected and evaluated as another indicator
of intervention integrity (Gansle & Noell, 2007).
Source:
Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to
intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and
practice of assessment and intervention (pp. 244-251).
www.interventioncentral.org
75
Response to Intervention
Intervention Integrity: Verify Through a Mix of Information Sources
Schools should consider monitoring intervention integrity
through a mix of direct and indirect means, including
direct observation and permanent products (Gansle &
Noell, 2007), as well as interventionist self-ratings (Roach
& Elliott, 2008).
Source:
Gansle, K. A., & Noell, G. H. (2007). The fundamental role of intervention implementation in assessing response to
intervention. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Response to intervention: The science and
practice of assessment and intervention (pp. 244-251).
Roach, A. T., & Elliott, S. N. (2008). Best practices in facilitating and evaluating intervention integrity. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp.195-208).
www.interventioncentral.org
76
Response to Intervention
‘Selecting Methods to Track Intervention Integrity’…
www.interventioncentral.org
77
Response to Intervention
‘Selecting Methods to Track Intervention Integrity’…
www.interventioncentral.org
78
Response to Intervention
‘Selecting Methods to Track Intervention Integrity’…
www.interventioncentral.org
79
Response to Intervention
‘Selecting Methods to Track Intervention Integrity’…
www.interventioncentral.org
80
Response to Intervention
Team Activity: Measuring ‘Intervention Follow-Through’
At your table:
•
Brainstorm ways that your school or
district will use to measure intervention
integrity for math and writing
interventions.
•
What preparations are necessary to
introduce these methods for measuring
‘intervention follow-through’ to your
faculty?
www.interventioncentral.org
81
Response to Intervention
Student Academic Performance:
How to Determine ‘Discrepancy’
www.interventioncentral.org
Response to Intervention
Avg Classroom Academic
Performance Level
Discrepancy 1: Skill Gap
Discrepancy 2:
(Current
Gap in Rate of
Performance Level)
Learning (‘Slope
Target
of Improvement’)
Student
‘Dual-Discrepancy’: RTI Model
of Learning Disability (Fuchs 2003)
www.interventioncentral.org
83
Response to Intervention
Definition: Parallax
“an apparent displacement or difference in the apparent
position of an object viewed along two different lines of
sight, and is measured by the angle or semi-angle of
inclination between those two lines.”
Source: Parallax. (2010, August 11). In Wikipedia, The Free Encyclopedia. Retrieved 09:33, August 13, 2010, from
http://en.wikipedia.org/w/index.php?title=Parallax&oldid=378400891
www.interventioncentral.org
84
Response to Intervention
Parallax and Academic Norms
When comparing the academic performance of a struggling
student to peers, schools should pay attention to both external
(research) academic norms and local norms. This dual
perspective can simultaneously compare the student’s skills to
‘national’ norms as well as determine how discrepant the
student’s skills are when compared to other children in his or
her classroom.
Source: Shapiro, E. S. (2008). Best practices in setting progress-monitoring monitoring goals for academic skill improvement. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 141-157). Bethesda, MD: National Association of School
Psychologists.
www.interventioncentral.org
85
Response to Intervention
Defining ‘Discrepant’ Academic Performance: Do
We Use External Norms or Local Norms?
External (Research or Benchmark) Norms: Used to
compare the performance of a student or instructional
program to objective external/research/national norms.
External norms can help to answer these questions:
– Is the school’s core program successful (comparison of
local to research norms)?
– Is a child performing at a minimum level of competency
in the academic skill to allow us to predict future
success?
– What objective academic performance cut-off should be
set to determine student entry into and exit from Tier 2
and 3 intervention programs?
www.interventioncentral.org
86
Response to Intervention
Defining ‘Discrepant’ Academic Performance: Do
We Use External Norms or Local Norms?
Local Norms: Rank-ordered compilation of scores of students
within a particular grade level/school. Local norms are
used to help answer these questions:
– What is the typical range of student ability in the grade
level or school?
– How is a particular student performing relative to other
children in the grade level or school?
– How much effort must a teacher exert to instruct this
student relative to other students in the class?
www.interventioncentral.org
87
Response to Intervention
Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students
Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data
31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112
115 118 118 131
LOCAL NORMS EXAMPLE: Twenty-three 4th-grade students
were administered oral reading fluency Curriculum-Based
Measurement passages at the 4th-grade level in their school.
In their current number form, these data are not easy to
interpret.
So the school converts them into a visual display—a boxplot —to show the distribution of scores and to convert the
scores to percentile form.
When Billy, a struggling reader, is screened in CBM reading
fluency, he shows a SIGNIFICANT skill gap when compared to
his grade peers.
www.interventioncentral.org
88
Response to Intervention
Baylor Elementary School : Grade Norms: Correctly Read Words Per Min : Sample Size: 23 Students
January Benchmarking
Group Norms: Correctly Read Words Per Min: Book 4-1: Raw Data
31 34 34 39 41 43 52 55 59 61 68 71 74 75 85 89 102 108 112
115 118 118 131
National Reading
Norms: 112 CRW
Per Min
Group Norms: Converted
to Box-Plot
Median (2nd Quartile)=71
1st Quartile=43
3rd Quartile=108
Billy=19
Hi Value=131
Low Value=31
0
20
40
Source: Tindal, G.,
Hasbrouck, J., &
Jones, C. (2005).Oral
reading fluency: 90
years of
measurement
[Technical report
#33]. Eugene, OR:
University of Oregon.
60
80
100
120
140
160
Correctly Read Words-Book 4-1
www.interventioncentral.org
89
Response to Intervention
How Do We Define ‘Discrepant Academic
Performance’ as One of the RTI ‘Dual Discrepancies’?
Using norms (benchmarks or local), the Special
Education Eligibility Team can set a cut-point, below
which a student is considered ‘severely discrepant’ from
their peers in a specific academic skill.
www.interventioncentral.org
90
Response to Intervention
How Do We Define ‘Discrepant Academic Performance’
as One of the RTI ‘Dual Discrepancies’? (Cont.)
Based on either external benchmarks or building-wide screening
results, schools need to quantify the lower and upper range of
academic performance that identifies a student as requiring Tier
2 or 3 supplemental intervention services.
A student with screening results:
– below the LOW cut-point is defined as having ‘severely
discrepant’ academic skills when compared with these local
norms and would benefit from Tier 3 intervention services.
– between the LOW and HIGH values is at lesser academic
risk and would benefit from Tier 2 intervention services.
– above the HIGH value does not require supplemental
interventions.
www.interventioncentral.org
91
Response to Intervention
RTI Tiers
EasyCBM Norms: Selecting Performance
‘Cut-Points’ for Tier 2/3 Services
51 WPM
31 WPM
1
2
3
20%ile
10%ile
Source: EasyCBM: (2010). Interpreting the EasyCBM progress monitoring test
results. Retrieved February 22, 2011, from
http://www.easycbm.com/static/files/pdfs/info/ProgMonScoreInterpretation.pdf
www.interventioncentral.org
Response to Intervention
Defining ‘Discrepant Academic Performance’: Example
Baylor Elementary School conducted a winter screening
of its 3rd-grade students in Oral Reading Fluency (ORF).
The school set the 10th percentile as being the low
academic cut-point, defining the cross-over to Tier3
support and the 30th percentile as being the high cutpoint, defining a student who would need no
supplemental intervention. The Tier 2 range, then, was
10th to 30th percentile.
Baylor compared its local norm results to
research/external ORF norms (Tindal, Hasbrouck &
Jones, 2005).
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical
report #33]. Eugene, OR: University of Oregon.
www.interventioncentral.org
93
Response to Intervention
Example: Baylor Elementary School/Smithtown School District
Grade 3: Winter Screening: Oral Reading Fluency: Word Correct Per
Minute
External/Research Norms: 10-30 %ile
36-69 WCPM = Tier 2 Range
Local (Building) Norms: 10-30 %ile
31-60 WCPM = Tier 2 Range
0
20
40
60
80
100
120
140
160
Correct Words Per Minute
www.interventioncentral.org
94
Response to Intervention
Using Screening Results to Define ‘Discrepant’ Academic
Performance: Local vs. External Fluency Norms
www.interventioncentral.org
95
Response to Intervention
Defining ‘Discrepant Academic Performance’: Example
Because the external/research norms were higher than its local
norms, Baylor School used those external norms to identify
students who needed Tier 2 and 3 services.
Using external/research norms prevented the school from
under-identifying students with serious academic deficits simply
because they happened to be in an underperforming school.
Of course, the school also realized, based on the low local
norms, that it needed to concentrate on improving core
instruction to increase the reading performance of all students.
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical
report #33]. Eugene, OR: University of Oregon.
www.interventioncentral.org
96
Response to Intervention
RTI & Special Education Eligibility: Sample Case
• A student is found to fall within the ‘some risk’ Tier 2 level in oral
reading fluency using external norms (student performed above
the 20th percentile).
• The student was found to fall at the 8th percentile using local
performance norms.
• The school believes that the student is learning disabled.
• How might you respond to this issue if you were a member of the
Special Education Eligibility Team at an initial eligibility meeting?
www.interventioncentral.org
97
Response to Intervention
Team Activity: What is the ‘Reachable, Teachable’ Range?
• What do you believe should be the
percentile cut-off (5%?, 10?, 20%?) that
your district might set that would
demarcate the boundary between Tier 2
reading services (the student can be
maintained with strong core instruction
and supplemental RTI support) and Tier 3
(the student needs intensive support and
may eventually be a candidate for a
special education referral)?
www.interventioncentral.org
98
Response to Intervention
Student ‘Rate of Improvement’
(Slope): How to Determine
‘Discrepancy’
www.interventioncentral.org
Response to Intervention
Estimating Student Rate of Improvement: What Are the
Relative Advantages of External vs. Local Growth Norms?
• External (Research-Derived) Growth Norms: Provide a
general estimate of the expected academic growth of a
‘typical’ student that can be applied across many academic
settings. However, these norms may not be representative of
student performance at a particular school.
NOTE: Research-derived growth norms are likely to serve as
the primary comparison at the Special Education Eligibility
Team meeting of whether a student is ‘severely discrepant’ in
rate of learning/slope of improvement.
www.interventioncentral.org
100
Response to Intervention
Research-Derived Rates of Academic Growth/Slope: Example
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33].
Eugene, OR: University of Oregon.
www.interventioncentral.org
101
Response to Intervention
Estimating Student Rate of Improvement: What Are the
Relative Advantages of External vs. Local Growth Norms?
• Local Growth Norms: Provide an estimate of typical growth
for students within a particular school’s population. This
provides insight into current levels of student achievement
and the effectiveness of instruction in that building. However,
these results cannot easily be applied to other dissimilar
academic settings.
NOTE: Local growth norms may be used to identify students
in general education who are falling behind their classmates
in academic skills.
www.interventioncentral.org
102
Response to Intervention
Estimating Rates of LOCAL Academic Growth/Slope
•
•
Methods for estimating student academic growth can include:
Calculation of typical rates of student progress from LOCAL
academic fluency norms (e.g., fall, winter, and spring oral
reading fluency norms collected in a school).
All scores can be entered into Excel to generate individual
student slopes (estimated rate of improvement) and
mean/median slope for the group.
Source: Hasbrouck, J., & Tindal, G. (2005). Oral reading fluency: 90 years of measurement. Eugene, OR: Behavioral
Research & Teaching. Retrieved from http://www.brtprojects.org/tech_reports.php.
www.interventioncentral.org
103
Response to Intervention
How Can a Student’s Rate of Improvement (Slope) Be
Analyzed to See If It is ‘Discrepant’?
There is not yet consensus in the research literature about
how to set specific criteria to judge whether a student’s actual
rate of academic progress is discrepant from expectations.
Two possible methods to determine slope discrepancy are
1. direct comparison of the student slope to the local or
external slope estimate—with a cut-off value to signal
that the student is discrepant.
2. calculation of the standard deviation for the slopes of
student’s in a local norming sample to determine whether
a student falls at least one standard deviation below the
mean slope of the group.
www.interventioncentral.org
104
Response to Intervention
How Can a Student’s Rate of Improvement (Slope) Be
Analyzed to See If It is ‘Discrepant’?
There is not yet consensus in the research literature about
how to set specific criteria to judge whether a student’s actual
rate of academic progress is discrepant from expectations.
Two possible methods to determine slope discrepancy are
1. direct comparison of the student slope to the local or
external slope estimate—with a cut-off value to signal
that the student is discrepant.
2. calculation of the standard deviation for the slopes of
student’s in a local norming sample to determine whether
a student falls at least one standard deviation below the
mean slope of the group.
www.interventioncentral.org
105
Response to Intervention
How Can a Student’s Rate of Improvement (Slope) Be
Analyzed to See If It is ‘Discrepant’? (cont.)
GROWTH QUOTIENT: Direct Comparison With Research,
Local Norms. The student slope can be divided by the local or external
slope estimate—with a cut-off value (e.g., 0.75 or below) selected to indicate
when this comparison value (quotient) is discrepant. NOTE: Any slope
comparison quotient <1.0 indicates that the student did not meet growth
expectations.
Example: A 4th-grade student, Darlene, was increasing reading fluency by
0.8 words per week on an intervention. Comparison external research norms
(Hasbrouck & Tindal, 2005) suggest that typical rates of student growth are
0.9 words per week.
Darlene’s rate of progress (0.8 words per week) divided by the research
growth norm (0.9 words per week) yield a quotient of 0.88. This figure shows
that Darlene has fallen below the target for growth (< 1.0) and is making
progress about 90 percent of the target for growth.
www.interventioncentral.org
106
Response to Intervention
How Can a Student’s Rate of Improvement (Slope) Be
Analyzed to See If It is ‘Discrepant’? (cont.)
Direct Comparison With Research, Local Norms.
Advantage: Calculating a quotient that divides the student’s actual rate of
growth by a comparison growth norm (either external or local slope estimate)
provides quick visual feedback about whether the student is exceeding,
meeting, or falling behind the growth represented by the comparison slope.
This quotient figure also gives a visual indication of the magnitude of the
student progress relative to the comparison slope.
Limitation: There are no research-generated guidelines for setting a‘cutpoint’
below which this ‘growth quotient’ figure would be seen as indicating a
significantly discrepant rate of student growth.
www.interventioncentral.org
107
Response to Intervention
Example: Direct Comparison With Research, Local Norms.
At a Special Education Team Eligibility Meeting, the Team
A student, Roy, wasWhen
foundRoy’s
with intervention
to
slope
was
divided
determined that there was not a disability but
increase his readingbyfluency
at a rate of 0.92
words
external/research
norms,
local
growth
norms,
the
acknowledged that the student may be seen as ‘struggling’
per week.
quotient
was
1.02.
resulting
quotient
was
0.83.
in the context ofthe
his resulting
high-performing
classroom.
www.interventioncentral.org
108
Response to Intervention
Setting Individual Student
RTI Academic Goals
Using Research Norms
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
Setting Individual Student RTI Academic
Goals Using Research Norms
To set a goal for student academic performance, four
elements are needed:
• The student’s baseline academic performance. Prior to
starting the intervention, the teacher calculates
baseline performance by assessing the target student
several times with the academic measure that will be
used to measure that student’s progress once the
intervention begins.
• Estimate of ‘typical’ peer performance. The teacher
has a reliable estimate of expected or typical peer
performance on the academic measure that will be
used to measure the target student’s progress.
www.interventioncentral.org
110
Response to Intervention
Setting Individual Student RTI Academic
Goals Using Research Norms
To set a goal for student academic performance, four
elements are needed (cont.):
• Estimate of expected weekly progress. The teacher
selects a rate of weekly academic progress that the
target student is expected to attain if the intervention is
successful.
• Number of weeks for the intervention trial. The teacher
decides on how many weeks the RTI intervention will
last, as the cumulative, final academic goal can be
calculated only when the entire timespan of the
intervention is known.
www.interventioncentral.org
111
Response to Intervention
How to Set a Student Goal
1. The teacher collects at least 3 baseline observations
from the target student using alternate forms of the
progress-monitoring measure (e.g., CBM oral reading
fluency passages). The median baseline observation
is selected to serve as the student’s starting (baseline)
performance.
2. The teacher subtracts the student’s baseline from the
estimate of typical peer performance for that grade
level supplied by the research norms to calculate the
academic performance gap that is to be closed during
the intervention.
www.interventioncentral.org
112
Response to Intervention
How to Set a Student Goal
3. The teacher decides how many instructional weeks
the intervention will be in place (e.g., 8 weeks).
4. The teacher selects grade-appropriate norms for
academic growth per week supplied by the research
norms.
www.interventioncentral.org
113
Response to Intervention
How to Set a Student Goal
5. The teacher multiplies the grade norm for weekly
growth (selected in step 4) by a figure between 1.5
and 2.0 (Shapiro, 2008). Because the original weekly
growth rate represents a typical rate student
improvement, the target student’s weekly growth
estimate should be adjusted upward to accelerate
learning and close the gap separating that student
from peers. Multiplying the original weekly growth rate
by an amount ranging between1.5 and 2.0
accomplishes this adjustment.
www.interventioncentral.org
114
Response to Intervention
How to Set a Student Goal
6. The teacher next multiplies the weekly growth
figure by the total number of weeks that the
intervention will be in place. This figure represents
the minimum academic growth expected during the
intervention.
7. The teacher adds the expected academic growth
calculated in the previous step to the student baseline
calculated in step 1. This figure represents the final
student goal if the intervention is successful.
www.interventioncentral.org
115
Response to Intervention
How to Set a Student Goal
Recommendations for using this approach:
Research norms for student performance and
academic growth are the ‘gold standard’ in goalsetting, as they provide fixed, external standards for
proficiency that are not influenced by variable levels of
student skill in local classrooms. When setting
academic goals for struggling students, schools should
use research norms whenever they are available. In
particular, research norms should be used for highstakes RTI cases that may be referred at some point
to the Special Education Eligibility Team.
www.interventioncentral.org
116
Response to Intervention
How to Set a Student Goal: Example
In December, Mrs. Chandler, a 4th-grade teacher,
collected three baseline assessments of a student,
Randy, in oral reading fluency using 4th-grade
passages. She found that Randy’s median reading
rate in these materials was 70 words per minute. Mrs.
Chandler consulted research norms in oral reading
fluency (Tindal, Hasbrouck & Jones, 2005) and
decided that a reasonable minimum reading rate for
students by winter of grade 4 (25th percentile) is 87
words per minute. Randy would need to increase his
reading rate by 17 words per minute to close this
academic achievement gap with peers.
www.interventioncentral.org
117
Response to Intervention
Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33].
Eugene, OR: University of Oregon.
www.interventioncentral.org
118
Response to Intervention
How to Set a Student Goal: Example (Cont.)
The reading intervention planned for Randy would last
8 instructional weeks. Mrs. Chandler consulted the
research norms and noted that a typical rate of growth
in reading fluency for a 4th-grade student is 0.9
additional words per week.
Mrs. Chandlers adjusted the 0.9 word growth rate for
Randy upward by multiplying it by 1.5 because she
realized that he needed to accelerate his learning to
catch up with peers. When adjusted upward, the
weekly growth rate for Randy increased from 0.9 to
1.35 additional words per minute.
www.interventioncentral.org
119
Response to Intervention
How to Set a Student Goal: Example (Cont.)
Multiplying the expected weekly progress of 1.35
additional words times the 8 weeks of the intervention,
Mrs. Chandler found that Randy should acquire at
least 11 additional words of reading fluency by the
conclusion of the intervention.
She added the 11 words per minute to Randy’s
baseline of 70 words per minute and was able to
predict that—if the 8-week intervention was
successful—Randy would be able to read
approximately 81 words per minute.
www.interventioncentral.org
120
Response to Intervention
How to Set a Student Goal: Example (Cont.)
Because Randy would not be expected to fully close
the gap with peers in 8 weeks, Mrs. Chandler
regarded her intervention goal of 81 words per minute
as an intermediate rather than a final goal.
However, if the intervention was successful and the
student continued to add 1.35 words per week to his
reading fluency, he could be expected to reach an
acceptable level of fluency soon.
www.interventioncentral.org
121
Response to Intervention
How to Monitor a Student Off-Level
• Conduct a ‘survey level’ assessment of the student to find their
highest ‘instructional’ level (between 25th and 50th percentile).
• Student is monitored at ‘off level’ during the intervention (e.g.,
weekly). The ‘goal’ is to move the student up to the 50th percentile.
• Once per month, the student is also assessed at grade level to
monitor grade-appropriate performance.
• When the student moves above 50th percentile on off-level, the
interventionist tests the student on the next higher level. If the
student performs above the 25th percentile on the next level,
monitoring starts at the new, higher level.
www.interventioncentral.org
122
Response to Intervention
Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33].
Eugene, OR: University of Oregon.
www.interventioncentral.org
123
Response to Intervention
RTI Lab: Writing a Tier 2/3
Intervention Summary Report:
Guidelines
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
Team Activity: Review a Sample Intervention Summary
Report
At your table:
•
Review the sample Intervention Summary
Report for Brian Haskell, a 5th grade student
(pp. 1-4).
•
What key points does the report contain that
may support or prevent your effort to
determine if the student is a ‘non-responder’
to intervention?
•
How useful does your team find this
Intervention Summary Report format to be?
www.interventioncentral.org
125
Response to Intervention
Tier 2/3 Intervention Summary Report: Sample
Introductory Paragraph
www.interventioncentral.org
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Introductory Paragraph. This opening section presents
essential background information about the RTI case,
including:
• Current grade level of the student
• Information about how the student was identified for
supplemental RTI intervention (e.g., student
performance on fall/winter/spring academic screening
placing them in the ‘at risk’ range)
• [If the student received an ‘off-level’ supplemental
intervention] information about the grade level selected
for the intervention and progress-monitoring
www.interventioncentral.org
127
Response to Intervention
Tier 2/3 Intervention Summary Report: Sample
Intervention Summary Paragraph/Section
www.interventioncentral.org
Response to Intervention
Sample Reading Fluency Norms
Source: Tindal, G., Hasbrouck, J., & Jones, C. (2005).Oral reading fluency: 90 years of measurement [Technical report #33].
Eugene, OR: University of Oregon.
www.interventioncentral.org
129
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs. This section
provides summary information about each intervention
plan or ‘trial’. A separate paragraph or section is written
for each intervention plan/trial. Every intervention
summary should include:
• Name and brief description of the instructional
program(s) or practices that makes up the intervention.
www.interventioncentral.org
130
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Details about intervention delivery, such as:
– Start and end dates of the intervention
– Total number of instructional weeks of the
intervention plan
– Number of sessions per week
– Length of each intervention session
– Intervention group size
www.interventioncentral.org
131
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Details about the student ‘s progress, such as:
– Student baseline at the start of the intervention. TIP:
When a student is starting an intervention plan and
was previously monitored on an earlier, recent
intervention plan, baseline can easily be computed
for the new intervention by selecting the median
value from the last three progress-monitoring data
points collected during the previous intervention.
www.interventioncentral.org
132
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Details about the student ‘s progress, such as:
– Weekly ‘accelerated’ goal for student improvement. This
‘accelerated’ goal can be calculated by taking a researchbased estimate of typical weekly student progress at a
grade level and multiplying that typical growth rate by a
figure between 1.5 and 2.0 (Shapiro, 2008).
– Cumulative performance goal at the end of the
intervention. This end-goal is computed by multiplying the
weekly accelerated goal by the number of weeks that the
intervention will take place. That product is then added to
the student’s baseline performance to compute the
cumulative goal.
www.interventioncentral.org
133
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Details about the student ‘s progress, such as:
– Typical weekly growth rates in the academic skill and
grade level in which the student is receiving a
supplemental intervention. These ‘typical’ weekly growth
rates are usually derived from research norms.
– Comparison of actual student performance to goals and
norms. The student’s actual weekly growth rate is
compared to both the accelerated weekly goal and the
typical peer rate of weekly progress. Additionally, the
student’s actual cumulative progress during the
intervention is compared to the original cumulative goal to
determine if the intervention was successful.
www.interventioncentral.org
134
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Information about intervention integrity. The total possible
number of intervention sessions available to the student is
computed as the number of sessions per week multiplied by
the number of weeks of the intervention. The number of
intervention sessions actually attended by the student is also
presented. The actual sessions that the student participated
in is divided by the total number of possible sessions; this
decimal is then multiplied by 100 to yield a percentage of
‘intervention integrity’. ‘Intervention integrity’ figures should
exceed 80 percent.
www.interventioncentral.org
135
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Summary Paragraphs (Cont.).
• Number of data points collected to monitor student progress
during the intervention. For Tier 2 interventions, monitoring
information should be collected at least every two weeks.
Tier 3 interventions should be monitored at least weekly.
www.interventioncentral.org
136
Response to Intervention
Tier 2/3 Intervention Summary Report: Sample
Intervention Series Analysis Section
www.interventioncentral.org
Response to Intervention
Tier 2/3 Intervention Summary Report: Sample Intervention Series Analysis Section
www.interventioncentral.org
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis: This concluding section
summarizes the findings of the several intervention
trials and reaches a conclusion about whether the
student was adequately responsive to generaleducation Tier 2/3 interventions. The section includes:
• An analysis of whether the student hit the accelerated
goal(s) for each of the interventions discussed.
www.interventioncentral.org
139
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.):
• An analysis of whether, during each intervention, the
student exceeded, met, or fell below the typical peer
growth norms at the grade level of the supplemental
intervention. This information will be useful in
determining whether a student has a significant
discrepancy in academic growth compared to typical
peer growth norms.
www.interventioncentral.org
140
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.):
• Summary of the student’s performance at his or her
current grade placement on recent schoolwide
academic screenings. Ideally, the student’s screening
results are presented with corresponding percentile
rankings. Students who continue to perform below the
10th percentile on school screenings at their grade of
record despite several intensive RTI interventions
demonstrate that the interventions have failed to
generalize to significant improvements in classroom
academic skills. Students meeting this profile can be
considered to have a severe discrepancy in academic
skills.
www.interventioncentral.org
141
Response to Intervention
Writing a Tier 2/3 Intervention Summary Report…
Intervention Series Analysis (Cont.):
• Conclusion about the student’s ‘response status’. Based
on the student’s response to intervention across the full
intervention history, the report reaches a conclusion
about whether the student meets criteria as a ‘nonresponder’ to intervention.
www.interventioncentral.org
142
Response to Intervention
Team Activity: How Can You Create Intervention Summary
Reports in Your District?
At your table:
•
Review the outline, example, and
recommendations provided at today’s
workshop for writing an Intervention
Summary Report.
•
Discuss how you might develop a
process in your own district to create
high-quality Intervention Summary
Reports for use by the Special
Education Eligibility Team and others.
www.interventioncentral.org
143
Response to Intervention
RTI & Special Education
Eligibility: ‘Challenge’
Scenarios
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
RTI & Special Education Eligibility: ‘Challenge’
Scenarios
• The school presents clear information suggesting that the
student has not responded adequately to general-education
reading interventions. The school believes that the student
should be designated Learning Disabled. In your review of
records, however, you note that the student passed the most
state ELA test.
• How might you respond to this issue at the meeting?
www.interventioncentral.org
145
Response to Intervention
RTI & Special Education Eligibility: ‘Challenge’
Scenarios
• When information is presented from the school on the student’s
response to a series of math intervention programs, you see that
they have tried 3 interventions at Tiers 2 and 3. However, there is
little information about the second of the 3 interventions,
including what research-based strategies were used or what
progress-monitoring data were collected. The intervention is
instead described as ‘the student saw the math intervention
teacher for 30 minutes 3 times per week.”
• How might you respond to this issue at the meeting?
www.interventioncentral.org
146
Response to Intervention
RTI & Special Education Eligibility: ‘Challenge’
Scenarios
• A student is referred to Special Education because she has
apparently failed to respond adequately to the school’s Tier 2
and 3 interventions.
As you review the student’s RTI information, you realize that the
student fell within the average range on the external reading
fluency performance norms, even though the classroom teacher
has made an emphatic case that the student has severely limited
reading skills in the classroom.
• How might you respond to this issue at the meeting?
www.interventioncentral.org
147
Response to Intervention
Planning Your District’s ‘Next Steps’ in Developing
RTI Decision Rules: Activity
At your table, review the content discussed at
today’s workshop:
Dual Discrepancy: Determining discrepant
performance and rate of growth
RTI ‘Non-Responder’ Checklist
Intervention integrity
Intervention Summary Report
Decide on at least 3 key ‘next steps’ that your district
should adopt to move forward in developing RTI
‘decision rules’ for CSE. Be prepared to share the
main points of your discussion with the large group.
www.interventioncentral.org
148
Download