22 April 2010: Data Collection

advertisement
Response to Intervention
RTI: How to Collect Data to
Understand
and Fix Student Academic
and Behavioral
Problems
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Data Collection: Defining Terms
Response to Intervention
Evaluation. “the process of using information collected through assessment to
make decisions or reach conclusions.” (Hosp, 2008; p. 364).
Example: A student can be evaluated for problems in ‘fluency with text’ by
collecting information using various sources (e.g., CBM ORF, teacher interview,
direct observations of the student reading across settings, etc.), comparing those
results to peer norms or curriculum expectations, and making a decision about
whether the student’s current performance is acceptable.
Assessment. “the process of collecting information about the characteristics of
persons or objects by measuring them. ” (Hosp, 2008; p. 364).
Example: The construct ‘fluency with text’ can be assessed using various
measurements, including CBM ORF, teacher interview, and direct observations of
the student reading in different settings and in different material.
Measurement. “the process of applying numbers to the characteristics of objects
or people in a systematic way” (Hosp, 2008; p. 364).
Example: Curriculum-Based Measurement Oral Reading Fluency (CBM ORF) is
one method to measure the construct ‘fluency with text’
www.interventioncentral.org
2
Response to Intervention
Formal Tests: Only One Source of Student Assessment
Information
“Tests are often overused and misunderstood in and
out of the field of school psychology. When necessary,
analog [i.e., test] observations can be used to test
relevant hypotheses within controlled conditions.
Testing is a highly standardized form of observation.
….The only reason to administer a test is to answer
well-specified questions and examine well-specified
hypotheses. It is best practice to identify and make
explicit the most relevant questions before
assessment begins. …The process of assessment
should follow these questions. The questions should
not follow assessment. “ p.170
Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology V (pp. 159-176). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
4
Response to Intervention
Relevant Academic Information: Sources and
Purpose
• Tier 1: Instructional information. Teachers do classroom assessments
(both formal and informal). Results are used to make day-to-day decisions
about pacing of instruction, to determine students who need additional
support, etc.
• Tier 1/Tier 2: Schoolwide screenings. Brief universal screenings are
administered to all students at a grade level to measure academic skills
that predict future school success. Results reflect on quality of core
instruction and drive recruitment for Tier 2 programs.
• Tier 3: Analytic/diagnostic instructional assessment. Struggling
students with more severe needs picked up in screenings may be
administered a more detailed assessment (using qualitative and/or
quantitative measures) to map out pattern of deficits in basic academic
skills. Results are used to create a customized intervention plan that
meets that student’s unique needs.
www.interventioncentral.org
5
Response to Intervention
Making Use of Existing (‘Extant’)
Data
www.interventioncentral.org
Response to Intervention
Universal Screening at Secondary Schools: Using Existing
Data Proactively to Flag ‘Signs of Disengagement’
“Across interventions…, a key component to promoting
school completion is the systematic monitoring of all
students for signs of disengagement, such as attendance
and behavior problems, failing courses, off track in terms of
credits earned toward graduation, problematic or few close
relationships with peers and/or teachers, and then following
up with those who are at risk.”
Source: Jimerson, S., Reschly, A.L., & Hess, R. (2008). Best practices in increasing the likelihood of school completion. In A. Thomas & J.
Grimes (Eds). Best Practices in School Psychology - 5th Ed (pp. 1085-1097). Bethesda, MD: National Association of School Psychologists..
p.1090
www.interventioncentral.org
7
Response to Intervention
Extant (Existing) Data (Chafouleas et al., 2007)
• Definition: Information that is collected by schools as a
matter of course.
• Extant data comes in two forms:
– Performance summaries (e.g., class grades, teacher
summary comments on report cards, state test
scores).
– Student work products (e.g., research papers, math
homework, PowerPoint presentation).
Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention
and instruction. New York: Guilford Press.
www.interventioncentral.org
8
Response to Intervention
Advantages of Using Extant Data (Chafouleas et al., 2007)
• Information is already existing and easy to access.
• Students will not show ‘reactive’ effects during data
collection, as the information collected is part of the
normal routine of schools.
• Extant data is ‘relevant’ to school data consumers (such
as classroom teachers, administrators, and members of
problem-solving teams).
Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention
and instruction. New York: Guilford Press.
www.interventioncentral.org
9
Response to Intervention
Drawbacks of Using Extant Data (Chafouleas et al., 2007)
• Time is required to collate and summarize the data (e.g., summarizing
a week’s worth of disciplinary office referrals).
• The data may be limited and not reveal the full dimension of the
student’s presenting problem(s).
• There is no guarantee that school staff are consistent and accurate in
how they collect the data (e.g., grading policies can vary across
classrooms; instructors may have differing expectations regarding
what types of assignments are given a formal grade; standards may
fluctuate across teachers for filling out disciplinary referrals).
• Little research has been done on the ‘psychometric adequacy’ of
extant data sources.
Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention
and instruction. New York: Guilford Press.
www.interventioncentral.org
10
Response to Intervention
Universal Screening at Secondary Schools: Using Existing
Data Proactively to Flag ‘Signs of Disengagement’
“Across interventions…, a key component to promoting
school completion is the systematic monitoring of all
students for signs of disengagement, such as attendance
and behavior problems, failing courses, off track in terms of
credits earned toward graduation, problematic or few close
relationships with peers and/or teachers, and then following
up with those who are at risk.”
Source: Jimerson, S., Reschly, A.L., & Hess, R. (2008). Best practices in increasing the likelihood of school completion. In A. Thomas & J.
Grimes (Eds). Best Practices in School Psychology - 5th Ed (pp. 1085-1097). Bethesda, MD: National Association of School Psychologists..
p.1090
www.interventioncentral.org
11
Response to Intervention
Mining Archival Data: What Are the ‘Early Warning Flags’
of Student Drop-Out?
•
•
•
•
A sample of 13,000 students in Philadelphia were tracked for 8
years. These early warning indicators were found to predict
student drop-out in the sixth-grade year:
Failure in English
Failure in math
Missing at least 20% of school days
Receiving an ‘unsatisfactory’ behavior rating from at least one
teacher
Source: Balfanz, R., Herzog, L., MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation
path in urban middle grades schools: Early identification and effective interventions. Educational Psychologist,42, 223–235. .
www.interventioncentral.org
12
Response to Intervention
What is the Predictive Power of These Early
Warning Flags?
Number of ‘Early Warning Flags’ in
Student Record
Probability That Student Would
Graduate
None
56%
1
36%
2
21%
3
13%
4
7%
Source: Balfanz, R., Herzog, L., MacIver, D. J. (2007). Preventing student disengagement and keeping students on the graduation
path in urban middle grades schools: Early identification and effective interventions. Educational Psychologist,42, 223–235. .
www.interventioncentral.org
13
Response to Intervention
Grades & Other Teacher Performance Summary
Data (Chafouleas et al., 2007)
• Teacher test and quiz grades can be useful as a
supplemental method for monitoring the impact of
student behavioral interventions.
• Other data about student academic performance (e.g.,
homework completion, homework grades, etc.) can also
be tracked and graphed to judge intervention
effectiveness.
Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention
and instruction. New York: Guilford Press.
www.interventioncentral.org
14
Response to Intervention
Marc Ripley
2-Wk
9/23/07
4-Wk
10/07/07
6-Wk
10/21/07
(From Chafouleas et al., 2007)
8-Wk
11/03/07
10-Wk
11/20/07
12-Wk
12/05/07
Source: Chafouleas, S., Riley-Tillman, T.C., & Sugai, G. (2007). School-based behavioral assessment: Informing intervention
and instruction. New York: Guilford Press.
www.interventioncentral.org
15
Response to Intervention
‘Elbow Group’ Activity: What Extant/Archival Data
Should Your RTI Team Review Regularly?
•
Discuss the essential extant/archival data
that your RTI Team should review as
‘early warning indicators’ of students who
are struggling (see p. 20 of packet).
•
What process should your school adopt to
ensure that these data are reviewed
regularly (e.g., every five weeks) to
guarantee timely identification of students
who need intervention assistance?
www.interventioncentral.org
16
Response to Intervention
RIOT/ICEL Framework:
Organizing Information to Better
Identify Student Behavioral &
Academic Problems
www.interventioncentral.org
Response to Intervention
Assessment Data: Reaching the ‘Saturation Point’
“…During the process of assessment, a point of
saturation is always reached; that is, the point
when enough information has been collected to
make a good decision, but adding additional
information will not improve the decision
making. It sounds simple enough, but the tricky
part is determining when that point has been
reached. Unfortunately, information cannot be
measured in pounds, decibels, degrees, or feet
so there is no absolute amount of information
or specific criterion for “enough” information.”
p. 373
Source: Hosp, J. L. (2008). Best practices in aligning academic assessment with instruction. In A. Thomas & J. Grimes (Eds.),
Best practices in school psychology V (pp.363-376). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
18
Response to Intervention
pp. 25-28
www.interventioncentral.org
Response to Intervention
RIOT/ICEL Framework
Sources of Information
• Review (of records)
• Interview
• Observation
• Test
Focus of Assessment
• Instruction
• Curriculum
• Environment
• Learner
www.interventioncentral.org
20
Response to Intervention
RIOT/ICEL Definition
• The RIOT/ICEL matrix is an assessment guide to help
schools efficiently to decide what relevant information to
collect on student academic performance and
behavior—and also how to organize that information to
identify probable reasons why the student is not
experiencing academic or behavioral success.
• The RIOT/ICEL matrix is not itself a data collection
instrument. Instead, it is an organizing framework, or
heuristic, that increases schools’ confidence both in the
quality of the data that they collect and the findings that
emerge from the data.
www.interventioncentral.org
21
Response to Intervention
RIOT: Sources of Information
• Select Multiple Sources of Information: RIOT
(Review, Interview, Observation, Test). The top
horizontal row of the RIOT/ICEL table includes four
potential sources of student information: Review,
Interview, Observation, and Test (RIOT). Schools
should attempt to collect information from a range of
sources to control for potential bias from any one
source.
www.interventioncentral.org
22
Response to Intervention
Select Multiple Sources of Information: RIOT (Review,
Interview, Observation, Test)
• Review. This category consists of past or present records
collected on the student. Obvious examples include report
cards, office disciplinary referral data, state test results, and
attendance records. Less obvious examples include student
work samples, physical products of teacher interventions
(e.g., a sticker chart used to reward positive student
behaviors), and emails sent by a teacher to a parent
detailing concerns about a student’s study and
organizational skills.
www.interventioncentral.org
23
Response to Intervention
Select Multiple Sources of Information: RIOT (Review,
Interview, Observation, Test)
• Interview. Interviews can be conducted face-to-face, via
telephone, or even through email correspondence.
Interviews can also be structured (that is, using a predetermined series of questions) or follow an open-ended
format, with questions guided by information supplied by the
respondent. Interview targets can include those teachers,
paraprofessionals, administrators, and support staff in the
school setting who have worked with or had interactions with
the student in the present or past. Prospective interview
candidates can also consist of parents and other relatives of
the student as well as the student himself or herself.
www.interventioncentral.org
24
Response to Intervention
Select Multiple Sources of Information: RIOT (Review,
Interview, Observation, Test)
• Observation. Direct observation of the student’s academic
skills, study and organizational strategies, degree of
attentional focus, and general conduct can be a useful
channel of information. Observations can be more structured
(e.g., tallying the frequency of call-outs or calculating the
percentage of on-task intervals during a class period) or less
structured (e.g., observing a student and writing a running
narrative of the observed events).
www.interventioncentral.org
25
Response to Intervention
Select Multiple Sources of Information: RIOT (Review,
Interview, Observation, Test)
• Test. Testing can be thought of as a structured and
standardized observation of the student that is intended to
test certain hypotheses about why the student might be
struggling and what school supports would logically benefit
the student (Christ, 2008). An example of testing may be a
student being administered a math computation CBM probe
or an Early Math Fluency probe.
www.interventioncentral.org
26
Response to Intervention
Formal Tests: Only One Source of Student Assessment
Information
“Tests are often overused and misunderstood in and
out of the field of school psychology. When necessary,
analog [i.e., test] observations can be used to test
relevant hypotheses within controlled conditions.
Testing is a highly standardized form of observation.
….The only reason to administer a test is to answer
well-specified questions and examine well-specified
hypotheses. It is best practice to identify and make
explicit the most relevant questions before
assessment begins. …The process of assessment
should follow these questions. The questions should
not follow assessment. “ p.170
Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology V (pp. 159-176). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
27
Response to Intervention
ICEL: Factors Impacting Student Learning
• Investigate Multiple Factors Affecting Student
Learning: ICEL (Instruction, Curriculum,
Environment, Learner). The leftmost vertical column of
the RIO/ICEL table includes four key domains of
learning to be assessed: Instruction, Curriculum,
Environment, and Learner (ICEL). A common mistake
that schools often make is to assume that student
learning problems exist primarily in the learner and to
underestimate the degree to which teacher instructional
strategies, curriculum demands, and environmental
influences impact the learner’s academic performance.
The ICEL elements ensure that a full range of relevant
explanations for student problems are examined.
www.interventioncentral.org
28
Response to Intervention
Investigate Multiple Factors Affecting Student Learning:
ICEL (Instruction, Curriculum, Environment, Learner)
• Instruction. The purpose of investigating the ‘instruction’
domain is to uncover any instructional practices that either
help the student to learn more effectively or interfere with
that student’s learning. More obvious instructional questions
to investigate would be whether specific teaching strategies
for activating prior knowledge better prepare the student to
master new information or whether a student benefits
optimally from the large-group lecture format that is often
used in a classroom. A less obvious example of an
instructional question would be whether a particular student
learns better through teacher-delivered or self-directed,
computer-administered instruction.
www.interventioncentral.org
29
Response to Intervention
Investigate Multiple Factors Affecting Student Learning:
ICEL (Instruction, Curriculum, Environment, Learner)
• Curriculum. ‘Curriculum’ represents the full set of academic
skills that a student is expected to have mastered in a
specific academic area at a given point in time. To
adequately evaluate a student’s acquisition of academic
skills, of course, the educator must (1) know the school’s
curriculum (and related state academic performance
standards), (2) be able to inventory the specific academic
skills that the student currently possesses, and then (3)
identify gaps between curriculum expectations and actual
student skills. (This process of uncovering student academic
skill gaps is sometimes referred to as ‘instructional’ or
‘analytic’ assessment.)
www.interventioncentral.org
30
Response to Intervention
Investigate Multiple Factors Affecting Student Learning: ICEL
(Instruction, Curriculum, Environment, Learner)
• Environment. The ‘environment’ includes any factors in the
student’s school, community, or home surroundings that can
directly enable their academic success or hinder that success.
Obvious questions about environmental factors that impact
learning include whether a student’s educational performance is
better or worse in the presence of certain peers and whether
having additional adult supervision during a study hall results in
higher student work productivity. Less obvious questions about
the learning environment include whether a student has a setting
at home that is conducive to completing homework or whether
chaotic hallway conditions are delaying that student’s transitioning
between classes and therefore reducing available learning time.
www.interventioncentral.org
31
Response to Intervention
Investigate Multiple Factors Affecting Student Learning:
ICEL (Instruction, Curriculum, Environment, Learner)
• Learner. While the student is at the center of any questions
of instruction, curriculum, and [learning] environment, the
‘learner’ domain includes those qualities of the student that
represent their unique capacities and traits. More obvious
examples of questions that relate to the learner include
investigating whether a student has stable and high rates of
inattention across different classrooms or evaluating the
efficiency of a student’s study habits and test-taking skills. A
less obvious example of a question that relates to the learner
is whether a student harbors a low sense of self-efficacy in
mathematics that is interfering with that learner’s willingness
to put appropriate effort into math courses.
www.interventioncentral.org
32
Response to Intervention
www.interventioncentral.org
Response to Intervention
• The teacher collects
several student math
computation worksheet
samples to document
work completion and
accuracy.
• Data Source: Review
• Focus Areas: Curriculum
www.interventioncentral.org
34
Response to Intervention
• The student’s parent tells
the teacher that her son’s
reading grades and
attitude toward reading
dropped suddenly in Gr 4.
• Data Source: Interview
• Focus: Curriculum,
Learner
www.interventioncentral.org
35
Response to Intervention
• An observer monitors the
student’s attention on an
independent writing
assignment—and later
analyzes the work’s
quality and completeness.
• Data Sources:
Observation, Review
• Focus Areas:
Curriculum,
Environment, Learner
www.interventioncentral.org
36
Response to Intervention
• A student is given a timed
math worksheet to
complete. She is then
given another timed
worksheet & offered a
reward if she improves.
• Data Source: Review,
Test
• Focus Areas:
Curriculum, Learner
www.interventioncentral.org
37
Response to Intervention
• Comments from several
past report cards describe
the student as preferring
to socialize rather than
work during small-group
activities.
• Data Source: Review
• Focus Areas:
Environment
www.interventioncentral.org
38
Response to Intervention
• The teacher tallies the
number of redirects for an
off-task student during
discussion. She designs a
high-interest lesson, still
tracks off-task behavior.
• Data Source:
Observation, Test
• Focus Areas: Instruction
www.interventioncentral.org
39
Response to Intervention
Uses of RIOT/ICEL
The RIOT/ICEL framework is adaptable and can be used
flexibly: e.g.:
• The teacher can be given the framework to encourage fuller use of
available classroom data, examination of environmental and curiculum
variables impacting learning.
• The RTI Team case manager can use the framework when premeeting with the teacher to better define the student problem, select
data to bring to the initial RTI Team meeting.
• Any RTI consultant working at any Tier can internalize the framework
as a mental guide to prompt fuller consideration of available data,
efficiency in collecting data, and stronger formulation of student
problems.
www.interventioncentral.org
40
Response to Intervention
Activity: Use the RIOT/ICEL Framework
• Review the RIOT/ICEL matrix.
• Discuss how you might use the
framework to ensure that
information that you collect on a
student is broad-based, comes
from multiple sources, and
answers the right questions
about the identified student
problem(s).
• Be prepared to report out.
www.interventioncentral.org
41
Response to Intervention
Breaking Down Complex
Academic Goals into Simpler
Sub-Tasks: Discrete
Categorization
www.interventioncentral.org
Response to Intervention
Identifying and Measuring Complex Academic
Problems at the Middle and High School Level
• Students at the secondary level can present with a
range of concerns that interfere with academic success.
• One frequent challenge for these students is the need
to reduce complex global academic goals into discrete
sub-skills that can be individually measured and tracked
over time.
www.interventioncentral.org
43
Response to Intervention
Discrete Categorization: A Strategy for Assessing
Complex, Multi-Step Student Academic Tasks
Definition of Discrete Categorization: ‘Listing a number of
behaviors and checking off whether they were performed.’
(Kazdin, 1989, p. 59).
• Approach allows educators to define a larger ‘behavioral’ goal for
a student and to break that goal down into sub-tasks. (Each subtask should be defined in such a way that it can be scored as
‘successfully accomplished’ or ‘not accomplished’.)
• The constituent behaviors that make up the larger behavioral
goal need not be directly related to each other. For example,
‘completed homework’ may include as sub-tasks ‘wrote down
homework assignment correctly’ and ‘created a work plan before
starting homework’
Source: Kazdin, A. E. (1989). Behavior modification in applied settings (4th ed.). Pacific Gove, CA: Brooks/Cole..
www.interventioncentral.org
44
Response to Intervention
Discrete Categorization Example: Math Study Skills
General Academic Goal: Improve Tina’s Math Study Skills
Tina was struggling in her mathematics course because of poor study skills. The RTI
Team and math teacher analyzed Tina’s math study skills and decided that, to study
effectively, she needed to:
Check her math notes daily for completeness.
Review her math notes daily.
Start her math homework in a structured school setting.
Use a highlighter and ‘margin notes’ to mark questions or areas of confusion in her
notes or on the daily assignment.
 Spend sufficient ‘seat time’ at home each day completing homework.
 Regularly ask math questions of her teacher.




www.interventioncentral.org
45
Response to Intervention
Discrete Categorization Example: Math Study Skills
General Academic Goal: Improve Tina’s Math Study Skills
The RTI Team—with teacher and student input—created the following
intervention plan. The student Tina will:
 Approach the teacher at the end of class for a copy of class note.
 Check her daily math notes for completeness against a set of teacher
notes in 5th period study hall.
 Review her math notes in 5th period study hall.
 Start her math homework in 5th period study hall.
 Use a highlighter and ‘margin notes’ to mark questions or areas of
confusion in her notes or on the daily assignment.
 Enter into her ‘homework log’ the amount of time spent that evening
doing homework and noted any questions or areas of confusion.
 Stop by the math teacher’s classroom during help periods (T & Th only)
to ask highlighted questions (or to verify that Tina understood that
week’s instructional content) and to review the homework log.
www.interventioncentral.org
46
Response to Intervention
Discrete Categorization Example: Math Study Skills
Academic Goal: Improve Tina’s Math Study Skills
General measures of the success of this intervention include (1) rate
of homework completion and (2) quiz & test grades.
To measure treatment fidelity (Tina’s follow-through with sub-tasks of the
checklist), the following strategies are used :
 Approached the teacher for copy of class notes. Teacher observation.
 Checked her daily math notes for completeness; reviewed math notes, started math
homework in 5th period study hall. Student work products; random spot check by study
hall supervisor.
 Used a highlighter and ‘margin notes’ to mark questions or areas of confusion in her notes
or on the daily assignment. Review of notes by teacher during T/Th drop-in period.
 Entered into her ‘homework log’ the amount of time spent that evening doing homework and
noted any questions or areas of confusion. Log reviewed by teacher during T/Th drop-in
period.
 Stopped by the math teacher’s classroom during help periods (T & Th only) to ask
highlighted questions (or to verify that Tina understood that week’s instructional content).
Teacher observation; student sign-in.
www.interventioncentral.org
47
Response to Intervention
CBM: Developing a
Process to Collect
Local Norms/Screening
Data
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
RTI Literacy: Assessment & Progress-Monitoring
•
•
•
To measure student ‘response to instruction/intervention’ effectively,
the RTI model measures students’ academic performance and
progress on schedules matched to each student’s risk profile and
intervention Tier membership.
Benchmarking/Universal Screening. All children in a grade level are
assessed at least 3 times per year on a common collection of
academic assessments.
Strategic Monitoring. Students placed in Tier 2 (supplemental)
reading groups are assessed 1-2 times per month to gauge their
progress with this intervention.
Intensive Monitoring. Students who participate in an intensive,
individualized Tier 3 intervention are assessed at least once per week.
Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools:
Procedures to assure scientific-based practices. New York: Routledge.
www.interventioncentral.org
49
Response to Intervention
Local Norms: Screening All Students (Stewart & Silberglit,
2008)
Local norm data in basic academic skills are collected at least
3 times per year (fall, winter, spring).
• Schools should consider using ‘curriculum-linked’ measures
such as Curriculum-Based Measurement that will show
generalized student growth in response to learning.
• If possible, schools should consider avoiding ‘curriculumlocked’ measures that are tied to a single commercial
instructional program.
Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
50
Response to Intervention
Local Norms: Using a Wide Variety of Data
(Stewart & Silberglit, 2008)
Local norms can be compiled using:
• Fluency measures such as Curriculum-Based
Measurement.
• Existing data, such as office disciplinary referrals.
• Computer-delivered assessments, e.g., Measures of
Academic Progress (MAP) from www.nwea.org
Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
51
Response to Intervention
Measures of
Academic Progress
(MAP)
www.nwea.org
www.interventioncentral.org
52
Response to Intervention
Applications of Local Norm Data (Stewart & Silberglit, 2008)
Local norm data can be used to:
• Evaluate and improve the current core instructional
program.
• Allocate resources to classrooms, grades, and buildings
where student academic needs are greatest.
• Guide the creation of targeted Tier 2 (supplemental
intervention) groups
• Set academic goals for improvement for students on
Tier 2 and Tier 3 interventions.
• Move students across levels of intervention, based on
performance relative to that of peers (local norms).
Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
53
Response to Intervention
Local Norms: Supplement With Additional
Academic Testing as Needed (Stewart & Silberglit, 2008)
“At the individual student level, local norm data are just the first
step toward determining why a student may be experiencing
academic difficulty. Because local norms are collected on brief
indicators of core academic skills, other sources of information
and additional testing using the local norm measures or other
tests are needed to validate the problem and determine why the
student is having difficulty. … Percentage correct and rate
information provide clues regarding automaticity and accuracy of
skills. Error types, error patterns, and qualitative data provide
clues about how a student approached the task. Patterns of
strengths and weaknesses on subtests of an assessment can
provide information about the concepts in which a student or
group of students may need greater instructional support,
provided these subtests are equated and reliable for these
purposes.” p. 237
Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
54
Response to Intervention
pp. 2-5
www.interventioncentral.org
55
Response to Intervention
pp. 17-24
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
Response to Intervention
Steps in Creating Process for Local Norming Using
CBM Measures
1. Identify personnel to assist in collecting data. A range
of staff and school stakeholders can assist in the school
norming, including:
• Administrators
• Support staff (e.g., school psychologist, school social
worker, specials teachers, paraprofessionals)
• Parents and adult volunteers
• Field placement students from graduate programs
Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data.
University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf
www.interventioncentral.org
65
Response to Intervention
Steps in Creating Process for Local Norming Using
CBM Measures
2. Determine method for screening data collection. The
school can have teachers collect data in the classroom or
designate a team to conduct the screening:
•
•
•
•
In-Class: Teaching staff in the classroom collect the data over a
calendar week.
Schoolwide/Single Day: A trained team of 6-10 sets up a testing
area, cycles students through, and collects all data in one school day.
Schoolwide/Multiple Days: Trained team of 4-8 either goes to
classrooms or creates a central testing location, completing the
assessment over multiple days.
Within-Grade: Data collectors at a grade level norm the entire grade,
with students kept busy with another activity (e.g., video) when not
being screened.
Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data.
University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf
www.interventioncentral.org
66
Response to Intervention
Steps in Creating Process for Local Norming Using
CBM Measures
3. Select dates for screening data collection. Data
collection should occur at minimum three times per year in
fall, winter, and spring. Consider:
• Avoiding screening dates within two weeks of a major
student break (e.g., summer or winter break).
• Coordinate the screenings to avoid state testing periods
and other major scheduling conflicts.
Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data.
University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf
www.interventioncentral.org
67
Response to Intervention
Steps in Creating Process for Local Norming Using
CBM Measures
4. Create Preparation Checklist. Important preparation steps
are carried out, including:
• Selecting location of screening
• Recruiting screening personnel
• Ensure that training occurs for all data collectors
• Line up data-entry personnel (e.g., for rapid computer
data entry).
Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data.
University of Oregon: Retrieved from https://dibels.uoregon.edu/logistics/data_collection.pdf
www.interventioncentral.org
68
Response to Intervention
Local Norms: Set a Realistic Timeline for ‘PhaseIn’ (Stewart & Silberglit, 2008)
“If local norms are not already being collected, it may be
helpful to develop a 3-5 year planned rollout of local
norm data collection, reporting, and use in line with
other professional development and assessment goals
for the school. This phased-in process of developing
local norms could start with certain grade levels and
expand to others.” p. 229
Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology V (pp. 225-242). Bethesda, MD: National Association of School Psychologists.
www.interventioncentral.org
69
Response to Intervention
Team Activity: Discuss a Plan to Conduct an
Academic Screening in Your School or District
Directions:
• Review the relevant materials in your handout
that relate to school-wide screening tools:
– Elementary screening: literacy: pp. 2-5
– Middle and high school screening: pp. 1734
• Discuss how you might create a building-wide
academic and/or behavioral screening process
for your school or expand/improve the one you
already have.
• Be prepared to report out to the larger group.
www.interventioncentral.org
70
Response to Intervention
Monitoring Student Academic or
General Behaviors:
Daily Behavior Report Cards
www.interventioncentral.org
Response to Intervention
Daily Behavior Report Cards (DBRCs) Are…
brief forms containing student behavior-rating
items. The teacher typically rates the student daily
(or even more frequently) on the DBRC. The
results can be graphed to document student
response to an intervention.
www.interventioncentral.org
72
Response to Intervention
http://www.directbehaviorratings.com/
www.interventioncentral.org
73
Response to Intervention
Daily Behavior Report Cards Can Monitor…
•
•
•
•
•
•
Hyperactivity
On-Task Behavior (Attention)
Work Completion
Organization Skills
Compliance With Adult Requests
Ability to Interact Appropriately With Peers
www.interventioncentral.org
74
Response to Intervention
Daily
Behavior
Report
Card:
Daily
Version
Jim Blalock
Mrs. Williams
www.interventioncentral.org
May 5
Rm 108
Response to Intervention
Daily
Behavior
Report
Card:
Weekly
Version
Jim Blalock
Mrs. Williams
Rm 108
05 05 07 05 06 07 05 07 07 05 08 07 05 09 07
40
www.interventioncentral.org
0
60 60 50
Response to Intervention
Daily Behavior Report Card: Chart
www.interventioncentral.org
Response to Intervention
Establishing RTI
Guidelines
to Diagnose Learning
Disabilities: What
Schools Should Know
Jim Wright
www.interventioncentral.org
www.interventioncentral.org
Response to Intervention
www.interventioncentral.org
79
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Building the Foundation
• Ensure Tier 1 (Classroom) Capacity to Carry Out
Quality Interventions. The classroom teacher is the
‘first responder’ available to address emerging student
academic concerns. Therefore, general-education
teachers should have the capacity to define student
academic concerns in specific terms, independently
choose and carry out appropriate evidence-based Tier 1
(classroom) interventions, and document student
response to those interventions.
www.interventioncentral.org
80
Response to Intervention
Tier 1 (Classroom) Interventions: Building Your
School’s Capacity
 Train Teachers to Write Specific, Measureable, Observable
‘Problem Identification Statements.
 Inventory Tier 1 Interventions Already in Use.
 Create a Standard Menu of Evidence-Based Tier 1
Intervention Ideas for Teachers.
 Establish Tier 1 Coaching and Support Resources.
 Provide Classroom (Tier 1) Problem-Solving Support to
Teachers.
 Set Up a System to Locate Additional Evidence-Based Tier 1
Intervention Ideas.
 Create Formal Guidelines for Teachers to Document Tier 1
Strategies.
 Develop Decision Rules for Referring Students from Tier 1 to
Higher Levels of Intervention.
www.interventioncentral.org
81
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Building the Foundation
• Collect Benchmarking/Universal Screening Data on
Key Reading and Math (and Perhaps Other)
Academic Skills for Each Grade Level. Benchmarking
data is collected on all students at least three times per
year (fall, winter, spring). Measures selected for
benchmarking should track student fluency and
accuracy in basic academic skills that are key to
success at each grade level.
www.interventioncentral.org
82
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Building the Foundation
• Hold ‘Data Meetings’ With Each Grade Level. After
each benchmarking period (fall, winter, spring), the
school organizes data meetings by grade level. The
building administrator, classroom teachers, and perhaps
other staff (e.g., reading specialist, school psychologist)
meet to:
– review student benchmark data.
– discuss how classroom (Tier 1) instruction should be
changed to accommodate the student needs revealed in the
benchmarking data.
– select students for Tier 2 (supplemental group)
instruction/intervention.
www.interventioncentral.org
83
Response to Intervention
Tier 2: Supplemental (Group-Based) Interventions
Tier 2 interventions are typically delivered in small-group
format. About 15% of students in the typical school will
require Tier 2/supplemental intervention support.
Group size for Tier 2 interventions is limited to 4-6 students.
Students placed in Tier 2 interventions should have a shared
profile of intervention need.
The reading progress of students in Tier 2 interventions are
monitored at least 1-2 times per month.
Source: Burns, M. K., & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and
secondary schools. Routledge: New York.
www.interventioncentral.org
84
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Establish the Minimum Number of Intervention
Trials Required Prior to a Special Education
Referral. Your district should require a sufficient
number of intervention trials to definitively rule out
instructional variables as possible reasons for student
academic delays. Many districts require that at least
three Tier 2 (small-group supplemental) / Tier 3
(intensive, highly individualized) intervention trials be
attempted before moving forward with a special
education evaluation.
www.interventioncentral.org
85
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Determine the Minimum Timespan for Each Tier 2 or
Tier 3 Intervention Trial. An intervention trial should
last long enough to show definitively whether it was
effective. One expert recommendation (Burns &
Gibbons, 2008) is that each academic intervention trial
should last at least 8 instructional weeks to allow
enough time for the school to collect sufficient data to
generate a reliable trend line.
www.interventioncentral.org
86
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Level of Student Academic Delay That Will
Qualify as a Significant Skill Discrepancy. Not all students
with academic delays require special education services;
those with more modest deficits may benefit from generaleducation supplemental interventions alone. Your district
should develop guidelines for determining whether a student’s
academic skills should be judge as significantly delayed when
compared to those of peers:
– If using local Curriculum-Based Measurement norms, set an
appropriate ‘cutpoint’ score (e.g., at the 10th percentile). Any student
performing below that cutpoint would be identified as having a
significant gap in skills.
– If using reliable national or research norms (e.g., reading fluency
norms from Hasbrouck & Tindal, 2004), set an appropriate ‘cutpoint’
score (e.g., at the 10th percentile). Any student performing below
that cutpoint would be identified as having a significant gap in skills.
www.interventioncentral.org
87
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will
Qualify as a Significant Discrepancy in Rate of
Learning. The question of whether a student has made
adequate progress when on intervention is complex.
While each student case must be considered on its own
merits, however, your district can bring consistency to
the process of judging the efficacy of interventions by
discussing the following factors…
www.interventioncentral.org
88
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will Qualify as a
Significant Discrepancy in Rate of Learning (Cont.).
– Define ‘grade level performance’. The goal of academic
intervention is to bring student skills to grade level. However,
your district may want to specify what is meant by ‘grade
level’ performance. Local CBM norms or reliable national or
research norms can be helpful here. The district can set a
cutpoint that sets a minimum threshold for ‘typical student
performance’ (e.g., 25th percentile or above on local or
research norms). Students whose performance is above the
cutpoint would fall within the ‘reachable, teachable range’ and
could be adequately instructed by the classroom teacher.
www.interventioncentral.org
89
Response to Intervention
`Estimate the academic skill gap between the
target student and typically-performing peers:
There are three general methods for estimating the
‘typical’ level of academic performance at a grade level:
•
•
•
Local Norms: A sample of students at a school are screened in
an academic skill to create grade norms (Shinn, 1989)
Research Norms: Norms for ‘typical’ growth are derived from a
research sample, published, and applied by schools to their own
student populations (e.g., Shapiro, 1996)
Criterion-Referenced Benchmarks: A minimum level, or
threshold, of competence is determined for an skill. The
benchmark is usually defined as a level of proficiency needed for
later school success (Fuchs, 2003)
www.interventioncentral.org
90
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will Qualify as a
Significant Discrepancy in Rate of Learning (Cont.).
– Set ambitious but realistic goals for student improvement.
When an intervention plan is put into place, the school should
predict a rate of student academic improvement that is
ambitious but realistic. During a typical intervention series, a
student usually works toward intermediate goals for
improvement, and an intermediate goal is reset at a higher
level each time that the student attains it.
The school should be able to supply a rationale for how it set
goals for rate of student improvement.
• When available, research guidelines (e.g., in oral reading fluency) can
be used.
• Or the school may use local norms to compute improvement goals.
• Sometimes the school must rely on ‘expert opinion’ if research or
local norms are not available.
www.interventioncentral.org
91
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will Qualify as a
Significant Discrepancy in Rate of Learning (Cont.).
– Decide on a reasonable time horizon to ‘catch’ the student up
with his or her peers. Interventions for students with serious
academic delays cannot be successfully completed
overnight. It is equally true, though, that interventions cannot
stretch on without end if the student fails to make adequate
progress. Your district should decide on a reasonable span of
time in which a student on intervention should be expected to
close the gap and reach grade level performance (e.g., 12
months). Failure to close that gap within the expected
timespan may be partial evidence that the student requires
special education support.
www.interventioncentral.org
92
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will Qualify as a
Significant Discrepancy in Rate of Learning (Cont.).
– View student progress-monitoring data in relation to peer
norms. When viewed in isolation, student progressmonitoring data tells only part of the story. Even if students
shows modest progress, they may still be falling farther and
farther behind their peers in the academic skill of concern.
Your district should evaluate student progress relative to
peers. If the skill gap between the student and their peers (as
determined through repeated school-wide benchmarking)
continues to widen, despite the school’s most intensive
intervention efforts, this may be partial evidence that the
student requires special education support.
www.interventioncentral.org
93
Response to Intervention
Using RTI to Determine Special Education
Eligibility: Creating Decision Rules
• Define the Rate of Student Progress That Will Qualify as a
Significant Discrepancy in Rate of Learning (Cont.).
– Set uniform expectations for how progress-monitoring data
are presented at special education eligibility meetings. Your
district should adopt guidelines for schools in collecting and
presenting student progress-monitoring information at special
education eligibility meetings. For example, it is
recommended that curriculum-based measurement or similar
data be presented as time-series charts. These charts should
include trend lines to summarize visually the student’s rate of
academic growth, as well as a ‘goal line’ indicating the
intermediate or final performance goal toward which the
student is working.
www.interventioncentral.org
94
Response to Intervention
Define the Rate of Student Progress That Will Qualify
as a Significant Discrepancy in Rate of Learning.
Determine the Minimum Timespan for Each Tier 2 or
Tier 3 Intervention Trial.
Establish the Minimum Number of Intervention Trials
Required Prior to a Special Education Referral.
Hold Data Meetings to Make Tier 2 Group
Placements for Each Grade Level.
Confidence in Eligibility
Decision
Define the Level of Student Academic Delay That
Will Qualify as a Significant Skill Discrepancy.
Collect Benchmarking/Universal Screening Data for
Each Grade Level.
Ensure Tier 1 (Classroom) Capacity to Carry Out
Quality Interventions.
www.interventioncentral.org
95
Response to Intervention
Curriculum-Based Measurement
Lab
www.interventioncentral.org
Response to Intervention
“
“…One way I have used the Maze in the past at the
secondary level, is as a targeted screener to determine an
instructional match between the student and the text
materials. By screening all students on one to three Maze
samples from the text and/or books that were planned for
the course, we could find the students who could not
handle the materials without support (study guides,
highlighted texts, alternative reading material). …This
assessment is efficient and it seems quite reliable in
identifying the potential underachievers, achievers, and
overachievers. The real pay back is that success can be
built into the courses from the beginning, by providing
learning materials and supports at the students'
instructional levels.”
”
Lynn Pennington, Executive Director, SSTAGE
(Student Support Team Association for Georgia Educators)
www.interventioncentral.org
97
Response to Intervention
Team Activity: Exploring Data Tools on the Internet
Directions:
• Consider the free CBM and other data tools
demonstrated during this workshop.
• Discuss how your school might experiment
with or pilot the use of some of these
measures to discover whether they might be
useful universal screening tools or assessment
options for Tier 1 (classroom) practice.
www.interventioncentral.org
98
Download