the Presentation - Center for Behavioral Educational

advertisement
D (as in data):
The critical letter in successful
implementation of RTI/SRBI
Drs. Sandy Chafouleas & Mike Coyne
CBER Research Collaborative
April 30, 2010
PURPOSE
YOU ASKED FOR IT…
A presentation on systems level data-based
decision making and RtI/SRBI.
What is RTI?
Approach for redesigning & establishing
teaching & learning environments that are
effective, efficient, relevant, & durable for
all students, families & educators
• NOT program, curriculum, strategy,
intervention
• NOT limited to special education
• NOT new
What is “response to
intervention”?
BASIC QUESTION: How do we know if X is working?
•
Foundations within data-based decision making
•
Roots of data-based decision making come from the problem-solving
model
•
Process involved in “problem-solving” is ancient
– model became clearly articulated within psychology and then education
through applied behavior analysis --- behavioral consultation
What is the problem?
Why is it occurring?
What should we do about it?
Did it work?
(Bergan,
1977, Bergan&Kratochwill, 1990; Tilly, 2009; Reschly& Bergstrom, 2009)
Foundations for RTI
Problem Solving Model
Define the
Problem
“Train and Hope” Model
WAIT for
New
Problem
REACT to
Problem
Behavior
Develop a
Plan
Evaluate
Expect, But
HOPE for
Implementation
Implement
Plan
Hire EXPERT
to Train
Practice
Select &
ADD
Practice
REVIEW:
Why do we need data?
Purposes of Assessment
• Screening
• Progress Monitoring
• Diagnosis
• Evaluation
Emphasized
by the
National
Center on
Response to
Intervention
Why do I need data?
At what level should the
problem be solved?
What is the purpose of
assessment?
(Universal, Targeted, Select)
(Screening, Progress Monitoring,
Evaluation, Diagnosis)
Which data do I need?
Which tools are best
matched to assess the
behavior of interest?
What decisions will be
made using these data?
Psychometric Adequacy
What resources are
available to collect
data?
Contextual relevance
Which tools can answer these questions?
Adapted from Chafouleas, Riley-Tillman, & Sugai, 2007
Usability
Selecting Appropriate Data
Sources
How do
we
balance
this
across
RTI
Tiers?
Measurement
Concerns
Feasibility
Concerns
Type of
Assessment
Time
Measurement
Targets
Staff
Resources
Psychometric
Properties
Obtrusiveness
Adapted from Briesch & Volpe (2007)
What is available to guide
“data” decisions?
National Association of State
Directors of Special Educators
www.nasdse.org
A LITTLE BACKGROUND ON
THE BLUEPRINTS…
Blueprint Format…
The Blueprints address the following
key points:
• There are critical components of RtI implementation that if not attended to
can render otherwise acceptable implementations ineffective.
• The school building is the unit of change in RtI. Multiple buildings within a
district can implement RtI, but their implementations will likely be
somewhat different.
• District-level supports must be systematically built in to support buildinglevel implementation.
• State-level supports must be systematically built to support district- and
building-level implementation.
• Building change should be guided by the answers to key questions. By
answering a specific set of interrelated questions, using the scientific
research and site-based data, buildings can be assured that they are
implementing the major components of RtI. Specific mandated answers to
these questions should not be imposed uniformly across all buildings.
Source: NASDSE blueprint on RTI implementation (school building level)
Student outcome data are crucial to:
• make accurate decisions about the effectiveness of general and
remedial education instruction/interventions;
• undertake early identification/intervention with academic and
behavioral problems;
• prevent unnecessary and excessive identification of students with
disabilities;
• make decisions about eligibility for special programs, including
special education; and
• determine individual education programs and deliver and evaluate
special education services.
Source: NASDSE blueprint on RTI implementation (school building level)
Three “components” to
RTI implementation
1. Consensus building – where RtI concepts are communicated
broadly to implementers and the foundational “whys” are taught,
discussed and embraced.
2. Infrastructure building – where sites examine their
implementations against the critical components of RtI, find
aspects that are being implemented well and gaps that need to
be addressed. Infrastructure building centers around closing
these practice gaps.
3. Implementation – where the structures and supports are put in
place to support, stabilize and institutionalize RtI practices into a
new “business as usual.”
Source: NASDSE blueprint on RTI implementation (school building level)
“D” CONSIDERATIONS
RELEVANT TO COMPONENT 2:
INFRASTRUCTURE…
Component 2: Infrastructure
Action 1. Form a leadership team
Step 1: Assign roles.
– Data Mentor
– Content specialist
– Facilitator
– Staff liaison
– Instructional leader/resource allocation
Source: NASDSE blueprint on RTI implementation (school building level)
Source: CT SRBI document
WHO SERVES THE DATA
MONITOR IN YOUR SETTING?
What resources are needed to facilitate positive
outcomes for this person(s)?
Component 2: Infrastructure
Action 3: The leadership team will work through
ten basic questions to develop action plans.
Question 1: Is our core program sufficient?
– identify screening tool, identify proficiency cut points,
collect universal screening data,
organize/summarize/display data, determine
acceptable % proficiency, identify % of students
meeting proficiency, make comparison, determine what
works/doesn’t work
Source: NASDSE blueprint on RTI implementation (school building level)
WHAT ARE THE KEY
FEATURES OF “GOOD”
SCREENING TOOLS?
SRBI… Tier I
Source: CT SRBI document
Example: School-wide behavior
using ODR
T abular Display
Graphic Display
2004-05
Enrollment 856
# ODR
Aug
68
Sep
123
Oct
138
Nov
123
Dec
133
Jan
123
Feb
117
Mar
108
Apr
135
May
112
Jun
54
# ODR/Month 2004-05
#ODR
MON
160
140
120
100
80
60
40
20
0
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Month
Display organization can vary by: referrals per day per month, behavior,
location, time of day, students with 3 ODRs, etc.
SRBI… Tier I
What are your Tier 1
assessment and data
analysis/decision
making strategies
across domains?
Source: NASDSE blueprint on RTI implementation (school building level)
Source: CT SRBI document
Measurement
Concerns
Feasibility
Concerns
Type of
Assessment
Time
Measurement
Targets
Staff
Resources
Psychometric
Properties
Obtrusiveness
Are those tools
appropriate, relevant,
and efficient?
Question 4: How will the sufficiency
and effectiveness of the core
program be monitored over time?
• Step 1: Determine key indicators
of
success.
• Next Steps include: Determine
baseline, establish goals, develop
the collection plan, schedule to
analyze data
Source: NASDSE blueprint on RTI implementation (school building level)
Think back to the
previous question
on measures…
How are you
doing now?
Which of these
areas might need
further
discussion?
EXAMPLE:
APPLICATION OF DBR-SIS IN
TIER I – CLASSWIDE
Example
Scale
Formats
for
DBR
Source: Chafouleas,
Riley-Tillman, &
Christ (2009)
Case Study Example:
Classwide Assessment
Riley-Tillman, Methe, & Weegar
(2009)
• Sample: First grade classroom
with 14 students
• Design: B-A-B-A
• Intervention: modeling and
prompting of silent reading
• Measures: researcher-completed
SDO, teacher-completed DBRSIS
• Conclusion: DBR data can be
sensitive to classroom-level
intervention effects, maps closely
to resource-intensive SDO
DBR
SDO
B1
72
68
Phase Mean
A1
B2
45
63
49
61
A2
42
50
Example: Tier I/II Classwide
Intervention in Middle School
Chafouleas, Sanetti, Jaffery &
Fallon (in prep)
•Sample: 8th grade, 2 teachers and 3
classrooms (17-24 students)
•Design: Multiple baseline across
classrooms
•Intervention: Self-monitoring and a
group contingency package,
implemented over about 2 months
•Measures: student-completed DBR
(teacher-checked), researchercompleted SDO
•Conclusion: Classwide intervention
overall effective, think about target
identification and need for supports
based on baseline
DBR-SM and SDO Data Across Classes
Baseline
M (SD)
Intervention
Phase 1
Phase 2
M (SD)
M (SD)
7.9 (2.03)
6.4 (2.80)
7.6 (1.95)
6.8 (2.31)
Ms. S – Period 5
DBR-SM Prepared.
Engagement
SDO Engagement
Off-Task
8.8 (1.33)
8.0 (1.71)
36.2 (12.51) 79.0 (5.08)
70.4 (7.60) 30.7 (6.30)
83.1 (.34)
21.7 (8.16)
9.6 (1.05)
8.6 (1.36)
9.9 (0.48)
9.3 (0.99)
9.9 (0.24)
9.6 (0.76)
75.9 (5.68)
34.7 (4.58)
86.7 (2.36)
19.2 (5.53)
86.7 (5.87)
16.7 (6.41)
8.1 (1.90)
7.4 (2.02)
8.3 (1.35)
7.8 (1.59)
8.9 (0.92)
8.1 (1.35)
57.9 (7.75)
47.5 (5.00)
71.0 (13.86) 80.6 (14.94)
34.6 (20.78) 28.9 (14.18)
Ms. B – Period 3
DBR-SM Prepared.
Engagement
SDO Engagement
Off-Task
Ms. S – Period 1
DBR-SM Prepared.
Engagement
SDO Engagement
Off-Task
REMEMBER… WE ARE STILL
IN TIER 1 (ALL STUDENTS)!
Question 6: For which students is the core
instruction sufficient or not sufficient? Why or
why not?
• This is where decision making moves to small group
and individual decision making.
• Plan for, and allocate, sufficient time for data
analysis.
• This step can be completed with varying levels of
rigor. Screening data can be used to address many
of these questions. The more serious student
problems, the more in-depth the problem analysis
should be…
Source: NASDSE blueprint on RTI implementation (school building level)
SRBI…Tier II
What are your Tier II
assessment and data
analysis/decision
making strategies
across domains?
Measurement
Concerns
Feasibility
Concerns
Type of
Assessment
Time
Measurement
Targets
Staff
Resources
Psychometric
Properties
Obtrusiveness
Are those tools
appropriate, relevant,
and efficient?
Source: CT SRBI document
Example: Early Identification and
Monitoring using “Local” Norms
Chafouleas, Kilgus, & Hernandez (2009)
• Sample: full day K inclusive
classroom, 2 teachers and 22
students
• Measures: teacher-completed
DBR-SIS following am and
pm over Nov-March for ALL
students
• Conclusion: “Local” cut-score
comparisons can be useful in
examining individual student
performance. Periodic reassessment of all may be
needed to re-confirm
appropriate comparison
Target
Behavior
Academic
Engagement
Disruptive
Behavior
Rating
Time
AM
PM
AM
PM
FALL
M (SD)
8.72 (1.31)
8.25 (2.03)
1.30 (1.47)
1.61 (2.08)
SPRING
M (SD)
9.40 (0.63)
9.37 (0.88)
0.60 (0.62)
0.42 (0.52)
Example: Early Identification
using
“Cut-Points”
Condition
Example DBR-SIS with BESS Criterion
(est. via the “gold standard”)
Target
Kilgus, Chafouleas, Riley-Tillman,
& Welsh (in
Positive
Negative
prep)
Test
Positive
Pos.
•
Purpose: To
evaluate theTRUE
diagnostic
accuracy of all possible DBR-SIS
Outcome
(Disruptive Behavior, Academic
Engagement, Compliance)
Negative
FALSE Neg.
•
Sample: Second grade teachers and
randomly selected students
in their
(Type
II error)
classrooms
•
Measures: teacher-completed DBR-SIS
following am and pm over 1 week, BESS
= Sensitivity
and SSiS Performance Screener
•
Analyses: Diagnostic accuracy statistics
•
Conclusion: DBR may provide efficient
initial identification of potential risk, but
may need to be confirmed through
complementary measures. Findings
suggest interpretation of DBR-SIS “cutscore” may be highly dependent on what
is considered to be a “true” indicator of
school-based behavioral difficulty.
Cut
SS
SP
Behavior
Score
.917
.615
FALSE Pos. 1.210
= Pos.
predictive
Disruptive
(Type
I error) 1.530
value.875 .698
Behavior
1.580 .833
.698
.792
.771
TRUE Neg. 1.845
= Neg.
predictive
7.165
value.792 .844
Academic
7.365 .833
.823
Engagement 7.895 .875
.771
= Specificity 8.055 .917 .719
8.410 .958
.677
PPP
NPP
.373
.420
.408
.463
.559
.541
.488
.449
.426
.967
.957
.944
.937
.942
.952
.961
.972
.985
SRBI… Tier III
What are your Tier
III assessment and
data analysis/decision
making strategies
across domains?
Measurement
Concerns
Feasibility
Concerns
Type of
Assessment
Time
Measurement
Targets
Staff
Resources
Psychometric
Properties
Obtrusiveness
Are those tools
appropriate, relevant,
and efficient?
Source: CT SRBI document
Example: Middle School
Behavior – Tier II/III
Chafouleas, Sanetti,
Jaffery & Fallon (in prep)
•Sample: Single “target”
student
•Design: A-B
•Intervention: same
SM/incentive implemented
over about 2 months
•Measures: studentcompleted DBR (teacherchecked)
•Conclusion: “classwidebased intervention”
insufficient, intervention
needs modification and
more intensive
monitoring/feedback
REMEMBER THIS:
Three “components” to RTI implementation
1. Consensus building – where RtI concepts are communicated
broadly to implementers and the foundational “whys” are taught,
discussed and embraced.
2. Infrastructure building – where sites examine their
implementations against the critical components of RtI, find
aspects that are being implemented well and gaps that need to
be addressed. Infrastructure building centers around closing
these practice gaps.
3. Implementation – where the structures and supports are put in
place to support, stabilize and institutionalize RtI practices into a
new “business as usual.”
Source: NASDSE blueprint on RTI implementation (school building level)
“D” CONSIDERATIONS
RELEVANT TO
IMPLEMENTATION…
Objectives for School Level
Implementation
• The school builds its master calendar and master schedule around the
instructional needs of students.
• The needs of students with core, supplemental and intensive needs are
addressed appropriately in this structure.
• Supplemental and intensive instructions are in addition to, rather than
instead of, core instruction.
• Implementation supports are systematically built into the system and are
carried out as planned.
• Scheduled dates are identified for all assessments (screening, diagnostic
and progress monitoring).
• Scheduled dates are identified for decision-making about students’
instruction (flexible grouping).
• Sufficient expertise is available to assist the school in making data-based
decisions about students’ instruction.
• Successes, no matter how small, are celebrated by all involved.
• A project-level evaluation plan is created and put in place. Data are
collected over time.
Example: Tier I
Mrs. Brown, the elementary principal, calls you into her
office to share her frustration at the number of students
being sent to her office during lunch periods. As you look
around, there seems to be more 2nd grade students
sprawled around the main office than those actually eating
lunch in the cafeteria! So, you pull a summary of office
discipline referrals over the past month and find that 2nd
grade students from Mr. Smith’s class have had
substantially more referrals than other groups of students.
You propose a plan that entails teaching expectations for
cafeteria behavior to Mr. Smith’s students along with
instituting better monitoring of those students in the
cafeteria to support positive behavior and reduce
inappropriate behavior. In the end, you have saved the
day in Mrs. Brown’s eyes – without a significant
assessment effort or multiple individualized interventions.
Example: Tier II
For example, in the cafeteria scenario, perhaps your
discipline referral data indicate that 4 students comprise
70% of the total number of cafeteria referrals following
implementation of the intervention for Mr. Smith’s class.
Thus, this group of students might serve as the target for
secondary level prevention efforts.
Example: Tier III
• In the cafeteria example, we may find that one of
Mr. Smith’s students continues to have difficulty.
Further exploration of school-wide data regarding
this student suggests behavioral problems exist
across multiple settings.
So, you may need to further investigate the reason for the
behavioral problems – such as by having multiple adults
complete a rating scale and also conducting SDO in various
settings. Those additional data, coupled with our existing
school-wide data and daily cafeteria behavior ratings, should
lead us to better understanding of the reason for the problem
behavior as well as potential interventions that are likely to
work (i.e., are conceptually relevant).
How would this example work
in your building?
Who does the data collection and analysis?
Do you have a regular schedule for data
review?
Do you have a “map” of intervention
resources to readily link to conceptually
relevant intervention?
Summary Thoughts about “D”
PROBLEM-SOLVING
MODEL
What is the problem?
Why is it occurring?
What should we do about it?
Did it work?
STRATEGIC PLANNING
PROCESS (long-term):
1)
2)
3)
Consensus
Infrastructure
Implementation
Source: NASDSE
Why do I need data?
At what level should
the problem be solved?
(Universal, Targeted,
Select)
What is the purpose
of assessment?
(Screening, Progress
Monitoring,
Evaluation, Diagnosis)
Which data do I need?
Which tools are
best matched to
assess the behavior
of interest?
Contextual relevance
What decisions
will be made
using these data?
Psychometric
Adequacy
What resources
are available to
collect data?
Usability
Which tools can answer these
questions?
Adapted from Chafouleas, Riley-Tillman, & Sugai, 2007
Further Resources
SRBI document
http://www.ctserc.org/s/index.php?option=com_content&view=article&id
=320:srbi-connecticuts-framework-for-rti-executive-summary-and-fulldocument&catid=17:eip&Itemid=110
National Association of State Directors of Special Education, Inc.:
Response to Intervention Project
http://www.nasdse.org/Projects/ResponsetoInterventionRtIProject/tabid
/411/Default.aspx
RTI Action Network (part of National Center for LD)
http://www.rtinetwork.org/
National Center on Response to Intervention
http://www.rti4success.org/
Questions, Comments,
Observations…
Thank you
mike.coyne@uconn.edu
sandra.chafouleas@uconn.edu
Download