Students with SLD by Year (Ages 3-21)

advertisement
Using RtI to Advance Learning
in Mathematics
Amanda VanDerHeyden
Education Research and Consulting, Inc.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
What do Families Want?
• Improved learning
• Transparent decisions
• Active system problemsolving
• Efficient use of
resources
• What was my child’s
score? What did you do
differently? What effect
did it have? What are
we doing next?
2
New Assumptions with RtI
• Most children should successfully respond
to intervention.
• Most children in a class should score at
benchmark levels given adequate
instruction.
• Intervention failure should be a rare event.
Where it is not rare, implementation error
should be the first suspect.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
• Instructing without assessment or
intervening without assessment data is
akin to driving without a map.
• With data, any solution becomes a
hypothesis to be tested.
• We need to focus more on supporting
solution implementation and evaluating
solutions to be sure they work.
• Effective teachers, administrators, and
schools are defined by the results they
produce.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Objectives Today
• Understand how to screen, reach
conclusions about proficiency, and correct
system and individual learning problems in
mathematics
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Measurement Should
• Reynolds 1975: In today’s context the
measurement technologies ought to become
integral parts of instruction designed to
make a difference
in the lives of children
not
andjust a prediction
about their lives.
© Amanda VanDerHeyden, Do Not Reproduce without Written Permission
Four Purposes of Assessment
Screening: Which of my students are not meeting grade
level expectations given Universal Instruction?
Instructional decision making: What are the categories or
specific needs of my students who are struggling in math?
Monitoring Progress: What does the student’s growth look
like given the (supplemental) support s/he is receiving?
Program evaluation: How is the education system working
for students overall?
Assessment within MTSS/RtI
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Type of Math Assessment
• General outcome measures (GOMs)
– Assess proficiency of global outcomes
associated with an entire curriculum
• Subskill mastery measures (SMMs)
– Assess learning of specific objective or skill
Mastery Model Measurement
(CBA)
Letter naming
fluency
Isolated sound
fluency
Beginning
sound fluency
Ending sound fluency
General Outcome
Measurement (CBM)
Words read correctly per minute
Letter naming
fluency
Isolated sound
fluency
Beginning
sound fluency
Ending sound fluency
Multi-Tiered Academic Interventions
Tier I: Universal screening and progress
monitoring with quality core curriculum: All
students,
Tier II: Standardized interventions with small
groups in general education: 15% to 20% of
students at any time
Tier III: Individualized interventions with in-depth
problem analysis in general education : 5% of
students at any time
RTI and Problem-Solving
TIER III
TIER I I
TIER I
Problem Solving
• Tier I – Identify discrepancy between
expectation and performance for class or
individual (Is it a classwide problem?)
• Tier II – Identify discrepancy for individual.
Identify category of problem. Assign small
group solution. (What is the category of the
problem?)
• Tier III – Identify discrepancy for individual.
Identify causal variable. Implement
individual intervention. (What is the causal
variable?)
Tier I
Types of Math Knowledge
• Conceptual - the understanding that math
involves an interrelated hierarchical network
that underlies all math-related tasks
• Procedural - the organization of conceptual
knowledge into action to actually perform a
mathematical task (Hiebert & Lefevre, 1986).
• Which comes first?
– Sequence may be specific to the domain or the
individual (Rittle-Johnson & Siegler, 1998; RittleJohnson, Siegler, & Wagner, 2001)
– But the two are clearly interrelated.
Challenges
• Ratio of 6:1
• NAEP data show improvements but not for
ethnic minorities and low SES students
• Lack of streamlined resources
• Insufficient instructional time allocated to
mathematics
• Math proficiency related to income postgraduation, success in college
• Students who are not proficient and enroll in
remedial classes post-secondary are less likely
to graduate
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
If done “right,” potential to
• Improve outcomes
• Lower costs
• Address inequity in achievement and
placements
• Attain effects in places or programs that
have a long record of failures
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Effects of RtI
2,000,000
6,055,629
6,195,113
6,296,353
6,407,418
6,523,428
6,633,902
6,719,267
6,713,071
6,685,952
6,605,724
6,483,398
2,834,032
2,867,895
2,861,107
2,848,483
2,831,217
2,798,319
2,735,267
2,665,417
2,573,059
2,476,192
4,710,338
2,790,191
2,578,431
3,000,000
2,129,188
4,000,000
4,144,000
5,000,000
1,462,000
6,000,000
3,694,000
7,000,000
796,000
Raw Number of Students Served
8,000,000
5,572,328
Number of Students Served by Year (Ages 3-21)
Students with Disabilities
Students with SLD
1,000,000
2008-09\1\
2007-08\1\
2006-07
2005-06
2004-05
2003-04
2002-03
2001-02
2000-01
1999-2000
1998-99
1995-96
1990-91
1980-81
1976-77
0
SOURCE: U.S. Department of Education, Office of Special Education Programs, Annual Report to Congress on the
Implementation of the Individuals with Disabilities Education Act, selected years, 1979 through 2006; and Individuals
with Disabilities Education Act (IDEA) database, retrieved September 13, 2010, from
http://www.ideadata.org/PartBdata.asp. National Center for Education Statistics, Statistics of Public Elementary and
Secondary School Systems, 1977 and 1980; Common Core of Data (CCD), "State Nonfiscal Survey of Public
Elementary/Secondary Education," 1990-91 through 2008–09. (This table was prepared September 2010.)
50.0
45.2
46.3
46.1
45.7
45.5
44.7
45.0
40.0
43.7
42.7
41.6
40.7
39.9
39.0
38.2
35.3
35.0
30.0
25.0
21.5
20.0
Students with SLD
15.0
10.0
5.0
2008-09\1\
2007-08\1\
2006-07
2005-06
2004-05
2003-04
2002-03
2001-02
2000-01
1999-2000
1998-99
1995-96
1990-91
1980-81
0.0
1976-77
Percent of Students with Disabilities Diagnosed with
SLD
Students with SLD by Year (Ages 3-21)
SOURCE: U.S. Department of Education, Office of Special Education Programs, Annual Report to Congress on the
Implementation of the Individuals with Disabilities Education Act, selected years, 1979 through 2006; and
Individuals with Disabilities Education Act (IDEA) database, retrieved September 13, 2010, from
http://www.ideadata.org/PartBdata.asp. National Center for Education Statistics, Statistics of Public Elementary
and Secondary School Systems, 1977 and 1980; Common Core of Data (CCD), "State Nonfiscal Survey of Public
Elementary/Secondary Education," 1990-91 through 2008–09. (This table was prepared September 2010.)
Slavin & Lake (2008)
• 87/256 reviewed studies met
rigorous inclusion criteria
• 13 categorized as examining
curricula
• 36 categorized as computerassisted instruction
• 36 categorized as instructional
process
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
+ 0.10
+ 0.19
+ 0.33
Conclusion
• If you want to change math learning
outcomes, you have to change the quality
of the instructional interaction between
student and teacher
• So what are the characteristics of quality
core instruction in mathematics?
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Begin with Number Sense
• Numbers represent quantity and have
magnitude
• One number may be bigger than another
number or quantity
• Numbers have a fixed order with numbers later
in the sequence representing greater quantities
Griffin (2004)
– Begins with counting in sequence, counting objects,
comparing quantities, adding and subtracting
numbers. Leads to understanding of associative,
commutative, and distributive property and place
value.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Integrate Instruction for
• Procedural and operational fluency with
conceptual understanding
• e.g., emergence of the “count-on” strategy as
children’s understanding of ordinality and
associative property develop
– Estimate, discuss solutions, verify solutions, practice
application
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Sequence Skills Logically and
Provide Adequate Instructional Time
• “a mile wide and an inch deep”
• Make tough decisions about which skills are
essential and ensure mastery of those skills
• NMP says
–
–
–
–
–
–
whole number add/sub by grade 3
mult/div by grade 5
Operations with fractions, decimals, percentages
Operations with pos/neg integers
Operations with pos/neg fractions
Solving percentages, ratios, and rates to balance
equations
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Monitor Student Progress
From VanDerHeyden (2009)
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Match Instructional Strategy to
Learner Competence
New Skill
Antecedent
Cues, Prompts
Reduced Task Difficulty
Narrowly defined Task
Increase Discriminability/
Stimulus Control
Response
Guided practice
Monitor Accuracy
Consequence
Immediate Feedback
More elaborate Feedback
Repetition Loop
Ensure 100% correct
responding
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Match Instruction to Learner
Competence
Established Skill
Antecedent
No Extra or External Cues
Fade Task Difficulty
Opportunities to Respond;
Practice to Mastery
Response
Monitor Fluency
Consequence
Delayed Corrective Feedback
Performance Contingencies
Goal Setting
Build Fluency
Mastered Skill
Antecedent
Permit Variation
Range of Task Difficulty
Application
Increased range of
stimuli
Response
Retention
Application/Generalization
Response
Variation- Build
response set
Consequence
Delayed Feedback
Elaborate Feedback on Application
Improve
Maintenance
Student
Competence
Goal of
Intervention
Intervention
Example
Acquisition Task Establish 100% Cover, copy,
correct
and compare
Independent
Task
Mastery Task
Build fluency
Flashcards,
timed
performance
with incentives,
response cards
Establish robust Guided practice
application
intervention
What is Balanced Math Instruction?
Math Proficiency
Opportunities to
predict, estimate,
verify, and discuss
solutions
Provide opportunities
to generalize skills to
novel problems
Ensure acquisition of
key concepts in math
Build conceptual
understanding to
fluency
Common Core Content
Standards
• Streamlined
• “Asking a student to understand
something means asking a teacher to
assess whether the child has understood
it.”
• Hallmark of understanding: student can
explain why a mathematical statement is
true or where a rule comes from.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Roadmap to Lesson Planning
• Do students understand? Can they do it?
• How will you
– Establish conceptual understanding?
– Build fluency?
– Provide applied practice and discussion?
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Tier 1: Screening
•
•
•
•
3 times per year
More frequently if problems are detected
Probably two probes required
Computation probes work well-- consider
state standards
• Math Screening
• 2 minutes. Scored for Digits Correct per 2 min
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Class-wide Screening
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Feedback to Teachers
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Tier 1 or 2: Class-wide Intervention
Mary
Chiquita
Baseline
120
Randy
Sandy
Intervention
Digits Correct in Two Minutes
Brandy
100
Colvin
m
a
s
t
e
r
y
80
60
40
Jolisha
Daleesha
Kiera
Bradley
Jared
Alfred
Sienna
instructional range
20
Jarian
Trey
Robert
Andrea
0
1
2
3
4
5
Sessions
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Ashley
Jaren
No Class-wide Problem Detected
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Tier 2: Can’t Do/Won’t Do
Assessment
• “Can’t Do/Won’t Do”
• Individually-administered
• Materials
3-7 minutes per
child
– Academic material that student performed poorly
during class assessment.
– Treasure chest: plastic box filled with tangible items.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Decision Rule Following Can’t
Do/Won’t Do Assessment
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Tier 3: Individual Intervention
• Conducted by classroom teacher
• Protocol based
• Follows adequate functional assessment
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
#Correct
Response to Intervention
Before
Intervention
During Intervention
Avg. for his Class
Each Dot is one
Day of Intervention
Intervention Sessions
Intervention
in Reading
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
#Correct
Response to Intervention
Before
Intervention
During Intervention
Avg. for his Class
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Using Screening Data to
Identify Class-wide and
System-wide Instructional
Problems
Step 1: Identify the need for Tier 1
or 2 Intervention
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Screening tells you
• How is the core instruction working?
• What problems might exist that could be
addressed?
• Most bang-for-the-buck activity
• Next most high-yield activity is classwide
intervention at Tier 2.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Consider
•
•
•
•
The Task
Integrity of Administration
Reliability of Scoring
Use software to organize the data
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Mult 0-9 4th Grade Fall Screening
Whole Grade by Teacher
Grade-wide Data by Student
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Look at Other Grades
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
How Can MTSS Help?
• Organize small groups based on student
proficiency (acquisition, fluency,
generalization)
• Use Classwide intervention to build fluency
in pre-requisite skills (I’ll explain)
• Use intensive, individualized interventions
to conduct acquisition interventions
following functional academic assessment
(I’ll show you how)
• Use screening data to connect instructional
strategies
to student proficiency
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Gradewide
Problem?
No
Yes
Classwide
Problem?
Yes
No
Intervention
© Amanda VanDerHeyden, Do Not Reproduce without Written Permission
School-Wide Problem?
• Examine core instruction materials and
procedures
– Instructional time
– Research-supported curric materials
– Calendar of instruction
– Understanding and measurement of mastery
of specific learning objectives
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
• Establish priorities for improvement and
determine timeline
• Add a supplemental instructional program
with weekly PM
• Examine and respond to implementation
effects each month. Share w/ feeder
pattern & connect to long-term effects.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
School-Wide Problem?
Percentage of Students At Risk
90
80
70
60
50
40
30
20
10
0
Percentage of Students
At Risk
Fall
Winter
Spring
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
60
Teacher
12
Teacher 11
Teacher
10
Teacher 9
Teacher 8
Teacher 7
Teacher 6
Teacher 5
Teacher 4
Teacher 3
Teacher 2
Teacher 1
• Demographics should become more
proportionate in failure or risk groups over
time.
• Percentage of students “on track” should
improve (look at percent enrolling in and
passing algebra, AP enrollments and
scores, Percent taking and meeting ACT
benchmarks).
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Grade-wide Problem?
• Examine core instruction procedures
• Begin class-wide supplement and PM
weekly
• Conduct vertical teaming with preceding
and subsequent grade levels to identify
strategies to ensure children attain gradelevel expected skills in future.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
63
Teacher
12
Teacher 11
Teacher
10
Teacher 9
Teacher 8
Teacher 7
Teacher 6
Teacher 5
Teacher 4
Teacher 3
Teacher 2
Teacher 1
Small Group Problem
• Use Tier 2 time to provide more explicit
instruction following standard protocol.
• Monitor weekly. Exit students based on
post-intervention performance not in the
risk range on lesson objectives and
screening criterion.
• When most children are responding well,
identify children for Tier 3.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
• About 90% of children should respond
successfully to Tier 2 intervention
• Successful responders should surpass
screening criterion at higher rates on
subsequent screenings.
• Successful responders should pass highstakes at higher rates than before use of
Tier 2 strategies.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Individual Problem?
• Conduct individual assessment to
establish targets, identify effective
intervention, and specify baseline.
• Prepare all materials
• Monitor weekly and troubleshoot to
accelerate growth
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
• Most children participating in Tier 3 should
respond successfully. More than 5% of
screened pop is a red flag.
• Focus on integrity of intervention.
• Growth should be detectable within two
weeks.
• Troubleshoot interventions that aren’t
working.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
• Successful responders to Tier 3 should fall
into risk range on subsequent screenings
at lower rates.
• Successful responders should pass highstakes at higher rates.
• Unsuccessful responders should qualify
for more intensive instruction at higher
rates.
• Responders/nonresponder should be
proportionate by demographics.
© Amanda VanDerHeyden, Do Not Reproduce Without Permission
Start with a Helicopter View
69
Second Grade Math
70
Third Grade Math
71
Where system problems are detected,
deploy system interventions and:
Verify Rapid Growth in all Classes
72
Look for Lagging Classes– and
Respond
73
74
Teacher
12
Teacher 11
Teacher
10
Teacher 9
Teacher 8
Teacher 7
Teacher 6
Teacher 5
Teacher 4
Teacher 3
Teacher 2
Teacher 1
Set System Goals- Track- And
Respond
First Graders N = 250
140
words read correctly per minute
120
100
80
Poverty
Not in Poverty
60
Linear (Poverty)
Linear (Not in Poverty)
40
20
0
1
2
3
4
5
6
7
8
Weeks
75
Let’s Talk about Two Pitfalls
• Loosely Defined Model
• Over-assessment
76
Your Model is “Too Loose” If
• Results are inconsistent across schools
and/or over time
• There are long delays between decisions
• There are cases without a final decision
77
Assess Smarter
• First, select the best measures and
understand what the “hit” rate is
• No measure is perfect and adding more
measures may not (most likely will not)
increase the “hit” rate
• What do I mean by a “hit” rate?
78
“Hit” Rates Summarize Accuracy of Decisions
79
80
Users Must Weigh
• The costs of false positive errors and false
negative errors for each decision.
– For Screening Decisions – A priority is placed
on avoiding false negative errors typically.
– As a result, many screening systems burden
systems with high false-positive error rates.
– High error rates cause users to lose
momentum and can attenuate intervention
effects systemwide.
– Collecting “more data” does not necessarily
improve the hit rate.
81
Case Example: More
Assessment?
82
Case Example: More
Assessment?
About 36 Weeks in School Year
180 Days of Instruction/ 6 hours per day
1080 Hours of Instruction
-120 hours for report card preparation
- 10 hours screening (200 minutes per class x 3 = 600
minutes per year)
- 15 hours progress monitoring (900 minutes per year- 5
min per child per 10 children, 2 times per month)
- 6 hours per year for unit tests
-GRAND TOTAL: 151 hours of Assessment = 14% of time
83
Schools are Drowning in Data
and the Same Children Still
Can’t Read (or Count)
• Are we making a difference?
• Are we changing the odds?
84
Take an Assessment Inventory
85
Verify Screening Adequacy
86
Exploit Existing Data and
Respond- First, Verify Core
87
88
89
90
91
92
Decision “Hit Rates” Can be
Examined to know if
• Use of an assessment or intervention
improves outcomes over time (increases
the odds of student success)
• You can compute the probability of
passing or failing the high-stakes test if a
student has passed or failed a screener
(called the post-test probability)
• e.g., VanDerHeyden, A. M. (2010). Determining early
mathematical risk: Ideas for extending the research. Invited
commentary in School Psychology Review, 39, 196-202.
93
To Avoid Pitfalls
• Specify measures, decision rules, and
intervention management procedures
• Obtain the best data
• Obtain only the data necessary to make
accurate decisions at each stage
• Plan system interventions where system
problems are detected
• Actively manage intervention
implementation
94
Ask
• What are our system goals?
• What data are we collecting to reflect
progress?
• How are we responding to lack of progress
(how often, what resources)?
• How do data inform professional
development decisions,
text/material/resource adoptions,
allocation of instructional time?
• How do data tie into personnel evaluation?
95
Ask
• Are we changing the odds of success in
our schools?
• What are our special targets and priorities
(e.g., numeracy, high-mobility, etc.)
• Are we operating as efficiently as
possible?
• Are teachers adequately supported (i.e.,
someone responds to data and goes in to
coach and support)?
• Do our instructional leaders follow data?
96
Avoid Common Mistakes
• Exploit existing data to know if efforts are
working
– % at risk fall, winter, spring by grade
– % of class-wide problems fall, winter, spring by
grade
– % of f/r lunch at risk should mirror % of f/r lunch
overall, same for ethnicity and sp ed
– Reduced risk across grades
– Decreased evaluations, proportionate, & accurate
• Specify what you are going to do about it
• Implement solution well
• Follow-up and respond to implementation
failures
97
Bad Decisions are Not Benign
• Art
• Music
• Play
• Rest
• Field Trips
• Special
Projects
Literacy
Mathematics
Language
and Writing
Social Skills
• Parent/School
Bonding
• Community
Support
98
Download