Reading Assessment

advertisement
Utah Coaching Network

The ultimate goal of assessment is to
identify problems with instruction and to
lead to instructional modifications. A
good share of present-day assessment
activities consist of little more than
meddling…We must use assessment data
to improve instruction…The only way to
determine the effectiveness of instruction
is to collect data.

Ysseldyke and Algozzine (1995)
Curriculum-Based Measurement:
Introduction




Increase knowledge and skills in curriculumbased measurement (CBM)
Learn levels of assessment & associated
purposes
Learn how to administer CBM in reading &
math
Apply knowledge of CBM to diagnostic
assessment practices
◦ Learn can’t do/won’t do assessment
◦ Learn survey level assessment






Learn goal setting formula
Explore interventions options
Apply graphing techniques
Analyze data
Consider coaching practices
Enjoy the journey!




Formative assessment
Measure of student performance over time
An analysis of specific skill on an individual
student
Tool
◦
◦
◦
◦
◦
◦
Identifying struggling students
Set goals
Align instruction with desired outcomes
Provides diagnostic information
Progress monitor
IEP development

Goal is two-fold:
1. Monitor student progress
1. Inform instruction / teacher practice






Measure of class-wide performance
An alternative to other assessment
procedures– often replaces costly, timeconsuming, disruptive practices
Quick & Easy
Establishes reliability & validity
Direct low-inference measures
Can be easily summarized & presented
◦ Parents, students, colleagues
Benchmarking
 Diagnostic

◦ Can’t do/won’t do
◦ Survey Level Assessment
◦ Error analysis
◦ Intervention development
Progress Monitoring
 Instructional/criterion-referenced

 When?
 How
Often?
 Who?
 How?
 What is the purpose?
Tier
3
Tier
2
Tier
1
Adapted from Burns & Riley-Tillman (2010)



Tier III – Identify discrepancy for individual.
Identify causal variable. Implement individual
intervention.
Tier II – Identify discrepancy for individual.
Identify category of problem. Assign small
group solution.
Tier I – Identify discrepancy between
expectation and performance for class or
individual.
Adapted from Burns & Riley-Tillman (2010)
Step 1
Step 2
Step 3
Additional Diagnostic
Assessment
All
students
at a grade
level
Individual
Diagnostic
Winter
Step 4
Results/Monitoring
Individual
Instruction
Weekly
Intensive
5%
Universal
Screening
Fall
Instruction/In
tervention
Group
Diagnostic
Small Group
Differentiated
by Skill
Targeted
15%
Spring
2x month
None
Universal
80%
Continue
with Core
Instruction
Benchmarks
Grades
Classroom
Assessments
Utah CRT
1.
Focus on the curriculum – but also
designed to function in problem-solving
paradigm
 What does this look like?
2.
Using alterable variables – changed
through instruction
 What does this look like?
3.
Employing low-inference measures
4.
Employing criterion-referenced measures
– the “what” & “how” to teach
 What does this look like?
 What does this look like?




One-minute probe (e.g., DIBELS, 6-min. Solution)
Administered individually
Provide intervention and progress monitor at
instructional level
Different measures
◦ Oral Reading Fluency (ORF)
◦ Maze (Comprehension)
◦ Early-reading (Initial Sound, Phoneme Segmentation,
Nonsense Word, Letter Naming Fluency)
(See Chapters 3 & 4 in ABCs of CBM)









Select appropriate material for probe
Place probe in front of and facing the student
Keep copy for the examiner (on clipboard)
Provide directions
Start timer
Have student perform task for allotted time
(1 minute for reading tasks)
Score probe
Display data on graph/chart
Video Clips . . . . Examples






Triads work together
Administer reading fluency probe
Score probe – count number correct and
number of errors
Record the score
Switch roles & repeat
Questions & answers – feedback

Oral Reading Fluency:
◦ Mark as correct
 # of words read correctly in one minute
◦ Mark as incorrect:
 Misread words
 Omissions
 Hesitations - words read by assessor (read after 3
seconds)
 Reversals – two or more words not read in order
(see page 146 in ABCs of CBM)



Benchmarks - Table 3.4 (p. 48)
Norms – Table 3.5 (p. 49)
Growth Rates – Table 3.2 (p. 47)
◦ greater progress is possible

If student doesn’t make adequate progress, it
doesn’t mean she lacks the ability to learn
reading– it means instruction needs to be
changed!

Math CBM can be broken into three areas:
◦ Early numeracy
◦ Computation
◦ Concepts and Applications

Focus will be on administering, scoring, and
using CBM for Computation




Math CBM is conducted having students answer
computational problems for two minutes
Count correct digits – NOT correct problems
Use standardized procedures
Have necessary materials
◦
◦
◦
◦
◦
◦
Different but equivalent math sheets
Directions for administration
Writing utensils
Stopwatch
Quiet environment
Scoring rules & procedures







Select appropriate material for probe
Place probe in front of and facing the student
Can be administered individually or as an
entire class
Provide directions
Start timer
Have student perform task for allotted time
(2 minutes for math tasks)
Score probe
Curriculum-Based Measurement:
Introduction

First time administration
◦ have three equivalent math sheets
◦ Recommend doing it in one session
◦ Median score of three samples used for baseline
(first data point on graph)

Every time
◦
◦
◦
◦
Standardized procedures (see pg 156)
Count correct digits
Record on graph
Use for decision making





Triads work together
Review math fluency probes
Score probe – count number correct and
number of errors
Record the score
Questions & answers – feedback
Curriculum-Based Measurement:
Introduction

Correct answers . . .
◦ Correct
used
◦ Correct
◦ Correct
◦ Correct

answers gets credit for longest method
digits for unfinished problem
digits for reversed or rotated digits
digits for placeholder – any symbol!
Errors
◦ Mark with slash (/)




More sensitive measure for changes in
student learning
Considered a fairer metric – students awarded
more points for correctly solving more
complex problems
Greater number of points is available for
great time commitment on more complex
problems
Let’s consider the options – see Figure 7.4
(page 105)



Benchmarks - Table 7.1 (p. 109)
Norms – Table 7.3 (p. 111)
Growth Rates – Table 7.1 (p. 109)
◦ greater progress is possible

If student doesn’t make adequate progress, it
doesn’t mean she lacks the ability to learn
math – it means instruction needs to be
changed!




Math often has a specific scope & sequence
that can vary from state to state, school to
school, or curriculum to curriculum
Educators often create math sheets linked to
the state’s core – progress monitoring
aligned with outcome measure!
Math sheets should have different problems
but equivalent difficulty
Can be purchased or created




Items on sheet not in order presented in
curriculum
Similar item types are grouped diagonally –
assists when looking for patterns in student
responses (see page101)
Consider mixed vs. single skill
Single-skill sheet helpful for short-term
planning – used to gain some diagnostic
information & assist in decision making



General Outcome Measures (GOMs) – used to
sample performance across several goals by
using “capstone” tasks that are complex (p.11)
Skills-based Measures (SBMs) – used to screen,
progress monitor, & do survey-level
assessment where “capstones” not available
(p. 12)
Mastery Measures (MM) – used on parts of
curriculum that contain discrete and easily
identified sets of items (p.13)

Purpose
◦ Determine motivation vs. skill deficit

Technique
◦ Administer same probe – add incentive
◦ Timing - Soon after benchmark/screener
 Decision Rules
 >=15% increase=motivation (Witt & Beck,
1999)
 <15% skill deficit
 Consider both

Triad practice
◦ Score (p. 48 Benchmarks)
◦ Can’t/Won’t?
◦ Decision?

Trial 1 (reading):
◦ Annie:
 4th grade 65 cwpm (Fall)

Trial 2 (math):
◦ Ethan:
 2nd grade 11 cd (Winter)

Trial 3 (math):
◦ Trevor:
 2nd grade 4 cd (Fall)
Purposes
◦ To determine the appropriate instructional
placement level for the student.
 The highest level of materials that the student can be
expected to benefit from instruction.
◦ To provide baseline data, or a starting point, for
progress monitoring
 In order to monitor progress toward a future goal,
you need to know how the student is currently
performing.
1. Start with grade level passages/worksheets (probes)
2. Administer 3 separate probes (at same level of
difficulty) using standard CBM procedures.
3. Calculate the median score (i.e. the middle).
4. Is the student’s score within instructional range?
◦ Yes - This is the student’s instructional level.
◦ No - If above level (too easy), administer 3
probes at next level of difficulty.
◦ No - If below level (too hard), administer 3
probes at previous level of difficulty.

Refer to Case Studies Provided . . .
◦ Completed Forms
 B. Blue
 Jack Horner




Sample One – Junie B. – Whole Class
Sample Two – Tom - Partner
Sample Three – Arthur - Independent
Consider instructional levels for sample
cases





Kindergarten – Letter sound fluency (LSF)
Grade 1 – Oral reading fluency (ORF) and/or
word identification fluency (WIF)
Grade 2 – Oral reading fluency (ORF)
Grade 3 – Oral reading fluency (ORF)
Grade 4+ - Oral reading fluency (ORF) &
mazes
See Chapters 3 & 4

Norms
◦ Compare student’s score to the performance of
others in her grade or at her instructional level
◦ Data collected on thousands of students – numbers
are very similar

Growth Rates
◦ Provide an indication of the average number of
words per week we would expect students to
improve
◦ Not necessarily new words - students reading same
words at a faster rate each week
1. Start with grade level math worksheets (probes)
2. Administer 3 separate probes (at same level of
difficulty) using standard CBM procedures.
3. Calculate the median score (i.e. the middle).
4. Is the student’s score within instructional range?
◦ Yes - This is the student’s instructional level.
◦ No - If above level (too easy), administer 3
probes at next level of difficulty.
◦ No - If below level (too hard), administer 3
probes at previous level of difficulty.
Refer to Case Studies Provided . . .
 Sample One - Junie B. – Whole Class
 Sample Two – Tom – Partner
 Sample Three -Arthur– Independent


Consider instructional levels for
sample cases






Grade 1 – Addition, subtraction
Grade 2 – Addition, subtraction
Grade 3 – Addition, subtraction,
multiplication, division
Grade 4+ - Multiplication, division
All Grades – Mixed-math skills
Secondary – consider common assessments
based on state or district core

It is our obligation to fix the problem!
◦ Build up prerequisite skills
◦ Increase length of daily lesson
◦ Alter way we respond when error is made

We do NOT lower expectations!
“Learning is a result of instruction, so when the
rate of learning is inadequate, it doesn’t always
mean there is something wrong with the student.
It does mean the instruction needs to be changed
to better meet the student’s needs.” (p. 47)

Three considerations:
◦ 1. Purpose – screening vs. progress mon.
◦ 2. Importance of task – learning to read
vs. learning Roman numerals
◦ 3. Significance of problem – student’s
difficulty increases
need effective
instruction
need more frequent
monitoring
 1.
End of Year Benchmarks
 2. Norms - Levels of performance
 3. Rate of progress – goal setting
◦ (# of weeks x growth rate) + median
baseline = goal
 Students with greatest deficits need
steepest slopes – more intense &
effective interventions

Case Study # 1
◦ Jack – 4th grader – reading data
◦ 3rd grade level 78/2, 4th grade level 71/ 3
◦ compute for 10 weeks and annual goal

Case Study # 2
◦ Suzie – 5th grader – math data
◦ Multiplication 31 CD, division 22 CD
◦ Compute for 10 weeks and annual goal

One of the main benefits of CBM is the
data are easily displayed in graphs &
charts
◦ Standard Graph for CBM
line graph
◦ Vertical axis = skill being measured
◦ Horizontal axis = time or sessions
Time Axis + Skill Axis
Record Changes in
Student Learning Over Time
Data on student level of performance
 Data on student rate of progress

• How well student
Performance
can do task
Progress
• How quickly student
is learning how to
perform task











Baseline
Y Axis Label
X Axis Label
Aim Line - Goal
Data Points
Intervention Line
Intervention Line Label
Change Line
Change Line Label
Level of material being used
Student demographics
Peer
Reading
Sight word
practice

Data Point Analysis – “If-Then Rules”
◦ 3-4 successive data points above the aim
line – move on (add “weight”)
◦ 3-4 successive data points below the aim
line – change intervention to boost
learning
◦ 3-4 successive data points lie around the
aim line – make no changes




By hand
5-Clicks in Excel
www.InterventionCentral.com
www.updc.org
◦ CBM Focus
◦ PM Focus

Graph individual data
◦ Include all components of a graph so it “stands
alone”
◦ Chart your data and progress monitoring data
 Remember labels
Adapted from Burns & Riley-Tillman (2010)
A Few “Go To” Interventions . . .
Inaccurate and Fluid:
Drill and overcorrection (pg 101-105)
Nuclear reading w/overcorrection
Accurate and Slow:
Peer tutoring (pg 145-149)
Repeated reading
Nuclear intervention
Inaccurate and Slow:
Response cards for increasing letter/letter sound
identification (pg 123-129)
Nuclear reading w/ overcorrection
DI on sight words, letter sounds, & blending
Won’t Do:
Mystery Motivator (pg 57-63)
Offer treasure chest each day child has beat score


Always focus on student outcomes!
What will be the objective of the coaching
interaction?

What tools will you use?

How will you measure mastery?

What’s the smallest change you can make that
results in the biggest outcome???




CBM data are an excellent source for writing
goals & objectives
CBM data provide information on specific
student skills (i.e. ORF, math facts)
CBM data are sensitive to student
improvement
CBM data allow teachers to make instruction
decisions




RTI is tiered approach to instruction
CBM is core component of RTI
CBM used throughout all tiers – only
change is frequency of assessment
Decisions within RTI Approach using CBM
 Effectiveness of instruction
 Eligibility for remedial programs (such as
special education)
ACADEMIC SYSTEMS
BEHAVIOR SYSTEMS
Tier 3: Comprehensive &
Intensive Students who
need individualized
interventions.
Tier 3: Intensive
Interventions Students
who need individualized
intervention.
Tier 2: Strategic
Interventions Students
who need more support in
addition to the core
curriculum.
Tier 2: Targeted Group
Interventions Students
who need more support in
addition to school-wide
positive behavior
program.
Tier 1: Core Curriculum
All students, including
students who require
curricular enhancements
for acceleration.
Tier 1: Universal
Interventions All
students in all settings.
63
COACHING
63
10 Steps for using CBM
 1. Who will be using CBM?
 2. Which skills of CBM will be implemented?
 3. What materials will be used?
 4. When will implementation start?
 5. Who will train the staff?
 6. Who will manage the materials?
 7. Who will collect the data?
 8. Where will the data be collected?
 9. Who will manage the data once collected?
 10. How will data be shared?
(Checklist for Using CBM – pp. 160-161)




Increase knowledge and skills in curriculumbased measurement (CBM)
Learn levels of assessment & associated
purposes
Learn how to administer CBM in reading &
math
Apply knowledge of CBM to diagnostic
assessment practices
◦ Learn can’t do/won’t do assessment
◦ Learn survey level assessment






Learn goal setting formula
Explore interventions options
Apply graphing techniques
Analyze data
Consider coaching practices
Enjoy the journey!













Updc.org
The ABC’s of CBM
One-Minute Academic Functional Assessment and Interventions:
“Can’t” Do It…or “Won’t” Do It?
Functional Assessments: A Step-by-Step Guide to Solving
Academic and Behavior Problems
Implementing Response-to-Intervention in Elementary and
Secondary Schools (Burns & Gibbons, 2008)
I’ve DIBEL’d, Now What?
AIMSweb.com
Interventioncentral.org
Pre-Referral Intervention Manual
School Problem Solving Teams
What Works Clearinghouse: http: //ies.ed.gov/ncee/wwc/
www.Studentprogress.org
www.CBMnow.org
Download