The New Merrill-Palmer Scales-Revised

advertisement
PROGRESS MONITORING
with the
Gale H. Roid, PhD and Mark F. Ledbetter, PsyD
Outline of Workshop
• Why progress monitoring?
• Review of newest IDEA and RTI criteria
• CBM/DIBELS versus improved models
• WRAT4-PMV: Design, administration,
scoring, research, uses
• Case studies
• Recommended applications
Why Progress
Monitoring?
• Early failure in reading ripples through upper
grades and other curriculum areas
• New Individuals with Disabilities Education
Act (IDEA) and No Child Left Behind Act
(NCLB) guidelines suggest progress
monitoring within the response to intervention
(RTI) model
• National Assessment of Educational Progress
(NAEP) shows 37% of fourth graders are
below basic level in reading skills
Benefits of
Intervention with
Progress Monitoring
• Two types of problem readers
1
1. Good oral language; poor phonic skills
2. Lower socioeconomic status (SES) with broad
weaknesses
• Two third graders from the northwest
given intensive tutoring with frequent
brief tests
1. Daron—Primary to Grade 3 oral reading in 14 months
2. Mia—Grade 1 to Grade 3 in 13 months
1
Torgesen, J. K. (2004, Fall). Preventing early reading failure—and its devastating
downward spiral. American Educator, 28.
Progress Monitoring
in NCLB, RTI, and
IDEA
• Annual yearly progress (AYP) in special
education
• Monitoring changes in classroom
instruction (Tier 2 of RTI)
• Intensive assessment in Tier 3 for
possible special education
History of the
RTI Model
According
to Heller, Holtzman, and Messick
2
(1982), there are three criteria for judging
the validity of
special education
3
placements:
1. General education classroom OK?
2. Special education more effective?
3. Is assessment method accurate?
2
Heller, K. A., Holtzman, W. H., & Messick, S. (Eds.) (1982). Placing children in special education: A
strategy for equity. Washington, DC: National Academy Press.
3
Fuchs, L. S., & Vaughn, S. R. (2006, March). Response to intervention as a framework for the
identification of learning disabilities. NASP Communiqué, 34, 1-6.
History of the
RTI Model (cont.)
Three-phase adaptation of Heller et al.’s plan:4
1. Student’s rate of growth in general education
2. Low-performing student’s response to better
instruction
3. Intensive assessment and further response to
evidence-based instruction
4
Fuchs, L. S., & Fuchs, D. (1998). Treatment validity: A unifying concept for reconceptualizing the identification
of learning disabilities. Learning Disabilities Research and Practice, 13, 204-219.
History of the
RTI Model (cont.)
Three-tiered prevention model5,6,7
1. Tier 1: Screening in general education
2. Tier 2: Fixed duration remediation with
progress monitoring
3. Tier 3: Assessment for special education
using progress monitoring
5
Individuals with Disabilities Education Improvement Act of 2004 (IDEA) (2004). Public Law No. 108-446, §632,
118 Stat. 2744.
6
Vaughn, S., Linan-Thompson, S., & Hickman, P. (2003). Response to instruction as a means of identifying
students with reading/learning disabilities. Exceptional Children, 69, 391-409.
7
Gresham, F. M. (2002). Responsiveness to intervention: An alternative approach to the identification of
learning disabilities. In R. Bradley, L. Danielson, & D. P. Hallahan (Eds.), Identification of learning
disabilities: Research to practice (pp. 467-519). Mahwah, NJ: Erlbaum.
CBM and DIBELS
• 1975: Stanley Deno (University of Minnesota) develops
easy-to-use basic skills assessments for teachers
• 1976 to 2005: Deno’s grad students Lynn Fuchs
(Vanderbilt), Gerald Tindal (Univ. of Oregon),
Mark Shinn, and others continue development of
curriculum-based measurement (CBM); major federal
grant support
• 1998: Roland Good’s Dynamic Indicators of Basic Early
Literacy Skills (DIBELS)
• 2004: IDEA reauthorization recommends CBM
(see http://IDEA.ed.gov)
Attributes of
the “Best CBM”
4
• Easy-to-use individual or small group
tests that teachers understand
• Measures improvement over time
• Brief tests given frequently
• Assesses program effectiveness
• No progress  changes in instruction
Attributes of the
“Best CBM” (cont.)
• Word reading performance is highly
related to other CBM measures (e.g.,
fluency, comprehension), especially in
Grades 1-3
• Feedback to teachers and students is
not enough. Guidance and follow-up on
methods of reading instruction is
necessary.
8
Hosp, M. K., & Fuchs, L. S. (2005). Using CBM as an indicator of decoding, word reading, and
comprehension: Do the relations change with grade? School Psychology Review, 34, 9-26.
9
Graney, S. B., & Shinn, M. R. (2005). Effects of reading curriculum-based measurement (R-CBM)
teacher feedback in general education classrooms. School Psychology Review, 34, 184-201.
8,9
Limitations of Some
CBM Applications




Criterion-referenced CBM may not have
grade-based expectations (norms)
CBM test forms not always “equivalent”
statistically (variation in difficulty)
Scores not always good for program
effectiveness or across-grade comparisons
Available CBM tests not in upper grades
WRAT4-PMV
Features and Benefits
• Simple and easy to use
• Long tradition in special education
• Four subtests: Word Reading, Sentence
Comprehension, Spelling, and Math
Computation
• Allows dual comparisons
1. Rate of growth of the student
2. National norms for grade-level expectations
WRAT4-PMV
Features and Benefits
(cont.)
• Four equivalent test forms containing
15 items at each level (six levels)
• Covers Grades K-12 and college
• Across-grade Level Equivalent (LE)
scores are available
• Computer scoring program is
available
Design of
WRAT4-PMV



Four forms for each level
Four subtests: Word Reading, Sentence
Comprehension, Spelling, and Math
Computation
Six levels
- Level 1: Grades K-1
- Level 2: Grades 2-3
- Level 3: Grades 4-5
- Level 4: Grades 6-8
- Level 5: Grades 9-12
- Level 6: Grades 13-16 (i.e., college)
Test Administration:
Word Reading
• Start at the grade level, then adjust
(out-of-level testing is OK)
• Present card with letters and words
• Say, “Look….read across.”
• If not clear, say “Please say the word
again.”
Sample Test Form:
Word Reading Level 3
(Grades 4-5)
Test Administration:
Sentence
Comprehension
• “Find the missing word.”
• Present the sample card and see if
the student finds the missing word
• Read the other sample sentences
• Student silently reads the
remaining sentences in the subtest
Test Administration:
Sentence
Comprehension (cont.)
Mark and score responses
Test Administration:
Spelling
• Spell the word “in context”
• Write (or print) letters or words
• You read the word by itself, then
read the word in a sentence
• Student uses Response Booklet to
write responses
Sample Response
Booklet: Spelling Level 2
(Grades 2-3)
Test Administration:
Math Computation
• Oral math for Grades K-5 (Levels 1-3):
“Show me 3 fingers.”
• Math calculation problems
Level 1: 7 or 8 items
Level 2: 10 or 11 items
Level 3: 13 items
Levels 4-6: 15 items
• Student uses Response Booklet
• No calculators
Sample Oral Math
Card: Levels 1-3
(Grades K-5)
Sample Examiner
Instructions: Math
Computation Card,
Level 2 (Grades 2-3)
Scoring: Plot Raw
Scores on the Profile
to Monitor Progress
Score Difference
Tables
Technical Aspects:
Reliability


High level of
reliability in
Grades K-12
Test-retest 30day practice
effect = less
than .5 point
Subtest
Median
alpha
Word Reading
.81
Sentence
Comprehension
.83
Spelling
.79
Math
Computation
.74
Technical
Aspects: Test
Form Equivalence


Nearly perfect
equivalence
among the
four test forms
at all levels
Criterion
Result
Item percent
correct equal
Within .02
Average means
equal across
forms
Yes
Gulliksen
method with
Wilks’ Lambda
Equal standard
deviations
Yes
Equality of
intercorrelation
Yes
10
11
10
Gulliksen, H. (1950). Theory of mental tests. New York: Wiley.
11
Wilks, S. S. (1932). Certain generalizations in the analysis of variance. Biometrika, 24, 471-494.
Technical
Aspects: Validity
Other published test w/ WRAT4-PMV subtest
Correlation
WIAT-II Word Reading w/ WR
.69
WIAT-II Word Reading w/ SP
.54
WIAT-II Number Operations w/ MC
.48
KTEA-II Reading w/ WR
.68
KTEA-II Writing w/ SP
.65
KTEA-II Math w/ MC
.48
Technical
Aspects: Word
Reading and LD


Study of 30
students with
reading learning
disability (LD)
SD difference in
scores of LD
versus controls =
.5-1.00 (usually 2
raw score points)
Level
Effect
size
1
.54 to .98
SD units
2
.47 to .77
3
.53 to .85
4
.42 to .82
Developmental
Trends in Level
Equivalent Scores
WRRASB4
600
500
400
Observed
300
Quadratic
0
AGEMO
100
200
300
Case Example #1:
Ananta, Grade 2—
Catching Up
Dual Criteria for LDs
Look for two trends:
4
1. Shows no improvement—a “flat
profile” based on “slope” of the
graph line
2. Performs below grade level despite
classroom interventions—the graph
line stays below the grade norms
Case Example #2:
Grade 3—Flat Profile
Dual Discrepancy
Case
Example
#3:
Julio,
Grade 4—
Progress
Across
Grades
Applications of
the WRAT4-PMV

Monitoring students identified by NCLB

Measuring RTI in Tier 2 (fixed duration
remediation)

Verification of qualification for special
education (Tier 3)

Long-term progress monitoring in special
education (AYP)
Applications of the
WRAT4-PMV (cont.)

See reference list handout for examples of
empirically-based instructional interventions

Five methods of reading intervention
- Repeated reading: Read passage twice
- Listening passage preview: You read it, have
student follow with finger
- Phrase drill: Read error words, student repeats
three times
- Syllable segmentation: Read each syllable
- Reward Contingency: If score is improved
12
12
Daly, E. J., Persampieri, M., McCurdy, M., & Gortmaker, V. (2005). Generating reading interventions through
experimental analysis of academic skills: Demonstration and empirical evaluation. School Psychology Review,
34, 395-414.
Sample
Report
From the
WRAT4-PMV
Scoring
Program
Sample
Report
From the
WRAT4-PMV
Scoring
Program
(cont.)
Sample
Report
From the
WRAT4-PMV
Scoring
Program
(cont.)
Sample
Report
From the
WRAT4-PMV
Scoring
Program
(cont.)
Sample Report From
the WRAT4-PMV
Scoring Program (cont.)
Sample Report From
the WRAT4-PMV
Scoring Program (cont.)
For More
Information…
See sample materials
after workshop.
Visit www.parinc.com
and click on Assessment
Consultants to contact
a sales representative or
to arrange a workshop
in your school district.
Download