Reading CBM Practice Materials.............................................................................................. 1
Scoring Guidelines........................................................................................................... 1
PRF Teacher Copy........................................................................................................... 2
PRF Student Copy........................................................................................................... 3
CBM Graphs and Instructional Decision-Making..................................................................... 4
Graph 1............................................................................................................................. 5
Graph 2............................................................................................................................. 6
Graph 3............................................................................................................................. 7
RTI Case Studies........................................................................................................................ 8
Case Study 1...................................................................................................................... 9
Case Study 2...................................................................................................................... 10
Sample Syllabi........................................................................................................................... 11
Formal and Informal Clinical Procedures, SPED 645..................................................... 12
Assessment and Instruction of Reading Difficulties, SPED 6631................................... 19
Reading Instruction for Students with Mild/Moderate Disabilities, Special Education
5122.................................................................................................................................. 28
Literacy in Special Education, 4370................................................................................. 37
Assessment and Evaluation in Special Education, 4320.................................................. 46
Course Activity
–
Steps for Slope Using the Tukey Method..................................
58
Course Activity
–
Graphing Data Using Excel........................................................ 60
Course Activity
–
Instructional Plan........................................................................ 61
Final Project Presentation Key Elements................................................................. 62
Final Project: Summarizing the Outcomes of Data-Based Interventions................ 63
Advanced Issues & Procedures in the Assessment of Students with Mild/Moderate
Disabilities, Special Education 3820................................................................................ 65
SPM, CBM, and RTI Resources................................................................................................ 72
Resources from the National Center on Student Progress Monitoring...................................... 74
What Is Scientifically-Based Research on Progress Monitoring?.................................... 75
How Progress Monitoring Assists Decision making in a Response-to-Instruction
Framework........................................................................................................................ 85
Student Progress Monitoring: What This Means for Your Child.................................... 89
IDEA Regulations: Early Intervening Services......................................................................... 91
Questions and Answers on Response to Intervention (RTI) & Early Intervening Services
(EIS)........................................................................................................................................... 96
Score as CORRECT:
Repetitions
Self-corrections
Insertions
Dialectical differences
Score as INCORRECT:
Mispronunciations
Word substitutions
Omitted words
Hesitations (word not said within 3 seconds)
Reversals
1
It was raining outside, and there was nothing for Norman to do.
“I have the most boring life,” he moaned, as he plopped down on the couch. Just as he switched on the television, the power went out. Watching a blank television was not something Norman wanted to do. He looked around at the four dismal walls that kept him out of the rain.
“Now what am I going to do?”
“You could tidy up your room,” his mom suggested, “or organize your closet.
Your closet is a disaster, Norman. I’m actually frightened of what you might find in there . You haven’t cleaned it in a decade.”
There was nothing Norman could say after his mom had made up her mind. He was going to have to clean out his closet.
The only problem was that Norman couldn’t even open his closet door. He had it held closed with a large wooden block. There was so much junk in there that it wouldn’t stay shut on its own. To push aside the wooden block and open the door would mean doom for Norman. He’d be crushed by falling trash as soon as he turned the knob. He decided that he would only pretend to clean his closet, but his mother came into his bedroom.
“Well,” she said, placing her hands on her hips, “let’s see you get to work.”
Norman put both hands on the doorknob and tugged. The entire doorframe gave a mighty CREAK. There was a loud rumble as Norman was pushed back by the wave of forgotten junk he’d jammed into his closet. When the loud noise faded,
Norman was lying on his back under a mountain of broken toys, dirty socks, and books. With a groan, he lifted himself to his feet.
There was an awful smell wafting from somewhere inside. Norman looked into the depths of his closet. It was dark, dreary, and mysterious. Anything —absolutely anything —could be hiding in there. Maybe trolls, ghouls, or gnomes, Norman thought. This job could be an adventure! Pushing up his sleeves, Norman got to work.
Licensed to University of Missouri
For the 2003-2004 School Year
DN 290748
It was raining
Grade 5, Passage 10
Copyright 2001 Edformation, Inc.
All Rights Reserved
214
229
242
257
271
286
131
146
162
178
193
209
296
308
321
333
347
348
99
107
122
64
71
84
12
28
42
57
2
It was raining outside, and there was nothing for Norman to do.
“I have the most boring life,” he moaned, as he plopped down on the couch. Just as he switched on the television, the power went out. Watching a blank television was not something Norman wanted to do. He looked around at the four dismal walls that kept him out of the rain.
“Now what am I going to do?”
“You could tidy up your room,” his mom suggested, “or organize your closet.
Your closet is a disaster, Norman. I’m actually frightened of what you might find in there. You haven’t cleaned it in a decade.”
There was nothing Norman could say after his mom had made up her mind. He was going to have to clean out his closet.
The only problem was that Norman couldn’t even open his closet door. He had it held closed with a large wooden block. There was so much junk in there that it wouldn’t stay shut on its own. To push aside the wooden block and open the door would mean doom for Norman. He’d be crushed by falling trash as soon as he turned the knob. He decided that he would only pretend to clean his closet, but his mother came into his bedroom.
“Well,” she said, placing her hands on her hips, “let’s see you get to work.”
Norman put both hands on the doorknob and tugged. The entire doorframe gave a mighty CREAK. There was a loud rumble as Norman was pushed back by the wave of forgotten junk he’d jammed into his closet. When the loud noise faded,
Norman was lying on his back under a mountain of broken toys, dirty socks, and books. With a groan, he lifted himself to his feet.
There was an awful smell wafting from somewhere inside. Norman looked into the depths of his closet. It was dark, dreary, and mysterious. Anything
—absolutely anything —could be hiding in there. Maybe trolls, ghouls, or gnomes, Norman thought. This job could be an adventure! Pushing up his sleeves, Norman got to work.
Licensed to University of Missouri
For the 2003-2004 School Year
DN 290748
It was raining
Grade 5, Passage 10
Copyright 2001 Edformation, Inc.
All Rights Reserved
3
4
100
90
80
70
60
50
40
30
20
10
0
X trend-line
X goal-line
X
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Weeks of Instruction
On this graph, the trend-line is steeper than the goal-line. Therefore, the student
’ s end-of-year performance goal needs to be adjusted. The teacher increases the desired rate (or goal) to boost the actual rate of student progress. The new goal-line can be an extension of the trend-line.
The point of the goal increase is notated on the graph as a dotted vertical line. This allows teachers to visually note when the student
’ s goal was changed. The teacher re-evaluates the student graph in another 7-8 data points to determine whether the student
’ s new goal is appropriate of whether a teaching change is needed.
5
X
100
90
80
70
60
50
40
30
20
10
0 trend-line
X
X trend-line
X
X
X
X
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Weeks of Instruction
On this graph, the trend-line is flatter than the performance goal-line. The teacher needs to change the student’s instructional program. The end-of-year performance goal and goal-line are never decreased!
A trend-line below the goal-line indicates that student progress is inadequate to reach the end-of-year performance goal. The instructional program should be tailored to bring a student’s scores up so they match or surpass the goal-line.
The point of the instructional change is represented on the graph as a solid vertical line. This allows teachers to visually note when the student’s instructional program was changed. The teacher reevaluates the student graph in another 7-8 data points to determine whether the change was effective.
6
100
90
80
70
60
50
40
30
20
10
0 trend-line
X
X trend-line
X
X
X
X
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Weeks of Instruction
If the trend-line matches the goal-line, then no change is currently needed for the student.
The teacher re-evaluates the student graph in another 7-8 data points to determine whether an end-ofyear performance goal or instructional change needs to take place.
7
8
40
30
20
10
0
80
70
60
50
This is Renee’s Passage Reading Fluency Graph for the beginning of the school year. Renee is a student in a 2 nd
grade general education classroom, and her teacher is concerned that she is not progressing as quickly as her peers, most of whom are on track to be reading 75 words correct per minute by the end of the school year (benchmark for proficiency). Renee’s teacher knows that, by week 18 midway through the school year, Renee should be reading about 45 words correct per minute to be on-track for meeting the benchmark. Renee’s teacher concludes that she needs Tier 2 intervention and a reading tutoring program is chosen.
Is the Tier 2 tutoring effective for Renee?
Is Renee on-track to meet the end-of-the-year benchmark of 75 words per minute?
Does Renee require further intervention?
X
X
X
X
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Weeks of Instruction
9
Here is Justin’s Passage Reading Fluency Graph. Justin is a third grade student in a general education class whose teacher responded to her concern about his response to universal instruction (Tier 1) by offering a Tier 2 tutoring intervention. At the beginning of the Tier 2 tutoring, Justin’s teacher set a goal for him to make gains in Passage Reading Fluency at the rate of an additional .75 word correct per minute for the next 16 weeks of instruction. She is now evaluating Justin’s progress nine weeks into Tier 2.
Is Tier 2 meeting Justin’s reading instruction needs?
Is Justin on track to meet the goal set by his teacher?
Should Justin be placed in Tier 3, or should another Tier 2 intervention be tried? What would
Tier 3 look like, vs. an alternative Tier 2 intervention?
40
30
20
10
0
70
60
50
1
X
3
Tier 1
5 7
X
X
Tier 2
X
9 11 13 15 17 19 21 23 25
Weeks of Instruction
10
*Please do not reproduce or distribute any of the following without permission.
Contact
or 1-866-770-6111.
11
Department of Educational Studies: Special Populations
Dr. Todd Busch
Required Texts:
Overton. (2006). Assessing Learners with Special Needs: An Applied Approach (5 th Edition.). New York:
Pearson Merrill Prentice Hall.
Course Objectives
Through completion of course assignments and activities, the students will:
1) describe assessment procedures and techniques of: observational/informal/teacher-made curriculum-based measurement criterion-referenced assessment norm-referenced assessment
2) calculate and interpret the following descriptive statistics and reliability indicators: measures of central tendency measures of dispersion standard deviation percentile rank standard error of measurement confidence intervals
3) interpret and synthesize evaluation data into a usable report that facilitates teaching and meeting students’ individual needs.
4) administer a minimum of one standardized assessment tool designed to measure a student's academic achievement.
5) evaluate instruments in areas of reliability, validity, and representativeness of the norming sample.
6) specify the national and state classification guidelines for EBD and LD
.
7) develop and administer curriculum-based measures in one academic area and chart student progress making instructional changes when necessary.
Minnesota Board of Teaching Standards of Effective Practices
MN SEP Standards Level
Standard 2, student learning.
Knowledge B Understand that a student’s physical, social, emotional, moral, and cognitive development influence learning and know how to address these factors when making instructional decisions
E. Assess both individual and group performance and design developmentally appropriate instruction that meets the student’s current needs in the cognitive, social, emotional, moral, and physical domains;
Knowledge
Assessment
Test topic: informal & formal testing;
Eligibility assessment
CBM Project
CBM Project
Assessment Summary
Lecture & Project
12
Standard 8, assessment.
A. Be able to assess student performance toward achievement of the Minnesota graduation standards under chapter 3501;
B. Understand the characteristics, uses, advantages, and limitations of different types of assessments including criterion-referenced and norm-referenced instruments, traditional standardized and performance-based tests, observation systems, and assessments of student work;
C. Understand the purpose of and differences between assessment and evaluation;
D. Understand measurement theory and assessmentrelated issues, including validity, reliability, bias, and scoring concerns;
Knowledge
Knowledge
Practice
Knowledge
Knowledge
Assessment Report
Project
CBM Project
Test topic
Norm referenced Test
Lecture
Informal Assessment
Lecture
Text Chapters
Test topic
Basic Concepts
Lecture
Test topic: Reliability
& Validity; Basic
Statistics
Test Manual Review
CBM Project
Assessment Report
E. Select, construct, and use assessment strategies, instruments, and technology appropriate to the learning outcomes being evaluated and to other diagnostic purposes;
F. Use assessment to identify student strengths and promote student growth and to maximize student access to learning opportunities;
G. Use varied and appropriate formal and informal assessment techniques including observation, portfolios of student work, teacher-made tests, performance tasks, projects, student self-assessments, peer assessment, and standardized tests;
H. Use assessment data and of the information about student experiences, learning behaviors, needs, and progress to increase knowledge of students, evaluate student progress and performance, and modify teaching and learning strategies;
I. Implement students’ self-assessment activities to help them identify their own strengths and needs and to encourage them to set personal goals of learning;
L. Establish and maintain student records of work and performance
M. Responsibly communicate student progress based on
Knowledge
Practice
Knowledge
Practice
Knowledge
Practice
Knowledge
Practice
Practice
Practice
Knowledge appropriate indicators to students, parents or guardians, and other colleagues.
Practice
Standard 10, collaboration, ethics, and relationships.
F. Understand data practices Knowledge
Practice
Test Topic
Compuscore
Interpretation
Assessment Report
Assessment Report
CBM Project
Class Portfolio
Assignment
Test topic: Diagnostic tests; informal assessment
Assessment Report
CBM Project
Assessment Report
Behavioral interview
Assessment Report
Oral interpretation of
Assessment Report
CBM Project
Test topic: Basic
Statistics; Reliability
& Validity
Norm-referenced
Scores
Assessment Report
CEC Standards – Core, EBD, LD
CEC National Standards
Standard 1 - Foundations
S1K5 Issues in definition & identification of
Level
Knowledge
Assessment
Cultural Diversity and
13
individuals with exceptional learning needs, including those from culturally & linguistically diverse backgrounds.
S1K5 Typical and atypical human growth and development.
S1K6 Issues, assurances & due process rights related to assessment, eligibility, & placement within a continuum of services.
LDS1K4 Laws and policies regarding pre-referral, referral, and placement procedures for individuals who may have learning disabilities
LDS1K5 Current definitions and issues related to the identification of individuals with learning disabilities
Knowledge
Knowledge
Knowledge
Knowledge
Standard 8 - Assessment
S8K1 Basic terminology used in assessment.
S8S4 Develop or modify individualized assessment strategies.
S8S7 Gather relevant background information.
S8S8 Administer nonbiased formal and informal assessments
S8S9 Use technology to conduct assessments
Practice
Practice
Practice
S8S11 Interpret information from formal and informal assessments.
S8S12 Use assessment information in making eligibility, program, and placement decisions for individuals with exceptional learning needs, including those from culturally and/or linguistically diverse backgrounds.
S8S13 Report assessment results to all stakeholders using effective communication skills
Knowledge
Practice
Knowledge
Knowledge
Practice
S8K2 Legal provisions and ethical principles regarding assessment of individuals
Knowledge
S8K3 Screening, prereferral, referral, and classification procedures.
Knowledge
S8K4 Use and limitations of assessment instruments. Knowledge
Assessment
Test topic: Ethics;
Special Considerations
Test topic: Norm-
Referenced Tests
Test topic: Ethics;
Special Considerations
Assessment Report
Test topic: Ethics;
Achievement Tests;
Intelligence Tests
Assessment Report
Test topics: Norm-
Referenced Tests;
Achievement Tests;
Diagnostic Tests
Test topic: Descriptive
Statistics
Manual Evaluation
Test topic: Diagnostic
Tests; Teacher-made tests; Rubrics
CBM Project
Assessment Report
Standardized Test
Administration
Assessment Report
Assessment Report
Standardized Test
Administration
CBM Project
Assessment Report
Standardized Test
Administration
CBM Project
Test topics:
Assessment for
Eligibility; Special
Considerations
Assessment Report
Test Topic:
Interpreting
Assessment for
Education Intervention
Oral interpretation of
Assessment Report
Test topic: Ethics;
Special Considerations
Manual Evaluation
Test topic: Assessment for Eligibility
Test topic: each type of formal/informal assessment; Reliability
14
ES8K1 Characteristics of behavioral rating scales
ES8K2 Policies and procedures involved in the screening, diagnosis, and placement of individuals with emotional/ behavioral disorders including academic and social behaviors
ES8K3 Types and importance of information concerning individuals with emotional/behavioral disorders available from families and public agencies
ES8K5 Prepare assessment reports on individuals with emotional/behavioral disorders based on behavioralecological information
LDS8K1 Terminology and procedures used in the assessment of individuals with learning disabilities
Knowledge
Knowledge
Knowledge
Practice and Validity of tests
Test topic: rating scales; Assessment of
Behavior
Test topic: Assessment for Eligibility;
Achievement Tests;
Behavior Assessment
Test topics: informal and formal testing
Assessment Report
Assessment Report
LDS8K2 Factors that could lead to misidentification of individuals as having learning disabilities
LDS8K3 Procedures to identify young children who may be at risk for learning disabilities
LDS8S1 Choose and administer assessment instruments appropriate to the individual with learning disabilities
Knowledge
Knowledge
Knowledge
Practice
Test topic: Basic
Statistics; Reliability and Validity
Standardized Test administration
Test topic: Assessment for Eligibility; Special
Considerations;
Diagnostic Tests
Test topic: Special
Considerations-EC
Assessment Report
Standardized test administration
CBM Project
Assessment Report ES8S2 Assess appropriate and problematic social behaviors of individuals with emotional/behavioral disorders
Practice
Course Requirements
Simulated Standardized Test Administration** 25 points
Students will administer one standardized academic achievement battery with another adult. Possible tests are: KTEA-R, KTEA-II, WJ-R, WJ-III, WIAT, WRAT3, PIAT-R-NU. One adult will be the student taking the test while the other will administer the instrument. Performance will be rated in terms of administration, development of rapport with “student”, scoring, and interpretation. Student can either hand score the record form or use scoring software. Students must write up the results of the administration as they relate to the student’s strengths or weaknesses in the tested domains.
** For this assignment the instructor will charge $2 for protocols and/or student workbooks. These monies will be used to buy new protocols to keep the assessment library stocked. Students who use their own protocols will not be assessed this fee.
Standardized Test Manual Evaluation 25 points
Students will evaluate a test manual and complete one summary of the test’s characteristics. Much of the information for this evaluation will be conducted in the classroom, however each student must summarize the findings of the evaluation and turn in the evaluation.
15
Curriculum-Based Measurement Project 100 points
Students will complete a ten-week CBM project on a student in one of the following academic areas: reading, written expression, or math. The CBM project will contain: (a) baseline data, (b) long-range goal line, (c) short-term objective, (d) at least 2 intervention phases (d) a graph of the data, (e) intervention plans.
Assessment Report 100 points
Students will generate an Assessment Summary based on the format and criteria given in class. Students will be given assessment information about a child and determine whether the student qualifies for special education services based on the information provided.
Midterm Examination 50 points
The midterm will consist of multiple choice, short answer, and/or matching. A calculator is needed and any formulas required will be supplied.
Final Examination 50 points
The final examination will consist of multiple choice, and short answer. The final is cumulative and will consist of approximately 80% new content and 20% content from the midterm.
Participation 50 points
Students are expected to participate in class discussions and group work. Students that are chronically late to class, leave early, or do not attend class will be deducted 5 points for every infraction.
Course Grading:
Grading will be based on a “sliding percentage”. The highest grade in the class will represent the total points possible for the course. Using this point total, the rest of the grades will be calculated using the following percentages:
Percentage
90 - 100
80 – 89
70 – 79
60 – 69 below 60
Letter Grade
A
B
C
D
F
Class Policies:
Content and lecture notes will be provided on-line via the Desire 2 Learn (D2L) site. Students will gain access to this site in order to access this information. Any problems accessing this site should be addressed to the Technology
Help Desk. The instructor does not have the ability to add people into the D2L system.
Changes in assignments or other important class information will be made using email. The instructor will be using the MNSU email account assigned to the students. It is the responsibility of the student to check this account periodically for any announcements. Alternate emails (ones not provided by MNSU) will not be contacted.
Other than in the case of medical emergencies, incompletes will not be given for this class. To qualify for an incomplete, the student will be required to provide medical documentation proving that the condition did not allow them to complete the assignments for this course.
If a student requires an accommodation for an assignment, the student must discuss this accommodation before the assignment is due. Accommodations will not be discussed the night an assignment is due. If the assignment is turned in late, it is subject to the late assignment policy stated below.
16
Late assignments will be penalized one quarter of the total points for the assignment for each week it is late.
Assignments are considered late if they are not turned in on the night they are due. Therefore, even if an assignment is only one day late, the student will incur a 25% penalty.
Weather conditions may cause a class to be cancelled. If a class is to be cancelled, the instructor will email all students in the class by 2 p.m. the day of class. If no email is sent, students should assume class is meeting as usual.Tentative Topic Schedule
Date Tentative Topic
August
31 Syllabus
Course Overview
Processes and Concerns in Assessment
Assessment vs. Evaluation
Types of Assessment
Chapter 1
September
7
14
21
28
October
5
12
Progress Monitoring: The Problem-Solving Model
Deno Article
Developing , Administering, and Scoring CBM Measures
Chapter 5
Data Utilization and Writing Instructional Plans
Legal and Ethical Issues in Assessment
Descriptive Statistics
Chapter 2, 3
Reliability and Validity
Chapter 4
19
Norm- Referenced Assessment and Test Development
Chapter 7
Midterm Exam
26 Tests of Academic Achievement
Chapter 8
November
2 Behavioral Observations
Assessing Behavior
Chapter 6
9 Qualifying Criteria for LD, EBD
Teacher Decision Making
Chapter 11
16 Writing Assessment Summary Reports
17
14
23
30
December
7
Response to Intervention for SLD Qualification
Teacher-made tests portfolios, rubrics
Measures of Intelligence
Chapter 9
Final
18
SPED 6631 & TL 6131, Spring 2006
Dr. Michelle Hosp
Course Description:
The purpose of this course is to help practicing teachers become proficient with a variety of formal and informal assessment and instructional procedures, and how to strengthen the connections between them. Teachers will learn to screen students for reading problems or potential reading problems, diagnose students’ reading strengths and needs, and monitor students’ progress to ensure that students will make optimal progress in reading. Teachers will also learn procedures for managing and analyzing assessment data. Teachers will receive handson practice by tutoring a student throughout the semester. Because you will be working with students directly, you will have the opportunity to apply the content you will be learning under the supervision of clinical faculty. The U of U Reading Clinic faculty will provide ongoing feedback and guidance during your tutoring.
The course will also focus on the assessment/instruction cycle, and how to use assessment data to design and implement instructional interventions to increase students’ reading achievement.
Instructional procedures will be based on scientifically-based reading research (NRP, 2000;
Snow, Burns, & Griffin, 1998) and will focus on building students’ oral language and background knowledge, teaching alphabet knowledge and phonemic awareness, teaching students to recognize and use common phonics spelling patterns, building students’ vocabulary, increasing fluency, teaching students to apply comprehension strategies, and fostering students’ reading engagement.
1. Understand various physical, emotional, developmental, social, linguistic, and instructional contributors to reading problems. (IRA Standard 4.1)
2. Understand and be able to use a wide variety of assessment tools and practices that range from individual and standardized group tests to informal, individual, and group classroom assessment strategies appropriate to the primary, intermediate, and secondary levels. (IRA
Standard 10.1)
3. Use assessment procedures for a) screening students for reading problems, b) diagnosing students’ reading strengths and needs, and c) monitoring students’ progress in reading. (IRA Standard 4.2, 10.2)
4. Use assessment information to plan and implement effective individual, small- and whole- group reading intervention strategies appropriate for students at the primary, intermediate, and secondary levels. (IRA Standard 10.2)
5. Use reading assessment data to design and implement instruction and interventions in the following areas:
• oral language and background knowledge instruction • vocabulary instruction
• alphabet knowledge instruction
• phonemic awareness instruction
• fluency instruction
• comprehension instruction
19
• phonics instruction
Standard 4.5)
6. Evaluate their own instruction. (IRA Standard 16.2)
• reading engagement (IRA
7. Effectively communicate the results of these assessments to students, parents, colleagues, and administrators. (IRA Standard 10.2)
McCormick, S. (2003). Instructing Students who have Literacy Problems, 4 th
Edition.
NJ:
Prentice Hall.
Morris, D. (1999). The Howard Street tutoring manual: Teaching at-risk readers in the primary grades. New York: Guildford.
Additional Course Readings on Reserve Electronically, or at the Special Education Office.
Tutoring binder available from the UU Reading Clinic. (cost $25.00)
1) Tutoring: Each student will have an assigned responsibility to tutor a child who is experiencing reading difficulties. Tutoring performance will be graded based on three criteria.
A.
A minimum of 21 tutoring sessions [out of 23 possible] must be completed by April
26, 2006. You are allowed to miss 2 sessions but MUST provide at least 6 hours notice to the reading clinic and the parents of the student. If you have more than 2 misses they must be made-up on your own time and arranged with the parents. More than 2 misses that have not been made-up will result in a failing grade for the course.
B.
Lesson plans for each tutoring session must be satisfactorily completed. Your lesson plans will be checked five times through out the semester.
C.
Demonstration of satisfactory implementation of tutoring materials. This will be based on two formal observations; a third observation may be used if needed.
2) Reading Assessment and Intervention Case Study : Throughout the semester you will compile information and write sections of a larger paper that will provide a complete case study of your student. Your case study will summarize the student’s strengths and weaknesses, describe the interventions used, describe assessments used, and report on the students progress as well as what you have gained from the experience. Please see the additional handout for further instructions.
3) Summarizing Readings: Several times over the semester you will provided a 1 page typed summary of the additional required readings and be asked to orally present your paper. Please see the additional handout for further instructions.
4) Midterm: During week 8 on material related to assessment. The midterm will contain a
“closed book” portion and an “open book” portion. The closed book portion will consist of
20
multiple choice and short answer. The open book portion will consist of mostly short answer.
There will not be a final.
NOTE FOR ALL COURSE REQUIREMENTS : All assignments are to be turned in at the start of class on the date they are due or otherwise specified (e.g., tutoring observations). Late assignments will result in a loss of 10% PER DAY except for extreme emergencies. I reserve the right to deem what is an emergency.
If you wish to dispute the grade assigned to a paper or a question on the midterm, you must do so in writing within 24 hours after the exam or paper has been returned. You must include a specific rational for why your answer is correct, or why the paper deserves a higher grade. “I think I deserve a better grade” does not constitute a rationale.
Attendance, Participation, and Conduct:
Student attendance and participation are essential as this class involves a tutoring component and is the culmination of the reading endorsement level one courses.
In order for you to participate and learn in this class you will be expected to arrive at the beginning of each class time and stay throughout each class period.
Students are expected to participate in class discussions. As the instructor, I will try to foster an environment that is free from criticisms and is respectful toward the thoughts of all people.
Due to the amount of information we will be addressing in this course, it is essential that you complete the readings prior to class. This will allow me to move through the course content without re-explaining concepts addressed in the readings, and will allow you to more fully participate in class discussion.
During class, please turn off cell phones and pagers (or place on vibrate) and refrain from talking when either the instructor or other students are talking.
If you miss class it is your responsibility to get the notes from another student. After you have gone through the notes, you are welcome to ask specific questions related to the content, but there will be no re-teaching of the missed content.
Academic integrity is essential to a positive teaching and learning environment. All students enrolled in the University courses are expected to complete coursework responsibilities with fairness and honesty. Failure to do so by seeking unfair advantage over others or misrepresenting someone else’s work as your own, can result in disciplinary action.
For a more detailed description of the student code of conduct refer to the University of Utah
Student Code (see below), also see the Department of Special Education Student Handbook at www.ed.utah.edu/sped/enrolled/enrolled.htm.
All students are expected to conduct themselves in accordance with the College of
Education Policies and Procedures Governing Academic Performance and Professional
Conduct as well as the University of Utah Code of Student Rights and Responsibilities
(available at www.admin.utah.edu/ppmanual/8/8-10.html)
21
It is expected that you will protect the rights of confidentiality afforded to students and their families inside and outside of this class. Often in class we will discuss individual students, assessment situations, and outcomes. Your experience with students and their families is helpful in adding to the content and understanding of issues in applied settings. However, when discussing an individual student you are working with or have worked with in the past, please remember to keep this information confidential and do not talk about these students outside of the class setting.
Evaluation Procedures
Midterm
Case Study (25 points each section)
75 points
75 points
Number of tutoring sessions (22 minimum) 50 points
Lesson Plans (5 checks)
Observations of Intervention (2 Formal)
Summarizing Readings
Grading Scale:
A = 95-100%
A- = 90-94%
C+ = 77-79%
C = 73-76%
25 points
50 points
25 points
300 Total Points
D- = 60-62%
F 59% and below
B+ = 87-89%
B = 83-86%
B- = 80-82%
C- = 70-72%
D+ = 67-69%
D = 63-66%
Grades:
300 – 285 (A)
284 – 270 (A-)
271 – 261 (B+)
260 – 249 (B)
248 – 240 (B-)
239 – 231 (C+)
230 – 219 (C)
218 – 210 (C-)
209 – 201 (D+)
200 – 189 (D)
188 – 180 (D-)
179 or below (F)
22
2
1
Tentative Course Schedule
This syllabus and schedule are subject to change in response to student learning and/or in the event of extenuating circumstances.
Note: M = McCormick list)
TS = Tutoring Session
HS = Howard Street Tutoring Manual
READINGS = Additional Readings (see attached
Week Date Time Topic Readings/
Assignments Due
3
Jan 9
(Mon)
Jan 11
(Wed)
Jan 16
(Mon)
Jan 18
(Wed)
5 - 6 Hand out Syllabus, Review Materials, Model Tutoring
4:55 - 7:55 Course Overview, Tutoring Programs and English
Language Learners
5 - 6 MLK Holiday
4:55 - 7:55 Review Tutoring and Practice with each other
Struggling Readers, Issues Related to Reading &
Assessment
5 - 6 TS #1
M 2 & 3
READINGS
Word Study Kit
M 4
READINGS
4
5
Jan 23
(Mon)
Jan 25
(Wed)
Jan 30
(Mon)
Feb 1
(Wed)
Feb 6
(Mon)
Feb 8
(Wed)
4:55 - 7:55 TS # 2, Review Phonological Awareness and
Purposes of Assessment
5 - 6 TS # 3
4:55 - 7:55 No TS Descriptive Statistics and Quantification of
Test Performance, CBM
5 - 6
4:55 - 7:55 TS # 5, Review Decoding and Correlation and Norms
5 - 6
TS # 4
TS # 6
HS 4
READINGS
HS 5, S & Y 4 &5
READINGS
S & Y 4 & 6
READINGS
Case Study Part I
6
7
8
9
Feb 13
(Mon)
Feb 15
(Wed)
Feb 20
(Mon)
Feb 22
(Wed)
Feb 27
(Mon)
Mar 1
(Wed)
Mar 6
(Mon)
Mar 8
(Wed)
4:55 - 7:55 TS # 7, Review Fluency, Reliability, SEM, True
5 - 6
Scores, and Confidence Intervals
President’s Day Holiday
4:55 - 7:55 TS # 8, Review Vocabulary and Comprehension,
Validity, and Review for Midterm
5 – 6 TS # 9
4:55 - 7:55 TS # 10, Midterm
5 – 6 TS # 11
4:55 - 7:55 No TS, Assessing reading skills using DIBELS
S & Y 7
READINGS
S & Y 8
READINGS
Mar 20 5 – 6 TS # 12
M 5
READINGS
23
14
15
11
11
12
10
13
(Mon)
Mar 22
(Wed)
4:55 - 7:55 TS # 13, Review DIBELS, Assessing general reading levels
M 6
READINGS
Case Study Part II
Mar 27
(Mon)
Mar 29
(Wed)
Apr 3
(Mon)
Apr 5
(Wed)
Apr 10
(Mon)
Apr 12
(Wed)
Apr 17
(Mon)
Apr 19
(Wed)
Apr 24
(Mon)
Apr 26
(Wed)
5 - 6 TS # 14
4:55 - 7:55 TS # 15, Assessing reading skills using CTOPP &
TPRI
5 - 6 TS # 16
4:55 - 7:55
5 - 6
4:55 - 7:55
5 - 6
4:55 - 7:55
5 - 6
5-6
TS # 17,
TS # 18
TS # 19,
TS # 20
TS # 21,
TS # 22
TS # 23
Other Assessments and Reading Strategies
Assessment Training NISSI, Peer Assisted
Learning Strategies (PALS)
Additional Reading Strategies
Last Tutoring Session
M 7
READINGS
24
M 9 & 10
READINGS
25
M 11
READINGS
M 12 & 13
READINGS
26
27 Cas e Study Part
III
24
Additional Required Readings
Week 1 Tutoring Programs and English Language Learners
Elbaum, B., Vaughn, S., Tejero Hughes, M., & Watson Moody, S. (2000). How effective are one-to- one tutoring programs in reading for elementary students at-risk for reading failure? A meta- analysis of the intervention research. Journal of Educational Psychology , 92 , 605-619.
Linan-Thompson, S., Vaughn, S., Hickman-Davis, P., & Kouzekanani, K. (2003). Effectiveness of supplemental reading instruction for second-grade english language learners with reading difficulites. Elementary School Journal , 103 , 221-235.
Week 2 Struggling Readers and Issues Related to Reading & Assessment
Snow, C., Burns, M.S., & Griffin, P. (1998). Who has reading difficulties?
In Preventing
Reading
Difficulties in Young Children (pp. 87-99) . National Acadamy Press, Washington, DC.
Snow, C., Burns, M.S., & Griffin, P. (1998). Predictors of success and failure in reading. In
Preventing
Reading Difficulties in Young Children (pp. 100-133) .
National Acadamy Press,
Washington,
DC.
Stanovich, K.E. (2000).Twenty-five years of research on the reading process: The Grand
Synthesis and what is means for our field. In K.E. Stanovich (Ed.), Progress in understanding reading: Scientific foundations and new frontiers (pp. 405-417). New York: Guilford.
Week 3 Review Phonological Awareness and Purposes of Assessment
Snider, V. E. (1995). Primer on phonemic awareness: What is it, why it’s important, and how to teach it. School Psychology Review, 24, 443-455.
Blachman, B. A. (2000). Phonological awareness. In M. L. Kamil, P. B. Mosenthal, P. D.
Pearson, &
R. Barr, (Eds.), Handbook of reading research: Vol. III (pp. 483-502). Mahwah, NJ:
Erlbaum.
Week 4 Descriptive Statistics and Quantification of Test Performance, CBM
Salvia, J. & Ysseldyke, J. E. (2003). Descriptive Stattistics. In Assessment (9 th
ed.) (68-89) .
Boston: Houghton Mifflin.
Salvia, J. & Ysseldyke, J. E. (2003). Quantification of Test Performance. In Assessment (9 th ed.)
(pp. 90-105). Boston: Houghton Mifflin.
Hosp, M. K. & Hosp, J. (2003). Curriculum-based measurement for reading, math, and spelling:
How to do it and why. Preventing School Failure, 48 (1), 10-17.
Ehri, L. C. (1997). Sight word learning in normal readers and dyslexics. In B. Blachman (Ed.),
Foundations of reading acquisition and dyslexia: Implications for early intervention (pp.
25
163-
190). Mahwah, NJ: Erlbaum.
Ehri, L. C., & McCormick, S. (1998) Phases of word learning: Implications for instruction with delayed and disabled readers. Reading & Writing Quarterly: Overcoming Learning
Difficulties, 14, 135-163.
Moats, L. C. (1998). Teaching Decoding. American Educator, 22, 42-49.
Week 6 Review Fluency, Reliability, SEM, True Scores, and Confidence Intervals
Fuchs, S. L., Fuchs, D., Hosp, M. K., & Jenkins, J. R. (2001) Oral reading fluency as an indicator of reading competence: A theoretical, empirical, and historical analysis. Scientific Studies of
Reading , 5 , 239-256.
Mastropieri, M. A., Leinart, A., & Scruggs, T. E. (1999). Strategies to increase reading fluency.
Intervention in School and Clinic, 34, 278-283.
Strecker, S. K., Roser, N. L., & Martinez, M. G. (1998). Toward understanding oral reading fluency.
National Reading Conference Yearbook, 47, 295-310.
Week 7 Review Vocabulary and Comprehension and Validity
Blachowicz, C. L., & Fisher, P. (2000). Vocabulary instruction. In M. L. Kamil, P. B. Mosenthal,
P. D. Pearson, & R. Barr, (Eds.), Handbook of reading research: Vol. III (pp. 503-524).
Mahwah, NJ: Erlbaum.
Pressley, M., Symons, S., McGoldrick, J. A., & Synder, B. L. (1995). Reading comprehension strategies. In M. Pressley & V. Woloshyn (Eds .), Cognitive strategy instruction that really improves children’s academic performance (pp. 57-100). Cambridge, MA: Brookline
Books.
Stanovich, K. E., West, R. F., Cunningham, A. E., & Cipielewski, J. (1996). The role of inadequate print exposure as a determinant of reading comprehension problems. In C.
Cornoldi & J. Oakhill
(Eds.), Reading comprehension difficulties: Processes and interventions (15-32).
Mahwah, NJ:
Erlbaum.
Week 9 Assessing reading skills using DIBELS
Good, R. H., Gruba, J., & Kaminski, R. A. (2002). Best practices in using dynamic indicators of basic early literacy skills (DIBELS) in an outcomes-driven model. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology IV (pp.679-700). Bethesda: National Association of
School Psychologists.
Goswami, U. (1998). The role of analogies in the development of word recognition. In J. L.
Metsala &
L. C. Ehri (Eds.), Word recognition in beginning literacy (pp. 41-64). Mahwah, NJ:
Erlbaum.
Week 10 Review DIBELS, Assessing general reading levels
Brown, K.J. (1999-2000) What kind of text–for whom and when? Textual scaffolding for beginning readers. The Reading Teacher , 53 , 292-307.
26
Ives-Wiley, H., & Deno, S. L. (2005). Oral reading and maze measures as predictors of success for
English learners on a state standards assessment. Remedial and Special Education, 26,
207-214.
Week 11 Assessing reading skills using CTOPP & TPRI
Cunningham, A. E., & Stanovich, K. E. (1998). What reading does for the mind. American
Educator, 22, 8-15.
Perfetti, C. A. (2003). The universal grammar of reading. Scientific Studies of Reading, 7, 3-24.
Week 12 Other Assessments and Reading Strategies
Beck, I. & Juel, C. (1995). The role of decoding in learning to read. American Educator , 19 (2),
8-42.
Brown, K.J. (in press). What should I say when they get stuck on a word? Aligning teachers’ prompts with students’ development. The Reading Teacher.
Week 13 Peer Assisted Learning Strategies (PALS)
Fuchs, D., Fuchs, L. S., & Burnish, P. (2000). Peer-assisted learning strategies: An evidencebased practice to promote reading achievement. Learning Disabilities Research & Practice, 15,
85-91.
Mathes, P. G., Howard, J. K., Allen, S., & Fuchs, D. (1998). Peer-assisted learning strategies for first- grade readers: Making early reading instruction more responsive to the needs of diverse learners.
Reading Research Quarterly, 33, 62-95.
Week 14 Additional Reading Strategies
Lovett, M. W., Lacerenza, L., Borden, S., Frijters, J. C., Steinbach, K. A., & De Palma, M
(2000).Components of effective remediation for developmental reading disability:
Combining phonological and strategy-based instruction to improve outcomes. Journal of
Educational Psychology, 92, 263-283.
27
Special Education 5122/6122, Fall 2006
Michelle Hosp, Ph.D.
COURSE DESCRIPTION
This course is designed for prospective special education teachers in the mild/moderate program.
Empirically validated instructional procedures are presented to address reading for students with disabilities. The focus will be on assessing students skills, planning and implementing appropriate instructional procedures, and monitoring students’ progress. Students will apply their skills in the classroom setting.
Course Reading Materials
Required:
Carnine, D. W., Silbert J., Kameenui, E. J. & Tarver, S. G. (2004). Direct Instruction Reading,
4th Ed.
Upper Saddle River, NJ: Merrill.
Fry, E. B., Kress, J. E., & Fountoukidis, D. L. (1993).
The Reading Teachers’s Book of Lists 5 th
Ed. Paramus, NJ: Prentice Hall.
Additional Readings located on WebCT and on CD in the Special Education Office.
Subscribe to Kevin Feldman’s list serve by sending an email to: literacy-on@lists.scoe.org
Course Objectives
1.
Identify students instructional needs through assessments
2.
Demonstrate knowledge of determining appropriate instructional strategies
3.
Demonstrate mastery of using Direct Instruction
4.
Implement appropriate instructional strategies based on student’s skill level
5.
Monitor student performance using curriculum-based measurement (CBM)
6.
Demonstrate using CBM to modify instructional material
7.
Write goals and objectives for Individualized Education Program (IEP)
Course Requirements
Attendance, Participation, and Conduct
Student attendance and participation are essential as this class is the foundation for reading instruction in your program. In order for you to participate and learn in this class you will be expected to arrive at the beginning of each class time and stay throughout each class period.
Attendance will be taken at the beginning of each class. Students that arrive late will not be counted as present.
Students are expected to participate in class discussions. As the instructor, I will try to foster an environment that is free from criticisms and is respectful toward the thoughts of all people. During class, please turn off cell phones and pagers (or place on vibrate) and reframe from talking when either the instructor or other students are talking.
All students are
28
expected to conduct themselves in accordance with the College of Education Policies and
Procedures Governing Academic Performance and Professional Conduct as well as the
University of Utah Code of Student Rights and Responsibilities (available at www.admin.utah.edu/ppmanual/8/8-10.html)
Requirements/Assignments:
Correct Pronunciation of Letter Sounds (20 pts)
You will produce the correct sounds of letters and letter combinations. (Additional handout)
Goals & Objectives (50 points total @ 10 points each)
You will write IEP goals and objectives for 5 separate skills covered in class. (Additional handout)
Decoding Assessment & DIBELS Nonsense Word Fluency (NWF) (30 pts)
You will administer a decoding assessment to one student you are working with. This assessment will be used to determine their areas of strength and weaknesses in decoding. (Additional handout)
Reading CBM (50 pts total)
You will administer CBM reading passages for survey level assessment and graph the data for one student over 12 weeks. (Additional handout)
D.I. Teaching Demonstration (25 pts)
You will pair with another student and demonstrate how to teach a reading lesson using D.I. techniques. (Additional handout)
Reading Strategy Demonstration (25 pts)
You will demonstrate for the class a reading strategy that you like from (e.g., Kevin Feldman list serve, practicum placement, additional readings, or other reading courses). The demonstration will use D.I. techniques. (Additional handout)
Oral and Written Summary of Additional Readings (100 Points)
All students are expected to read the additional readings before each class. Graduate students will also turn in a 1 page typed summary of each additional reading assigned to their group for a total of 15 readings to be turned in. Graduate students will also orally summarize the additional readings, several times over the semester. (Additional handout)
* NOTE FOR ALL REQUIREMENTS/ASSIGNMENTS: ALL ASSIGNMENTS ARE TO
BE TURNED IN AT THE START OF CLASS ON THE DATE THEY ARE DUE. LATE
ASSIGNMENTS WILL RESULT IN A LOSS OF 10% PER DAY EXCEPT FOR
EXTREME EMERGENCIES. I RESERVE THE RIGHT TO DEEM WHAT IS AN
EMERGENCY.
29
Confidentiality
It is expected that you will protect the rights of confidentiality afforded to students with disabilities and their families inside and outside of this class. Often in class we will discuss individual students, assessment situations, and outcomes. Your experience with students with disabilities and their families is helpful in adding to the content and understanding of issues in applied settings. However, when discussing an individual student you are working with or have worked with in the past, please do not reveal any information that would allow other people to identify the student (e.g. name, school, age, parents name, etc).
Accommodations
Students requiring special accommodations to meet the course expectations should bring this to the attention of the instructor during the first week of the semester or immediately after the identification of a new disability. Written documentation from the Center for Disability
Services (162 Union Building) concerning the disability must be provided before accommodations can be made. Phone number (801) 581-5020. On the Web http://disability.utah.edu
Evaluation Procedures
Correct Pronunciation of Letter Sounds
Goals & Objectives
20 pts.
50 pts.
Decoding Reading Assessment & NWF
Reading CBM
Reading Strategy Demonstration
D.I. Teaching Demonstration
30 pts.
25 pts.
25 pts.
50 pts.
200 points
Correct Pronunciation of Letter Sounds
Goals & Objectives
Decoding Reading Assessment & NWF
Reading CBM
20 pts.
50 pts.
30 pts.
50 pts.
D.I. Teaching Demonstration
Reading Strategy Demonstration
25 pts.
25 pts.
Oral & Written Summary of Additional Readings 100 pts.
300 points
Grading Scale:
A = 95-100%
A- = 90-94%
B+ = 87-89%
C+ = 77-79%
C = 73-76%
C- = 70-72%
D- = 60-62%
F 59% and below
30
B = 83-86%
B- = 80-82%
Undergraduates:
200 – 190 (A)
189 – 180 (A-)
179 – 174 (B+)
173 – 166 (B)
165 – 160 (B-)
Graduates:
300 – 285 (A)
284 – 270 (A-)
271 – 261 (B+)
260 – 249 (B)
248 – 240 (B-)
D+ = 67-69%
D = 63-66%
159 – 154 (C+)
153 – 146 (C)
145 – 140 (C-)
139 – 134 (D+)
133 – 126 (D)
125 – 120 (D-)
119 or below (F)
239 – 231 (C+)
230 – 219 (C)
218 – 210 (C-)
209 – 201 (D+)
200 – 189 (D)
188 – 180 (D-)
179 or below (F)
31
Tentative Course Schedule
This syllabus and schedule are subject to change in response to student learning and/or in the event of extenuating circumstances.
Note: DI-R = Direct Instruction Reading
READINGS = Additional Readings (see attached list)
H & N = Howell & Nolet
Topic Reading/Assignment Due Week Date
1
2
3
Aug.
29th
Sept.
5th
Sept.
12th
4
5
6
7
8
9
10
Sept.
19th
Sept.
26th
Oct.
3rd
11 Nov.
7th
12 Nov.
14th
13 Nov.
21st
14 Nov.
28th
15
Oct.
10th
Oct.
17th
Oct.
24th
Oct.
31st
Dec.
5th
Course Overview; Early Reading
Development; Reading Skills
Dynamic Indicators or Basic Early
Literacy Skills (DIBELS)
Curriculum-Based Measurement (CBM)
& Decoding Assessment
Phonemic Awareness & Letter Sounds:
Overview & Theory, Start Reading
Tutoring
Phonemic Awareness & Letter Sounds:
Techniques & Instructional Strategies
Decoding: Overview & Theory
Decoding: Techniques and Instructional
Strategies
Word Reading: Overview, Theory,
Techniques, and Instructional Strategies
Fluency: Overview & Theory
Fluency: Techniques and Instructional
Strategies
Vocabulary: Overview & Theory, Go to
Curriculum Lab
Vocabulary: Techniques and
Instructional Strategies
NO CLASS
Comprehension: Overview & Theory,
Last week for reading tutoring and
CBM
Comprehension: Techniques and
Instructional Strategies
DI-R: 1, 2, 3, 4 READINGS
[additional readings handout]
READINGS (Group A & B)
DI-R: 5, READINGS (Group A)
[reading CBM & decoding assessment,
NWF handouts]
DI-R: 6 READINGS (Group B)
[goals & objectives handout; correct letter sound handout]
DI-R: 7 READINGS (Group A)
CBM Survey Level Assessment & Graph with Goal Line, Decoding Assessment &
NWF [Reading Strategy handout]
DI-R: 8, 14, 15, READINGS (Group B)
Goal/Objective #1
[D.I. demo handout]
DI-R: 16, H & N 9,
READINGS (Group A)
DI-R: 9, 10, 17, READINGS (Group B)
Goal/Objective #2
DI-R: 18, READINGS (Group A)
CBM Graph
DI-R: 27, READINGS (Group B)
DI-R: 11, 19, 20, READINGS (Group A)
Goal/Objective #3
READINGS (Group B)
Goal/Objective #4 (Email to Instructor)
DI-R: 21, 22, 23, 24, H & N 8,
READINGS (Group A)
READINGS (Group B)
CBM Graph, Goal/Objective #5
32
Additional Required Readings
Week 1 Course Overview; Early Reading Development; Reading Skills
Snow, C., Burns, M.S., & Griffin, P. (1998). Predictors of success and failure in reading. In Preventing
Reading Difficulties in Young Children (pp. 100-133). National Acadamy Press, Washington,
DC.
Wren, S. (2002). Ten myths of reading instruction. Southwest Educational Development Laboratory, 14
(3), 3-8.
Week 2 Dynamic Indicators or Basic Early Literacy Skills (DIBELS) (Group A & B)
Good, R. H., Gruba, J., & Kaminski, R. A. (2002). Best practices in using dynamic indicators of basic early literacy skills (DIBELS) in an outcomes-driven model. In A. Thomas & J. Grimes (Eds.),
Best practices in school psychology IV (pp.679-700). Bethesda: National Association of School
Psychologists.
Week 3 & Curriculum-Based Measurement (CBM) (Group A)
Hasbrouck, J. E., & Tindal, G. (1992). Curriculum-based oral reading fluency norms for students in grades 2 through 5. Teaching Exceptional Children, 23, 41-44.
Hosp, M. K. & Hosp, J. (2003). Curriculum-based measurement for reading, math, and spelling: How to do it and why. Preventing School Failure, 48 (1), 10-17.
Week 4 Phonemic Awareness & Letter Sounds: Overview & Theory (Group B)
Adams, M. J., Foorman, B. R., Lundberg, I., & Beeler, T. (1998). The elusive phoneme: Why phonemic awareness is so important and how to help children develop it.
Educator, 22, 18-29.
American
Scarborough, H. S., & Brady, S. A. (2002). Toward a common terminology for talking about speech and reading: A glossary of the “phon” words and some related terms. Journal of Literacy Research,
34, 299-336.
National Reading Panel (2000). Part I: Phonemic awareness instruction: Executive Summary .
Bethesda, MD: NICHD. http://www.nichd.nih.gov/publications/nrp/report.htm
Week 5 Phonemic Awareness & Letter Sounds: Techniques & Instructional Strategies (Group A)
Snider, V. E. (1995). Primer on phonemic awareness: What is it, why it’s important, and how to teach it.
School Psychology Review, 24, 443-455.
Troia, G. A., Roth, F. P., & Graham, S. (1998). An educator’s guide to phonological awareness:
Assessment measure and interventions activities for children. Focus on exceptional
Children, 31 (3), 2-12.
Week 6 Decoding: Overview & Theory (Group B 2 of the 3)
Moats, L. C. (1998). Teaching Decoding. American Educator, 22, 42-49.
Stahl, S.A., Duffy-Hester, A.M., & Stahl, K.A. (1998). Everything you wanted to know about phonics
(but were afraid to ask). Reading Research Quarterly, 33, 338-355.
National Reading Panel (2000). Part II: Phonics instruction: Executive summary . Bethesda, MD:
NICHD. http://www.nichd.nih.gov/publications/nrp/report.htm
Week 7 Decoding: Techniques and Instructional Strategies (Group A)
33
Lovett, M. W., Lacerenza, L., Borden, S., Frijters, J. C., Steinbach, K. A., & De Palma, M.
(2000).Components of effective remediation for developmental reading disability:
Combining phonological and strategy-based instruction to improve outcomes. Journal of
Educational Psychology, 92, 263-283.
Mathes, P. G., Howard, J. K., Allen, S., & Fuchs, D. (1998). Peer-assisted learning strategies for firstgrade readers: Making early reading instruction more responsive to the needs of diverse learners. Reading Research Quarterly, 33, 62-95.
Week 8 Word Reading: Overview, Theory, Techniques, and Instructional Strategies (Group B)
Ehri, L. C., & McCormick, S. (1998) Phases of word learning: Implications for instruction with delayed and disabled readers. Reading & Writing Quarterly: Overcoming Learning
Difficulties, 14, 135-163.
Goswami, U. (1998). The role of analogies in the development of word recognition. In J. L. Metsala & L.
C. Ehri (Eds.), Word recognition in beginning literacy (pp. 41-64). Mahwah, NJ: Erlbaum.
Week 9 Fluency: Overview & Theory (Group A)
Fuchs, L.S., Fuchs, D, Hosp, M. K., & Jenkins, J. R. (2001). Oral reading fluency as an indication of reading competence: A theoretical, empirical, and historical analysis.
Scientific Studies of Reading, 5 , 239-256.
Strecker, S. K., Roser, N. L., & Martinez, M. G. (1998). Toward understanding oral reading fluency.
National Reading Conference Yearbook, 47, 295-310.
National Reading Panel (2000). Part III: Fluency: Executive Summary . Bethesda, MD: NICHD. http://www.nichd.nih.gov/publications/nrp/report.htm
Week 10 Fluency: Techniques and Instructional Strategies (Group B)
Mastropieri, M. A., Leinart, A., & Scruggs, T. E. (1999). Strategies to increase reading fluency.
Intervention in School and Clinic, 34, 278-283.
Honig, B., Dimand, L., & Gutlohn, L. (2000). Reading Fluency. In CORE Teaching Reading
Sourcebook for kindergarten through eighth grade (pp. 11.2- 11.30). Novato, CA: Arena.
Week 11 Vocabulary: Overview & Theory (Group A)
Baker, S. K., Simmons, D. C., & Kameenui, E. J. (1998). Vocabulary acquisition: Research bases. In Simmons, D. C., & Kameenui, E. J. (Eds.), What reading research tells us about children with diverse learning needs (183-217). Mahwah, NJ: Erlbaum.
Blachowicz, C. L., & Fisher, P. (2000). Vocabulary instruction. In M. L. Kamil, P. B. Mosenthal,
P. D. Pearson, & R. Barr, (Eds.),
Mahwah, NJ: Erlbaum.
Handbook of reading research: Vol. III (pp. 503-524).
Week 12 Vocabulary: Techniques and Instructional Strategies (Group B)
Pressley, M., & Lysynchuk, L. (1995). Vocabulary. In M. Pressley & V. Woloshyn (Eds.),
Cognitive strategy instruction that really improved children’s academic performance.
(pp.101-115). Cambridge, MA: Brookline.
McKeown, M. G., & Beck, I. L. (2004). Direct and rich vocabulary instruction. In J. F.
Baumann, E. J. Kame’enui, (Eds.),
Vocabulary Instruction: Research to Practice (pp. 13-
27). New York: Guilford Press.
National Reading Panel (2000). Part I: Vocabulary Instruction: Executive Summary . Bethesda, MD:
NICHD. http://www.nichd.nih.gov/publications/nrp/report.htm
Week 14 Comprehension: Overview & Theory (Group A)
34
Perfetti, C. A., Marron, M. A., & Foltz, P. W. (1996). Sources of comprehension failure: Theoretical perspectives and case studies. In C. Cornoldi & J. Oakhill (Eds.), Reading comprehension difficulties: Processes and interventions (137-165). Mahwah, NJ: Erlbaum.
Pressley, M. (2000). What should comprehension instruction be the instruction of? In M. L. Kamil, P. B.
Mosenthal, P. D. Pearson, & R. Barr, (Eds.), Handbook of reading research: Vol. III (pp. 545-
562). Mahwah, NJ: Erlbaum.
National Reading Panel (2000). Part II: Text Comprehension Instruction: Executive summary .
Bethesda, MD: NICHD. http://www.nichd.nih.gov/publications/nrppubskey.cfm
Week 15 Comprehension: Techniques and Instructional Strategies (Group B)
Fuchs D., Fuchs, L. S., & Burish, P. (2000). Peer-assisted learning strategies: An evidience-based practice to promote reading achievement. Learning Disabilities Research & Practice, 15 (2), 85-
91.
Pressley, M., Symons, S., McGoldrick, J. A., & Synder, B. L. (1995). Reading comprehension strategies.
In M. Pressley & V. Woloshyn (Eds .), Cognitive strategy instruction that really improves children’s academic performance
(pp. 57-100). Cambridge, MA: Brookline
Books.
RECOMMENED READINGS (Not Required)
Foundational Issues: Historical Overview and Definitions
Lyon, G. R. (1995). Toward a definition of Dyslexia. Annals of Dyslexia, 45, 3-27.
Spear-Swerling, L., & Sternberg, R. (1996). Off track : When poor readers become "learning disabled" (pp. 29-76). Boulder, CO : Westview Press.
Stanovich, K. E. (1988). The right and wrong places to look for the cognitive locus of reading disability. Annals of Dyslexia, 38, 154-177.
Phonemic Awareness & Letter Sounds
Blachman, B. A. (2000). Phonological awareness. In M. L. Kamil, P. B. Mosenthal, P. D.
Pearson, & R. Barr, (Eds.), Handbook of reading research: Vol. III (pp. 483-502).
Mahwah, NJ: Erlbaum.
Decoding
Beck, I., & Juel, C. (1995). The role of decoding in learning to read. American Educator , 19 (2), 8-
42.
Word Reading
Ehri, L. C. (1997). Sight word learning in normal readers and dyslexics. In B. Blachman (Ed.),
Foundations of reading acquisition and dyslexia: Implications for early intervention
190). Mahwah, NJ: Erlbaum.
(pp. 163-
Fluency
Nathan, R. G., & Stanovich, K. E. (1991). The causes and consequences of differences in reading fluency. Theory into Practice, 30, 176-184.
Wolf, M., Miller, L., & Donnelly, K. (2000). Retrieval, automaticity, vocabulary elaboration, orthography (RAVE-O): A comprehensive, fluency-based reading intervention program.
Journal of Learning Disabilities, 33, 375-386.
35
Vocabulary
Graves, M. F., & Watts-Taffe, S. M. (2002). The place of word consciousness in research-based vocabulary program. In A. E. Farstrup & S. J. Samuels (Eds.), What research has to say about reading instruction (140-165). Newark, Delaware: International Reading Association.
Comprehension
Stanovich, K. E., West, R. F., Cunningham, A. E., Cipielewski, J., & Siddiqui, S. (1996). The role of inadequate print exposure as a determinant of reading comprehension problems. In
C. Cornoldi & J. Oakhill (Eds.), Reading comprehension difficulties: Processes and interventions (15-32). Mahwah, NJ: Erlbaum.
Motivation
Anderson, R. C., Wilson, P. T., & Fielding, L. G. (1988). Growth in reading and how children spend their time outside of school. Reading Research Quarterly, 23, 285-303.
Cunningham, A. E., & Stanovich, K. E. (1998). What reading does for the mind. American
Educator, 22, 8-15.
Gutherie, J. T., & Wigfield, A. (2000). Engagement and motivaton in reading. In M. L. Kamil, P. B.
Mosenthal, P. D. Pearson, & R. Barr, (Eds.), Handbook of reading research: Vol. III (pp. 403-
424). Mahwah, NJ: Erlbaum.
36
ITERACY IN
PECIAL
DUCATION
4370/7370/8470
Dr. Erica Lembke
R
EQUIRED
T
EXTS
Carnine, D.W., Silbert, J, Kame’enui, E.J., Tarver, S.G., & Jungjohann, K. (2006). Teaching struggling and at-risk readers: A direct instruction approach.
Upper Saddle River, NJ: Pearson Prentice
Hall.
Honig, B., Diamond, L. & Gutlohn, L. (2000). Teaching reading sourcebook for kindergarten through eighth grade.
Novato and Emeryville, CA: Consortium on Reading Excellence (CORE) and
Arena Press.
C
LASS
N
OTES
—
AVAILABLE EACH WEEK PRIOR TO CLASS ON THE COURSE
B
LACKBOARD SITE
: https://courses.missouri.edu/
C
OURSE
D
ESCRIPTION
The purpose of this course is to provide teachers with knowledge regarding current research and issues specific to educating students with disabilities in the area of reading. Topics will include historical and contemporary perspectives on reading instruction and assessment, and implementation of evidence-based practice to improve phonological awareness, decoding, word recognition, fluency, comprehension, and vocabulary. Written language will also be addressed as it pertains to reading instruction. At the conclusion of this course, students will be able to:
1. Describe, through oral and written discussion, historical and contemporary perspectives on reading instruction for students with special needs. (MOSTEP1.2.1, 1.2.2, 1.2.3).
2. Describe, through oral and written discussion, research-based knowledge surrounding instructional methodologies for students with specific deficits relating to reading (MOSTEP
MOSTEP1.2.1, 1.2.2, 1.2.3, 1.2.4 ).
3. Identify characteristics of children at-risk for or with deficits in reading (MOSTEP 1.2.3).
4. Identify and administer appropriate assessment procedures to make instructional programming decisions for students with deficits in reading (MOSTEP 1.2.8).
5. Identify and implement instructional strategies to improve the phonological awareness, decoding, word recognition, and vocabulary skills of students who are at-risk for or have reading disabilities
(MOSTEP1.2.3, 1.2.5).
**Descriptions of the MOSTEP (Missouri Standards for Teacher Education Programs) standards are given in the back of the syllabus.
37
I
N
-C
LASS
Q
UIZZES
(90 points)
C
OURSE
R
EQUIREMENTS
(U
NDERGRADUATE AND
G
RADUATE
)
Quizzes will be 15 points each and will consist of multiple choice, true/false, matching, fill-inthe-blank, and short answer items. Your lowest quiz score for the semester will be dropped.
C
URRICULUM
-B
ASED
M
EASUREMENT
P
ROJECT
(75 points)
1.
Demonstrate the use of a data-based approach (Curriculum-based Measurement) to teach reading. a.
Select a student that you can teach on a regular basis--at least 2 to 3 days per week--for at least 15 minutes per day. b.
The basic skill area that you will be teaching to your student is reading. c.
Specify a measurable goal that is potentially attainable within 8 weeks. d.
Design procedures to measure performance on the skill that can be administered each instructional session. e.
Use the measurement procedures to contrast the student’s performance with that of his age/grade peers on that skill. f.
Write an Instructional Plan (IP) for teaching the student the skill you have identified. g.
Implement your IP with the student (i.e., teach the student) for a minimum of 7 weeks. h.
Measure the student’s performance on the skill every session (2 to 3 times per week) and graph the performance data. i.
After 3 to 5 data points, modify the IP in some clear and substantial respect that you hypothesize could be more effective. j.
Plot the trend of the data before and after the intervention. k.
Implement the revised plan and continue to measure and graph performance. l.
Continue the cycle in steps i and j.
2.
Evaluation a.
Projects will be due on May 8 th and will be scored on the basis of their quality and completeness with respect to the guidelines distributed in class.
P ROJECT O VERVIEW (5 points)
You will provide a brief overview of your project and your progress so far, midway through the semester. Details about what needs to be included in this overview will be distributed in class. This overview is due March 6 th .
P
ROJECT
P
RESENTATION
(10 points)
You will present an overview of your project to the class on May 1 st .
Details about what should be included in this presentation will be given out in class.
R
EADING LESSON PLAN
(25 points) Due March 20
38
One reading lesson will be created and administered to a student(s) who has reading difficulties.
Lesson plan guidelines will be provided in class. The lesson plan will be graded on a mastery basis (may be corrected and turned in for re-grading one time).
C
LASS ATTENDANCE AND PARTICIPATION
Participation in this course should strengthen your abilities to collaborate with your peers and become a contributing member of a dynamic learning community. Although not assigned a particular number of points, your attendance and participation in class are essential. Collaboration with your peers outside of class is strongly encouraged. All students are expected to read the assigned material before each class period. Students will frequently be asked to discuss textbook material in small groups; therefore adequate preparation is in the best interest of both the individual and his or her peers. It is recognized that there are legitimate reasons for being absent; however, it is the responsibility of the student to discuss the reason for any absence with the instructor. Except in extreme emergency, students should contact the instructor prior to an anticipated absence. Excessive absences can result in loss of grade points.
A DDITIONAL C OURSE R EQUIREMENT (G RADUATE STUDENTS ONLY )
R ESEARCH A RTICLE S UMMARY AND D ISCUSSION (20 points)
Each student will: (a) select a research-based course reading that is relevant to the scheduled topic being discussed on a particular day, (b) provide a one-page summary of the reading to classmates on the date the reading is due, (c) lead a 10-15 minute class discussion on that reading, including providing a brief activity or questions for discussion. Students will sign up for articles during the first few weeks of class.
E
XTRA
C
REDIT
O
PPORTUNITY
P
ARTICIPATE IN
S
TUDENT
CEC A
CTIVITIES
(4 points)
You have the opportunity to earn up to 4 extra credit points by participating in two Student
Council for Exceptional Children activities (2 points each). These might include attending SCEC meetings, attending the state special education conference (March 11-13), or volunteering to help with
SCEC activities. You must have Dr. Lembke sign a paper indicating that you attended and this needs to be turned in to Dr. Lembke to receive credit .
39
Evaluation Criteria—Undergraduates
Requirement
Quizzes
Project
Overview
Presentation
Reading Lesson
Individual Points
6 X 15
1 X 75
1 X 10
1 X 10
1 X 25
Total Possible Points
90
Total Course Points --------------------------------------------------------------------- 210
Evaluation Criteria—Graduates
75
10
10
25
Requirement
Quizzes
Project
Overview
Presentation
Reading Lesson
Article Summary
Individual Points
6 X 15
1 X 75
1 X 10
1 X 10
1 X 25
1 X 20
Total Possible Points
90
75
10
10
25
20
Total Course Points --------------------------------------------------------------------- 230
B+
B
B-
C+
C
C-
D+
C ONVERSION TO U NIVERSITY LETTER GRADES :
Letter Grade Percentage of Total Points
A
A-
93% to 100%
90% to 92%
87% to 89%
83% to 86%
80% to 82%
77% to 79%
73% to 76%
70% to 72%
67% to 69%
D
D-
63% to 66%
60% to 62%
According to University policy, plus and minus grades are only given to undergraduate students. Grades will be rounded up if the decimal is between .5 and .9 and rounded down if the decimal is between .1 and .4.
G ENERAL R EQUIREMENTS
40
1. Attendance is required for all scheduled class meetings and students are responsible for information covered in assigned readings, handouts, discussions, and activities. Attendance is stressed because students will have opportunities to (a) improve their knowledge base through discussions of critical topics and issues, (b) practice skills needed to engage in professional dialogue/exchange with colleagues, (c) practice skills required to present information to others, (d) acquire information from lectures and presentations, (e) participate in activities, and (f) submit required assignments. Except in emergencies, please contact the instructor prior to class if you will be absent.
For potential absences related to weather, use your best judgment. Class at MU is rarely cancelled because of weather, so it is your responsibility to get the notes and assignments that you miss, should you choose to not attend class due to weather.
2. Like the instructor, students are expected to come to class meetings THOROUGHLY
PREPARED. "Thoroughly prepared" is defined as having read the readings sufficiently to verbally and in writing (a) discuss definitions, concepts, issues, and procedures and (b) relate this information to content presented in previous classes or readings. It also implies that students have reviewed information from previous readings and class meetings. It will be the students' responsibility to prepare questions when information from readings or class meetings is unclear.
3. All assignments must be submitted AT OR BEFORE THE ASSIGNED DUE DATE. Unexcused assignments submitted after the due date may result in no points or a reduction in points.
4. ALL WRITTEN ASSIGNMENTS must be prepared in a PROFESSIONAL manner. Products which, in the judgment of the instructor, are unreadable or unprofessionally prepared will be returned ungraded or assigned a lower evaluation.
5. DO YOUR OWN WORK. To plagiarize is "to steal and pass off as one's own the ideas or words of another" (Webster, 1967, p. 646), or to not acknowledge the author of an idea. If plagiarism is evident, the student will receive a "0" or "NP" on that activity AND may be given a "NP" grade for the course AND may be suspended or expelled from the university. Professors are required to report any suspected cases of academic dishonesty to the Provost’s office.
6. Academic honesty is fundamental to the activities and principles of any university. All members of the academic community must be confident that each person's work has been responsibly and honorably acquired, developed, and presented. Any effort to gain an advantage not given to all students is dishonest whether or not the effort is successful. The academic community regards academic dishonesty as an extremely serious matter, with serious consequences that range from probation to expulsion. When in doubt about plagiarism, paraphrasing, quoting, or collaboration, consult the course instructor.
7. If you need accommodations because of a disability, if you have emergency medical information to share with me, or if you need special arrangements in case the building must be evacuated, please inform me immediately. Please see me privately after class, or at my office.
To request academic accommodations (for example, a notetaker), students must also register with
Disability Services, AO38 Brady Commons, 882-4696. It is the campus office responsible for reviewing documentation provided by students requesting academic accommodations, and for accommodations planning in cooperation with students and instructors, as needed and consistent with course requirements.
41
TOPICAL OUTLINE AND WEEKLY ASSIGNMENTS
1
1/16
2
1/23
Structure of the course and introduction to course content
Characteristics of struggling readers
Scientifically-based reading research—national reports
Historical Perspectives
Models of reading instruction
Overview of beginning reading instruction
Research on beginning reading instruction
Reading:
DI—Chapters 1-4
Teaching Reading
Sourcebook—Chapters 1-
2
Reading:
CBM tutorial
Sourcebook—Chapters 3 and 4
Quiz 1
3
1/30
4
2/6
5
2/13
Curriculum-based Measurement as Progress
Monitoring in reading and writing o Procedures o Computer-based applications o Decision-making
Early reading assessment—DIBELS
Classroom reading instruction
Phonemic awareness and alphabetic understanding
Delivery of instruction o Basic Instructional Plan o Lesson Plan Components
Phonemic awareness and alphabetic understanding, cont.
6
2/20
Letter-sound correspondence
Phonics
Reading:
DI—Chapters 5-6
Put Reading First :
Phonemic Awareness
Sourcebook—Chapters 5,
6, and 7
Quiz 2
Reading:
DI—Chapter 7-8
Sourcebook—Chapters 8
Put Reading First :
Phonics
Begin CBM Project—
Collect Baseline Data
42
Week
7
2/27
8
3/6
9
3/13
10
3/20
Fluency
Topic
Word reading—beginning and primary stages
High frequency and multisyllabic words
Repeated Reading
Peer-assisted Learning Strategies
Spelling and Writing
Vocabulary Instruction
Assignment
Reading:
DI—Chapters 9-10
Sourcebook—chapters 9-
10
Quiz 3
Reading:
DI—Chapters 11-13
Sourcebook—chapter 11
PALS article
DUE: Project Overview
Reading:
Sourcebook—chapters 12-
13
Quiz 4
Reading:
DI, Chapters 14-16
Sourcebook—Chapters
14-15
Due: Reading Lesson
11
3/27
12
4/3
13
4/10
14
4/17
15
4/24
16
No Class—Spring Break (University Holiday)
Beginning comprehension
Strategic reading
Narrative text
Primary level comprehension
Expository text
Content Area Reading
Response to Intervention in Reading
Classroom instruction
Overview, Reflections, Course Evaluations, Catch-up
Reading:
DI, Chapters 17-18
Sourcebook—chapters 16-
17
Quiz 5
Reading:
DI, Chapter 19
Sourcebook—Chapter 18
Reading:
Quiz 6
Selected articles
Reading:
DI—Chapters 21-23
Sourcebook—Chapters
19-20
Quiz 7
DUE: Presentations (10
43
5/1 Project presentations
17
5/8
Finals Week
M
OSTEP Standards points)
DUE: CBM Projects
Quality Indicator 1.2.1: The pre-service teacher understands the central concepts, tools of inquiry and structures of the discipline(s) within the context of a global society and creates learning experiences that make these aspects of subject matter meaningful for students.
1.2.1.1 knows the discipline applicable to the certification area(s); 1.2.1.2 presents the subject matter in multiple ways; 1.2.1.3 uses students' prior knowledge;
1.2.1.4 engages students in the methods of inquiry used in the discipline; creates interdisciplinary learning.
Quality Indicator 1.2.2: The pre-service teacher understands how students learn and develop, and provides learning opportunities that support the intellectual, social, and personal development of all students.
1.2.2.1 knows and identifies child/adolescent development; 1.2.2.2 strengthens prior knowledge with new ideas; 1.2.2.3 encourages student responsibility;
1.2.2.4 knows theories of learning.
Quality Indicator 1.2.3: The pre-service teacher understands how students differ in their approaches to learning and creates instructional opportunities that are adapted to diverse learners.
1.2.3.1 identifies prior experience, learning styles, strengths, and needs;
1.2.3.2 designs and implements individualized instruction based on prior experience, learning styles, strengths, and needs;
1.2.3.3 knows when and how to access specialized services to meet students' needs;
1.2.3.4 connects instruction to students' prior experiences and family, culture, and community.
Quality Indicator 1.2.4: The pre-service teacher recognizes the importance of long-range planning and curriculum development and develops, implements, and evaluates curriculum based upon student, district, and state performance standards.
1.2.4.1 selects and creates learning experiences that are appropriate for curriculum goals, relevant to learners, and based upon principles of effective instruction (e.g., encourages exploration and problem solving, building new skills from those previously acquired);
1.2.4.2 creates lessons and activities that recognize individual needs of diverse learners and variations in learning styles and performance;
1.2.4.3 evaluates plans relative to long and short-term goals and adjusts them to meet student needs and to enhance learning.
Quality Indicator 1.2.5: The pre-service teacher uses a variety of instructional strategies to encourage students' development and critical thinking, problem solving, and performance skills.
1.2.5.1 selects alternative teaching strategies, materials, and technology to achieve multiple instructional purposes and to meet student needs;
1.2.5.2 engages students in active learning that promotes the development of critical thinking, problem solving, and performance capabilities.
Quality Indicator 1.2.6: The pre-service teacher uses an understanding of individual and group motivation and behavior to create a learning environment that encourages positive social interaction, active engagement in learning, and self-motivation.
1.2.6.1 knows motivation theories and behavior management strategies and techniques; 1.2.6.2 manages time, space, transitions, and activities effectively;
(lesson) 1.2.6.3 engages students in decision making.
Quality Indicator 1.2.7: The pre-service teacher models effective verbal, nonverbal, and media communication techniques to foster active inquiry, collaboration, and supportive interaction in the classroom.
1.2.7.1 models effective verbal/non-verbal communication skills;
1.2.7.2 demonstrates sensitivity to cultural, gender, intellectual, and physical ability differences in classroom communication and in responses to students' communications;
1.2.7.3 supports and expands learner expression in speaking, writing, listening, and other media; 1.2.7.4 uses a variety of media communication tools.
44
Quality Indicator 1.2.8: The pre-service teacher understands and uses formal and informal assessment strategies to evaluate and ensure the continuous intellectual, social, and physical development of the learner.
1.2.8.1 employs a variety of formal and informal assessment techniques (e.g., observation, portfolios of student work, teacher-made tests, performance tasks, projects, student self-assessments, authentic assessments, and standardized tests) to enhance and monitor her or his knowledge of learning, to evaluate student progress and performances, and to modify instructional approaches and learning strategies;
1.2.8.2 uses assessment strategies to involve learners in self-assessment activities, to help them become aware of their learning behaviors, strengths, needs and progress, and to encourage them to set personal goals for learning;
1.2.8.3 evaluates the effect of class activities on both individual and the class as a whole, collecting information through observation of classroom interactions, questioning, and analysis of student work;
1.2.8.4 maintains useful records of student work and performances and can communicate student progress knowledgeably and responsibility, based on appropriate indicators, to student, parents, and other colleagues.
Quality Indicator 1.2.9: The pre-service teacher is a reflective practitioner who continually assesses the effects of choices and actions on others. This reflective practitioner actively seeks out opportunities to grow professionally and utilizes the assessment and professional growth to generate more learning for more students.
1.2.9.1 applies a variety of self-assessment and problem-solving strategies for reflecting on practice, their influences on students' growth and learning, and the complex interactions between them; 1.2.9.2 uses resources available for professional development; 1.2.9.3 practices professional ethical standards.
Quality Indicator 1.2.10: The pre-service teacher fosters relationships with school colleagues, parents, and educational partners in the larger community to support student learning and well-being.
1.2.10.1 participates in collegial activities designed to make the entire school a productive learning environment;
1.2.10.2 talks with and listens to students, is sensitive and responsive to signs of distress, and seeks appropriate help as needed to solve students' problems;
1.2.10.3 seeks opportunities to develop relationships with the parents and guardians of students, and seeks to develop cooperative partnerships in support of student learning and well-being;
1.2.10.4 identifies and uses the appropriate school personnel and community resources to help students reach their full potential.
45
4320/7320
Fall, 2005
Erica Lembke, Ph.D.
Required Text
McLoughlin, J.A. & Lewis, R.B. (2005). Assessing students with special needs (6 th
edition).
Pearson-Merrill Prentice Hall: Upper Saddle River, NJ.
Text Student Website: www.prenhall.com/mcloughlin
Class Notes https://courses.missouri.edu/
Course Description
This course will focus on mastery of basic assessment knowledge including psychometric principles as well as diagnostic and prescriptive teaching from curriculum based measurement (CBM) data. Students will learn principles central to administration, scoring, and interpretation of normreferenced and curriculum-based assessments. Further, students will learn the process and appropriate sources for conducting review of standardized, norm-referenced instruments and will make data-based programming decisions. The course will also include a review of legislation and litigation pertaining to evaluation of exceptional students with a focus on implications for the school setting. Federal and state process guidelines and criteria for conducting multidisciplinary evaluation and diagnosing students with disabilities will be stressed in the course along with the professional ethics of conducting comprehensive educational evaluations. Students will become familiar with assessment tools across domains and demonstrate ability to design evaluation plans appropriate to individuals with diversity.
Course Requirements
C LASS ATTENDANCE AND PARTICIPATION (60 POINTS )
Participation in this course should strengthen your abilities to collaborate with your peers and become a contributing member of a dynamic learning community. Your attendance and participation in class are essential. Collaboration with your peers outside of class is strongly encouraged. All students are expected to read the assigned material before each class period.
Students will frequently be asked to discuss textbook material in small groups; therefore adequate preparation is in the best interest of both the individual and his or her peers. It is recognized that there are legitimate reasons for being absent; however, it is the responsibility of the student to discuss the reason for any absence with the instructor. Except in extreme emergency, students should contact the instructor prior to an anticipated absence. Students that are chronically late to class, leave early, or do not attend class will be deducted 5 points for every infraction.
46
I N -C LASS E XAMINATIONS (150 points)
DUE: 10/17 and 12/5
Two exams (a mid-term and a final) will be administered to assess knowledge of content of readings and class discussion. Exams will consist of short answer, multiple-choice, and essay items. Each exam is worth 75 points. The final exam will not be comprehensive but will cover material since the prior exam.
S
TANDARDIZED
T
EST
A
DMINISTRATION
(100 points)
DUE: 10/3 and 12/12 (by 5pm)
Students will administer two standardized, norm-referenced tests. One will be administered to a classmate, with the process observed by the instructor. The second will be administered to a child.
The student will submit the scored test protocol and a brief summary of results.
P ROGRESS M ONITORING P ROJECT (100 POINTS )
DUE: 11/14
Students will complete a six-week Curriculum-Based Measurement (CBM) project in either reading or math. The CBM project will contain: a) baseline data, b) long-range goal line, c) short-term objective, d) at least one change in instruction, e) a graph of the data, and f) an instructional plan.
C
ASE
S
TUDIES
, I
N
-C
LASS
A
SSIGNMENTS
, E
TC
. (80 points total--10 points each)DUE (completed in class): 8/29, 9/12, 9/19, 9/26, 10/10, 10/24, 10/31, 11/14
There will be a total of 8 case studies and in-class assignments related to course topics that you will complete in class with a small group. Groups will be assigned by the instructor.
Please come prepared to work on the case study or assignment by reading any material in advance.
When completing a case study, each group will discuss the case and answer given questions.
Your group will work towards reaching a consensus on each of the answers to the case study questions. If, after complete discussion, the group is unable to reach consensus, the dissenting member may write up a separate answer. It may seem easy enough to reach consensus because this is a simulation and you may not be fully invested in the process; however, practicing resolving professional differences of opinion will be very beneficial. It will inevitably occur in your future experiences.
You will be graded on the completeness of your answers on the given assignment, including discussion of all key points within case studies, and citation of information from the texts and lecture. You can receive up to10 points for the case study content.
**Case Studies and In-Class Assignments cannot be made-up in the event of an absence. There is too much discussion and feedback that occurs within the group to replicate this independently.
47
A DDITIONAL REQUIREMENT FOR STUDENTS ENROLLED IN 7320
S TANDARDIZED T EST C RITIQUE (50 POINTS )
DUE: 10/31
Students will review and critique one norm-referenced assessment instrument. Students must use professional sources for this critique and current APA style in writing and format of the paper.
Your text may be used as one source; however, at least two additional sources are required.
Suggested resources:
Diagnostique or Assessment for Effective Intervention Council for Exceptional Children (refereed journal for CEDS)
Buros’ Mental Measurement Yearbook
Tests in Print
Test Manuals – (See test assigned to you.)
Other refereed, professional sources may be appropriate if approved by the instructor. Critiques should be referenced using the most current APA guide for style and submitted using word processing. Use third person and past tense. Papers will be evaluated for adherence to the format that follows. Total pages not to exceed 4 with cover page.
Cover Page, according to current APA style
I.
Cite test using current APA style (author, date, name of test, publisher).
II.
Give contents and price of instrument.
III.
Discuss group for whom the instrument was intended.
IV.
Provide summary of administration (procedures, time, who can administer, training for examiners, general format and responses required, accommodations allowed).
V.
Discuss type of scores available and how scores are interpreted.
VI.
Discuss standardization (sampling procedures, sample characteristics).
VII.
Discuss reliability (form(s), summary of correlation coefficients, standard error of measurement).
VIII.
Discuss validity (form(s) and summary of correlation coefficients).
IX.
Provide summary and conclusions:
1.
Major Questions and Problems
2.
Distinguishing Characteristics
3.
Undesirable Features
4.
Overall Evaluation. Include appropriateness for individuals with disabilities.
X.
References, in current APA format. (Does not need to be on separate page.)
Evaluation Criteria
Requirement
Attendance/Participation
Individual Points
12 X 5
Total Possible Points
60
48
Exams 2 X 75 points each
Standardized Test Administration 2 X 50
1 X 100 Progress Monitoring Project
Case Studies/In-class assign. 8 X 10 points each
Total Course Points (4320) ------------------------------------------------------------- 490
Total Course Points (7320) --------------------------------------------------------------- 540
Course Points (540) + Test Critique Points (50)
Conversion to University letter grades:
Letter Grade Percentage of Total Points
A
A-
B+
B
93% to 100%
90% to 92%
87% to 89%
83% to 86%
B-
C+
C
C-
D+
D
D-
80% to 82%
77% to 79%
73% to 76%
70% to 72%
67% to 69%
63% to 66%
60% to 62%
**Only undergraduate students receive plus/minus grades
150
100
100
80
General Requirements
1. Attendance is required for all scheduled class meetings and students are responsible for information covered in assigned readings, handouts, discussions, and activities. Attendance is stressed because students will have opportunities to (a) improve their knowledge base through discussions of critical topics and issues, (b) practice skills needed to engage in professional dialogue/exchange with colleagues, (c) practice skills required to present information to others, (d) acquire information from lectures and presentations, (e) participate in activities, and (f) submit required assignments. Except in emergencies, please contact the instructor prior to class if you will be absent.
2. Like the instructor, students are expected to come to class meetings THOROUGHLY
PREPARED. “Thoroughly prepared” is defined as having read the readings sufficiently to verbally and in writing (a) discuss definitions, concepts, issues, and procedures and (b) relate this information to content presented in previous classes or readings. It also implies that students have reviewed information from previous readings and class meetings. It will be the students’ responsibility to prepare questions when information from readings or class meetings is unclear.
3. All assignments must be submitted AT OR BEFORE THE ASSIGNED DUE DATE. Unexcused assignments submitted after the due date may be returned ungraded or may be assigned a lower evaluation.
49
4. ALL WRITTEN ASSIGNMENTS must be prepared in a PROFESSIONAL manner. Products which, in the judgment of the instructor, are unreadable or unprofessionally prepared will be returned ungraded or assigned a lower evaluation.
5. DO YOUR OWN WORK. To plagiarize is “to steal and pass off as one’s own the ideas or words of another” (Webster, 1967, p. 646), or to not acknowledge the author of an idea. If plagiarism is evident, the student will receive a “0” or “NP” on that activity AND may be given a “NP” grade for the course AND may be suspended or expelled from the university.
6. All evaluation of products will be done as objectively as possible. In the case of qualitative assessment, evaluation will be based on instructor judgment. Final course grades will be based on the number of points accrued on submission of required products: 100-90% = A, 89-80% = B, 79-70% = C,
69-60% = D, and <60 = F.
7. The assignment of INCOMPLETE OR “I” GRADES is discouraged and will be used only in cases of extreme emergencies where satisfactory progress has been demonstrated and a passing grade may be earned. However, should an “I” grade be required, students should notify the instructor at the time such circumstances exist. Upon notification, a course completion contract between the student and instructor will be developed before the last week of the term.
8. Academic honesty is fundamental to the activities and principles of any university. All members of the academic community must be confident that each person’s work has been responsibly and honorably acquired, developed, and presented. Any effort to gain an advantage not given to all students is dishonest whether or not the effort is successful. The academic community regards academic dishonesty as an extremely serious matter, with serious consequences that range from probation to expulsion. When in doubt about plagiarism, paraphrasing, quoting, or collaboration, consult the course instructor.
9.
If you need accommodations because of a disability, if you have emergency medical information to share with me, or if you need special arrangements in case the building must be evacuated, please inform me immediately. Please see me privately after class, or at my office.
To request academic accommodations (for example, a notetaker), students must also register with
Disability Services, AO38 Brady Commons, 882-4696. It is the campus office responsible for reviewing documentation provided by students requesting academic accommodations, and for accommodations planning in cooperation with students and instructors, as needed and consistent with course requirements.
For other MU resources for students with disabilities, click on ADisability Resources@ on the MU homepage.
50
2
8/29
3
9/5
4
9/12
1
8/22
TOPICAL OUTLINE AND WEEKLY ASSIGNMENTS
Structure of the Course
Discuss course content
Overview of Syllabus
Overview of Special Education Assessment
History
Purposes
Legal Issues
Measurement Concepts
Guest Speaker: Dr. Craig Frisby, Associate professor in School Psych.
No Class—Labor Day (University Holiday)
Reading:
Chapter 1
Reading:
Chapter 3
Case Study/Inclass assignment 1:
Mr. Chang
5
9/19
6
9/26
The Assessment Process, including Involving
Parents and Families
Teacher/student Assistance teams
Bias in Assessment
Informal Assessments and Curriculum-Based
Assessment Techniques
Curriculum-Based Measurement
Standardized Assessment
Using a Response to Intervention model for LD
Identification
Reading:
Chapters 2 and
15
Case Study/Inclass assignment 2:
Juan
Reading:
Chapters 5-6
Case Study/Inclass assignment 3:
Rudy Bega
Reading: Chapter
4, Fuchs (2003),
Mellard (2004),
Vaughn (2003)
Case Study/ Inclass assignment 4:
Gina
Begin CBM
Project: Collect baseline data
51
12
11/7
13
11/14
14
10
10/24
11
10/31
7
10/3
8
10/10
9
10/17
IQ (Learning Aptitude) and Achievement
(School Performance) Tests
Administer Standardized test to a peer in class
Classroom Behavior
Study Guide for Midterm—distributed and discussed
Midterm Exam
Assessing Reading
Assessing Math
Presentations on Standardized Test Critiques
7320 students only
Assessing Written and Oral Language
**Change class time??
Reading:
Chapters 7-8
DUE:
Administer standardized test in-class
Reading:
Chapter 10
Case Study/Inclass assignment 5:
Jack
Midterm:
Chapters 1-8 and
15, class notes, articles
CBM Project:
Implement
Intervention
Reading:
Chapter 11
Case Study/Inclass assignment 6:
Teague
Reading:
Chapter 12
Case Study/Inclass assignment 7:
David
DUE:
Standardized test critique
Reading:
Chapters 13 and
14
Missouri State Eligibility Criteria
Writing and Sharing Assessment Reports
Connecting Assessment to Instruction
Trends and Issues
No class—Thanksgiving Break (University
Reading:
Case Study/Inclass assignment 8:
DUE: CBM
Project
52
16
12/5
17
12/12
11/21
15
11/28
Holiday)
Assessment for Transition Education and Planning
Adaptive Behavior
Connecting Assessment and ITP
Study Guide for Final—distributed and discussed
Final Exam
Reading:
Chapter 17
Final Exam:
Chapters 10-14,
17
NO CLASS DUE:
Standardized
Test (by 5pm)
53
Professional Standards, Knowledge and Skill Objectives, Instructional Methods, and Methods of Evaluation
MISSOURI STANDARDS FOR TEACHER EDUCATION PROGRAMS (MOSTEP)
In addition to the Conceptual Framework of the PEU , professional education candidates will be expected to demonstrate performances consistent with the Missouri Department of Elementary and
Secondary Education standards for professional educators or MoSTEP . Selected standards are referenced as follows along with corresponding performance assessments. Note that numerous other MoSTEP competencies are reflected in the discipline specific standards that follow.
1.2.2 The preservice teacher understands how students learn and develop, and provides learning opportunities that support the intellectual, social, and personal development of all students.
1.2.3 The preservice teacher understands how students differ in their approaches to learning and creates instructional opportunities that are adapted to diverse learners.
Exams; In-class activities; Single-test Assessment
Reports; Evaluation Report.
Recommendation section of Single-Test
Assessment Reports and Evaluation Report.
1.2.7 The preservice teacher models effective verbal, nonverbal, and media communication techniques to foster active inquiry, collaboration, and supportive interaction in the classroom.
1.2.8 The preservice teacher understands and uses formal and informal assessment strategies to evaluate and ensure the continuous intellectual, social, and physical development of the learner.
Administration of assessments; preparation of written reports; & in-class practice assessment.
Assessment administration & scoring; written reports from norm-referenced assessments and
CBAs; Written reports; Evaluation report
DISCIPLINE SPECIFIC PROFESSIONAL STANDARDS
The standards and objectives that follow represent the knowledge and skills needed by professionals in the area of educational assessment and are based on the policy statement from the Common
Core of Knowledge and Skills Essential for All Beginning Special Education Teachers (CEC, 1998), the
Professional Education Unit Conceptual Framework (e.g., objectives marked CF 1 in part address competency #1 listed in the PEU Conceptual Framework ), and MoSTEP Standards (e.g., objectives marked MO 1.2.2 in part address this MoSTEP competency). The MoSTEP discipline-specific standards for preservice teachers of students with mild-moderate disabilities are the same as CEC knowledge and skill standards (indicated below respectively as K and S).
Course Standards, Objectives, and Methods of Evaluation
Upon successful completion of this course, the student will have demonstrated the following knowledge and skills:
I.
Philosophical, Historical, and Legal Foundations of Special Education—Core
Criteria
Knowledge:
I CC:K2
Objectives
Variations in beliefs, traditions, and values across cultures within society and the effect of the relationship among child, family, and schooling.
(CF 9; MO 1.2.3)
Instructional Methods
Readings, discussion, collaborative in-class activities.
Evaluation
Exams
Evaluation Report that reflects cultural diversity and role of family in process.
54
I CC:K3
I CC: K4
I CC: K5
Issues in definition and identification procedures for individuals with exceptional learning needs including individuals from culturally and/or linguistically diverse backgrounds.
(CF 2, 3. & 9; MO 1.2.2 & 1.2.3)
Assurances and due process rights related to assessment, eligibility, and placement.
(CF 7 & 8; MO 1.2.4 & 1.2.8)
Rights and responsibilities of parents, students, teachers, and other professionals, and schools as they relate to individual learning needs.
(CF 8 & 10; MO 1.1.2.7)
II.
Characteristics of Learners- none included
III.
Assessment, Diagnosis, and Evaluation--Core
Criteria
Knowledge:
III CC:K1
Objectives
Basic terminology used in assessment.
(CF 2 & 7; MO 1.2.8)
III CC: K2 Ethical concerns related to assessment.
(CF 7 & 8; MO 1.2.7 & 1.2.8)
III CC: K3 Legal provisions, regulations, and guidelines regarding assessment of individuals.
(CF 7 & 8; MO 1.2.7 & 1.2.8)
III CC: K5 Appropriate application and interpretation of scores, including grade score versus standard score, percentile ranks, age/grade equivalents, and standings.
(CF 7; MO 1.2.8)
III CC: K6 State appropriate use and limitations of each type of assessment instrument.
(CF 7; MO 1.2.8)
III CC: K7 Incorporate strategies that consider the influences of diversity on assessment, eligibility, programming, and placement of individuals with exceptional learning needs.
(CF 7 & 8; MO 1.2.3 & 1.2.8)
III CC: K8 The relationship between assessment and placement decisions.
(CF 3, 6, & 7; MO 1.2.4 & 1.2.8)
III CC: K9 Methods for monitoring progress of individuals with exceptional learning needs.
(CF 6 & 7; MO 1.2.8)
Readings, discussion, collaborative in-class activities.
Readings and discussion
Readings and discussion
Instructional Methods
Readings and discussion
Readings and discussion
Readings and discussion
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
Exams
Diagnostic interpretations within individual and composite reports.
Exams
Exams
Evaluation
Exams
Exams
Exams
Exams
Test Summaries
Assessment Portfolio
Test Critique
Exams
Evaluation Reports
Exams
Evaluation Reports
Exams
Process File
Exams
CBM Development &
Administration
55
Skills:
III CC: S5
III CC: S7
III CC: S8
Objectives
Interpret information from formal and informal assessment instruments and procedures.
(CF 7; MO 1.2.8)
Use performance data and information from teachers, other professionals, individuals with exceptionalities, and parents to make or suggest appropriate modifications in learning environments. (CF 6, 10; MO 1.2.7)
Develop individualized assessment strategies for instruction.
(CF 7 & 9; MO 1.2.3 & 1.2.8)
Instructional Methods
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
Readings and discussion
Collaborative in-class activities
IV.
V.
Instructional Content and Practice – none included.
Planning and Managing the Teaching and Learning Environment
Criteria
Knowledge:
V CC:K3
Objectives
Ways in which technology can assist with planning and managing the teaching and learning environment
(CF 5; MO 1.2.7 & 1.2.8)
Instructional Methods
Readings and discussion
Collaborative in-class activities
VI.
Managing Student Behavior and Social Interaction Skills
Criteria
Knowledge:
V CC:K10
Objectives
Identify assessment instruments
Instructional Methods
Discussion and readings
Criteria
Knowledge:
V CC:K11
V CC:K12
Skills:
V CC:S9 appropriate to assessment of student behavior including social and adaptive skills.
(CF 7; MO 1.2.8)
Objectives
Demonstrate knowledge of appropriate procedures for conducting a functional behavioral analysis.
(CF 7; MO 1.2.8)
Demonstrate knowledge of various behavioral recording techniques.
(CF 7; MO 1.2.8)
Design comprehensive evaluations appropriate to development of present
In-class/library review of instruments
Instructional Methods
Readings and discussion
Collaborative in-class activities
Discussion and readings
Readings and discussion
In-class/library review of level of performance in social, behavioral and adaptive domains.
(CF 7; MO 1.2.8) test instruments
VII.
Communication and Collaborative Partnerships – Core
Criteria
Knowledge:
Objectives Instructional Methods
Evaluation
Exams
Testing Summaries
Assessment Reports
Exams
Testing Summaries
CBA Interpretation
Assessment Reports
Exams
CBM Development &
Administration
Evaluation
Exams
Evaluation Plans
Assessment Reports
Evaluation
Exams
Evaluation Reports
Evaluation
Exams
Exams
Assessment Reports
Assessment Reports
Evaluation
56
VI CC:K1 Identify factors that promote effective communication and collaboration between instructor/student; with parents, educators and community members in a culturally responsive
VI CC:K5 program.
(CF 10; MO 1.2.7 & 1.2.10)
Identify ethical practices for confidential communication to others about individuals with exceptional learning needs.
(CF 8 & 10; MO 1.2.7 & 1.2.10)
VIII. Professionalism and Ethical Practices – Core
Criteria
Knowledge:
Objectives
VII CC:K4 Identify consumer and professional organizations, publications, and journals relevant to educational evaluation
(CF 8; MO 1.2.9)
VII CC:K5 Recognize the importance of the teacher serving as a model for individuals with exceptional learning
Skills:
VII CC:S5
VII CC:S7 needs.
(CF 6, 8, & 10; MO 1.2.7)
Objectives
Demonstrate proficiency in oral and written communication.
(CF 2 & 6; MO 1.2.7)
Comply with local, state, provincial, and federal monitoring and evaluation
VII CC:S8
VII CC:S9 requirements.
(CF 6; MO 1.2.1)
Use copyrighted educational materials in an ethical manner
(CF 8; MO 1.2.4)
Practice within the CEC Code of
Ethics and other standards and policies of the profession.
(CF 8; MO 1.2.4)
Readings and discussion
Collaborative in-class activities
Discussion and readings
Instructional Methods
Readings and discussion
Readings and discussion
Collaborative in-class activities
Instructional Methods
Feedback on assignments
Models of appropriate format and style
Readings and discussion
In-class activities
Instructor provision of models and class discussion
Discussion and readings including the CEC Code
Exams
Exams
Assessment Reports
Evaluation
Test Critique
All Assignments
Evaluation
Exams
Test Critique
Assessment Reports
Conclusion Statements
Test Critique
Referencing in assignments and exams
All Assignments
Technology Objectives
1.
Explain the transdisciplinary nature of assistive technology applications and discuss the value of including a variety of disciplines in the service delivery process.
2.
Demonstrate proficiency in using a computer system to aid or enhance personal productivity.
3.
Demonstrate skill in using productivity tools for professional and personal use, including word processing, database, spreadsheet, graphic utilities, and drawing programs.
4.
Demonstrate knowledge of uses of computers for problem solving, data collection, information management, decision making, communications, and development of presentations.
57
Lembke Course Activity- Steps for Slope Using the Tukey Method
1) Count the data points.
2) Draw two vertical lines to divide the data points approximately in thirds. There should be approximately an equal number of data points on each side of each vertical line.
3) Look at the left side. FIRST, find the median week. So if you have 5 points, your median would be 3. If you have 6 points, your median would be between 3 and 4. Draw a vertical dotted line down through this point that represents the median week.
4) NEXT, number the points from lowest to highest, bottom to top. So the lowest point would be 1, next lowest would be 2, etc. You can also write in the values of the points at this time, if this is helpful. Find the median value, either based on how you numbered them or by looking at the values. Draw a horizontal dashed line through this point. Draw this line over to touch the vertical dashed line you drew earlier. Where these two points meet, draw an X.
5) Repeat steps 3 and 4 for the right side of your data.
6) Connect your two X's with a line. This is your line of slope. Compare this line to your goal line to make decisions.
To calculate the numerical value of the slope, use this formula:
3rd median point – 1st median point
# of data points – 1
For the graphed data below: median point in the third section of data=68 – median point in the first section of data=67
(68-67=1)
# of data points=8-1=7
1/7=.14 (words per week growth)
58
weekly data for Jim
80
70
60
50
2 trend line
3
1
1
3
2 goal line
40
30
20
10
0
1/
1/
20
07
1/
2/
20
07
1/
3/
20
07
1/
4/
20
07
1/
5/
20
07
1/
6/
20
07
1/
7/
20
07
1/
8/
20
07
1/
9/
20
07
1/
10
/2
00
7
1/
11
/2
00
7
1/
12
/2
00
7
1/
13
/2
00
7
1/
14
/2
00
7
Date
59
Lembke Course Activity- Graphing Data Using Excel
1.
Enter your dates in one row and the student’s corresponding scores in the row below
2.
Highlight all data by clicking and dragging your mouse across the data.
3.
Click on insert, chart, and line.
4.
click next
5.
click on series, and then in the name box, put “weekly data for Jim” or something else similar
6.
Click next
7.
In the chart title box, name your chart
8.
In the x-axis box, put “date”
9.
In the y-axis box, put number of words read correctly in one minute OR number of correct choices in 3 minutes, or whatever your measure was.
10.
Click next
11.
click finish
12.
You should see your graph pasted within your worksheet.
13.
Insert a line for your goal.
14.
Draw a vertical dashed line following baseline by inserting a line using autoshapes. Right click on the line and click on format autoshapes to make it dashed. Do the same thing between phases 1 and 2 of your data.
15.
Click on the line that connects your baseline data to your phase one data so that the two points that start and finish that line segment are highlighted. Right click, click on format data point, and then under line, click none. Click ok. You shouldn’t be able to see that line segment now. Do the same thing for the line segment that connects phases 1 and 2 of your data.
16.
Finally, put your trend line on the data. Use the steps listed in the document “Steps to calculating a trend line using the Tukey method”
17.
Label the phases of your instruction (Baseline, Initial Instruction, Instructional Change)
60
Lembke Course Activity- Instructional Plan
Sch: Academic Area:
Gender: M or F
Instructional
Objectives
Allocated Time:
Instructional
Activities
Ratio Time Materials Motivational
Strategies
61
Lembke Course- Final Project Presentation Key Elements
Name of Presenter:
In 10 minutes…
1) service.
Briefly describe the student you worked with, including grade and level of
2)
State the student’s baseline performance. State the peer’s level of performance, and indicate the discrepancy between the student’s performance and the peer’s performance (if known).
3) State the short term objective. Why did you choose this?
4) State the long range goal.
5) Present your reasons for selecting the basic skill you are teaching (i.e., why did you choose oral reading, or maze, or letter sounds)?
6) Describe the materials you used (which reading skill and what level) to measure student progress and how you measured the student’s progress (how often, when).
7)
Pass out a copy of your student’s graph to each class member. Describe your initial instruction and how the student was performing during this initial period following baseline.
8) Describe your instructional change and how it affected your student’s performance. What area was your change in (materials, instruction, S/T ratio, time, etc.)?
9)
10)
Was your student’s long range goal achieved?
Make recommendations for future programming for this student. Instructional interventions?
Total Score:
62
Lembke Course Activity - Final Project: Summarizing the Outcomes of Data-Based
Interventions
I. Student Background Information (8 points):
Give student name (FIRST NAME ONLY), grade, age, level of service, and standardized test information available.
II. Individualized Education Plan (15 points):
A. Present Student Level of Performance and Peer Level of Performance (5 points):
State the student’s baseline performance. State the peer’s level of performance, and indicate the discrepancy between the student’s performance and the peer’s performance.
B. Long Range Goal (5 points): State a long range goal. (CBM format)
C. Short Term Objective (5 points): State the short term objective (CBM format)
III. Rationale for Selection of Goal (5 points):
Present your reasons for selecting the long range goal that you selected.
IV. Detailed Description of Measurement Procedures (6 points):
Describe in detail the materials you used to measure student progress, and how you measured the student’s progress.
V. Completed Instructional Plan (17 points): a.
Develop and attach the Instructional Plan. Use appropriate interventions from the
Direct Instruction book, the CORE sourcebook, class, or any other empirically validated source. Clearly indicate on the Instructional Plan one change in instruction with a horizontal line. (8 points). b.
Description of Instructional Plan: In narrative form, briefly describe the activities on the Instructional Plan and your rationale for choosing these activities (based on the information about the student.) Describe the change that you made and give rationale for this change (9 points).
63
VI. Performance Graph (8 points):
Clearly graph the student’s progress. Indicate peer’s level of performance on graph. If you are using the computer program, print out your graph and indicate your initial instruction and instructional change in writing. Separate baseline, initial instruction, and instructional change with vertical lines and labels. Include the students’ long range goal and trend lines for both initial instruction and the instructional change.
Make sure to label each phase with a short description and label the LRG and trend lines.
VII. Evaluation of the Program (16 points):
A. Effectiveness of Interventions (8 points): Evaluate the effectiveness of the instructional plan and your instructional change relative to the student’s goal line. Discuss whether the student was or was not on track to meet their goal. Comment first on the effectiveness of your initial instruction, and then comment on the effectiveness of the instructional change. Discuss your trend line for each phase as it compares to your goal line.
B. Recommendations (8 points): Make recommendations for future programming for this student. Include level of placement, type of instructional interventions, etc. Give a rationale for your recommendations.
The report will be graded on the basis of the accuracy of the graphing, the interpretation of the data, the appropriateness of the Instructional Plan, and the completeness and quality of the text.
Attend to clarity and conciseness; the report should be readily understandable to all interested and concerned parties, including parents.
64
Stephen N. Elliott, PhD
Spring Semester 2007
Dunn Family Professor of Assessment
Course Description
This course provides an examination of measurement concepts, technical issues, and common practices associated with the use of tests to assess the academic and social functioning of K-12 students. To this end, the following topics will be the focus of readings, lectures, and course assignments:
Key measurement concepts required to interpret and use test results wisely;
Purposes and common procedures for conducting a comprehensive assessment of students;
Accurate administration and interpretation of a set of tests and assessments used to measure academic and social/behavior constructs essential to classification of students with disabilities;
Current technical and methodological issues in the assessment of students’ progress and achievement.
Course Goals
Competence in using and interpreting a set of core assessment tools commonly used in special education eligibility determination, intervention design, progress monitoring, and evaluation of intervention outcomes. Course content will be applied through the administration of assessments with children and adolescents. Students are expected to recruit and get written consent for children to participate in the assessment activities.
Key objectives for successful participants in Sp Ed 3820 include:
1. Recognize the factors that influence the overrepresentation of culturally/linguistically diverse students in programs for individuals with mild and moderate disabilities.
2. Understand that cultural, ethnic, gender, and linguistic differences may be confused with or misinterpreted as manifestations of a disability and take actions to guard against inappropriate assessment and over and under identification of students for special education services.
3. Use ongoing assessment and student progress monitoring to write IEPs and account for student outcomes.
65
4. Use a variety of assessment procedures to document students’ learning, behavior, and growth within multiple environments. Plan and conduct assessments to develop individual learning plans.
5. Ensure that students with disabilities participate in school system and statewide assessments and document on the IEP the use of accommodations, special considerations or alternate assessments when appropriate.
6. Aware of and guard against over and under identification of disabilities based on cultural, ethnic, gender, and linguistic diversity. Use assessment strategies that guard against misinterpreting these differences as disabilities.
7. Know how to administer, score, interpret and report on formal and informal assessments, including standardized, functional, criterion-referenced, and curriculumbased tests.
8. Plan and conduct informal and formal assessments typically used to make eligibility and placement decisions.
9. Know how to use ongoing assessment and student progress monitoring to make instructional decisions and adaptations and modifications in instruction.
VANDERBILT’S HONOR CODE GOVERNS ALL WORK IN THIS COURSE
Basically, the Honor Code calls for students to conduct themselves ethically: do their own work and credit others appropriate when ideas or materials are used. For each assignment in this course, I will explain how the Honor Code applies. If you have any doubts, please ask me - not another student - for clarification. Uncertainty about the application of the Honor Code does not excuse a violation.
Students with Disabilities
I am committed to making educational opportunities available to all students. If you have a learning or physical disability, please approach me as soon as the semester starts (preferably on the first day of class). I will maintain the confidentiality of any discussion of your learning needs. It also will be helpful if you provide me with a letter from the Opportunity Development
Center (2-4705) explaining your specific needs so that I am aware of them early on and can make the appropriate accommodations.
Required Texts & Readings
Salvia, J. & Ysseldyke, J. Assessment in special and inclusive education (10 th
ed.).
Boston, MA : Houghton-Mifflin . (SY&B: Required)
American Educational Research Association et. Al (1999). Standards for
educational and psychological testing. Washington, DC: AERA, APA, & NCME.
(TS: Required)
Articles and chapters from other sources available on OAK.
Course Assignments
Quizzes o Quiz 1 will cover basic measurement and test score interpretation methods and issues. o Quiz 2 will cover testing practices and issues relevant to the core set of measures covered. o Quiz 3 will cover the Testing Standards.
66
Administration & Interpretation of Assessments o Administer the Woodcock to an undergraduate classmate and provide a videotap e and completed Examiner’s Checklist of the administration. o Administer the Woodcock to a student, score it, and provide a detailed write up concerning each subscale and the overall conclusions from the test. o Administer at least 5 CBM progress monitoring probes in reading or math and prepare graphs reflecting at least one student’s progress. Completed probes and the graph of results will be turned to be evaluated for accuracy in scoring and recording results on the graph. o Select individuals to complete one set (either teacher-student, teacher-parent) of rating scales (either the SSRS, ACES, or Vineland). Turn in the completed protocols and a short description (1 page) of your conclusions from the set of rating scales.
Review of an Assessment and Independent Validity Research o Select and a review a test or assessment (other than those in the core set used in this class). The written review should: (a) provide a thorough description of the instrument, (b) review at least 3 research articles that provide evidence for the validity of the test or assessment results, and (c) include a concluding section concerning the use of the assessment in your future research or practice.
Class Participation o Come to class prepared to engage in lecture discussions and to encourage the learning of other students.
Student Evaluation
Quiz 1: Measurement & Test Interpretation Basics
Quiz 2: Assessment Practices and Issues
Quiz 3: Testing Standards
Assignment
Woodcock Achievement Test Practice Administration
Woodcock Achievement Test administration and interpretation write-up
CBM administration and write-up with graph
Rating Scale multi-source administration and interpretation write-up
Test review and with analysis of supporting validity research
Possible
Points
100
100
100
50
100
150
150
100
Due
Dates
Feb. 13
April 3
May 1
Feb. 27
March 13
April 12
April 12
April 24 weekly Class attendance and participation
Total points
150
1000
A+ = 98-100%; A = 93-97%; A- = 90-92%
67
B+ = 87-89%; B = 83-86%; B- = 80-82%
C+ = 77-79%; C = 73-76%; C- = 70-72%
January 16: Course Introduction: Purpose, Goals, and Key Concepts
Chapter 1 in SY&B
Deno, S.L. (2005). Problem-Solving Assessment. In R. Brown-Chidsey (Ed.),
Assessment for intervention: A problem-solving approach (pp. 10-40). New York:
Guilford Press.
January 23: Assessment of Children: Fundamental Practices, Guiding Theories, &
Professional Standards
Chapters 2 & 3 in SY&B
Chapters Introduction, 8, & 11 in TS
Fundamental Measurement & Statistical Concepts
January 30: Essential Statistical Concepts and Score Scales
Chapters 4, 5, & 6 in SY&B
Chapter 4 in TS
February 6: Reliability and Validity of Test Scores
Chapters 7 & 8 in SY&B
Chapters 1 & 2 in TS
February 13 : Evaluating Tests & Assessment System
Chapter 15 in SY&B
Chapters 6 & 7 in TS
Methods for
Assessing Children’s Academic and Social Behavior
February 16-17: Assessment of Achievement / Woodcock Achievement Test Workshop
Chapter 21 and pages 308-313 in SY&B
68
Woodcock Manual materials disseminated by Professor
February 20: Assessment of Language
Chapter 24 in SY&B
Peabody Picture Vocabulary Test Manual materials disseminated by Professor
February 27: Assessment of Reading
Chapters 22 in SY&B
Fuchs, L.S., & Fuchs, D. (2004). Using curriculum-based measurement for progress monitoring.
March 6: Spring Break / No Class!
Chapter 23 in SY&B
Fuchs, L.S. & Fuchs, D. (2005). Using curriculum-based measurement for progress monitoring in math.
March 20: Assessment of Behavior: Social, Emotional, and Adaptive
Chapters 11, 26, 27 in SY&B
SSRS Manual to be handed out in class
March 27: No Formal Class Meeting / Administer Assessments
April 2: Testing Standards Discussion Session (1:00 to 3:00 optional)
April 3: Using Classroom Performances and Work Samples to Assess Students
Chapter 12 & 14 in SY&B
Challenging Assessment Issues
April 12 : Large-Scale Achievement Testing for Students with Disabilities
Chapters 9 & 31 in SY&B
69
Bolt, S. E., & Thurlow, M. (2004). Five of the most frequently allowed testing accommodations in state policy: Synthesis of the research. Remedial and special
education, 25, 141-152.
Elliott, S.N., & Roach, R.T. (in press). Alternate assessments of students with significant disabilities: Alternative approaches, common technical challenges. Applied
Measurement in Education.
April 17: Assessing Response to Intervention
Chapter 30 in SY&B
Fuchs, D. & Fuchs, L. S. (2005) Responsiveness-to-Intervention: A Blueprint for
Practitioners, Policymakers, and Parents. Teaching Exceptional Children, 38, 57-61.
April 24: Assessing Students with Limited English Proficiency
Chapter 10 in SY&B
Abedi, J., Hofstetter, C. H., & Lord, C. (2004). Assessment accommodations for English language learners: Implications for policy-based empirical research. Review of Educational
Research, 74,
1-28.
22 Key Concepts
Intelligent and ethical use of tests and assessments for children require substantial knowledge of measurement, statistics, normative development, federal and state education and disability regulations, and skills in test administration, scoring practices, and communication. This course is only one of perhaps 3 or 4 courses needed to be competent at evaluating children’s learning and behavior. In addition to acquiring this fundamental knowledge, changes in the education and test development fields require individuals choosing an assessment-oriented career to commit to the pursuit of professional development opportunities and life-long learning. This learning will be enabled and enhanced if you have command of 22 fundamental concepts that will be emphasized in this course. The concepts are:
Assessment
Testing
Standardization
NRT
CRT
Scaled Score
Variability
Dispersion
Error
Variance
Standard Deviation
Standard Error of Measurement
70
Confidence Interval
Bias
Reliability
Validity
Correlation coefficient
Baseline data
Stability
Ability
Achievement
Level of Inference
71
IDEA website http://idea.ed.gov/
Topical paper on Early Intervening Services (see p. 81 below)
Q & A on RTI and Early Intervening Services (see p. 86 below)
National Center on Student Progress Monitoring www.studentprogress.org
Summer Institute materials o Reading, advanced reading, math, written expression, RTI, AYP, data-based decision-making, statewide implementation
Library: Articles and Research, Center presentations, FAQ’s, Newsletters, Links o Examples of articles from the website are included in this packet.
Families: Information on SPM and CBM for families. o An example of a family-oriented article is included in this packet.
Tools chart: The Center’s Technical Review Committee has conducted several annual reviews of tools. Those that met the TRC’s rigorous criteria for technical adequacy are included in the chart along with detailed information about each tool. See: http://www.studentprogress.org/chart/chart.asp
Research Institute on Student Progress Monitoring http://www.progressmonitoring.net/
CBM probes
Study Group and Leadership Team online modules for student progress monitoring
Literature database
Regional Resource and Federal Center Network (RRFC) http://www.rrfcnetwork.org/component/option,com_frontpage/Itemid,1/
Search for topics (e.g., search terms below) and find resources from a variety of centers o “student progress monitoring” o “curriculum-based measurement” o “response to intervention”
New England Comprehensive Center http://www.necomprehensivecenter.org/
Archived webinar on RTI ( http://www.necomprehensivecenter.org/events/RTIWebinar1 )
Links to resources for RTI and assessment
National Research Center on Learning Disabilities www.nrcld.org
School-based RTI practices
RTI in learning disabilities determination o Resource kit, manuals, conference and symposium materials
Model site research
72
IRIS Center for Training Enhancement http://iris.peabody.vanderbilt.edu/
Online learning modules (click on Star Legacy Modules, go to Differentiated Instruction) o Classroom Assessments (parts 1-2) address SPM and CBM o RTI (parts 1-4)
Case studies, activities, and information briefs for use in higher education classrooms
The Access Center http://www.k8accesscenter.org/index.php
Strategies for enhancing access to the general education curriculum for students with disabilities
Briefs and professional development modules
State profiles and a database of resources
Center on Instruction http://www.centeroninstruction.org/
Resources for scientifically-based instruction in Reading, Math, Science, Special
Education, and for ELL students.
Intervention Central http://www.interventioncentral.org/
RTI-Wire, a list of online RTI resources
CBM resources, including online sources for probes
IDEA Partnership Grant http://www.ideapartnership.org/
RTI o Dialogue Guides ( http://www.ideapartnership.org/page.cfm?pageid=28 ) o Web-based resources ( http://www.ideapartnership.org/report.cfm?reportid=237 ) o Workgroup ( http://ideapartnership.org/page.cfm?pageid=8 ) o Library ( http://www.ideapartnership.org/report.cfm?reportid=247 )
CBM o Web-based resources ( http://www.ideapartnership.org/rkr2.cfm?rkrpageid=12 ) o Library ( http://www.ideapartnership.org/report.cfm?reportid=73 )
73
74
Lynn S. Fuchs and Douglas Fuchs
Available on www.studentprogress.org
. Click on Library, then Articles and Research.
Abstract .
When teachers use systematic progress monitoring to track their students progress in reading, mathematics, or spelling, they are better able to identify students in need of additional or different forms of instruction, they design stronger instructional programs, and their students achieve better. This document first describes progress monitoring procedures for which experimental evidence demonstrates these effects. Then, an overview of the research is presented.
Introduction .
Progress monitoring is when teachers assess students’ academic performance on a regular basis
(weekly or monthly) for two purposes: to determine whether children are profiting appropriately from the typical instructional program and to build more effective programs for the children who benefit inadequately from typical instruction.
This document describes research on progress monitoring in the areas of reading, spelling, and mathematics at grades 1-6. Experimental research, which documents how teachers can use progress monitoring to enhance student progress, is available for one form of progress monitoring: Curriculum-Based Measurement (CBM). More than 200 empirical studies published in peer-review journals (a) provide evidence of CBM’s reliability and validity for assessing the development of competence in reading, spelling, and mathematics and (b) document CBM’s capacity to help teachers improve student outcomes at the elementary grades.
Most classroom assessment relies on mastery measurement. With mastery measurement, teachers test for mastery of a single skill and, after mastery is demonstrated, they assess mastery of the next skill in a sequence. So, at different times of the school year, different skills are assessed.
Because the nature and difficulty of the tests keep changing with successive mastery, test scores from different times of the school cannot be compared (e.g., scores earned in September cannot be compared to scores earned in November or February or May). This makes it impossible to quantify or describe rates of progress. Furthermore, mastery measurement has unknown reliability and validity, and it fails to provide information about whether students are maintaining the previously mastered skills.
CBM avoids these problems because, instead of measuring mastery of a series of single shortterm objectives, each CBM test assesses all the different skills covered in the annual curriculum.
CBM samples the many skills in the annual curriculum in such a way that each weekly test is an alternate form (with different test items, but of equivalent difficulty). So, in September, a CBM mathematics test assesses all of the computation, money, graphs/charts, and problem-solving skills to be covered during the entire year. In November or February or May, the CBM test
75
samples the annual curriculum in exactly the same way (but with different items). Therefore, scores earned at different times during the school year can be compared to determine whether a student’s competence is increasing.
CBM also differs from mastery measurement because it is standardized; that is, the progress monitoring procedures for creating tests, for administering and scoring those tests, and for summarizing and interpreting the resulting database are prescribed. By relying on standardized methods and by sampling the annual curriculum on every test, CBM produces a broad range of scores across individuals of the same age. The rank ordering of students on CBM corresponds with rank orderings on other important criteria of student competence (1). For example, students who score high (or low) on CBM are the same students who score high (or low) on the annual state tests.
For these reasons, CBM demonstrates strong reliability and validity (2). At the same time, because each CBM test assesses the many skills embedded in the annual curriculum, CBM yields descriptions of students’ strengths and weaknesses on each of the many skills contained in the curriculum. These skills profiles also demonstrate reliability and validity (3). The measurement tasks within CBM are as follows:
Pre-reading
Phoneme segmentation fluency: For 1 minute, the examiner says words; in response to each word, the child says the sounds that constitute the word.
Letter sound fluency: The examiner presents the student with a sheet of paper showing the 26 lower case letters displayed in random order; the student has 1 minute to say the sound identified with each letter.
Reading
Word identification fluency: The examiner presents the student with a list of words, randomly sampled (with replacement) from a list of high-frequency words; the student reads words aloud for
1 minute; the score is the number of words read correctly. (Word identification fluency is appropriate for first graders until the score reaches 40 words read correctly per minute.)
Passage reading fluency: The examiner presents the student with a passage of the difficulty expected for year-end competence; the student reads aloud for 1 minute; the score is the number of words read correctly. (Passage reading fluency is appropriate through the fourth-grade instructional level.)
Maze fluency: The examiner presents the student with a passage of the difficulty expected for yearend competence for 2.5 minutes; from this passage, every seventh word has been deleted and replaced with three possible choices; the student reads the passage while selecting the meaningful choice for every seventh word; the score is the number of correct replacements.
Mathematics
Computation: The examiner presents the student with items systematically sampling the problems covered in the annual curriculum (adding, subtracting, multiplying, dividing whole numbers, fractions, and decimals, depending on grade); the student has a fixed time (depending on grade) to write answers; the score is the number of correct digits written in answers.
Concepts and applications: The examiner presents the student with items systematically sampling the problems covered in the annual curriculum (measurement, money, charts/graphs, problem
76
solving, numeration, number concepts); the student has a fixed time (depending on grade) to write answers; the score is the number of correct answers written.
Spelling
Each test comprises 20 words randomly sampled from the pool of words expected for mastery during the year; the examiner dictates a word while the student spells on paper; the next item is presented after the student completes his/her spelling or after 10 seconds, whichever occurs sooner; the test lasts 2 minutes; the score is the number of correct letter sequences (pairs of letters) spelled correctly.
Written Expression
In response to a story starter (i.e., a short topic sentence or phrase to begin the written piece), the student writes for a fixed amount of time (3-10 minutes). The score is the number of correct word sequences.
CBM produces two kinds of information. The overall CBM score (i.e., total score on the test) is an overall indicator of competence. The CBM skills profile describes strengths and weaknesses on the various skills assessed on each CBM test.
Teachers use the overall CBM score in three ways.
First, overall CBM scores are used in universal screening to identify students in need of additional or different forms of instruction. For example, CBM can be administered to all students in a class, school, or district at one point in time (e.g., October or January). Then, children in need of additional attention are identified using (a) normative standards (i.e., identifying students who score low compared to other students in the class, school, or nation) or
(b) CBM benchmarks (i.e., identifying students whose scores fall below a specific cut-point that predicts future success on state tests).
The second way teachers use overall CBM scores is to monitor students’ development of academic competence. That is, students are measured weekly or monthly, with each student’s
CBM scores graphed against time. This graph shows the student’s progress toward achieving competence on the annual curriculum. If the graphed scores are going up, then the student is developing competence on the annual curriculum; if the scores are flat, then the student is failing to benefit from the instructional program. The rate of weekly improvement is quantified as slope.
Research provides estimates of the amount of CBM progress (or slope) students typically make.
So, a teacher can compare the slope of her/his own class to the slope of large numbers of typically developing students to determine whether his/her instructional program is generally successful or requires adjustment. Teachers can also examine the slopes of individual students to determine which children are failing to make the amount of progress other children in the class
(or nation) are demonstrating and therefore require additional help.
77
The third way teachers use overall CBM scores is to improve instructional programs. For students who are failing to profit from the standard instructional program (as demonstrated via universal CBM screening or via inadequate CBM progress-monitoring slopes), teachers use
CBM to “experiment” with different instructional components. As teachers adjust instructional programs, in an attempt to enhance academic progress for these children, the teachers continue to collect CBM data. They then compare CBM slopes for different instructional components to identify which components optimize academic growth. In this way, teachers use CBM to build effective programs for otherwise difficult-to-teach children.
Teachers use the CBM skills profiles to identify which skills in the annual curriculum require additional instruction and which students are experiencing problems with maintaining skills after initial mastery was demonstrated. This kind of information can be accessed via CBM because every test assesses every skill covered in the annual curriculum. So, mastery status on every skill can be described directly from each CBM test.
Overview of research .
Studies included in this overview met the following criteria. First, they relied on experimental design; that is, teachers volunteered to participate in any of the study conditions and then were randomly assigned to conditions. Second, all studies included a control group (where teachers did not use systematic progress monitoring), against which the effects of progress monitoring procedures were assessed. Third, progress monitoring procedures were implemented for at least
15 school weeks, or 4 school months. Fourth, teachers’ instructional plans were analyzed to determine how planning changed as a function of progress monitoring. Fifth, students’ academic achievement was measured at the beginning and end of the study on global tests to determine whether students achieved differently in the various progress monitoring conditions.
This overview is organized in three sections: (a) evidence on CBM’s utility to identify students in need of additional or different forms of instruction, (b) evidence on the usefulness of CBM’s graphed analysis of the overall score to help teachers improve their instructional programs and effect better student achievement, and (c) evidence on the added value of CBM’s skills profiles for designing superior instructional programs that produce greater learning.
Results of these studies are described in terms of statistical significance and effect sizes.
Statistical significance means that one treatment group performed so much better than another group that it is highly unlikely that the results could be attributed to chance. This speaks to the reliability of the findings: If a similar study were conducted again, we would expect to find similar results, and if a teacher were to implement the treatment, we would expect similar effects for her/his students. It is possible, however, to have a statistically significant effect, which is accurate and reliable, but is small.
To address the question about whether a treatment effect is big or small, we look at effect sizes.
Effect sizes tell us how many standard deviations one treatment group performed better than another. If the mean of a test is 100 and its standard deviation is 15 (like an IQ test), then an
78
effect size of 1 standard deviation would mean, for example, that the treatment group ended the study with a score of 100 while the control group ended with a score of 85. Generally, in educational research, an effect size of .30 is considered small, .50 is considered moderate, and
.70 is considered large.
Identifying students in need of additional or different forms of instruction . Research shows that
CBM can be used to prompt teacher concern about student progress and to signal the need for additional or different forms of instruction. For example, in a recent study (4), 24 second-grade teachers were randomly assigned to control or CBM progress monitoring groups. Progress monitoring teachers, with the assistance of computers, collected CBM oral reading fluency data with every student in their classes. The computer organized the CBM information into individual student graphs as well as class reports. These reports showed CBM class graphs; noted students who fell in the lowest quartile of the class; and identified students in need of comprehension instruction, fluency development, or decoding work. In addition, the report provided a rank ordering of the students in the class, sorting them into those who already had met the year-end CBM benchmark, those who were on track to meet the year-end benchmark, and those who were at risk of failing to achieve the year-end benchmark. Teachers collected CBM data for 15 weeks, with individual graphs shown at the end of every data-collection session and with class reports printed every 3 weeks. Every 3 weeks, teachers answered the questions, “Do you have children whose progress seems problematic? Which children are you concerned about?” Progress monitoring teachers expressed concern about statistically significantly more students, with effect sizes exceeding 1 standard deviation. Moreover, when asked, “Why are you concerned about __________ ?,”
Progress monitoring teachers described features of student performance to explain their concern; by contrast, control teachers cited reasons beyond their control (such as English Language Learner status, special education status, attention or motivation problems, or inadequate parental involvement). This pattern of results was statistically significant. Therefore, systematic progress monitoring can be used to raise teacher concern about students’ reading progress and to signal the need for additional or different forms of instruction.
Usefulness of graphed analysis of thee overall CBM scores . Evidence strongly supports the utility of graphed analysis of overall CBM scores in helping teachers plan more effective programs. Studies (5) conducted over the past decade provide corroborating evidence of strong effects on students’ reading, spelling, and mathematics achievement when teachers rely on CBM progress monitoring to help them plan their instruction. A study conducted in the New York City
Public Schools (6) illustrates this research. Teachers participated for 18 weeks in a control group
(i.e., no systematic progress monitoring) or a CBM progress monitoring group. In the progress monitoring group, teachers measured students' reading performance with CBM oral reading fluency twice weekly, scored and graphed CBM performances, and applied CBM decision rules
(described in the next three paragraphs) to those graphs to plan their students' reading programs.
Children whose teachers employed CBM progress monitoring to develop reading programs achieved statistically significantly better than students in the control group on measures tapping a variety of reading skills, including a fluency test as well as the decoding and comprehension subtests of the Stanford Diagnostic Reading Test. Effect sizes were large, ranging between .94 and
1.18 standard deviations. So, teachers used CBM’s graphed analysis to effect greater reading achievement in terms of fluency, decoding, and comprehension.
79
CBM progress monitoring, using the graphed analysis, relies on decision rules that help teachers set ambitious student goals and help them determine when instructional adjustments are needed to prompt better student growth. The student’s initial CBM scores are graphed. The teacher uses normative information about expected rates of CBM growth to set a goal for the end of the school year. A diagonal line is drawn from the initial scores to the goal level/date. This diagonal line represents the desired rate of improvement for that student. As the instructional program is implemented, weekly CBM data are collected and graphed. A line of best fit is drawn through the student’s graphed scores to estimate the child’s actual weekly rate of improvement, or CBM slope.
The steepness of the goal line is compared to the steepness of the student’s actual rate of improvement. If the steepness of the student’s actual rate of improvement is greater, then the CBM decision is to raise the goal. If the steepness of the goal line is greater, then the CBM decision is to adjust the instructional program to stimulate greater learning.
Fuchs, Fuchs, and Hamlett (7) explored the contribution of the goal-raising CBM decision rule.
Teachers were assigned randomly to and participated in one of three treatments for 15 weeks in mathematics: no CBM, CBM without a goal-raising rule, and CBM with a goal-raising rule. The goal-raising rule required teachers to increase goals whenever the student's actual rate of growth
(represented by the slope through the actual, graphed scores) was greater than the growth rate anticipated by the teacher (reflected in the goal line). Teachers in the CBM goal-raising condition raised goals statistically significantly more frequently (for 15 of 30 students) than teachers in the nongoal-raising conditions (for 1 of 30 students). Moreover, concurrent with teachers' goal raising was statistically significantly differential student achievement on pre/post standardized achievement tests: The effect size comparing the pre/post change of the two CBM conditions (i.e., with and without the goal-raising rule) was .50 standard deviation. Consequently, using CBM to monitor the appropriateness of instructional goals and to adjust goals upward whenever possible is one means by which CBM can be used to assist teachers in their instructional planning.
A second way in which CBM can be used to enhance instructional decisions is to assess the adequacy of student progress and determine whether, and if so when, instructional adjustments are necessary. When actual growth rate is less than expected growth rate, the teacher modifies the instructional program to promote stronger learning. Fuchs, Fuchs, and Hamlett (8) estimated the contribution of this CBM decision-making strategy with 29 teachers who implemented CBM for
15 school weeks with 53 students. Teachers in a "CBM-measurement only" group measured students' reading growth as required but did not use the assessment information to structure students' reading programs. Teachers in the CBM-"change the program" group measured student performance and used CBM to determine when to introduce program adjustments to enhance student learning. Results indicated that, although teachers in both groups monitored student progress, important differences were associated with the use of the "change the program" decision rule. As shown on the Stanford Achievement Test-Reading Comprehension subtest, students in the
"change the program" group achieved statistically significantly better than a no-CBM control group (effect size=.72), whereas the "measurement only" CBM group did not (effect size=.36).
Moreover, the slopes of the two CBM treatment groups were significantly different, favoring the achievement of the "change the program" group (effect size=.86). As suggested by these findings and results of other studies (9), collecting CBM data, in and of itself, exerts only a small effect on
80
student learning. To enhance student outcomes in substantial ways, teachers need to use the CBM data to build effective programs for difficult-to-teach students.
Added value of skills profiles . To obtain rich descriptions of student performance, alternative ways of summarizing and describing student performance are necessary. Because CBM assesses performance on the year's curriculum at each testing, rich descriptions of strengths and weaknesses in the curriculum can be generated, and studies show how these skills profiles enhance teacher planning and student learning. In a series of investigations in reading (10), math (11), and spelling
(12), teachers were assigned randomly to one of three conditions: no CBM, CBM with goal-raising and change-the-program decision rules, and CBM with goal-raising and change-the-program decision rules plus CBM skills profiles. In all three studies, teachers in the skills profile group generated instructional plans that were statistically significantly more varied and more responsive to individuals' learning needs. Moreover, they effected statistically significantly better student learning as measured on change between pre- and posttest performance on global measures of achievement. Effect sizes associated with the CBM diagnostic profile groups ranged from .65 to
1.23 standard deviations. This series of studies demonstrates how structured, well-organized CBM information about students' strengths and difficulties in the curriculum can help teachers build better programs and effect greater learning.
Summary .
As demonstrated via the randomized field trials described above, teachers can use systematic progress monitoring in reading, mathematics, and spelling to identify students in need of additional or different forms of instruction, to design stronger instructional programs, and to effect better achievement outcomes for their students.
81
References .
1.
Good, R.H., Simmons, D.C., & Kame’enui, E.J. (2001). The importance and decisionmaking utility of a continuum of fluency-based indicators of foundational reading skills for third-grade high-stakes outcomes. Scientific Studies of Reading, 5 , 257-288.
2.
Marston, D. (l989). Curriculum-based measurement: What is it and why do it? In M.R. Shinn
(Eds.), Curriculum-based measurement: Assessing special children (pp. 18-78). New York:
Guilford.
3.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., & Allinder, R.M. (l989). The reliability and validity of skills analysis within curriculum-based measurement. Diagnostique, 14 , 203-221.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., Thompson, A., Roberts, P.H., Kubec, P., & Stecker,
P.M. (l994). Technical features of a mathematics concepts and applications curriculum-based measurement system. Diagnostique, 19 (4), 23-49.
4.
Fuchs, L.S., & Fuchs, D. (in press). Can diagnostic assessment information enhance general educators’ instructional planning and student achievement. In B. Foorman (Ed.),
Prevention and intervention for reading disabilities . New York: York Press.
5.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., & Allinder, R.M. (l991). Effects of expert system advice within curriculum-based measurement on teacher planning and student achievement in spelling. School Psychology Review, 20 , 49-66.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., & Ferguson, C. (l992). Effects of expert system consultation within curriculum-based measurement using a reading maze task. Exceptional
Children, 58 , 436-450.
Jones, E.D., & Krouse, J.P. (l988). The effectiveness of data-based instruction by student teachers in classrooms for pupils with mild learning handicaps. Teacher Education and Special
Education, 11 , 9-19.
Stecker, P.M., & Fuchs, L.S. (2000). Effecting superior achievement using curriculum-based measurement: The importance of individual progress monitoring. Learning Disability Research and Practice, 15 , 128-134.
Wesson, C.L. (l991). Curriculum-based measurement and two models of follow-up consultation. Exceptional Children, 57 , 246-257.
Wesson, C.L., Skiba, R., Sevcik, B., King, R., & Deno, S. (l984). The effects of technically adequate instructional data on achievement. Remedial and Special Education, 5 , 17-22.
6.
Fuchs, L.S., Deno, S.L., & Mirkin, P.K. (l984). The effects of frequent curriculum-based measurement and evaluation on student achievement, pedagogy, and student awareness of learning. American Educational Research Journal, 21 , 449-460.
82
7.
Fuchs, L.S., Fuchs, D., & Hamlett, C.L. (l989a). Effects of alternative goal structures within curriculum-based measurement. Exceptional Children, 55 , 429-438.
8.
Fuchs, L.S., Fuchs, D., & Hamlett, C.L. (l989b). Effects of instrumental use of curriculumbased measurement to enhance instructional programs. Remedial and Special Education, 10 (2),
43-52.
9.
Stecker, P.M., & Fuchs, L.S. (2000). Effecting superior achievement using curriculum-based measurement: The importance of individual progress monitoring. Learning Disability Research and Practice, 15 , 128-134.
Wesson, C.L., Skiba, R., Sevcik, B., King, R., & Deno, S. (l984). The effects of technically adequate instructional data on achievement. Remedial and Special Education, 5 , 17-22.
10.
Fuchs, L.S., Fuchs, D., & Hamlett, C.L. (l989). Monitoring reading growth using student recalls: Effects of two teacher feedback systems. Journal of Educational Research , 83, 103-
111.
11.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., & Stecker, P.M. (l990). The role of skills analysis in curriculum-based measurement in math. School Psychology Review, 19 , 6-22.
12.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., & Allinder, R.M. (l991). The contribution of skills analysis to curriculum-based measurement in spelling. Exceptional Children, 57 , 443-452.
Implications for Practice .
Teachers should monitor student progress in reading, spelling, and mathematics using standardized progress monitoring systems, such as curriculum-based measurement (CBM).
Teachers should use progress monitoring systems to identify students in need of additional or different forms of instruction.
For students who do not respond adequately to the standard instructional program, teachers should use graphed analyses of CBM scores to insure ambitious goals and to identify instructional components that result in improved learning for otherwise difficult-to-teach students.
Teachers should use skills profiles, derived from progress monitoring systems, to formulate strong instructional programs and to effect better student outcomes.
Additional Readings .
Deno, S.L. (l985). Curriculum-based measurement: The emerging alternative. Exceptional
Children, 52 , 219-232.
83
Deno, S.L., & Fuchs, L.S. (l987). Developing curriculum-based measurement systems for databased special education problem solving. Focus on Exceptional Children, 19 (8), 1-16.
Fuchs, L.S., & Deno, S.L. (l991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57 , 488-501.
Fuchs, L.S., Fuchs, D., Hamlett, C.L., Walz, L., & Germann, G. (l993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Review, 22 , 27-48.
Good, R.H., Simmons, D.C., & Kame’enui, E.J. (2001). The importance and decision-making utility of a continuum of fluency-based indicators of foundational reading skills for third-grade high-stakes outcomes. Scientific Studies of Reading, 5 , 257-288.
National Center on Student Progress Monitoring
1000 Thomas Jefferson ~ Washington, DC 20007
202-342-5000
E-Mail: studentprogress@air.org
Web: www.studentprogress.org
This document was developed through Cooperative Agreement (#H324U010004) between
Vanderbilt University and the U.S. Department of Education, Office of Special Education
Programs for the National Research Center on Learning Disabilities. The contents of this document do not necessarily reflect the views or policies of the Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S.
Government.
84
Deborah Speece
Department of Special Education
University of Maryland
Available on www.studentprogress.org
. Click on Library, then Articles and Research
My colleagues and I studied a response-to-instruction model as a method of identifying children for special education services. To judge responsiveness, we used curriculum-based measures
(CBM) of oral reading fluency to monitor progress. In one of the schools we worked in, children were administered these one-minute measures every week. About every 8 weeks we met with the children’s teachers to share graphs of children’s progress, identify children who were falling behind their peers, and design reading interventions that the general educator thought were feasible to implement in the classroom. Children who caught up with their peers were considered responsive and continued with weekly measurement; those who did not make adequate progress continued to receive specially-designed intervention from the general education teacher as well as weekly measurement. This process generated a number of examples of how weekly progress monitoring, which includes systematic data interpretation and teacher action, is central to good decision-making in an RTI framework. Two children are discussed whose profiles illustrate different aspects of the progress monitoring-RTI interface.
Kyle: Don’t Forget About Academics
Kyle was in second grade when he entered our study.
When we met with his classroom teacher to discuss his lack of reading progress, the discussion was dominated by a focus on his problem maintaining attention and the excellent involvement of his parents with the school and classroom. Kyle’s father volunteered in the classroom one day per week and both parents were aware of Kyle’s impulsivity, difficulties completing assignments and working independently. They declined any involvement with special education assessment or suggestions to evaluate for attention deficit disorder but did work closely with his pediatrician.
Although attention was clearly an issue, it was equally apparent that Kyle was not making progress in reading. Our weekly CBM measures in the fall showed that he was reading about 20 words correctly in one minute. The average number of words read by second graders at that time of year is about 65 . The graph below depicts Kyle’s performance on the CBM measure across the year. As indicated, he was reading only 20 words per minute in November and December.
Equally problematic was that he was showing no growth. His performance was somewhat better in January but he was still behind his peers.
The teacher may have realized the extent of his reading problems but our sense was that her emphasis was how to keep Kyle focused (a reasonable goal). Reading instruction seemed a secondary concern. In the second half of the year, Kyle’s reading performance showed good improvement following the teacher’s implementation of a reading intervention developed collaboratively. The dotted line on the graph indicates when the teacher began the intervention, the “G” represents the goal we set for him, and the “T” represents the trend line that summarizes his rate of growth. The interpretation is that Kyle exceeded his goal. Because we did not
85
establish experimental control, we cannot say his improvement was due to the teacher’s additional instruction.
There are two important points relevant to progress monitoring and RTI. First, it is conceivable that Kyle’s alarmingly poor reading may not have received proper due given the preoccupation with his attention problems. Thus, frequent monitoring and interpretation of performance seems essential to keep track of children’s academic progress. Second, performance comparisons to both individual progress and group progress is necessary in an RTI framework. Kyle exceeded his goal at the end of second grade but when compared to his second grade peers, he still lagged behind on the number of words he could read and his rate of growth. In planning for the next year, instructional arrangements and practices should be considered that might help Kyle close the gap with his peers.
Janis: When More is Needed
Janis’ profile was quite a bit different from Kyle’s. She was identified by our project in first grade because she was making very little growth in oral reading fluency. Janis was viewed by her teachers as cooperative, hard working, and very quiet. She was consistently described as slow, being the last child to join her group or finish work. Spanish was her first language and teachers believed she had made a lot of progress with her English skills since kindergarten. Janis was in our study for two years and was identified three times by us as being below her classmates on her growth and the number of words she could read.
Janis received several rounds of intervention that we developed with her teachers including individual instruction from a general education teacher in second grade. As far as we could tell, her teachers were devoted to improving her reading skill but despite their efforts, Janis remained behind her classmates. The graphs below show Janis’ first and second grade oral reading fluency scores. Although she made growth, it was minimal and certainly not enough to catch up with her peers. She was growing at a rate of approximately .5 words per week while her peers were more than doubling her growth.
Janis’ progress monitoring data show that she was not responsive to general education efforts and more intensive intervention was needed. Interestingly, the teachers never raised the possibility of special education services. Possibly the fact that she was an English language learner clouded the issue of her reading progress. That is, disability was implicitly ruled out because of her language status. In any event, the progress monitoring graphs are telling and
86
difficult to argue with. Regardless of perceived cause, Janis required something different to make gains in reading.
Summary
Progress monitoring is a method of keeping track of children’s academic development. Progress monitoring requires frequent data collection (i.e., weekly) with technically adequate measures, interpretation of the data at regular intervals, and changes to instruction based on the interpretation of child progress. The two cases presented were meant to illustrate how progress monitoring data could be used to make reasonable decisions about children’s responsiveness. In one example the data shined a light on reading problems that may have been overshadowed by behavioral issues and, in the other, the data indicated that the child needed more than what could be delivered in general education. The approach requires a different way of thinking about children’s learning but is a powerful method of judging responsiveness.
Resources
Case, L. P., Speece, D. L., & Molloy, D. E. (2003) The validity of a response-to-instruction paradigm to identify reading disabilities: A longitudinal analysis of individual differences and contextual factors. School Psychology Review, 32, 557-582.
* Has further analysis of Kyle, Janis and other students
Deno, S. L. (1997). Whether thou goest…Perspectives on progress monitoring. In J. W.
Lloyd, E. J. Kameenui, & D. Chard (Eds.), Issues in educating students with disabilities (pp. 77-
99). Mahwah, NJ: Lawrence Erlbaum.
87
*Compares two methods of progress monitoring: Mastery measurement and General Outcome
Measurement
Fuchs, L. S., Hamlett, C., & Fuchs, D. Monitoring basic skills progress . Pro-Ed: Austin, TX.
* Software for CBM administration, scoring, graphing on Mac; reading, math concepts and applications, spelling
Speece, D. L., Case, L. P., & Molloy, D. E. (2003). Responsiveness to general education instruction as the first gate to learning disabilities identification. Learning Disabilities Research
& Practice, 18, 147-156 .
*Entire issue devoted to Response to Instruction www.glue.umd.edu/~dlspeece/cbm reading/
* Has oral reading fluency passages grades 1-4, administration, scoring directions; local norms for a group of elementary school children
88
Prepared by:
Kathleen McLane
Available on www.studentprogress.org
. Click on Families.
Our children’s progress is being monitored constantly at school, through the steady stream of homework assignments, quizzes, tests, projects, and standardized tests. On first hearing the term
“student progress monitoring,” our initial reaction may be “they’re doing this already!” or “more tests?”. But do you really know how much your child is learning or progressing? Standardized tests compare your child’s performance with other children’s or with state standards. However, these tests are given at the end of the year; the teacher who has been working with your child during the year will not be able to use the test results to decide how to help your child learn better. Progress monitoring can give you and your child’s teacher information that can help your child learn more and learn faster, and help your child’s teachers teach more effectively and make better decisions about the type of instruction that will work best with your child. In other words, student progress monitoring is not another way of assigning a number to your child; it is a way of helping the child learn and the teacher teach.
What Is Student Progress Monitoring?
Student progress monitoring helps teachers evaluate how effective their instruction is, either for individual students or for the entire class. You are probably already familiar with the goals and objectives that must be included in the Individualized Education Plan (IEP) for each child who receives special education services. A teacher who uses progress monitoring works with the goals in the IEP, and the state standards for the child’s grade level, to develop goals that can be measured and tracked, and that can be used to divide what the child is expected to learn by the end of the year into shorter, measurable steps. For example, the child may have a reading goal that is stated in terms of the number of words per minute expected by the end of the year. Or, the child may have a math goal that is stated as the number of problems scored correctly on tests covering the math content for the year. Once the teacher sets the goals and begins instruction, then he or she measures the child’s progress toward meeting the goals each week. All the tests have the same level of difficulty, so the weekly tests can reflect the child’s rate of progress accurately. With each test, the teacher compares how much the child is expected to have learned to the child’s actual rate of learning.
If the child is meeting or exceeding the expectation, the teacher continues to teach the child in the same way. If the child’s performance on the measurement does not meet the expectation, then the teacher changes the teaching . The teacher might change the method being used, the amount of instructional time, the grouping arrangement (for example, individual instruction versus small-group instruction), or some other aspect of teaching. In this process, the teacher is looking for the type and amount of instruction that will enable the child to make enough progress toward meeting the goal. The measurements take from 1 to 5 minutes, so the child should not have the feeling of constantly being tested. In addition, since the teacher measures progress
89
frequently – usually once a week – he or she can revise the instructional plan as soon as the child needs it, rather than waiting until a test or the state assessment shows that the child’s instructional needs are not being met.
After each weekly measurement, the teacher notes your child’s performance level and compares it to previous measurements and to expected rates of learning. The teacher tracks the measurements on a graph as a way of showing the success of both the teacher and the student.
What Information Should I Receive From the School?
If a teacher, or a school, decides to implement student progress monitoring, you may receive a letter describing the program and how the teacher will be working with your child, or it may be discussed at your child’s IEP meeting. After that, you should receive regular feedback from the teacher on how well your child is doing, perhaps with a copy of the graph itself and information on instructional changes. If you do not receive the graph and instructional information, ask for it. For more information visit www.studentprogress.org
.
National Center on Student Progress Monitoring
1000 Thomas Jefferson ~ Washington, DC 20007
202-403-5000
E-Mail: studentprogress@air.org
Web: www.studentprogress.org
This document was developed through Cooperative Agreement (#H326W0003) between the American Institutes for Research and the U.S. Department of
Education, Office of Special Education Programs. The contents of this document do not necessarily reflect the views or policies of the Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S.
Government. This publication is copyright free. Readers are encouraged to copy and share it, but please credit the National Center on
Student Progress Monitoring
.
90
Available at:
,
91
IDEA Regulations
The reauthorized Individuals with Disabilities Education Act (IDEA) was signed into law on
Dec. 3, 2004, by President George W. Bush. The provisions of the act became effective on July
1, 2005, with the exception of some of the elements pertaining to the definition of a “highly qualified teacher” that took effect upon the signing of the act. The final regulations were published on Aug. 14, 2006. This is one in a series of documents, prepared by the Office of
Special Education and Rehabilitative Services (OSERS) in the U.S. Department of Education that covers a variety of high-interest topics and brings together the regulatory requirements related to those topics to support constituents in preparing to implement the new regulations.
1
This document addresses the final regulatory requirements regarding early intervening services.
IDEA Regulations
1. Add “early intervening services” to the regulations under local educational agency
(LEA) eligibility.
An LEA may not use more than 15 percent of the amount the LEA receives under Part B of the Act for any fiscal year, less any amount reduced by the LEA pursuant to 34 CFR
300.205, if any, in combination with other amounts (which may include amounts other than education funds), to develop and implement coordinated, early intervening services, which may include interagency financing structures, for students in kindergarten through grade 12
(with a particular emphasis on students in kindergarten through grade three) who are not currently identified as needing special education or related services, but who need additional academic and behavioral support to succeed in a general education environment.
[34 CFR 300.226(a)] [20 U.S.C. 1413(f)(1)]
2. Allow activities in implementing coordinated, early intervening services LEAs.
In implementing coordinated, early intervening services under 34 CFR 300.226, an LEA may carry out activities that include:
• Professional development (which may be provided by entities other than LEAs) for teachers and other school staff to enable such personnel to deliver scientifically
1
Topics in this series include: Alignment With the No Child Left Behind (NCLB) Act ; Changes in Initial Evaluation and
Reevaluation; Children Enrolled by Their Parents in Private Schools; Discipline; Disproportionality and Overidentification; Early
Intervening Services; Highly Qualified Teachers; Identification of Specific Learning Disabilities; Individualized Education
Program (IEP) Team Meetings and Changes to the IEP; Individualized Education Program (IEP); Local Funding; Monitoring,
Technical Assistance and Enforcement; National Instructional Materials Accessibility Standard (NIMAS) ; Part C Amendments in
IDEA 2004; Part C Option: Age 3 to Kindergarten Age; Procedural Safeguards: Surrogates, Notice and Consent; Procedural
Safeguards: Mediation; Procedural Safeguards: Resolution Meetings and Due Process Hearings; Secondary Transition; State
Complaint Procedures; State Funding; and Statewide and Districtwide Assessments. Documents are available on the IDEA Web site at: http://IDEA.ed.gov.
92
based 1 academic and behavioral interventions, including scientifically based literacy instruction, and, where appropriate, instruction on the use of adaptive and instructional software; and
Providing educational and behavioral evaluations, services, and supports, including scientifically based literacy instruction. [34 CFR 300.226(b)] [20 U.S.C. 1413(f)(2)]
3. Clarify the relationship between free appropriate public education (FAPE) and early intervening services.
Nothing in this section shall be construed to either limit or create a right to FAPE under Part
B of the Act or to delay appropriate evaluation of a child suspected of having a disability.[34
CFR 300.226(c)] [20 U.S.C. 1413(f)(3)]
4. Establish reporting requirements.
Each LEA that develops and maintains coordinated, early intervening services under 34 CFR
300.226 must annually report to the State educational agency (SEA) on:
The number of children served under 34 CFR 300.226 who received early intervening services; and
The number of children served under 34 CFR 300.226 who received early intervening services and subsequently receive special education and related services under Part B of the
Act during the preceding two year period. [34 CFR 300.226(d)] [20 U.S.C. 1413(f)(4)]
5. Establish coordination with the ESEA 2 .
6.
Funds made available to carry out 34 CFR 300.226 may be used to carry out coordinated, early intervening services aligned with activities funded by, and carried out under the ESEA if those funds are used to supplement, and not supplant, funds made available under the
ESEA for the activities and services assisted under 34 CFR 300.226. [34 CFR 300.226(e)]
[20 U.S.C. 1413(f)(5)]
Permit the use of funds for early intervening services.
1 Scientifically based research has the meaning given the term in section 9101(37) of the Elementary and Secondary Education
Act ( ESEA ) of 1965. Section 9101(37) of ESEA , as amended by the NCLB, defines scientifically based research as “research that involves the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge relevant to education activities and programs.” The statute then explains that this kind of research: (1) Employs systematic, empirical methods that draw on observation or experiment; (2) Involves rigorous data analyses that are adequate to test the stated hypotheses and justify the general conclusions drawn; (3) Relies on measurements or observational methods that provide reliable and valid data across evaluators and observers, across multiple measurements and observations, and across studies by the same or different investigators; (4) Is evaluated using experimental or quasi-experimental designs in which individuals, entities, programs, or activities are assigned to different conditions and with appropriate controls to evaluate the effects of the condition of interest, with a preference for random-assignment experiments, or other designs to the extent that those designs contain withincondition or across-condition controls; (5) Ensures that experimental studies are presented in sufficient detail and clarity to allow for replication or, at a minimum, offer the opportunity to build systematically on their findings; and (6) Has been accepted by a peer-reviewed journal or approved by a panel of independent experts through a comparably rigorous, objective, and scientific review. (Note: practitioner journals or education magazines are not the same as peer-reviewed academic journals.)
2 For purposes of this document, the NCLB is referred to as the ESEA of 1965, as amended.
93
The amount of funds expended by an LEA for early intervening services under 34 CFR
300.226 shall count toward the maximum amount of expenditures that the LEA may reduce under 34 CFR 300.205(a) of this section.
[34 CFR 300.205(d)] [20 U.S.C. 1413(a)(2)(C)(i)]
7. Require early intervening services in the case of significant disproportionality.
In the case of a determination of significant disproportionality with respect to the identification of children as children with disabilities, or the placement in particular educational settings of these children, in accordance with 34 CFR 300.646(a), [Note that 34
CFR §300.646(a) addresses identification, placement and disciplinary actions.] the State or the Secretary of the Interior must…require any LEA identified under 34 CFR 300.646(a) to reserve the maximum amount of funds under section 613(f) of the Act [34 CFR 300.226] to provide comprehensive coordinated early intervening services to serve children in the LEA, particularly, but not exclusively, children in those groups that were significantly overidentified under 34 CFR 300.646(a) of this section. [34 CFR 300.646(b)(2)] [20 U.S.C.
1418(d)(2)(B)]
8. Establish a relationship between maintenance of effort and early intervening services.
LEAs that seek to reduce their local maintenance of effort in accordance with 34 CFR
300.205(d) and use some of their Part B funds for early intervening services under 34 CFR
300.226 must do so with caution because the local maintenance of effort reduction provision and the authority to use Part B funds for early intervening services are interconnected. The decisions that an LEA makes about the amount of funds that it uses for one purpose affect the amount that it may use for the other. Below are examples that illustrate how 34 CFR
300.205(d) and 300.226(a) affect one another.
Example 1: In this example, the amount that is 15 percent of the LEA's total grant (see 34
CFR 300.226(a)), which is the maximum amount that the LEA may use for early intervening services (EIS), is greater than the amount that may be used for local maintenance of effort
(MOE) reduction (50 percent of the increase in the LEA's grant from the prior year's grant)
(see 34 CFR 300.205(a)).
Prior Year's Allocation:
Current Year's Allocation:
Increase:
Maximum Available for MOE Reduction:
Maximum Available for EIS:
$900,000
$1,000,000
$100,000
$50,000
$150,000
If the LEA chooses to set aside $150,000 for EIS, it may not reduce its MOE (MOE maximum $50,000 less $150,000 for EIS means $0 can be used for MOE).
If the LEA chooses to set aside $100,000 for EIS, it may not reduce its MOE (MOE maximum $50,000 less $100,000 for EIS means $0 can be used for MOE).
If the LEA chooses to set aside $50,000 for EIS, it may not reduce its MOE (MOE maximum $50,000 less $50,000 for EIS means $0 can be used for MOE).
94
If the LEA chooses to set aside $30,000 for EIS, it may reduce its MOE by $20,000
(MOE maximum $50,000 less $30,000 for EIS means $20,000 can be used for MOE).
If the LEA chooses to set aside $0 for EIS, it may reduce its MOE by $50,000 (MOE maximum $50,000 less $0 for EIS means $50,000 can be used for MOE).
Example 2: In this example, the amount that is 15 percent of the LEA's total grant (see 34
CFR 300.226(a)), which is the maximum amount that the LEA may use for EIS, is less than the amount that may be used for MOE reduction (50 percent of the increase in the LEA's grant from the prior year's grant) (see 34 CFR 300.205(a)).
Prior Year's Allocation:
Current Year's Allocation:
$1,000,000
$2,000,000
Increase:
Maximum Available for MOE Reduction:
Maximum Available for EIS:
$1,000,000
$500,000
$300,000
If the LEA chooses to use no funds for MOE, it may set aside $300,000 for EIS (EIS maximum $300,000 less $0 means $300,000 for EIS).
If the LEA chooses to use $100,000 for MOE, it may set aside $200,000 for EIS (EIS maximum $300,000 less $100,000 means $200,000 for EIS).
If the LEA chooses to use $150,000 for MOE, it may set aside $150,000 for EIS (EIS maximum $300,000 less $150,000 means $150,000 for EIS).
If the LEA chooses to use $300,000 for MOE, it may not set aside anything for EIS (EIS maximum $300,000 less $300,000 means $0 for EIS).
If the LEA chooses to use $500,000 for MOE, it may not set aside anything for EIS
(EISmaximum $300,000 less $500,000 means $0 for EIS).
[Appendix D to 34 CFR 300]
95
Available at:
96
Questions and Answers
On Response to Intervention (RTI) and
Early Intervening Services (EIS)
January 2007
The final regulations for the reauthorized Individuals with Disabilities Education Act (IDEA) were published in the Federal Register on August 14, 2006, and became effective on October 13,
2006. Since publication of the final regulations, the Office of Special Education and
Rehabilitative Services (OSERS) in the U.S. Department of Education has received requests for clarification of some of these regulations. This is one in a series of question and answer documents prepared by OSERS to address some of the most important issues raised by requests for clarification on a variety of high-interest topics. Generally, the questions, and corresponding answers, presented in this Q&A document required interpretation of IDEA and the regulations and the answers are not simply a restatement of the statutory or regulatory requirements. The responses presented in this document generally are informal guidance representing the interpretation of the Department of the applicable statutory or regulatory requirements in the context of the specific facts presented and are not legally binding. The Q&As are not intended to be a replacement for careful study of IDEA and the regulations. The statute, regulations, and other important documents related to IDEA and the regulations are found at http://idea.ed.gov
.
The final regulations incorporate new requirements regarding identifying children with specific learning disabilities (SLD) and early intervening services (EIS). With regard to identifying children with SLD, the regulations: (1) allow a local educational agency (LEA) to consider a child’s response to scientific, research-based intervention as part of the SLD determination process; (2) allow States to use other alternative research-based procedures for determining whether a child has a SLD; (3) provide that States may not require the use of a severe discrepancy between intellectual ability and achievement to determine whether a child has a
SLD; and (4) require a public agency to use the State criteria in determining whether a child has a SLD and discuss the role that response to scientific research-based interventions plays in a comprehensive evaluation process.
The regulations regarding EIS permit an LEA to use not more than 15% of its IDEA Part B funds to develop and implement EIS. The regulations also indicate how EIS funds can be expended; on whom the EIS funds can be spent; the reporting requirements for EIS; special provisions regarding disproportionality based on race and ethnicity and how that affects an LEA’s use of
EIS funds; and the relationship of EIS to maintenance of effort. The purpose of the questions and answers that follow is to provide additional guidance to States and LEAs in complying with the requirements regarding EIS and response to scientific research-based interventions to identify students with a SLD.
Authority: The requirements for using a process based on a child’s response to scientific, research-based intervention when determining that the child is a
97
child with a specific learning disability are found in the regulations at 34
CFR §§300.307, 300.309 and 300.311.
The requirements for early intervening services are found in the regulations at 34 CFR
§§300.205(d), 300.208(a)(2), 300.226 and 300.646(b)(2).
Question A-1: Please clarify how a child with a disability who is already receiving special education and related services also would be eligible to receive services using response to intervention (RTI) strategies.
Answer: Response to intervention (RTI) strategies are tools that enable educators to target instructional interventions to children’s areas of specific need as soon as those needs become apparent. There is nothing in IDEA that prohibits children with disabilities who are receiving special education and related services under IDEA from receiving instruction using RTI strategies unless the use of such strategies is inconsistent with their individualized education programs (IEPs). Additionally, under IDEA, a public agency may use data gathered through RTI strategies in its evaluations and reevaluations of children with SLD. However, children with disabilities who are currently identified as needing special education and related services may not receive RTI services that are funded with
IDEA funds used for EIS pursuant to 34 CFR §300.226. This is because
EIS is “… for students in kindergarten through grade 12 (with a particular emphasis on students in kindergarten through grade three) who are not currently identified as needing special education or related services, but who need additional academic and behavioral support to succeed in a general education environment.”
Question A-2: Why was RTI included in IDEA?
Answer: The reports of both the House and Senate Committees accompanying the
IDEA reauthorization bills reflect the Committees’ concerns with models of identification of SLD that use IQ tests, and their recognition that a growing body of scientific research supports methods, such as RTI, that more accurately distinguish between children who truly have SLD from those whose learning difficulties could be resolved with more specific, scientifically based, general education interventions. Similarly, the
President’s Commission on Excellence in Special Education recommended that the identification process for SLD incorporate an RTI approach.
98
Question B-1: Is the use of funds for EIS required or permitted?
Answer: Generally, the use of funds an LEA receives under Part B of the Act for
EIS is discretionary on the part of the LEA, except when an LEA has significant disproportionality based on race and ethnicity. Under 34 CFR
§300.226, an LEA may not use more than 15% of the amount the LEA receives under Part B of the Act for any fiscal year, less any amount reduced by the LEA pursuant to 34 CFR §300.205, if any, in combination with other amounts (which may include amounts other than education funds), to develop and implement coordinated EIS. If a State identifies an
LEA as having significant disproportionality based on race and ethnicity with respect to the identification of children with disabilities, the placement of children with disabilities in particular educational settings, or the incidence, duration, and type of disciplinary actions taken against children with disabilities, including suspensions and expulsions, the SEA must require the LEA to reserve the maximum amount of funds available to the LEA to provide EIS to children in the LEA, particularly, but not exclusively, to children in those groups that were significantly overidentified.
Question B-2:
What does it mean to “reserve” funds for EIS?
Answer: The Department interprets “reserve” to mean that these funds can only be spent on EIS. The statute does not authorize LEAs to use the funds they must “reserve” for EIS for any other purpose.
Question B-3: Must the maximum amount of special education funds allowed for EIS be reserved only if significant disproportionality is the result of inappropriate identification?
Answer: No. The reservation of funds must occur whether or not the significant disproportionality was the result of inappropriate identification. In addition to identification, funds also would have to be reserved if significant disproportionality was found with respect to discipline or placement in particular educational settings.
Question B-4: If a State has identified significant disproportionality in an LEA can the
IDEA funds the LEA must use to address the issue be used to provide
99
services to students who have already been found eligible for special education and related services?
Answer:
No. Section 300.226(a) states that EIS is “ … for students in kindergarten through grade 12 (with a particular emphasis on students in kindergarten through grade three) who are not currently identified as needing special education or related services, but who need additional academic and behavioral support to succeed in a general education environment.”
Question B-5: What is the relationship between EIS funds and maintenance of effort
(MOE) funds?
Answer: LEAs that seek to reduce their local maintenance of effort in accordance with 34 CFR §300.205(d) and use some of their Part B funds for early intervening services under 34 CFR §300.226 must do so with caution because the local maintenance of effort reduction provision and the authority to use Part B funds for early intervening services are interconnected. The decisions that an LEA makes about the amount of funds it uses for one purpose affect the amount that it may use for the other. Appendix D of the Part B regulations [71 FR 46817] provides examples of how 34 CFR §300.205(d), regarding local maintenance of effort, and 34 CFR §300.226(a), regarding EIS funds, affect one another.
Question C-1: Must an LEA evaluate a child upon the request of the parent at any time during the RTI process? May a parent request an initial special education evaluation at any time during the RTI process?
Answer: If the LEA agrees with the parent that the child may be a child who is eligible for special education services, the LEA must evaluate the child.
The Federal regulations at 34 CFR §300.301(b) allow a parent to request an evaluation at any time. If an LEA declines the parent’s request for an evaluation, the LEA must issue a prior written notice as required under 34
CFR §300.503(a)(2) which states, “written notice that meets the requirements of paragraph (b) of this section must be given to the parents of a child with a disability a reasonable time before the public agency refuses to initiate or change the identification, evaluation, or educational placement of the child or the provision of FAPE to the child.” The parent can challenge this decision by requesting a due process hearing to resolve the dispute regarding the child’s need for an evaluation.
100
Question C-2: May an LEA require that all children suspected of having a SLD first be assessed using an RTI process before an eligibility determination may be made?
Answer: If an LEA is using RTI for all its students, it may require the group established under 34 CFR §300.306(a)(1) and 34 CFR §300.308 for the purpose of determining the eligibility (eligibility group) of students suspected of having a SLD to review data from an RTI process in making an eligibility determination. Models based on RTI typically evaluate the child’s response to instruction prior to the beginning of the evaluation time period described in 34 CFR §300.301(c)(1), and generally do not require as long a time to complete an evaluation because of the amount of information already collected on the child’s achievement, including observation data. If the eligibility group determines that additional data are needed and cannot be obtained within the evaluation time period described in 34 CFR §300.301(c)(1), the parent and eligibility group can agree to an extension of the timeframe. However, as explained in
Question C-1, parents can request an evaluation at any time, and the public agency must either obtain consent to evaluate and begin the evaluation, or, if the public agency declines the parent’s request, issue a prior written notice as required by 34 CFR §300.503(a)(2).
Question C-3: Section 300.309(a)(2)(i) states that the eligibility group may determine that a child has a specific learning disability if “the child does not make sufficient progress to meet age or State-approved grade-level standards in one or more” identified areas. Section 300.309(a)(2)(ii) states that the group may determine that a child has a specific learning disability if “the child exhibits a pattern of strengths and weaknesses in performance, achievement, or both, relative to age, State-approved grade level standards, or intellectual development” that the group determines is relevant to making an eligibility determination. Please explain how these two criteria differ from one another.
Answer: Section 300.309(a)(2)(i) reflects the use of the criterion that the child has not made sufficient progress in at least one of the following areas when using response to intervention as an aspect of the SLD identification process: oral expression, listening comprehension, written expression, basic reading skills, reading comprehension, mathematics calculation, and mathematics problem solving. Alternatively, based on 34 CFR
§300.309(a)(2)(ii), the group could consider variation in a child's performance, achievement, or both relative to age, State-approved gradelevel standards, or intellectual development that is determined by the eligibility group to be relevant to identification of a SLD using appropriate assessments. Under this criterion, a pattern of strengths and weaknesses in performance, achievement, or both relative to age, State-approved grade-
101
level standards or intellectual development would be part of the evidence that a child has a learning disability.
Question C-4: The regulations require an SEA to adopt criteria for determining if a child has a specific learning disability (34 CFR §300.307(a)). Does this preclude the SEA from mandating RTI as the sole criterion used to determine if a child has a specific learning disability? Must an LEA follow the State-developed criteria for determining if a child has a specific learning disability?
Answer: An SEA must include a variety of assessment tools and may not use any single measure or assessment as the sole criterion for determining whether a child is a child with a disability, as required under 34 CFR §300.304(b).
However, an SEA could require that data from an RTI process be used in the identification of all children with SLD.
An LEA must comply with the criteria adopted by their SEA regarding this requirement. The requirements at 34 CFR §300.307(a) require that a
State adopt criteria for determining whether a child has a specific learning disability. The Analysis of Comments and Changes accompanying the final Part B regulations, page 46649, clarifies, “… the Department believes that eligibility criteria must be consistent across a State to avoid confusion among parents and school district personnel. The Department also believes that requiring LEAs to use State criteria for identifying children with disabilities is consistent with the State's responsibility under section 612(a)(3) of the Act to locate, identify, and evaluate all eligible children with disabilities in the State.”
Question C-5: When implementing an evaluation process based on a child’s response to scientific, research-based intervention, the regulations require that a
“public agency must promptly request parental consent to evaluate a child
(34 CFR §300.309(c))” if the “child has not made adequate progress after an appropriate period of time (34 CFR §300.309(c)(1)).” Please define
“promptly” and “adequate” in this context.
Answer: The Federal regulations under 34 CFR §300.309(c) require that if a child has not made adequate progress after an appropriate period of time, a referral for an evaluation must be made. However, the regulations do not specify a timeline for using RTI or define “adequate progress.” As required in 34 CFR §300.301(c), an initial evaluation must be conducted within 60 days of receiving consent for an evaluation (or if the State establishes a timeframe within which the evaluation must be completed, within that timeframe). Models based on RTI typically evaluate a child's response to instruction prior to the onset of the 60-day period, and
102
generally do not require as long a time to complete an evaluation because of the amount of data already collected on the child's achievement, including observation data. A State may choose to establish a specific timeline that would require an LEA to seek parental consent for an evaluation if a student has not made progress that the district deemed adequate.
We do not believe it is necessary to define the phrase “promptly” because the meaning will vary depending on the specific circumstances in each case. There may be legitimate reasons for varying timeframes for seeking parental consent to conduct an evaluation. However, the child find requirements in 34 CFR §300.111 and section 612(a)(3)(A) of the Act require that all children with disabilities in the State who are in need of special education and related services be identified, located, and evaluated.
Therefore, it generally would not be acceptable for an LEA to wait several months to conduct an evaluation or to seek parental consent for an initial evaluation if the public agency suspects the child to be a child with a disability. If it is determined through the monitoring efforts of the
Department or a State that there is a pattern or practice within a particular
State or LEA of not conducting evaluations and making eligibility determinations in a timely manner, this could raise questions as to whether the State or LEA is in compliance with the Act.
Question C-6: May an eligibility determination be made using only information that was collected through an RTI process?
Answer: Section 300.304 (b) states that in conducting an evaluation, a public agency must use a variety of assessment tools and strategies to gather relevant functional, developmental, and academic information about the child, including information provided by the parent, that may assist in determining eligibility and not use any single measure or assessment as the sole criterion for determining whether a child is a child with a disability and for determining an appropriate educational program for the child.
The Department provided additional clarification regarding this issue in the Analysis of Comments and Changes section of the regulations, page
46648. This section states, “an RTI process does not replace the need for a comprehensive evaluation. A public agency must use a variety of data gathering tools and strategies even if an RTI process is used. The results of an RTI process may be one component of the information reviewed as part of the evaluation procedures required under 34 CFR §§300.304 and
300.305. As required in 34 CFR §300.304(b), consistent with section
614(b)(2) of the Act, an evaluation must include a variety of assessment
103
tools and strategies and cannot rely on any single procedure as the sole criterion for determining eligibility for special education and related services.”
Question D-1:
Why don’t early intervening services apply to 3-5 year olds?
Answer: Section 300.226(a) tracks the statutory language in section 613(f)(1) of the
Act, which states that early intervening services are for children in kindergarten through grade 12, with a particular emphasis on children in kindergarten through grade 3. Thus, LEAs may not use Part B funds to provide EIS to non-disabled preschool children.
Question E-1: Is the use of RTI required or just permitted?
Answer: Section 300.307(a)(2)-(3) requires that a State’s criteria for identification of specific learning disabilities:
Must permit the use of a process based on the child's response to scientific, research-based intervention; and
May permit the use of other alternative research-based procedures for determining whether a child has a specific learning disability.
Section 300.307(b) states that a public agency must use the State’s criteria in identifying children with specific learning disabilities. Thus, the State’s criteria must permit the use of RTI and may require its use, in addition to other assessment tools and strategies, for determining whether the child has a specific learning disability.
Question E-2: Does each LEA have to select either RTI or a discrepancy model to determine if a child is a child with a specific learning disability?
Answer: No. The State agency must adopt criteria regarding the determination of
SLD eligibility.
An SEA must include a variety of assessment tools and may not use any single measure or assessment as the sole criterion for determining whether a child is a child with a disability, as required under 34 CFR §300.304(b).
An LEA must comply with the criteria adopted by its SEA. Section
104
300.307(a) requires a State to adopt criteria for determining whether a child has a specific learning disability.
The Analysis of Comments and Changes section accompanying the
Federal regulations, page 46649, clarifies, “… the Department believes that eligibility criteria must be consistent across a State to avoid confusion among parents and school district personnel. The Department also believes that requiring LEAs to use State criteria for identifying children with disabilities is consistent with the State's responsibility under section
612(a)(3) of the Act to locate, identify, and evaluate all eligible children with disabilities in the State. We believe this provides the Department with the authority to require a public agency to use its State’s criteria in determining whether a child has an SLD, consistent with §§300.307 through 300.311.”
Question E-3: What services can be defined as early intervening services? For example, are physical therapy, occupational therapy, and assistive technology considered early intervening services?
Answer: State and local officials are in the best position to make decisions regarding the provision of early intervening services, including the specific personnel to provide the services and the instructional materials and approaches to be used. Nothing in the Act or regulations prevents
States and LEAs from including related services personnel in the development and delivery of educational and behavioral evaluations, services, and supports for teachers and other school staff to enable them to deliver coordinated, early intervening services .
Question F-1:
Please define “significant disproportionality” in the context of EIS.
Answer: Each State has the discretion to define the term “significant disproportionality,” in the context of EIS, for the LEAs and for the State in general. In identifying significant disproportionality, a State may determine how much disproportionality is significant. However, the
State’s definition of “significant” must be based only on a numerical analysis, and may not consider factors such as the extent to which an
LEA’s policies and procedures comply with the IDEA or the compliance history of an LEA. Establishing a national standard for significant disproportionality is not appropriate because there are multiple factors at the State level to consider in making such determinations. For example,
States need to consider the population size, the size of individual LEAs, and composition of the State’s population. States are in the best position to evaluate those factors. The Department has provided guidance to States
105
on methods for assessing disproportionality. This guidance is found at: http://www.ideadata.org/docs/Disproportionality%20Technical%20Assista nce%20Guide.pdf
.
Question F-2: Will early intervening services data be reported in State Performance
Plans (SPP) or Annual Performance Reports (APRs)?
Answer: No. Section 300.226 directs LEAs to report EIS data to their SEA. It is not a part of the information that an SEA must report to the Department in its SPP or APRs.
Question F-3:
For discipline purposes, would a student’s participation in an RTI process be considered a “basis of knowledge” under 34 CFR §300.534(b)?
Answer: Generally, no. Participation in an RTI process, in and of itself, would not appear to meet the “basis of knowledge” standards in 34 CFR §300.534.
The standards for whether a public agency has a “basis of knowledge” are laid out in the Federal regulations at 34 CFR §300.534.
Question F-4: When an RTI model is implemented, can an incremental process be used to train individual schools so that over time the entire LEA is implementing the model or must all the schools in the entire LEA be trained simultaneously?
Answer: If the State or LEA requires the use of a process based on the child's response to scientific, research-based intervention, in identifying children with SLD, then all children suspected of having a SLD, in all schools in the LEA, would be required to be involved in the process. However, research indicates that implementation of any process, across any system, is most effective when accomplished systematically in an incremental manner over time. If the LEA chose to “scale up” the implementation of the RTI model gradually over time, as would be reasonable, the LEA could not use RTI for purposes of identifying children with SLD until RTI was fully implemented in the LEA. Therefore, it is unwise for a State to require the use of a process based on the child's response to scientific, research-based intervention before it has successfully scaled up implementation.
Question F-5: How might EIS funds be used to support a process determining whether a child has a specific learning disability and to address the needs of students who need additional academic and behavioral support to succeed in a general education environment?
106
Answer: If EIS funds are used to support a process to determine whether a child has a specific learning disability there are three interacting identification/instructional dynamics that need to be considered: (1) identification of learning disabilities; (2) early intervening services; and
(3) response to intervention (RTI). While the Department does not subscribe to a particular RTI model, the core characteristics that underpin all RTI models are: (1) students receive high quality research-based instruction in their general education setting; (2) continuous monitoring of student performance; (3) all students are screened for academic and behavioral problems; and (4) multiple levels (tiers) of instruction that are progressively more intense, based on the student’s response to instruction.
For example, an RTI model with a three-tier continuum of school-wide support might include the following tiers and levels of support: (1) Tier one (Primary Intervention), for all students using high quality scientific research-based instruction in their general education setting. It would not be appropriate to use EIS funds for these activities since these students do not need additional academic and behavioral support to succeed in a general education environment. (2) Tier two (Secondary Intervention), for specialized small group instruction of students determined to be at risk for academic and behavioral problems. It would be appropriate to use EIS funds to support these activities. (3) Tier three (Tertiary Intervention) for specialized individualized instructional/behavioral support for students with intensive needs. EIS funds could not be used if these students were currently receiving special education or related services.
Question F-6: Should services supported with EIS funds be scientifically based?
Answer: The No Child Left Behind Act and IDEA call on educational practitioners to use scientifically based research to guide their decisions about which interventions to implement. IDEA states that in implementing coordinated early intervening services an LEA may carry out activities that include--
(1) Professional development (which may be provided by entities other than LEAs) for teachers and other school staff to enable such personnel to deliver scientifically based academic and behavioral interventions, including scientifically based literacy instruction, and, where appropriate, instruction on the use of adaptive and instructional software; and
(2) Educational and behavioral evaluations, services, and supports, including scientifically based literacy instruction.
107