apr2005

advertisement
Massachusetts Part B
Annual Performance Report for
FFY 2005
Submitted to the
Office of Special Education Programs:
February 1, 2007
Massachusetts Department of Education
350 Main Street, Malden, MA 02148
Phone 781-338-3000 TTY: N.E.T. Relay 800-439-2370
www.doe.mass.edu
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Part B Massachusetts Annual Performance Report (MA APR)
for FFY 2005
Table of Contents
Cover Letter / Overview of MA APR Development
3
Monitoring Priority: FAPE in the LRE

Indicator #1: Graduation Rates
4

Indicator #2: Drop-Out Rates
6

Indicator #3: Assessment
9

Indicator #4: Suspension/Expulsion
12

Indicator #5: School Age LRE
14

Indicator #6: Preschool LRE
19
Effective General Supervision / Effective Transition
 Indicator #12: Early Childhood Transition
21
Effective General Supervision / General Supervision
 Indicator #15: Identification and Correction of Noncompliance
24

Indicator #16: Complaint Timelines
28

Indicator #17: Due Process Timelines
30

Indicator #19: Mediation Agreements
32

Indicator #20: State Reported Data
33
Attachment
 Attachment 1: Report of Dispute Resolution for FFY2005
35
Note: Further information on the Indicators listed above and complete information for Indicators 7, 8, 9,
10, 11, 13, 14, and 18 can be found in the Massachusetts State Performance Plan (MA SPP) found at
http://www.doe.mass.edu/sped/spp/.
2
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Submitted February 1, 2007
U.S. Department of Education
ATTN: Janet Scire / Mail Stop 2600
7100 Old Landover Road
Landover, MD, 20785-1506
Dear Ms. Scire,
Enclosed is the Massachusetts Annual Performance Report for FFY2005 (MA APR). The MA APR
responds directly to the indicators identified by the Office of Special Education Programs (OSEP) in
Information Collection 1820-0624, Part B State Performance Plan (SPP) and Annual Performance
Report (APR). The MA APR provides information on Indicators 1, 2, 3, 4a, 5, 6, 12, 15, 16, 17, 19, and
20.
The Massachusetts Department of Education (MASSDE) has engaged in a variety of activities to obtain
broad input from stakeholders on the development of the MA APR. MASSDE convened the
Massachusetts Statewide Special Education Steering Committee – which consists of state special
education advisory council members, key MASSDE personnel, local education officials, parents,
advocates, and representatives from higher education, charter schools, approved private special
education schools, and adult service agencies – to review data, measure progress against the targets,
examine methodologies, and identify key activities as appropriate for each of these indicators.
Additionally, MASSDE has formed targeted workgroups focused on each indicator. These workgroups
incorporate a wide variety of stakeholders who communicate through the year to help guide
Massachusetts’ work in each area.
Regarding public dissemination, the completed MA APR will be made widely available for public
discussion. This will be accomplished by broad discussion in workgroups (as previously mentioned) and
at the statewide advisory council meeting and other conference and group discussion opportunities.
Additionally, MASSDE will post the MA APR on the MASSDE website, and distribute hard copies of the
report to key constituents and the media.
Additionally, MASSDE is currently working with our stakeholder groups to determine the best way to
publicly report this data at an LEA level. The data will be made available through the School and District
Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table
format and/or through an interactive mapping software program that will allow users to visually compare
LEA data more readily. The data for Indicators 1, 2, 3, 4a, 5, and 12 will be available on the MASSDE
website by June 1, 2007. Data for Indicator 6 will be available for public reporting at an LEA level during
the 2007-08 school year.
If questions or additional clarification is needed regarding the MA APR, please contact me at
781.338.3388 or mmmittnacht@doe.mass.edu.
Sincerely,
Marcia Mittnacht
State Director of Special Education
Special Education Planning and Policy Development Office
Massachusetts Department of Education
3
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 1: Percent of youth with IEPs graduating from high school with a regular diploma compared to
percent of all youth in the State graduating with a regular diploma.
(20 U.S.C. 1416 (a)(3)(A))
Measurement: The statewide Graduation Rate is the number of students in a cohort who graduate
in four years or less, divided by the number of first-time entering 9th graders in that cohort. The
denominator is adjusted so that students who transfer into Massachusetts’ public schools are added
to the original cohort and students who transfer out, or who are now deceased, are subtracted from
the original cohort. The quotient is multiplied by 100 to express the Graduation Rate as a
percentage.
FFY
2005
(2005-2006)
Measurable and Rigorous Target
Baseline Year- Students with IEPs Graduation Rate: 61.6%
Actual Target Data for 2005-2006:
2005-06 is the first year for which Massachusetts has calculated a Graduation Rate for students with
IEPs, non-IEP students, and all students. Baseline data are presented in the following table:
# of Students
in 2005-06
cohort
IEP
Non-IEP
All Students
13,462
61,035
74,497
# of Students in 2005-06 cohort
who graduated in four years or
less.
8,291
51,154
59,445
2005-06 Graduation
Rate
61.6%
83.8%
79.8%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
This is the Baseline Year.
MASSDE is currently working with our stakeholder groups to determine the best way to publicly report this
data at an LEA level. The data will be made available through the School and District Profiles section of
the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table format and/or
through an interactive mapping software program that will allow users to visually compare LEA data more
readily. The data will be available on the MASSDE website by June 1, 2007.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
Massachusetts is calculating and reporting a statewide Graduation Rate for the first time this year and
consequently, the SPP has been completely revised. MASSDE previously used the competency
determination (CD) attainment rate as an interim measure and best approximation of a graduation rate.
The Measurable and Rigorous Targets have been revised in the State Performance Plan to reflect the
change from CD Rate to Graduation Rate. However, the new Targets are still based on initial
recommendations from the Massachusetts Steering Committee. Modest Targets reflect the need to
4
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
consider significant variation in district-level Graduation Rates. Targets also reflect input from various
interest groups concerning the challenges of effectively closing the gap in this performance indicator.
Several state-level improvement activities support districts in serving students with disabilities and in
helping all students reach graduation. The following are some of the improvement activities that are
expected to help increase the Graduation Rate for students with disabilities:




Non-competitive grants for school districts to fund Special Education-related professional
development, including induction and mentoring programs for new special educators;
Special Education Summer Institutes with topics such as “Assistive Technology and Universal
Design in the Classroom”, “Response to Intervention”, “Assessing English Language Learners
with Disabilities”, and Special Education Leadership Academies for Administrators;
Project FOCUS Academy, a pilot distance-learning program with courses for educators in
Universal Design for Learning, Transition/Post-School Outcomes, and Positive Behavioral
Interventions and Supports, which supports participants in making school-wide changes that
benefit students with disabilities; and
Alternative Education and Dropout Prevention programs that support “at-risk” students.
5
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 2: Percent of youth with IEPs dropping out of high school compared to the percent of all youth
in the State dropping out of high school.
(20 U.S.C. 1416 (a)(3)(A))
Measurement: The dropout rate is the number of students who drop out over a one-year period,
from July 1 to June 30, who do not return to school by October 1 st of the next school year, divided
by the total enrollment of students, times 100. The measurement for youth with an IEP is the same
as the measurement for youth without an IEP.
FFY
Measurable and Rigorous Target
2005
Maintain current dropout level of 5.6% for students with IEPs
(2005-2006)
Actual Target Data for 2005-2006: The baseline year (2004-05) dropout rate for students with IEPs was
5.6%. In 2003-04, the dropout rate for students with IEPs was 5.4%. An analysis of the 2004-05 dropout
data and the change from 2003-04 follows in the next section.
Actual target data for 2005-06 was not available at the time of this reporting. MASSDE expects to have
this information in Spring 2007.
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
In 2004-05, the dropout rate for students without an IEP was 3.5% and for all students it was 3.8% -- an
increase of 0.1% from 2003-04 for each population of students, respectively. The dropout rate for
students with IEPs increased 0.2% in 2004-05, to 5.6%.
Change in Dropout Rate
5.4%
5.6%
2003-2004
2004-2005
3.4%
3.5%
Students with IEPs Students without
IEPs
In both the 2003-04 and 2004-05 school years, students with an IEP comprised 14% of the high school
population, but about 21% of the dropout rate.
6
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Dropout Rate Vs. Rate of Students
with IEPs in High School
30.0%
20.0%
2003-2004
20.9% 21.3%
2004-2005
14.3% 14.5%
10.0%
0.0%
Students with an All High School
IEP who Dropped Students with an
Out
IEP
In 2004-05 the dropout rate for all students was higher at each successive grade level, from a low of 3.0%
in 9th grade, to a high of 4.7% in 12th grade. The dropout rate for all students in grade 10 held constant
and decreased in grade 12 (from 4.8%% in 2003-04). Twenty-eight percent (28%) of all grade 11
students, and 34% of all grade 12 students, who had not achieved a competency determination dropped
out in 2004-05.
In 2004-05, males dropped out a higher rate than females (4.4% versus 3.2%, respectively.) For each
gender, this was an increase of 0.1% from 2003-04. Hispanics had the highest dropout rate (9.1%)
among race/ethnic groups, and the greatest increase (+0.8%) from 2003-04. Whites had the lowest
dropout rate in 2004-05 (2.8%), which held constant from the previous school year. The dropout rate for
African-Americans also held constant at 6.2%. It decreased among Asians and Native Americans.
In response to the increase in the dropout rate from 2002-03 (3.3%) to 2003-04 (3.7%), MASSDE initiated
an open-ended survey of school district administrators to better understand and address the issue of
students dropping out. One-hundred five school districts responded to the questions, which surveyed: 1)
the perceived reasons for why students drop out, 2) the steps the district has taken to improve high
school retention and graduation, 3) the perceived biggest challenge for improving the state dropout rate,
and 4) suggestions for MASSDE to curb the dropout problem statewide. As a result of the survey,
several improvement activities have been added to the SPP, including focus groups with students who
have dropped out or are at risk of dropping out, and a statewide graduation summit.
Massachusetts is currently working with Achieve, Inc to reduce the number of students who are dropping
out, improve students’ readiness for college or employment, and increase the graduation rate. This
partnership is part of the Staying the Course: High Standards and Improved Graduation Rates project
supported by the Carnegie Corporation of New York, and will create a more coherent and aligned use of
high school reform strategies and policies. Lessons learned from this project will be shared with other
interested states and lead districts. In addition, MASSDE is working with its strategic partners to seek
additional funds for improving high school graduation rates and reducing barriers to learning.
MASSDE initiated meetings with key stakeholders in March 2006 to obtain input on the SPP targets and
proposed improvement activities. Workgroup participants have considered the relationship of dropping
out to emotional impairment disability status, involvement with juvenile corrections, and attendance
policies as they apply to course credit and graduation requirements. They have also recommended that
future dropout initiatives include the middle school level. Some workgroup participants are also members
of the Massachusetts Special Education Steering Committee, which authorized the allocation of
discretionary funds for the Positive Behavioral Interventions and Supports grant program, and provided
feedback on the proposed public reporting of SPP data. Workgroup members were also notified of
NDPC-SD technical assistance offerings and consulted on the development of the district dropout
prevention resources survey.
7
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Effective the 2006-07 school year, the MASSDE is requiring school districts to designate a person at the
district level to serve as dropout prevention liaison. This person will be responsible for communicating
with the MASSDE on issues regarding dropout prevention and recovery. All public school districts –
regardless of the district dropout rate and grade levels served – must identify a person in their directory
administration for this role. The Dropout Prevention Liaison is familiar with district-wide efforts related to
dropout prevention and recovery. The responsibility defaults to the Superintendent if the district does not
identify an individual for the role. The Dropout Prevention Liaison will be the focus of the Spring 2007
survey of school district dropout prevention efforts, improvements, resources, and programs. The 200506 dropout rate data will be used to target districts with demonstrated progress reducing high rates, as
well as those with low dropout rates.
Additionally, MASSDE is currently working with our stakeholder groups to determine the best way to
publicly report this data at an LEA level. The data will be made available through the School and District
Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table
format and/or through an interactive mapping software program that will allow users to visually compare
LEA data more readily. The data will be available on the MASSDE website by June 1, 2007.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets for this indicator in the SPP submitted December 1, 2005 were based on 2003-04 data, as it
was the most current available at the time. The February 1, 2007 SPP was updated with the baseline
data from 2004-05. (The dropout rate for students with an IEP in 2003-04 was 5.4%; in 2004-05 it was
5.6%.) The targets for 2005-06 and 2006-07 were adjusted to 5.6% each (from 5.4% each); the targets
for subsequent years (FFY 2007 – 2010) remain unchanged. The Massachusetts Special Education
Steering Committee unanimously recommended these targets, and the extensive additions to the
schedule of improvement activities is designed to assist meeting the goal of reducing the 2007-08 dropout
rate to 5.3% (-0.3% from the baseline).
Supplemental Improvement Activities not originally proposed in the SPP submitted December 1, 2005
have occurred in Years One and Two, and additional ones have been added to the schedule in Years
Three through Six. The aforementioned qualitative survey of districts (and associated report) was added
to the schedule of improvement activities as it originated after the SPP was submitted. The results of this
survey yielded additional follow-up activities such as focus groups and a graduation rate summit.
MASSDE took advantage of several other activities that presented themselves during the course of year
one (e.g., NDPC-SD technical assistance) and the SPP has been updated to reflect this. Also added to
the schedule of improvement activities were: data verification efforts, the development of an online
exchange community for soliciting stakeholder input, and a proposed Positive Behavioral Interventions
and Supports (PBIS) grant initiative to begin in 2007.
All originally scheduled improvement activities in 2005-06 and 2006-07 (to date) were completed.
8
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 3: Participation and performance of children with disabilities on statewide assessments:
A. Percent of districts that have a disability subgroup that meets the State’s minimum “n” size
meeting the State’s AYP objectives for progress for disability subgroup.
B. Participation rate for children with IEPs in a regular assessment with no accommodations; regular
assessment with accommodations; alternate assessment against grade level standards; alternate
assessment against alternate achievement standards.
C. Proficiency rate for children with IEPs against grade level standards and alternate achievement
standards.
(20 U.S.C. 1416 (a)(3)(A))
Measurement:
A. Percent = [(# of districts meeting the State’s AYP objectives for progress for the disability subgroup
(children with IEPs)) divided by the (total # of districts that have a disability subgroup that meets
the State’s minimum “n” size in the State)] times 100.
B. Participation rate =
a. # of children with IEPs in assessed grades;
b. # of children with IEPs in regular assessment with no accommodations (percent = [(b)
divided by (a)] times 100);
c. # of children with IEPs in regular assessment with accommodations (percent = [(c) divided
by (a)] times 100);
d. # of children with IEPs in alternate assessment against grade level achievement
standards (percent = [(d) divided by (a)] times 100); and
e. # of children with IEPs in alternate assessment against alternate achievement standards
(percent = [(e) divided by (a)] times 100).
Account for any children included in a but not included in b, c, d, or e above.
Overall Percent = [(b + c + d + e) divided by (a)].
C. Proficiency rate =
a. # of children with IEPs in assessed grades;
b. # of children with IEPs in assessed grades who are proficient or above as measured by
the regular assessment with no accommodations (percent = [(b) divided by (a)] times
100);
c. # of children with IEPs in assessed grades who are proficient or above as measured by
the regular assessment with accommodations (percent = [(c) divided by (a)] times 100);
d. # of children with IEPs in assessed grades who are proficient or above as measured by
the alternate assessment against grade level achievement standards (percent = [(d)
divided by (a)] times 100); and
e. # of children with IEPs in assessed grades who are proficient or above as measured
against alternate achievement standards (percent = [(e) divided by (a)] times 100).
Account for any children included in a but not included in b, c, d, or e above.
Overall Percent = [(b + c + d + e) divided by (a)].
9
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
FFY
Submitted February 1, 2007
Measurable and Rigorous Targets
% Districts Meeting
AYP for Disability
Subgroup (3A)
Participation Rate for
Students with IEPs (3B)
Proficiency Rate for
Students with IEPs (3C)
ELA
MATH
ELA
MATH
ELA
MATH
Targets for
FFY 2005
(2005-2006)
45%
37%
99%
99%
23.9%
14.3%
Actual
Target Data
for
2005-2006
25%
19%
97.6%
97.7%
26.1%
15.5%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
In order to meet the goal of students with disabilities achieving academic success, MASSDE worked with
districts and schools to analyze student assessment data and implement effective improvement plans
outlined in the State Performance Plan submitted in December 1, 2005.
During the 2005-06 school year several improvement activities were completed:









School districts were provided with their AYP results detailing the outcomes for each subgroup.
Detailed MCAS files were provided to the districts so that schools can create item-analysis charts
to assist educators in identifying weakness and relevant relationships across student subgroups
and subject areas, and to inform staff professional development.
The Massachusetts Statewide Special Education Steering Committee met in December 2006 to
review current data and activities and give input on methods of data analysis.
The targeted workgroup focused on this indicator met to review the improvement activities and
results, to revise technical assistance and to consider dissemination of activities.
Superintendents and principals previewed preliminary 2006 Adequate Yearly Progress (AYP)
data for schools and report discrepancies.
Districts and schools making substantial gains in students’ achievement were identified
(Compass awards) and best practices were disseminated.
Analysis of the assessment data for students with disabilities, including testing accommodations
and the impact of key factors (e.g. disability type, educational environment, level of need) on
student performance were completed.
School Panel reviews were conducted in order to determine districts that could be identified as
under-performing and in need of assistance.
Training and technical assistance, including analysis of student assessment data and
development of school improvement plans, were provide to districts and schools identified as
under-performing or in need of improvement.
Analysis for Indicator 3A:
The targets set for 2005-06 maintained the baseline level of performance obtained for 2004-05. The data
for 2005-06 shows a slippage in the percentage of districts making AYP for special education subgroups
in English/Language Arts and Mathematics:
 In ELA the percentage decreased from 45% to 25%; and
 For Mathematics, the percentage decreased from 37% to 19%.
10
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
The decrease in the number of districts making AYP for special education subgroup in both English
Language Arts and Mathematics is partly due to the fact that the determination for districts making AYP
changed from the year of the baseline data. In 2004-05 a district was determined to make AYP for a
subgroup if AYP was made at any grade level. In 2005-06 in order for a district to make AYP for a
subgroup it must make AYP at all three grade levels i.e. elementary, middle and high school.
Additionally, more of the slippage is likely due to the AYP target increasing as required under NCLB. The
number of districts not making AYP for other subgroups also increased in 2005-06 and is indicative of the
largest national challenge to the NCLB accountability provisions. Although MASSDE is disappointed at
this slippage, we believe that students with disabilities in Massachusetts continue to perform at generally
high levels and we are proud of Massachusetts’ strong performance as compared to other states in this
regard. We set high standards and have continued to strive for them. MASSDE will continue to work with
the focused indicator workgroup to identify the performance gaps in our schools and districts that require
attention. The next step is to work with educators to take the information from the accountability reporting
and use it to improve the performance of the students unable to meet their targets.
Analysis for Indicator 3B:
The baseline data (2004-05) indicated the participation rate for students with IEPs on statewide
assessments was better than 99%, which MASSDE characterized as full participation. The baseline
participation rate on the ELA assessment was 99.3%, and 99.4% on the Mathematics assessment.
However, the data for 2005-06 participation rate for students with IEPs on statewide assessment reflects
a slippage in the participation rate of students with IEPs on the statewide assessment. The participation
rate was 97.6% and 97.7% on the ELA and Mathematics assessment respectively. In spring 2006,
students were tested in Reading, English Language Arts (grade 4-8 and 10), Mathematics (grade 3-8 and
10), and Science and Technology/Engineering (grade 5 and 8). Six of these sixteen tests --English
Language Arts at grades 5, 6 and 8 and Mathematics at grades 3, 5, and 7 – were newly introduced in
2006 school year in response to the requirements of the federal No Child Left Behind Act (NCLB). It is
possible that this increase in the number of tests increased the chances of students missing a test.
However, although MASSDE acknowledges that numerically this does represent slippage in comparison
to the targets set, MASSDE is satisfied that the participation rate continues to be excellent for students
with disabilities in state assessment programs.
Analysis for Indicator 3C:
The baseline data (obtained for 2004-05) indicated that 23.9% of students with IEPs scored “proficient” or
above in ELA, and 14.3% scored “proficient” or above in Mathematics. These data include all tested
grades combined. Targets for 2005-06 sought to maintain these levels. The FFY 2005 data indicates an
increase in the percent of students with IEPs scoring proficient and above in both ELA and Mathematics.
For ELA 26.1% of students with IEPs scored “proficient’ or above, and 15.5% scored “proficient’ or above
in Mathematics. These data include all tested grades combined (grades 4-8 and 10 for ELA; grades 3-8
and 10 for Mathematics). This exceeded the target set for 2005-06 in both ELA and Mathematics.
MASSDE is particularly pleased with this performance, as MASSDE believes its “proficiency” standard is
a high standard indeed.
Public Reporting:
MASSDE is currently working with our stakeholder groups to determine the best way to publicly report
data for this indicator at an LEA level. The data will be made available through the School and District
Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table
format and/or through an interactive mapping software program that will allow users to visually compare
LEA data more readily. The data will be available on the MASSDE website by June 1, 2007.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets, improvement activities, timelines and resources for 2005-06 remain appropriate.
11
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 4: Rates of suspension and expulsion:
A. Percent of districts identified by the State as having a significant discrepancy in the rates of
suspensions and expulsions of children with disabilities for greater than 10 days in a school year;
and
B. Percent of districts identified by the State as having a significant discrepancy in the rates of
suspensions and expulsions of greater than 10 days in a school year of children with disabilities
by race and ethnicity.
(20 U.S.C. 1416(a)(3)(A); 1412(a)(22))
Measurement:
A. Percent = [(# of districts identified by the State as having significant discrepancies in the rates of
suspensions and expulsions of children with disabilities for greater than 10 days in a school year)
divided by the (# of districts in the State)] times 100.
B. Percent = [(# of districts identified by the State as having significant discrepancies in the rates of
suspensions and expulsions for greater than 10 days in a school year of children with disabilities
by race ethnicity) divided by the (# of districts in the State)] times 100.
Include State’s definition of “significant discrepancy.”
INDICATOR 4A:
FFY
2005
(2005-2006)
Measurable and Rigorous Target
1.8% of all districts are flagged for a review of their policies and procedures
Actual Target Data for 2005-2006:
INDICATOR 4A:
2005-06
Special Education
Enrollment
2005-06
Suspension Rate
% of districts with 5
times State Rate*
% of districts with a
finding of “significant
discrepancy”
1.2%
0%
(4 districts)
*Districts with fewer than 30 students in special education were removed from this part of the analysis
160,752
.916%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
INDICATOR 4A:
In order to meet the goal of students with disabilities achieving academic success, MASSDE worked with
districts and schools to analyze SSDR data and implement effective improvement plans outlined in the
State Performance Plan submitted in December 1, 2005.
During the 2005-06 school year several improvement activities were completed:
12
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005



Submitted February 1, 2007
The Massachusetts Statewide Special Education Steering Committee met in December 2005 to
discuss possible definitions for ‘significant discrepancy’.
Met with the targeted workgroup focused on this indicator. Discussion included how to best
ensure that districts are using the same definition of “suspension” when reporting their data.
Implemented the current MASSDE State Improvement Grant, Project FOCUS Academy. This
program focuses on creating professional development programs to help students with disabilities
build sound career goals and learn skills to ensure successful post-secondary outcomes.
Participants in nine (9) school districts attended the first semester of the Positive Behavioral
Interventions and Supports course offered in the spring of 2006. This could have a long-term
impact of reducing the expulsion and >10 days suspension rates for students with an IEP.
The baseline data, 2004-05, indicated that 1.8% or 6 districts were flagged for a review of policies and
procedures, that baseline rate was also the target set for 2005-6. The data for the reporting year, 200506, shows a decrease in the number of districts flagged for a review of policies and procedures.
MASSDE is pleased at the decrease, while at the same time we continue to upgrade our activities in
regards to ensuring that the data is appropriately reported. No findings of “significant discrepancy” were
made for the four districts flagged for this reporting year when policies and procedures were reviewed.
MASSDE continues to consider the main effort in this area to rest with appropriate procedures to ensure
good reporting and an effective review of policies and procedures.
Additionally, MASSDE is currently working with our stakeholder groups to determine the best way to
publicly report this data at an LEA level. The data will be made available through the School and District
Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table
format and/or through an interactive mapping software program that will allow users to visually compare
LEA data more readily. The data will be available on the MASSDE website by June 1, 2007.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
INDICATOR 4A:
The 2004-05 SSDR data was not available on time to be submitted in the MA SPP in December 2005.
The data submitted at that time was 2003-04 data. The 2004-05 SSDR data is now available and the SPP
for Indicator 4a with proposed targets, improvement activities, timeless and resources was revised and
rewritten.
13
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 5: Percent of children with IEPs aged 6 through 21:
A. Removed from regular class less than 21% of the day; 1
B. Removed from regular class greater than 60% of the day; or
C. Served in public or private separate schools, residential placements, or homebound or hospital
placements.
(20 U.S.C. 1416(a)(3)(A))
Measurement:
A. Percent = [(# of children with IEPs removed from regular class less than 21% of the day) divided
by the (total # of students aged 6 through 21 with IEPs)] times 100.
B. Percent = [(# of children with IEPs removed from regular class greater than 60% of the day)
divided by the (total # of students aged 6 through 21 with IEPs)] times 100.
C. Percent = [(# of children with IEPs served in public or private separate schools, residential
placements, or homebound or hospital placements) divided by the (total # of students aged 6
through 21 with IEPs)] times 100.
Measurable and Rigorous Target
FFY
% of children with IEPs
aged 6 through 21 in
full inclusion
(Indicator 5A)
% of children with IEPs
aged 6 through 21 in
substantially separate
placements
(Indicator 5B)
% of children with IEPs
aged 6 through 21 in outof-district placements
(Indicator 5C)
2005
(2005-2006)
43.4%
16.2%
6.8%
Actual
Target Data
for
2005-2006
49.1%
15.7%
6.7%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
The percent of students in full inclusion increased in 2005-06 (from 43.4% in 2004-05) while the percent
of students in substantially separate placements (16.2% in 2004-05) and out-of-district placements (6.8%
in 2004-05) decreased. Although the percent of students in substantially separate programs and out-ofdistrict programs was expected to be stable, MASSDE considers these numbers to be a good indication
that districts are improving in efforts to be more fully inclusive of students with disabilities in the least
restrictive environment. Despite the target expectations of increasing percentages for students in
substantially separate placements, which is expected as a result of the decrease over time in out-ofdistrict placements, MASSDE considers that it has met its 2005-06 targets in all three placement areas.
1
At the time of the release of this package, revised forms for collection of 618 State reported data had not yet been approved.
Indicators will be revised as needed to align with language in the 2005-2006 State reported data collections.
14
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
In order to explain our progress, it is informative to look at our identified improvement activities, which
require us to analyze our data in a number of ways.
Disability categories across the board increased in rates of full inclusion. The top five disability categories
that have increased the most in full inclusion over the past year are:

Sensory Vision (10.7 percentage points; 44.4% - 55.1%)

Physical Disabilities (8.3 percentage points; 60.1% - 68.4%)

Specific Learning Disabilities (6.2 percentage points; 49.4% - 55.6%)

Health (5.5 percentage points; 58.2% - 63.7%)

Neurological (5.3 percentage points; 51.8% - 57.1%)
The chart below shows the change in full inclusion over the past year in percentage points for all disability
categories.
Change in Full Inclusion
Percentage Points
12.00
10.8
10.00
8.4
8.00
4.00
6.2
5.5 5.2
6.00
2.7
3.7
3.7
4.0
4.9 5.3
3.7
2.00
1.0
M
ul
tip
le
A
D utis
is
ab m
ili
ti e
s
C
om H
e
m
la
Se uni th
ns cat
io
or
n
y
V
i
s
Se
Em io
n
ns
ot
Sp o
i
o
r
ec y H
n
Ph al
ifi
a
c
Le rd o ysi
ca
f
ar
ni He l
ng ar
D ing
is
ab
D i lity
D
e
ev af
el bl in
op
d
N me
eu nt
ro al
lo
In gic
a
te
l le l
ct
ua
l
0.00
The top five disability categories in the substantially separate setting that have changed the most over the
past year are:

Physical (decreased by 2.8 percentage points; 13% - 10.2%)

Sensory Vision (decreased by 1.8 percentage points; 12.3% – 10.4%)

Multiple Disabilities (increase by 1.7 percentage points; 30.9% – 32.6%)

Developmental (decrease by 1.4 percentage points; 18.9% – 17.5%)

Communication (decrease by .7 percentage points; 9.3% – 8.6%)
The chart below shows the change in substantially separate settings over the past year in percentage
points for all disability categories.
15
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Change in Substantially Separate Settings
1.
7
2
1.5
Autism
0.
5
Multiple Disabilities
0.
3
0.5
Health
Communication
0
Sensory Vision
0.
0
-0.5
Emotional
Physical
-0
.6
-0
.6
-0
.7
-1
-0
.3
-0
.3
Sensory Hard of Hearing
Specific Learning Disability
-1.5
Deafblind
-1
.4
-2
Developmental
-1
.8
percentage points
0.
6
1
Neurological
-2.5
Intellectual
-2
.8
-3
-3.5
The percent of students in out of district placements did not change significantly. However, it is
noteworthy that over the past year

Sensory Hard of Hearing increased by 5.0 percentage points; 21.8% – 26.7%

Sensory Vision decreased by 3.2 percentage points; 14.9% – 11.7%

Multiple Disabilities increased by 1.5 percentage points; 22.8% – 24.4%
The chart below shows the change in out of district placements over the past year in percentage points
for all disability categories.
Change in Out of District Placements
Autism
5.0
6
Multiple Disabilities
5
Helath
Communication
3
1.5
Sensory Vision
1.1
Emotional
0
-0.
1
-0.
2
-0.
8
-0.
2
-0.
1
Deafblind
Developmental
Neurological
-3
Intellectual
-3.
2
-4
Sensory Hard of Hearing
Specific Learning Disabilities
-1
-2
16
Physical
0.2
0.4
1
0.0
2
-1.
3
Percentage Points
4
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
According to these trends we are most significantly increasing the LRE placements for students with
sensory vision disabilities as their placements in substantially separate settings and out of district
placements are decreasing and full inclusion is increasing. In addition, according to these trends, we
should take the next step and look at student level trends and based on those findings determine the
need for technical assistance in the disability categories that are most affected.
In March 2006 the LRE workgroup met to discuss this indicator. The emerged priority was to identify
ways of assuring that appropriate placement of students takes priority over the number of students placed
in the least restrictive environment. This is a concern given the high prevalence of out of district
placements as previously indicated, especially for students who are sensory hard of hearing.
Massachusetts has a long, positive history with Special Education Approved Private Schools (APS)
across the state and the workgroup recognized this and focused their intentions on what the students are
getting and not where they are. The workgroup has provided recommendations to the state including a
review of the eligibility process and reviewing and providing technical assistance regarding the concept of
LRE as a placement. The workgroup continues to be apprised of the work of the state and their input is
called for on a regular basis.
During 2006, MASSDE worked with the North East Regional Resource Center (NERRC) to participate in
a survey to study the differences in LRE placements in states in the northeast region. The survey was
developed for six of the eight states in the northeast region (two states opted not to participate) and while
much of the survey was consistent across all states, each state was also allowed to add unique questions
relevant to their state. MA was allowed to include additional scenarios, state-specific language, and
technical assistance/training activities, to make the survey as meaningful as possible to our state. Eight
respondents within the state (special education local district personnel) independently assigned an LRE
classification to the fictitious students in the scenarios based on the MASSDE LRE placement
descriptions. In addition, questions were asked about availability and perceived value of state-delivered
training and technical assistance relative to LRE knowledge transfer.
The report we have received to date does not include data on the state-specific scenarios but does
provide some customized feedback for our state to consider. A compelling finding of this study shows
that not only did states across the region rate the same case studies differently, but even within states (all
except for one) the case studies were rated differently. This finding makes clear that even with
individual student data, there may be significant variation that is attributable to different understanding of
data elements, even among trained and knowledgeable people.
An additional finding showed great variation in MA (and other states) in knowledge about technical
assistance or training opportunities related to these data elements.
Based on this study three main points arise for MA.

A review of the language of our LRE definitions may prove beneficial;

Districts knowledge of the technical assistance opportunities available to them needs to be
heightened; and

More technical assistance needs to occur to ensure districts across the state are categorizing
similarly situated students in the same ways in our individual student data system.
Findings from this study will be shared with the state’s LRE workgroup and will begin to focus our
technical assistance and outreach to districts.
In addition, MASSDE has offered ongoing technical assistance to districts throughout the year on a
variety of topics. Specifically addressing the increase in LRE full inclusion, the 2006 Special Education
Summer Institutes focused on many strategies of evaluation and educational implementation that would
increase teachers’ abilities to include students with disabilities in the general education curriculum. This
included strategies for teaching students with physical disabilities and sensory vision impairments. The
state also has available to districts training material on our website, an e-mail address for questions, and
grant opportunities for funding for technical assistance.
Regarding public reporting, MASSDE is currently working with our stakeholder groups to determine the
best way to publicly report this data at an LEA level. The data will be made available through the School
and District Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be
presented in table format and/or through an interactive mapping software program that will allow users to
visually compare LEA data more readily. The data will be available on the MASSDE website by June 1,
2007.
17
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
Given the efforts and progress made over the past year, proposed targets for this indicator remain
appropriate. However, based on the information gleaned from analysis we have conducted regarding
our placement trends, the findings reported in the LRE study, and continuous technical assistance
provided by the state, some additional improvement activities added to the MA State Performance Plan
are warranted.
In 2006-07, MASSDE staff will analyze LRE data trends at the student level. The purpose of this will be
to determine if students are moving from one placement to another, and if so, determine where they are
going. Based on these finding, we will determine the need to technical assistance to enhance educators’
abilities to educate all students in the general education curriculum. In addition, technical assistance will
be provided regarding data collection to ensure that students who are in comparable placements are
categorized similarly. Additional professional development opportunities will be provided which include
Special Education Summer Institute trainings on strategies to include all students in the general education
curriculum, effective evaluation procedures, and data collection. Finally, work will be done by the
MASSDE to enhance districts knowledge of technical assistance opportunities available around least
restrictive environments.
18
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: FAPE in the LRE
Indicator 6: Percent of preschool children with IEPs who received special education and related services
in settings with typically developing peers (i.e., early childhood settings, home, and part-time early
childhood/part-time early childhood special education settings).
(20 U.S.C. 1416(a)(3)(A))
Measurement: Percent = [(# of preschool children with IEPs who received special education
services in settings with typically developing peers) divided by the (total # of preschool children with
IEPs)] times 100.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
78.4%
Actual Target Data for 2005-2006: 78.3%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
Over the past year the state has been working to implement the improvement activities identified in the
SPP submitted in December 2005. Year 1 activities included:
 Meet with Massachusetts Statewide Special Education Steering Committee to review current data
and identify potential improvement activities.
 Review and revise SIMS data collection to more accurately collect data on preschool placements;
this will include creating standard definitions for preschool settings.
 Meet with targeted workgroups focused on this indicator. Discussion will include identifying the
best method for reporting these data publicly at a district level.
Over the course of the year we were able to begin the implementation all of these activities and we will
continue to build on our efforts in future years in efforts to create a more substantial positive impact.
As MASSDE has worked to improve in all indicator areas, we came upon a discovery regarding this
indicator. Despite the fact that districts had been given training on data collection (training is provided
every year), some districts have not assigned SASID numbers (SASID = State Assigned Student
Identification) to preschool students who are only receiving services in an itinerant setting. In addition,
some districts were incorrectly assigning educational environment codes. Therefore, our state data set
may be incomplete. We will be addressing these issues in our spring training described below.
MASSDE considers the target data for 2005-06 to be essentially meeting the target and notes that the
data collection method will change for the 2006-07 data collection in this area and, consequently,
MASSDE has been devoting more attention to informing the districts of this expected data change as well
as considering the impact the changed data definitions will have on our work and results in this Indicator
area.
19
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
MASSDE is currently working with our stakeholder groups to determine the best way to publicly report this
data at an LEA level. Due to the change in data collection, LEA data will be available for public reporting
for the 2007-08 school year for this Indicator
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
Revisions have been made to the improvement activities identified for this Indicator as a result of changes
in the required data collection. In September 2006 OSEP changed the definition for calculating
educational environment for preschool students with disabilities. Instead of determining the extent to
which special education services are provided in inclusive settings, districts will be required to determine
the extent to which the child participates in an inclusive early childhood setting overall regardless of
whether these settings serve as placement for special education services. This will require a number of
changes at the district level:
 IEP Teams will now have to ask parents whether the child participates in an early childhood
program outside of early childhood special education and, if the answer is yes, the number of
hours their child spends in these settings across the course of a week.
 The states educational environment codes will now be determined based on the amount of time
the child spends receiving special education services in inclusive settings and time spent in other
inclusive early childhood settings outside of special education in order to calculate the appropriate
percentage of time the child participates in inclusive early childhood programs overall.
Therefore, an immediate activity we are in the process of conducting is an effort to bring our Student
Information Management System (SIMS) up to speed with the new requirements in order to facilitate
accounting of placements for preschool students with disabilities according to the required data set.
SASID numbers will be assigned to all preschool students and new preschool codes will be developed
that align with the new definitions from OSEP. By updating the system in this way, districts will be better
prepared to collect and report the new information required.
In addition, training on the new formula and how to collect the needed information for calculating time in
regular early childhood programs during IEP meetings will occur for districts in the Spring. This will be a
collaborative effort across departments at MASSDE and across indicators. This training will be a high
priority, as we know that it is necessary to revise our data collection techniques in order to obtain
accurate, reliable data as required by OSEP.
An important aspect of the spring training will be to encourage districts to be in regular communication
with families and the various community services offered to preschool students in order to be up-to-date
on the extent to which preschool students with disabilities attend early childhood programs outside of their
special education services. The new formula for calculating time in regular early childhood programs calls
for a more transparent service delivery model for preschool students and therefore the state will work with
districts to expand lines of communication and find ways to enhance collaboration between special
education services, families, and community based programs.
In the fall of 2007, after we have collected data on preschool students using the new formula as laid out
by OSEP, we will review and analyze these data to determine if the new formula creates a further need to
revise our SPP.
20
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / Effective Transition
Indicator 12: Percent of children referred by Part C prior to age 3, who are found eligible for Part B, and
who have an IEP developed and implemented by their third birthdays.
(20 U.S.C. 1416(a)(3)(B))
Measurement:
a. # of children who have been served in Part C and referred to Part B for eligibility determination.
b. # of those referred determined to be NOT eligible and whose eligibilities were determined prior
to their third birthdays.
c. # of those found eligible who have an IEP developed and implemented by their third birthdays.
d. # of children for whom parent refusal to provide consent caused delays in evaluation or initial
services.
Account for children included in a but not included in b, c or d. Indicate the range of days beyond
the third birthday when eligibility was determined and the IEP developed and the reasons for the
delays.
Percent = [(c) divided by (a – b – d)] times 100.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
100%
Actual Target Data for 2005-2006: 77%, representing no increase on our baseline data.
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for FFY06: The Massachusetts Department of Education (MASSDE) and Massachusetts
Department of Early Education and Care (MASSEEC) worked together to improve the questions asked in
the 619 grants in order to receive more accurate data, however, upon receipt of the data, MASSDE still
feels it is not appropriately collecting this information since the Indicator speaks only to whether children
referred prior to age 3 are receiving services at age three if they are found eligible. As we review more
data, in addition to the pinning down of “when” services are received, we find we must pin down “how
much prior to age three” the referral from Part C was received.

Massachusetts has state requirements that require districts to respond within five school working
days to referrals for special education evaluation. Generally, districts respond with a proposal to
evaluate and seek the consent of the parent to the assessments the district proposes.

We have to assume that the parent will require at least a calendar week to respond.

Massachusetts has a state requirement that once consent for evaluation is received, the district
has 45 school working days to conduct the eligibility evaluation and to convene a Team to
determine eligibility and develop an IEP and determine placement.

The parent has 30 calendar days to respond to the proposed IEP and placement.

Therefore, in order to appropriately determine if a district has provided services in a timely way
and by age 3, we need to capture referrals that occurred, at a minimum, approximately three
months in advance of the child’s third birthday (5 school working days + 1 calendar week for
referral to consent + 45 school working days for consent to IEP proposal = 11 calendar weeks +
21
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
30 calendar days for proposal to parental consent to services = a little less than 15 calendar
weeks = approximately 3 months.

Massachusetts also has a regulation that requires that if the child is referred by the time he/she is
two and half, then the district is required to complete all activities so that the student may receive
services promptly when he/she is 3 years of age. We do not collect information at this time on
how early the referral was received. Anecdotally, districts report that referrals occur often a
month or two months prior to the 3rd birthday. Although we can encourage districts to act
promptly, we cannot require them to complete all activities in a shorter timeline than is available
for all other students.

Finally, in addition to these required timelines, in order for the timelines to apply during the
summer when schools are not in session, the consent to evaluate must be received within 30 to
45 school working days before the end of the school year.
After MASSDE had initiated its first year data collection activity for this Indicator, OSEP subsequently
added to the measurement “d. # of children for whom parent refusal to provide consent caused delays in
evaluation or initial services” and we did not collect that information this year. We will add that element
into next year’s grant application and rephrase questions and define criteria more clearly including
collecting information only on students whose referral prior to age 3 was made at least 3 months in
advance of the child’s 3rd birthday.
In 2005-06, a significant number of children referred from Part C and found eligible for special education
were served prior to turning three (from 5% [n=216] in 2004-05 to 29% [n=1,096]).
In 2005-06:
a.
b.
c.
d.
5,358 have been served in Part C and referred to Part B for eligibility determination.
919 of those referred were determined to be NOT eligible and this was determined prior to their
third birthdays. An additional 278 were determined to be NOT eligible after their third birthday, for
a total of 1197 children found NOT eligible.
2,934 of those found eligible are reported to have an IEP developed and implemented by their
third birthday.
We do not know the number of children for whom parent refusal to provide consent caused
delays in evaluation or initial services. We did not ask for this new element and, therefore, do not
use it in the final analysis.
Of the children referred from Early Intervention (n=5,358), we have information on 5,005 as to whether
they were found eligible or ineligible for services. While we didn’t specifically ask districts to justify if there
was a difference in the number referred and those that completed eligibility determination, there is a
difference statewide of 353 children. We expect that these are families who removed their child from the
eligibility determination process at some point and we have, therefore, used 5,005 for the remainder of
the analyses.
Based on these 5,005 children referred, and subtracting all of those found NOT eligible, we are
considering a group of 3,808 children of whom 2,934 received services by age 3, or 77% of the group
referred and found eligible. As we did not properly collect data on when referrals were received, including
the impact of summer on evaluation timelines, we believe that this is a significant underrepresentation of
the responsiveness and timeliness of Massachusetts districts.
We do note that of 686 children served in Part C and found eligible for special education and served after
their third birthdays:

62 were served 2 school days after their birthdays

139 were served 3-5 school days after their birthdays

104 were served 6-10 school days after their third birthdays

131 were served 2-4 weeks after their third birthdays, and

260 were served a month or longer after their third birthdays.
Reflecting on our identified improvement activities, MASSDE, MASSEEC and the Massachusetts
Department of Public Health (DPH) collaborated on a GSEG grant application that would have allowed for
the development of a shared data system that would have assigned State Assigned Student Identification
22
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
(SASID) to children in Part C, Early Intervention, in hopes that transitions would improve because all
students would be entered into the state system at an early age enabling referrals to be made in a timely
manner and districts could respond accordingly. While funding was not approved, the three agencies
have not disbanded the effort and are continuing conversations to implement such a system. A
Memorandum of Understanding has been drafted and distributed to the three agencies for review.
The updated Interagency Agreement between Head Start, DOE, EEC, and DPH is completed and is in
review at senior management level for endorsement. Once disseminated, this too will provide guidance to
districts on timely transitions and eligibility determination and services in place by age three.
We looked to another data source to augment the data we received through the Early Childhood Special
Education grants. We found in examining half of the districts which completed a Coordinated Program
Review (CPR) last year, the majority were found to have “partially met” criterion SE17: Initiation of
services at age three and Early Intervention transition procedures. We will continue to work with Program
Quality Assurance and provide technical assistance with districts.
As a compliance indicator, districts are expected to have the children transitioning from Early Intervention,
found eligible for services and services commence by age 3. DPH and EEC have begun regional
meetings with EI and LEA staffs to discuss timelines, referring early, responding within 5 days, etc. in
order to assess young children and determine eligibility. We will continue to meet with districts and EI
programs.
We will complete and disseminate a guide to transition, co-authored by DPH and EEC, written for
parents, early childhood educators, LEAs and EI programs.
Additionally, MASSDE is currently working with our stakeholder groups to determine the best way to
publicly report this data at an LEA level. The data will be made available through the School and District
Profiles section of the MASSDE website (http://profiles.doe.mass.edu/). Data will be presented in table
format and/or through an interactive mapping software program that will allow users to visually compare
LEA data more readily. The data will be available on the MASSDE website by June 1, 2007.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets, improvement activities, timelines and resources for 2005-06 remain appropriate.
23
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / General Supervision
Indicator 15: General supervision system (including monitoring, complaints, hearings, etc.) identifies and
corrects noncompliance as soon as possible but in no case later than one year from identification.
(20 U.S.C. 1416 (a)(3)(B))
Measurement:
Percent of noncompliance corrected within one year of identification:
a. # of findings of noncompliance.
b. # of corrections completed as soon as possible but in no case later than one year from
identification.
Percent = [(b) divided by (a)] times 100.
For any noncompliance not corrected within one year of identification, describe what actions,
including technical assistance and enforcement actions that the State has taken.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
100%
Actual Target Data for FFY 2005 (2005-2006):
a. 238 findings of special education noncompliance were made through the Problem Resolution System
(PRS) for complaints received between July 1, 2004 and June 30, 2005. 100% of these findings were
corrected within one year of identification.
o
921 findings of special education noncompliance were made through the Coordinated Program
Review (CPR) reports published between July 1, 2004 and June 30, 2005. 62% (567) of these
findings have been corrected to-date.
a. 1159 instances of non-compliance were identified through complaints or monitoring during
2004-2005.
b. 805 findings of non-compliance have been corrected, for an overall correction rate of 69%.
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
A. Discussion of Improvement Activities Completed:
Year 1 (FFY 2005: 2005-06)
1. MASSDE staff from Program Quality Assurance Services (PQA) and Special Education Policy &
Planning (SEPP) met with members of the Massachusetts Statewide Special Education Steering
Committee on November 1, 2005. The meeting was centered on a discussion of the SPP and its
various indicators. Workgroups were set up for all of the indicators.
2. PQA administrators made clear to PQA staff, including supervisors and liaisons to schools and
districts, the ramifications of the new one-year requirement. Existing materials for the Coordinated
24
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Program Review (CPR) system were revised to feature prominently the requirement that all
corrective action be completed as soon as possible, but in no case later than one year from
identification. These materials include the CPR General Information booklet, the CPR
Chairperson’s script for district orientation to CPR procedures, the cover letter to the CPR Final
Report, and the CPR Corrective Action Plan (CAP) form and instructions. (Because the timelines
for corrective action after a Mid-cycle Review (MCR) are dictated by PQA staff, there was no
need to notify school or district staff about the one-year requirement with respect to MCRs;
because corrective action after complaints is already consistently carried out within one year,
there was no need to revise the Problem Resolution System (PRS) materials.)
3. On March 21, 2006, the Indicator 15 Workgroup set up at the November 1, 2005 Steering
Committee meeting (see #1 above) met. Some of the issues the workgroup raised relative to the
timeline issue were: the problems for districts when the CPR Final Report is received during the
summer and staff are on vacation; the need for MASSDE to work collaboratively with the school
or district, supplying technical assistance before the CAP is submitted; and the need to streamline
MASSDE’s system of progress reporting on the corrective action undertaken. Consideration of all
of these issues is informing MASSDE’s work on reforming the CPR/CAP system to reach the
target of 100% for Indicator 15. (See # 5, below, and #1 - 3 under “Explanation of Progress or
Slippage,” below.)
4. MASSDE staff have continued to use PQA’s software system, Remedy’s Action Request System,
to track the date of closure of a complaint; in cases where corrective action was required as a
result of the complaint, this date serves as a marker for the successful completion of that
corrective action.
5. After exploring several means for tracking the completion of corrective action required through the
CPR/MCR system, PQA has decided to develop, through a contractor, a web-based monitoring
system for the transmission between PQA and schools/districts of self-assessments, compliance
reports, corrective action plans, progress reports, etc. Such a system will assist not only with
tracking the completion of corrective action, but also with streamlining all of PQA’s monitoring.
MASSDE has issued a Request for Response (RFR) for proposals to develop this web-based
monitoring system and has received multiple bids. On February 1, 2007, a bidder will be selected
for the contract. In the meantime, MASSDE has contracted with a project manager to act as
liaison for PQA with whoever is selected as the contractor; the project manager is now
familiarizing himself with PQA procedures. The plan is to pilot the web-based system in FFY 2008
(2008-09), with possible preparatory activities in FFY 2007 (2007-08). PQA has begun organizing
a stakeholder group of representatives from districts and charter schools that are interested in
being a part of that pilot to consult on the design and implementation of the web-based system.
Year 2 (FFY 2006: 2006-07)
MASSDE staff from Program Quality Assurance Services (PQA) and Special Education Policy & Planning
(SEPP) met with members of the Massachusetts Statewide Special Education Steering Committee on
December 5, 2006 for the purpose of obtaining feedback on the SPP Indicators. The Steering Committee
members had no comments on the data they reviewed for Indicator 15, and no questions or concerns.
They suggested the possibility of using incentives for meeting compliance requirements. MASSDE is
taking under consideration this possibility as something that might be incorporated into the web-based
model about to be developed.
B. Explanation of Progress or Slippage:
100% of findings of noncompliance made through the Problem Resolution System for complaints were
corrected within one year. This is the second reporting period for which MASSDE has maintained 100%
compliance in this area (as reported in the MA SPP for 2004-05, 100% of noncompliance from complaints
was corrected within one year).
Of the 921 findings of special education noncompliance that were made through the Coordinated
Program Review (CPR) reports published between July 1, 2004 and June 30, 2005, 62% (567) of these
findings have been corrected to-date. MASSDE reports the data in this manner because the standard of
correction within one year was not the standard originally applied to non-compliance found in 2004-05.
Therefore, the standard could not be fully applied to this time period.
25
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
In the SPP submitted December 1, 2005, MASSDE submitted as baseline data for CPR findings the
percentage of findings of noncompliance made in the CPRs conducted in 2001-02 that were corrected
(and sustained) by the time of the relevant 2004-05 Mid-cycle Review (MCR). That percentage was 68%.
The 62% reported for CPR findings corrected from 2004-05 does not constitute slippage from the 68%
figure submitted under Indicators 15A and 15B in the SPP, since 68% was a figure for the percentage of
findings corrected within three years (roughly) of the onsite visit. Comparison of the SPP baseline for this
Indicator and the APR results is not possible, these are apples on the one hand and oranges on the
other. MASSDE anticipates that the APR due February 2008 will, for the first time, provide all data as
envisioned by the required measurement factors for this indicator. In the meantime, MASSDE believes
that good progress has been made in changing our compliance expectations to correction within one
year.
A review of the findings of non-compliance does not reveal any specific area of systemic non-compliance
that would indicate the need for statewide action. MASSDE is struggling with our data sets as our SE
indicators encompass areas of non-compliance that may be large or small, with high impact or low
impact. The statewide data also does not capture progress toward compliance effectively at this time. As
a consequence, it is only at the district report level that the impact of any non-compliance can be reliably
considered and information on progress is available only through the progress reports received as part of
the corrective action plan. MASSDE is considering how we might capture information on areas of noncompliance that may be more specifically relevant to student outcomes vs. "minor" procedural noncompliance and how we might capture progress information for the purpose of using our local data more
effectively in development of our corrective action plans and progress reporting as well as using our
statewide data more effectively both for our annual performance reporting and our state improvement
activities.
C. Further Improvement Steps:
MASSDE can, however, describe steps it is taking, in addition to the improvement activities described
above, in order to approach the target of 100% correction of noncompliance within one year.
1. A PQA committee began meeting in 2005 to explore the possibility of converting to an
electronic system of corrective action plans (CAPs) and progress reports on the
implementation of corrective action—the expectation being that an electronic system would
lead to speedier completion and verification of corrective action. Plans were finalized in
December, 2006, for this new system, to be piloted in the spring of 2007 and, if successful,
used in the interim until the web-based system referred to above is instituted.
2. The new electronic CAP/progress report system includes technical assistance in the form of
pop-up guidance and examples for each part of the corrective action plan (the pop-up
guidance for “Expected Date of Completion for Each Corrective Action Activity” reminds local
personnel that the proposed date of completion should be well before the one-year
anniversary of the Final Report). The instructions also include the expectation that
school/district personnel will consult with PQA staff during the development of the CAP, the
idea being that the more consultation occurs during the CAP’s development, the more readily
approvable—and effective--the CAP will be. Incidentally, the new electronic system provides
for a period of 40 business days from the Final Report to the approval of the CAP, whereas,
the old CAP/progress report system provides for 45 business days.
3. For the schools and districts where the new electronic CAP/progress report system will not be
piloted, some of the changes being planned for the new electronic system are also being
made in the old CAP/progress report system materials—for instance, the same technical
assistance in the form of guidance and examples is being included, just not in pop-up form.
D. Actions taken when CPR findings of noncompliance were not corrected within one year of
identification (as noted above, all findings of noncompliance stemming from complaints were
corrected within one year):
As mentioned in the first paragraph under Explanation of Progress or Slippage above, almost all of the
corrective action for the findings of noncompliance made in CPR reports between July 1, 2004, and June
30, 2005, was proposed before the one-year requirement was instituted, under PQA’s previous, non-timebased, tracking system. Here is a partial description of that system, as included in the December 1, 2005
SPP:
26
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
When progress reports are received from the district, MASSDE issues a Review of Progress Report in
which it indicates, with the basis for its decision, whether the progress report has adequately shown that
the corrective action for a particular criterion has been implemented. If the progress report has not
adequately shown the implementation of the corrective action, MASSDE requires a further progress
report and indicates what that progress report must show.
Three years after the CPR, unless unique circumstances dictate either acceleration of the timetable or
postponement, PQA conducts the special education Mid-Cycle Review mentioned above to review again
areas that were present on the charter school or district’s previous CAP. PQA publishes a Coordinated
Program Review MCR Monitoring Report, and, if the school or district is again found non-compliant, in
whole or in part, with any of the areas previously corrected following the CPR, then PQA issues its own
Corrective Action Plan, which must be implemented by the school or district without delay. Failure to
implement MASSDE’s Corrective Action Plan within the required time may result in the loss of funds to
the school or district and/or other enforcement action by MASSDE.
For the future, the web-based monitoring system that PQA is about to develop (described under A., Year
1, #5 above), along with the electronic CAP/progress reporting system and the revisions to the previous
CAP/progress reporting system (described under B. 1- 3 above), will help MASSDE track, give technical
assistance, and use enforcement action where necessary in the context of the one-year compliance
timeframe under which it is now operating.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets, improvement activities, timelines and resources for 2005-06 remain appropriate.
27
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / General Supervision
Indicator 16: Percent of signed written complaints with reports issued that were resolved within 60-day
timeline or a timeline extended for exceptional circumstances with respect to a particular complaint.
(20 U.S.C. 1416(a)(3)(B))
Measurement: Percent = [(1.1(b) + 1.1(c)) divided by 1.1] times 100.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
100%
Actual Target Data for 2005-2006: (226 reports within timeline + 36 reports within extended timelines) /
324 complaints with reports issued x 100 = 81%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
Discussion of Improvement Activities Completed:
Year 1 (2005-06)
1. The system of more intensive monitoring was developed and implemented. Darlene Lynch, formerly
Assistant Administrator of Program Quality Assurance Services (PQA), now Director, has periodically
printed out logs for every staff person showing complaints assigned and timelines met or missed, then
disseminated these logs to the appropriate supervisor with comments on compliance or on needed
improvement.
2. Remedy’s Action Request System software was modified so that PQA staff now receive more frequent
reminders from the system that action with respect to a complaint is required.
3. A committee of PQA staff was formed that examined the possibility of reorganizing into two teams, one
of which would manage the Problem Resolution System (complaint resolution system). The committee
ultimately determined that it was not clear that such a reorganization would help with meeting timelines.
4. Discussion of recalcitrant schools and districts resulted in the revision of procedures for PQA. The
updated Problem Resolution System (PRS) Implementation Guide for PQA Staff, published in July 2006,
provides for an independent DOE investigation of the complaint where the local agency indicates that it
does not intend to provide a local report in response to the complaint. The updated guide also provides
that where the local agency indicates that it will not respond to the complaint within the required timelines,
PQA staff may, with the approval of a supervisor, issue a letter of finding based on the complainant’s
documentation.
Year 2 (2006-07)
1. PQA has prepared a statistical report on complaint resolution in connection with preparing this APR,
and has analyzed the reasons for noncompliance and barriers to timely compliance: see Explanation of
Progress or Slippage below.
2. PQA is moving ahead on implementing needed modifications to the Problem Resolution System. See
last paragraph of Explanation of Progress or Slippage below.
28
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Explanation of Progress or Slippage:
Although MASSDE did not meet its target for 2005-06 of 100%, it made progress on the target of 100%
compliance. It increased the percent of signed written special education complaints with reports issued
that were resolved within 60 days, or within a timeline extended for exceptional circumstances with
respect to a particular complaint, from 69% in 2004-05 to 81% in 2005-06. This progress was due in large
part to the new system of more intensive monitoring of the management of complaints that has been
developed and implemented by PQA. (See paragraph 1 under Year 1 above.)
Where the timeline was not extended, the percent of signed written complaints that were resolved within
60 days was 89% (226 out of 255); where the timeline was extended, the percent of signed written
complaints that were resolved within the extension was 52% (36 out of 69).
It is clear from these numbers and percentages that most of the problem with belated resolution of
complaints in 2005-06 arose in connection with complaints where the timeline was extended. Upon
analysis, the reason for the timeline problems where complaint timelines were extended was that PQA’s
timeline extensions have been inadequate to the task.
In order to continue making progress on the target for the percentage of complaints resolved in a timely
manner, PQA is moving ahead on improving its policies and practices. Those improvements include:
(see Year 1, paragraph 4, above)

allowing PQA staff, where a district resists investigating a complaint, to institute their own
investigation or (with the supervisor’s approval) to issue a letter of finding based on the
complainant’s documentation;
(see previous paragraph)

making clear to PQA staff through the supervisors that where timelines are extended for
exceptional circumstances with respect to a particular complaint, the extension should allow
not only enough time for the school or district to submit additional information but also
enough time for the PQA staff person to review that information, obtain feedback on that
information from the complainant, consult with the PQA supervisor, and write and issue a
letter of finding or letter of closure (taking into account which days are holidays or weekends
and when the staff person will be out of the office on Coordinated Program Reviews or Midcycle Reviews);

revising the Problem Resolution System Implementation Guide for PQA Staff to include
guidance on the duration of extensions; and

instituting more intensive monitoring for the management of complaints where the timelines
have been extended, in particular.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
Given the progress made over the past year, the targets, improvement activities, timelines and resources
for 2005-06 remain appropriate.
29
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / General Supervision
Indicator 17: Percent of fully adjudicated due process hearing requests that were fully adjudicated within
the 45-day timeline or a timeline that is properly extended by the hearing officer at the request of either
party.
(20 U.S.C. 1416(a)(3)(B))
Measurement: Percent = [(3.2(a) + 3.2(b)) divided by 3.2] times 100.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
100%
Actual Target Data for 2005-2006:
# of Hearings (fully adjudicated)
18
Decisions issued within 45-day timeline or a timeline that is
properly extended by the hearing officer at the request of either
party.
16
% of fully adjudicated due process hearing requests that were fully
adjudicated within the 45-day timeline or a timeline that is properly
extended by the hearing officer at the request of either party.
88.8%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
In an effort to ensure uniformity in scheduling parent and LEA requests for hearing, initial hearing dates
for both were set 35 calendar days subsequent to receipt of the hearing request. This was done to
accommodate the 30-day resolution period mandated for all parental hearing requests (despite the fact
that such a time period was not required for LEA requests). As a result, Hearing Officers had a 40
calendar day period to issue decisions on parent requests (as only 5 days of the 45 days allocated post
resolution period had been utilized in scheduling the initial 35 day date). However, given that in the LEA
hearing request scenario the 45-day timeline begins to run at the date the hearing request is filed, the
Hearing Officer in such instance was only left with a period of 10 calendar days to issue the decision.
One of the two BSEA decisions issued outside the timelines this fiscal year was in fact the result of this
situation. (It should be noted that, without factoring in said decision, the percent of fully adjudicated due
process hearing requests that were fully adjudicated within the 45-day timeline or a timeline properly
extended by the hearing officer at the request of either party, was 94%.)
30
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
The BSEA is rectifying this scheduling anomaly by changing BSEA hearing rules to reflect an initial
hearing date for LEA requests which is 20 calendar days from the date of the filing the hearing request,
thus allowing the Hearing Officer 25 calendar days to issue the decision.
There was one other decision issued outside of the timelines (five days late). This matter is being
addressed through monitoring and supervision.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets, improvement activities, timelines and resources for 2005-06 remain appropriate.
31
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / General Supervision
Indicator 19: Percent of mediations held that resulted in mediation agreements.
(20 U.S.C. 1416(a)(3)(B))
Measurement:
Percent = [(2.1(a)(i) + 2.1(b)(i)) divided by 2.1] times 100.
FFY
Measurable and Rigorous Target
2005
(2005-2006)
86%
Actual Target Data for 2005-2006:
# of Mediations
772
# of Mediation Agreements
644
% of Mediations Held that Resulted in Mediation
Agreements
83.4%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
Improvement Activities Completed:
All improvement activities identified in the SPP for 2005-06 have been completed. 2006-07 improvement
activities identified to date in the SPP have been completed.
Explanation of Progress or Slippage:
The MASSDE mediation program is managed by the Bureau of Special Education Appeals, Mediation
Office (BSEA-Mediation) and is nationally recognized as providing highly effective mediation services.
Our SPP identified our target setting for this Indicator to be a maintenance target as it was strongly felt
that although tracking mediation agreements was important, it would be inappropriate to suggest that we
seek to “compel” parties in mediation to reach agreement. Therefore, although the percentage of
mediations that resulted in mediation agreements is slightly lower than our baseline year and our target
for 2005-06 (85.9%), we believe that 83.4% appropriately meets our essential goal of maintaining a high
level of mediation agreements. We do note that 2005-06 data show over 100 more mediations than the
previous year.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
The targets, improvement activities, timelines and resources for 2005-06 remain appropriate.
32
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Submitted February 1, 2007
Monitoring Priority: Effective General Supervision Part B / General Supervision
Indicator 20: State reported data (618 and State Performance Plan and Annual Performance Report) are
timely and accurate.
(20 U.S.C. 1416(a)(3)(B))
Measurement:
State reported data, including 618 data and annual performance reports, are:
a. Submitted on or before due dates (February 1 for child count, including race and ethnicity;
placement; November 1 for exiting, discipline, personnel; and February 1 for Annual
Performance Reports); and
b. Accurate (describe mechanisms for ensuring error free, consistent, valid and reliable data and
evidence that these standards are met).
FFY
Measurable and Rigorous Target
2005
(2005-2006)
100%
Actual Target Data for 2005-2006: 85.7% on-time submissions, 100% of data submitted
The target data for 2005-06 are based on the percentage of the seven required data submissions for that
year (Tables 1-6 and the MA SPP/APR submission) submitted on or before the required due dates. The
percent compliance indicates the percentage of data submissions we successfully submitted by the
deadline:
Data Submission Type
Due Date
Table 1: Federal Child Count
February 1
Table 2: Personnel Data
November 1
Table 3: Educational Environment
February 1
Table 4: Exit Data
November 1
Table 5: Discipline Data
November 1
Table 6: Assessment Data
February 1
2005-06 Data Submitted:
On-time (EDEN only submission)
On-time with formal extension
On-time (EDEN-only submission)
On time with formal extension
One week after extension
On time
33
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
Annual Performance Report
Total # of Data Reports Submitted On-Time:
% of Data Reports Submitted On-Time
Submitted February 1, 2007
April 1
On time
6
85.7%
Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that
occurred for 2005-2006:
In 2004-05, MASSDE completed 16.6% of its data submissions on-time. For 2005-06, MASSDE showed
improvement with this Indicator by completing 85.7% of its data submissions on-time, or on-time with
formal extensions granted. However, due to the timing of our data collection schedules and the time
needed for cleaning and checking of the data, delays continue to occur in some of the OSEP collections.
With our move from the December 1 to the October 1 collection, our compliance with Table 1 and Table 3
has dramatically increased. Additionally, MASSDE has been granted permission to submit its data for
tables 1 and 3 through EDEN-only, instead of requiring a dual data submission for EDEN and OSEP. The
reporting of Table 6 data was a new requirement for 2005-06). This data was submitted on time for 200506.
Through the ongoing development of our teacher database we hope to continue providing timely and
accurate data for Table 2 and plan on scheduling the data collection for this database to meet the needs
of OSEP. Table 5 data is collected on a cycle where the data submission window is closed in October of
the following academic year. This allows districts the time to submit accurate and complete data but
makes it difficult for us to submit the data by the November 1 deadline. For 2005-06, we were able to
submit Table 5 data on time with a formal extension, but will continue to work with the data submission
timelines to meet the compliance deadline by 2010.
The Annual Performance Report has been submitted by the deadline each year with as accurate data as
we have available at the time of reporting. For example, we have submitted our discipline data in the
previous APRs but know there has been room for improvement in the reporting of this data. We are
working to improve the reporting of the data by districts each year through adjustments made to our
collection tools.
In December 2005, MASSDE met with the Massachusetts Statewide Special Education Steering
Committee to review current data and to identify potential improvement activities. It is our plan to
continue to meet with targeted workgroups, including MASSDE staff, focused on specific areas of data
collection.
Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines /
Resources for 2005-2006:
Given the progress made over the past year, the targets, improvement activities, timelines and resources
for 2005-06 remain appropriate.
34
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
U.S. DEPARTMENT OF EDUCATION
OFFICE OF SPECIAL EDUCATION
AND REHABILITATIVE SERVICES
OFFICE OF SPECIAL EDUCATION
PROGRAMS
Submitted February 1, 2007
TABLE 7
PAGE 1 OF 1
REPORT OF DISPUTE RESOLUTION UNDER PART B, OF THE
INDIVIDUALS WITH DISABILITIES EDUCATION ACT
2005-06
OMB NO.: 1820-0677
FORM EXPIRES: 08/31/2009
STATE: Massachusetts
SECTION A: Written, signed complaints
459
(1) Written, signed complaints total
(1.1) Complaints with reports issued
324
(a) Reports with findings
256
(b) Reports within timeline
226
(c) Reports within extended timelines
36
(1.2) Complaints withdrawn or dismissed
83
(1.3) Complaints pending
72
(a) Complaint pending a due process hearing
21
SECTION B: Mediation requests
772
(2) Mediation requests total
(2.1) Mediations
(a) Mediations related to due process
(i) Mediation agreements
(b) Mediations not related to due process
(i) Mediation agreements
(2.2) Mediations not held (including pending)
44
31
728
613
0
SECTION C: Hearing requests
(3) Hearing requests total
(3.1) Resolution sessions
(a) Settlement agreements
(3.2) Hearings (fully adjudicated)
(a) Decisions within timeline
(b) Decisions within extended timeline
(3.3) Resolved without a hearing
568
442*
212**
18
3
13
323
SECTION D: Expedited hearing requests (related to disciplinary decision)
(4) Expedited hearing requests total
37
(4.1) Resolution sessions
34*
(a) Settlement agreements
1*
(4.2) Expedited hearings (fully adjudicated)
2
(a) Change of placement ordered
1
*This number was calculated by taking the total number of hearing requests, minus the number of
hearings requested by LEAs (resolution session not required), minus the number of mediations related to
due process (notion being that parties may opt for mediation in lieu of resolution session). This number is
not deemed reliable at this time for the following reasons:
1) there may be cases in which both a resolution session and a mediation were held;
2) there are likely cases in which both parties waived the resolution session and did not opt for
mediation; and
35
Massachusetts Part B Annual Performance Report (MA APR) for FFY 2005
3)
Submitted February 1, 2007
there are likely cases in which the LEA failed to timely convene a resolution meeting within the
15 days and therefore it was constructively waived. None of these situations is accounted for in
the above-noted number.
The Bureau of Special Education Appeals (BSEA) is developing revisions to follow-up and reporting
procedures that would increase the reliability of this number for the future.
**This number (212) represents 50% of all cases (424) involving parental requests for hearing which were
not resolved through mediation or substantive hearing officer decision on the merits. Only thirteen cases
(of the 424 cases reported above) were cases in which parents filed a written form with the BSEA, per
BSEA procedure, withdrawing a hearing request and citing settlement at resolution session as basis for
the withdrawal.
We used 50% of the 424 number, to reflect our level of confidence in the likely highly inflated (and hence
not deemed reliable) 424 as it includes situations in which private settlements may have resulted outside
the resolution session process, cases which were withdrawn without settlements having occurred, cases
in which a settlement conference was conducted by the BSEA resulting in withdrawal of the hearing
request, cases in which a pre-hearing conference resulted in a settlement and/or withdrawal of the
hearing request and cases in which a dispositive ruling was issued by a hearing officer.
36
Download