NEW JERSEY EDUCATION ASSOCIATION ,

advertisement
,.
NEW JERSEY EDUCATION ASSOCIATION
180 West State Street Post
Office Box 1211 Trenton, NJ
08607-1211 Phone
609-599-4561 Facsimile
609-392-6321 www.njea.org
Testimony by Francine Pfeffer, NJEA
Before The NJ State Board of Education
May 19,2010
As you know, the passage rate for the first administration of the Alternate High School
Assessment, or AHSA, administered to those students who were not proficient on the High
School Proficiency Assessment (HSP A) was very low-just 10 percent passed for language arts
and 34 percent for math.
The AHSA replaced the Special Review Assessment, the SRA, following a March 2008
resolution by the State Board of Education to reinstate and reform the process.
NJEA had long supported the use of the SRA to enable students to demonstrate their
proficiencies through alternate means when they, for various legitimate reasons, could not show
their abilities on the HSP A.
NJEA strongly opposed the elimination of the SRA; however, we supported a revision of
the process. We were encouraged by the department's creation of the SRA Advisory Committee,
which was composed of a variety of stakeholders representing the education community in New
Jersey.
The committee was charged to 'propose protocols for implementing reforms to the current
8RA process." Their recommendations focused on the administration and scoring aspects of the
SRA and reflected the features included in the State Board's endorsement of an SRA reform
proposal.
Unfortunately, some key recommendations of that committee were not enacted. I believe
this is what has led to the extreme discrepancies between the percentage of students who have
failed the initial administration of the AHSA compared to the SRA. These discrepancies bring
into question the scoring criteria and methods used by the state's vendor, Measurement, Inc.
Continued.. .
For example, in her Dec. 17 memorandum to chief school administrators, Deputy
Commissioner Willa Spicer spelled out the scoring procedures for the AHSA. Those procedures
included the need for approximately 150 scorers for mathematics and 110 scorers for language
arts who held secondary school certification in those subject areas. In addition, teachers with
previous experience administering or scoring the SRA were preferred.
In practice, it is my understanding that for the January administration ofthe test, the staff
of Measurement, Inc. scored approximately 70. percent of the student responses, which represents
a substantially larger percentage than teachers.
In fact, having the test scored by an outside vendor ignored one of the key
recommendations of the SRA Advisory Committee. The majority of the committee recommended
that the scoring remain at the local districts rather than be conducted by an outside vendor.
Local scoring would have allowed for a sequence of instruction followed by
administration of the Performance Assessment Tasks, or PATS, that make up the AHSA. This
would have been followed by local scoring that would allow for timely feedback and further
instruction before the administration of additional portions of the test.
Having the PATs scored by a vendor would slow down the feedback processeliminating
the ability for teachers to use the results to determine student weaknesses in order to best prepare
students for their next bite at the apple.
The committee took seriously questions concerning whether local scoring of the tests
would result in valid outcomes. To that end, the committee recommended a tiered or graduated
audit system that would require periodic external verification and, where necessary, corrective
action and possibly the removal of scoring from local schools to external groups of trained New
Jersey educators.
But the scoring was not the only problem of the AHSA. So too was its administration.
The advisory committee had recommended four three-week windows in which to administer the
PATs-in November, February, May and August. Each administration window would be preceded
by an eight-week instructional period. Such a scenario would enable completion of assigned
PATs with timely results to be followed by remediation in areas of weakness.
Instead, the department chose three testing windows-January, April, and possibly July.
The reduced number of windows, compounded by the lack oftimely feedback, is a recipe for a
high failure rate. In fact, the combination of the January assessment window and the vendor
scoring resulted in districts receiving AHSA scores on Thursday, April I, which for some was one
day prior to a weeklong spring break. Many of these students returned to school on April 12,
which was the beginning of the second administration window.
In addition to the scoring and administration windows, the advisory committee
recommended that department content specialists select and assign a limited number of PATs for
each administrative testing window that would be sufficient to cover the clusters within each
content area-for example three to four PATs per cluster. This would prevent an unlimited number
of PATs being administered to students. Also, a different set of SRA PATs would be issued for
each administrative testing window. It was recommended that each set would contain 16
mathematics PATs and 12 language arts PATs available for administration.
However, the PATs were selected by the vendor and the number of PATs available for
administration was severely limited. Math consisted of only eight PATs, five of which could be
administered with four necessary for passing. Language arts consisted of only six PATs, four of
which could be administered with three necessary for passing. Students could not be
administered more than the number of PATs allowed in the testing window.
Instead of being "an alternative assessment that provides students with the opportunity to
exhibit their understanding and mastery of the HSP A skills in contexts that are familiar and
related to their experiences,"-the department's own words-the AHSA process has become a "mini"
version of the high-stakes HSPA. (www.ni.gov/education/assessmentlhslindex.shtml#sra)
In a letter to Commissioner Schundler, NJEA recommended that the Department of
Education reconvene the SRA Advisory Committee to the review the problems with this year's
process and make recommendations for 2010-11.
Continued.. .
-
But that alone would be too little too late for this year's students.
So, we also asked the commissioner to:
1. Permit the scores of students who passed the AHSA to count favorably toward the
students' eligibility to graduate, but otherwise set aside the results until a thorough
review of the administration and scoring of the test is conducted.
2. Reinstate the option for local scoring with department audits for the April and
summer administrations of the AHSA.
3. Notify all districts that the DOE will provide fiscal and administrative support for
AHSA instructional programs to prepare for another test administration this
summer for all seniors who have completed all other requirements for
graduation.
I hope that you will likewise encourage the commissioner to take these actions. Thank you.
GGS:FLP:emr
5/19/1 0
-
DEPARTMENT OF EDUCATJON PO Box 500
TRENTON, NJ 08625-0500
JON S. CORZINE
Governor
To:
LUCILLEE. DAVY
Commissioner
Jacqueline Jones, Assistant Commissioner
Division of Early Childhood Education
Jay Doolan, Assistant Commissioner Division
of Educational Standards & Programs
From:
Timothy Peters, Director
Office of State Assessments
Date:
March 6, 2009
Subject:
Special Review Assessment (SRA) Advisory Committee Recommendations
The Special Review Assessment (SRA) Advisory Committee was convened for the first time
on August 29,2008, and met on four subsequent occasions, for four to five hours each time:
October 14, 2008, November 24, 2008 December 15, 2008, and January 26, 2009. The first
two sessions were held here at the department; the latter three meetings were held at the
Mercer County Office of Education.
The Committee's membership represented a broad range of professional organizations and
educational constituencies from throughout the state, including SRA practitioners, district
administrators, advocacy groups, and the Department of Education (DOE) itself. Since the
county offices play a central role in the SRA process, several county education specialists
were included in the committee. A full listing of members is appended. The Committee's
meetings were organized and coordinated by SRA program coordinator Dr. Faye Ball, and I
attended each session as well.
The essential charge to the SRA Advisory Committee was to propose protocols for
implementing reforms to the current SRA process. These reforms, primarily focused on the
reliability and transparency of the SRA administrative and scoring processes, were embodied
in presentations made by the department to the State Board in 2007 and in early 2008.
The State Board had in August 2005 agreed to eliminate the SRA, provided that the state
establish an alternate "second-chance" mechanism. A satisfactory alternate mechanism never
materialized, and so in March 2008 the State Board endorsed an SRA reform proposal with
these basic features:
·
www.nj.gov/education
New Jersey Is An Equal Opportunity Employer
Printed on Recycled and
Recyclable Paper
- --
Page 2
. The establishment of specific SRA administration windows, with limitations on
teacher access to the SRA performance assessment tasks (P AT);
. The standardization of the SRA administration process by having the state test
vendor assign and distribute the performance tasks to districts;
. Improvements to the validity and transparency of SRA scoring either by having the
state test vendor organize regional scoring centers staffed by trained New Jersey
teachers, or by some other audit mechanism;
. Collecting more data on student performance on the SRA and on student outcomes; .
Continued monitoring of districts that rely heavily on the SRA to graduate seniors; .
Continued instructional support and retesting for those who "fail" the SRA, and
ongoing support for English Language Learners (ELL) among the SRA population;
One fundamental issue emerged immediately from the Committee's deliberations: while the
Committee pursued its mission diligently and collegially, the majority of the Committee,
including all the non-DOE members, was uncomfortable with the wholesale removal of SRA
scoring from the local districts. This is reflected in their recommendations to the department,
which are described below.
. The establishment of specific SRA administration windows/Limitations on
teacher access to SRA materials
Recommendations
The SRA Advisory Committee recommends four three-week administrative windows for the
administration of the SRA PATs. The four administrative testing windows would be
scheduled every 8 weeks, so that the window is preceded by an 8 week instructional period.
The four windows would occur within the months of November, February, May, and August
of each year. To avoid the practice or the appearance of improper use of PATs for preassessment instructional purposes, teachers would not have access to the PATs during this
time. CDs containing a limited, assigned set of SRA PATs would be available for pickup at
the county offices by the district no earlier than five business days before the start of each
testing window and the CDs would have to be returned within five business days after the
close of each administration window.
. The standardization of the SRA administration process by having the test
vendor assign and distribute the PATs to districts
Recommendations
The SRA Advisory Committee recommends that NJDOE content coordinators select and
assign a limited set of PATs for each administrative testing window, sufficient to cover the
clusters within each content area (e.g., 3-4 per cluster). Thus, students would not have
unlimited opportunities within the administration window to successfully complete PATs.
Unsuccessful completion of the available PATs in a given window would be followed by
additional instruction leading up to the next administration period. However successful
completion of PATs during any administrative window would continue to count toward
meeting required graduation standards.
Page 3
In language arts literacy (LAL), students would still have to complete 2 writing PATs, one
persuasive reading PAT, one narrative reading PAT; for mathematics, two PATs for each
cluster, for at total of 8 mathematics PATs.
A different set ofSRA PATs would be issued for each administrative testing window. Each
set (depending on the robustness of the item pool) would contain 16 mathematics PATs (4 for
each standard), 4 writing prompts, 2 narrative reading passages with 4 related PATs (2 PATs
per passage), and 2 persuasive reading passages with 4 related PATs (2 PATs per passage).
The Committee did express interest in the possibility of using HSP A cluster data and related
metrics (e.g., statewide cluster mean score) to limit a student's SRA obligations to those
content clusters in which the student had demonstrably failed to attain proficiency (e.g.,
geometry, data analysis, patterns and algebra, but not numerical operations); however, no
consensus was reached on the methodology for achieving this goal.
.
Improvements to the validity and transparency of SRA scoring either by having the
state test vendor organize regional scoring centers staffed by trained New Jersey
teachers, or by some other audit mechanism
Recommendations
As noted above, for both educational and logistical reasons, the majority of the Committee
remains uncomfortable with the notion of completely removing SRA scoring from the local
districts. The district representatives emphasized both the "immediacy" and directness of
the professional development benefit - that is, the direct engagement with student work and
scoring of student work - and the immediacy of information about student performance as
important benefits of retaining the local scoring. The Committee also expressed concern as to
whether regional scoring centers operating for limited periods of time could handle the
volume of SRA student responses within the necessary timelines, or accommodate later
transfers into the district.
The Committee did take seriously the obligation to assure the validity of SRA scoring and the
credibility of high diplomas earned through the SRA. The Committee recommended a tiered
or graduated audit system that subjects all schools and districts to periodic external
verification, provides heightened monitoring and, where necessary, corrective action in high
use schools and districts, and where warranted, removes scoring from local schools to
external groups of trained NJ educators.
Such an audit process would monitor local district adherence to the SRA scoring rubrics and
identify schools or districts that depart from those rubrics to a significant or pervasive degree.
Schooh; or districts found to abuse or misuse the SRA scoring rubrics would lose the right to
score their students' SRA task responses, and these would be scored by the state testing
vendor or other trained scorers assigned by the department. In the Committee's recommended
scenario, all high schools would be subject to an ongoing audit process, and districts with a
distinctly high SRA usage rate might be automatically subject to such audits.
Page 4
Additional validity steps could include retraining of the district scorers by DOE or vendor
specialists and/or recruitment of other teachers to be trained as scorers. Furthermore, required
scorer training sessions could be conducted, with a tracking system to verify participation by
district staff.
The Committee recommends that the department focus on systemic improvement and
verifiability of the alternative assessment processes and avoid any oversight process that
would result in an individual student's passing SRA score being overruled or retracted.
. Collect more data about student performance on the SRA
Recommendations
The Committee agrees that the department must collect more data on SRA performance, to
include information about course taking patterns, access to highly qualified teachers,
supplemental instruction and other supports for SRA students, and to include information
about student performance on the PATs themselves, much as item statistics are collected on
HSPA test items. The SRA Advisory Committee recommends that NJSMART be developed
to capture those data elements deemed important to measure the reliability, validity and
ultimate outcomes of the SRA process. As with the HSP A and other statewide assessment
programs, data should also be disaggregated by DFG, ethnicity, and other major subgroups.
. Continued monitoring of districts that rely heavily on the SRA to graduate
seniors
Recommendations
As noted above, the Committee recommends ongoing monitoring of heavy-use SRA
districts, to assure the validity of their SRA administration and scoring protocols, and to help
inform analysis of the SRA student population and its instructional needs.
. Continued instructional support and retesting for those who "fail" the SRA.
Recommendations
Instructional support for the SRA student population remains a priority for the Committee,
but the Committee's obligation to focus on the SRA administration protocols prevented fuller
discussion of this important topic. Similarly, the Committee acknowledged the importance of
vertical articulation for at risk freshmen in order to achieve the longer term goal of reducing
the SRA student population by providing instructional support for them much earlier. It may
be desirable to re-convene the Committee at some future date for the purpose of pursuing
these instructional support and articulation issues.
-
-
- -
Page 5
Membership
The members of the 2008-2009 SRA Advisory Committee are as follows:
. Dr. James P. Doran, Principal, Adult & Alternative Education, Hudson County Schools
of Technology and representing NJ Council of County Vocational-Technical Schools
Dr. Amy C. Fratz, Associate Director, Professional Development, New Jersey
Education Association (NJEA)
. Dr. Deborah L. Ives, K-8 Mathematics Supervisor, Livingston Public Schools and
Immediate Past president of the Association of Mathematics Teachers of New
Jersey
Ms. Christine Kane, Director, NJ Performance Assessment Alliance, New Jersey
Principals and Supervisors Association (NJPSA)
. Mr. Dan Kortvelesy, Curriculum Supervisor, Mathematics, Technology, Business
Education and Media, Mainland Regional High School
Mr. Stan Karp, Director, Secondary Reform Project, Education Law Center
Mr. James Leutz, Supervisor of Testing, The East Orange School District
Mr. David M. Matonis, Supervisor, Special Programs, Curriculum and Instruction,
Bridgewater Raritan High School
Ms. Karen Moore, Retired Teacher, Trenton Central High School, and Former
Member, SRA LAL Test Development Committee
Ms. Carla Norris, District SRA Coordinator, School Leadership Team 11, Newark
Public Schools
.
.
.
.
.
.
.
The department was represented by:
. Dr. Rob Akins, Measurement Specialist, Office of State Assessments
Dr. Faye Ball, SRA Coordinator, Office of State Assessments
. Ms. Eileen Gavin, County Education Specialist, Union County Office of Education
Mr. Timothy Giordano, Mathematics Coordinator, Office of Mathematics and
Science Education
Ms. Carol Mizrahi, County Education Specialist & Former Member, SRA LAL Test
Development Committee, Salem County Office of Education
Dr. Susanne Miskiewicz, County Education Specialist, Middlesex County Office of
Education
. Dr. Linda Morse, Director, Office of District and School Improvement Services
Dr. Timothy Peters, Director, Office of State Assessments
Ms. Lori Ramella, Education Specialist, Office of Student Achievement and
Accountability for Raquel Sinai, Coordinator, Bilingual/ESL Education
. Ms. Jackee Reuther, County Education Specialist, Mercer County Office of
Education
.
.
.
.
.
.
TP /Ir
c: Faye Ball
-
--
-
Download