Race to the Top Progress Update Sub-criterion (D)(2)

advertisement
(D)(2) Part B Narrative, North Carolina, August 2013
Race to the Top Progress Update
Sub-criterion (D)(2)
Part B: In preparation for monthly calls, States must submit written responses to the following
questions for two application sub-criteria (e.g. (A)(2) and (D)(4)).1 All responses in this section
should be tailored to the goals and projects associated with this sub-criterion.
Application sub-criterion:2
D (2): Improving teacher and principal effectiveness based on performance
(i)
Establish clear approaches to measuring student growth and measure it for each
individual student;
(ii)
Design and implement rigorous, transparent, and fair evaluation systems for teachers
and principals that (a) differentiate effectiveness using multiple rating categories that
take into account data on student growth as a significant factor and (b) are designed
and developed with teacher and principal involvement;
(iii)
Conduct annual evaluations of teachers and principals that include timely and
constructive feedback; as part of such evaluations, provide teachers and principals
with data on student growth for their students, classes, and schools;
(iv)
Use these evaluations, at a minimum, to inform decisions regarding –
(a) Developing teachers and principals, including by providing relevant coaching,
induction support, and/or professional development;
(b) Compensating, promoting, and retaining teachers and principals, including by
providing opportunities for highly effective teachers and principals to obtain
additional compensation and be given additional responsibilities;
(c) Whether to grant tenure and/or full certification to teachers and principals using
rigorous standards and streamlined, transparent, and fair procedures; and
(d) Removing ineffective tenured and untenured teachers and principals after they
have had ample opportunities to improve, and ensuring that such decisions are made
using rigorous standards and streamlined, transparent, and fair procedures.
North Carolina’s goals for this sub-criterion:
1

By 2010 - 2011, all participating LEAs will measure student growth.

By 2011-2012, all participating LEAs will have qualifying evaluation systems for
teachers and principals.
Note that States will only be required to submit documentation for the on-site program review, not for monthly calls. States
should work with their Program Officers to determine relevant state-specific documentation.
2
All highlighted fields will be pre-populated by the Department Program Officer prior to State completion.
1
(D)(2) Part B Narrative, North Carolina, August 2013

By 2011 - 2012, all participating LEAs will use qualifying evaluation systems to develop
teachers and principals, promote teachers, retain effective teachers and principals, grant
tenure and/or full certification to teachers and principals, and remove ineffective tenured
and untenured teachers and principals.

By 2011 - 2012, some participating LEAs will use qualifying evaluation systems to
compensate teachers and principals.
Relevant Projects:

Complete transition to use of online North Carolina Educator Evaluation System.

Add a student growth component to the North Carolina Educator Evaluation System.

Add educator statuses that identify effective and highly effective teachers and
administrators.

Publicly report on educator effectiveness.

Collaborate with external vendor to select and implement student academic growth
indicator.

Pilot student surveys and team value-added score for inclusion in the teacher evaluation
process.

Create aligned evaluation instruments and processes for school personnel not currently
covered by the North Carolina Educator Evaluation System.

Continue training and professional development on using the North Carolina Educator
Evaluation System with integrity and fidelity.

Require annual evaluation for all teachers.

Convene Educator Effectiveness Work Group.

Design Measures of Student Learning for all currently non-tested grades and subjects.

Continue to develop high-quality student-teacher data links.

Implement incentive bonuses to staff in low-achieving schools.
Questions:
1. Is the State on-track to implement the activities and meet the goals and performance
measures that are included in its approved scope of work for this sub-criterion? If
so, explain why. If not, explain why not.
The State has made strong progress toward meeting the goals and performance measures
listed above and in its Race to the Top application. Given the diversity of projects within the
D(2) sub-criterion, an update on each project appears below.
2
(D)(2) Part B Narrative, North Carolina, August 2013
Complete transition to use of online North Carolina Educator Evaluation System
North Carolina has transitioned to a new online platform for the North Carolina Educator
Evaluation System (NCEES). Public Consulting Group and Truenorthlogic are collaborating
to host NCEES in a more user-friendly environment that integrates with the host of
technology tools available in Home Base. While the evaluation standards and process have
not changed, the transition to a new technology platform provided an opportunity to engage
stakeholders in the design of screens, workflow process, automatic signature features, and
reporting options. Throughout the past year, LEA representatives have assisted with the
design of the new online platform and tested it prior to launch.
In addition to being more user-friendly, the new technology platform will result in increased
data quality, as it utilizes automatic data flows from authoritative sources rather than the
manual entry process used in the platform previously employed (known as “McREL”). The
State’s Student Information System, payroll data, Unique ID system, and Human Resource
Management System provide authoritative data into the new Truenorthlogic platform.
During the summer of 2013, representatives from Truenorthlogic and the Public Consulting
Group collaborated with Department staff members to train over 800 members of LEA
training teams in how to use the updated NCEES platform. The Home Base Support Center
(see section three) is assisting users with questions as they access and begin to use the
platform. As of mid-August 2013, over 1,000 teachers per day are logging into the new
platform to complete self-assessments and write professional development plans.
Add a student growth component to the North Carolina Educator Evaluation System
In spring 2013, the State Board of Education approved changes to the sixth standard of the
North Carolina Teacher Evaluation Process and the eighth standard of the North Carolina
School Administrator Evaluation Process. The sixth standard is now based solely (100
percent) on a teacher’s individual impact on student learning as measured through the End of
Grade assessments, End of Course assessments, Common Exams, Career and Technical
Educator assessments, and other methods of measuring growth that are still in development.
Teachers who do not have individual growth data will continue to receive a sixth standard
rating based on school-wide growth data until the assessment process for their grade and
subject area is complete. The State Board of Education also revised policy around the
assessments used to measure growth in a school administrator’s eighth standard rating. The
rating will be based on growth on the End of Grade assessments, End of Course assessments,
Common Exam, and Career and Technical Education assessments.
Add educator statuses that identify effective and highly effective teachers and
administrators
In spring 2012, the State Board of Education approved the definitions of educator status, a
summative indicator of a teacher’s overall effectiveness, based on the NCEES ratings. While
no teacher or administrator has an overall status yet, the evaluation dashboards do present the
methodology for determining status and will allow educators to track their progress toward
status.
3
(D)(2) Part B Narrative, North Carolina, August 2013
Publicly report on educator effectiveness
In May 2013, the Department of Public Instruction released public reports on educator
effectiveness data from the 2011-12 school year. The reports are now housed in an online
database that allows users to search for data from certain districts and schools, as well as
access both the 2010-11 and 2011-12 data. The web address for the new database is
http://www.ncpublicschools.org/effectiveness-model/data/.
Collaborate with external vendor to select and implement student academic growth
indicator
After the selection of EVAAS as the statewide growth model, the Department moved quickly
to implement several enhancements to the system, including online teacher web reporting,
online teacher access, online learning modules, and professional development. Additionally,
the Department expanded the use of EVAAS through:

Value-Added Reporting for Additional Grades/Subjects and Courses: The SAS
Institute is currently exploring value-added analysis with the results of the Common
Exams administered during the 2012-13 school year. The EVAAS team is also
running simulations with the data from the Spring 2013 pilot of a K-2 assessment
focused on literacy to identify various ways that growth can be measured through a
pre- and post-assessment of students’ reading comprehension levels. The SAS
Institute continues to add value-added reporting for additional Career and Technical
Education courses as standards are revised.

Roster Verification: During Spring 2013, teachers of courses and subjects/grades with
End of Grade assessments, End of Course assessments, Common Exams, and Career
and Technical Education assessments logged into EVAAS to verify their class rosters
and adjust instructional responsibility and instructional availability levels for students.
The use of the roster verification tool will result in more accurate student-teacher data
linkages used in value-added analysis.

Online Learning Modules: NCDPI and the SAS Institute continue to develop new
learning modules and revise additional modules to ensure that they remain aligned
with policy (for example, the State Board of Education’s changes to the sixth and
eighth standards described above). Learning modules scheduled for release between
now and the end of September 2013 include: district and school academic
preparedness reports, student search and custom student reports, student history and
projections, teacher value-added and diagnostic reports (gain model), and teacher
value-added and diagnostic reports (prediction model). These modules are scheduled
for release prior to the release of 2012-13 value-added data.

Professional Development: The SAS Institute continues to offer between five and
eight virtual trainings on EVAAS each week, and the SAS trainers have also
completed in-person trainings at several of the Regional Education Service Alliances
(RESAs), as well as regional conferences. The SAS staff and other NCDPI staff have
trained the NCDPI Professional Development Leads in the Educator Effectiveness
Division so that they can complete in-person training on EVAAS. In school year
4
(D)(2) Part B Narrative, North Carolina, August 2013
2012-13, NCDPI trainers held 25 sessions that reached over 4,500 educators. The
team will continue training during school year 2013-14.
Pilot student surveys and team value-added score for inclusion in the teacher evaluation
process
Because the State will be using the EVAAS roster verification tool, the concept of a team
value-added score is no longer under consideration. Teachers who provide instruction in
team environments will indicate their teaming in the roster verification tool, and the growth
of shared students will be included in the teachers’ individual value-added scores.
NCDPI has met with representatives from Pearson and Truenorthlogic to investigate the
possibility of integrating a student survey platform into Home Base. Discussions continue as
the vendors draft proposals for such a tool.
Create aligned evaluation instruments and processes for school personnel not currently
covered by the North Carolina Educator Evaluation System
NCDPI has completed the design of the optional evaluation instrument and process for
school speech language pathologists, school-based physical therapists, school-based
occupational therapists, and school nurses. Additionally, design of required evaluation
instruments for instructional technology facilitators, library media coordinators, social
workers, school psychologists, and school counselors is complete, and the instruments and
processes are in use in school districts for the 2013-14 school year. While the final
validation studies are still pending, the NCDPI and Research and Evaluation Associates have
already made the needed revisions that were noted in the validation study.
Various NCDPI consultants and divisions are delivering training on the evaluation processes
through in-person and virtual sessions. For example, the Division of Digital Teaching and
Learning is delivering training on the evaluation processes for library media coordinators and
instructional technology facilitators.
NCDPI and Research and Evaluation Associates continue to work with stakeholder groups
on the design of professional standards, evaluation rubrics, and evaluation processes for
career development counselors and teacher-leaders.
Continue training and professional development on using the North Carolina Educator
Evaluation System with integrity and fidelity
NCDPI’s new NCEES consultant has been on-board for almost a year. Working with a team
of Professional Development Leads, she has designed standardized training documents and
resources for the teacher evaluation instrument and process, particularly around the topic of
inter-rater reliability. During the 2013-14 school year, the NCEES team will deliver regional
trainings on “Fine-Tuning Evaluation Ratings.”
The NCEES team is also working on “master rating” of classroom videos shared by the Bill
& Melinda Gates Foundation, and has incorporated some videos into the training on interrater reliability. Pending approval by the Office of State Budget and Management and
USED, the NCDPI will release a proposal for an online observation calibration tool that will
5
(D)(2) Part B Narrative, North Carolina, August 2013
allow school administrators to watch classroom videos and compare their ratings on the
observation rubric with those of “master raters.”
Require annual evaluation for all teachers
The State Board of Education approved a policy requiring an annual evaluation of all
teachers, effective with the 2011-12 school year.
Convene Educator Effectiveness Work Group
The Educator Effectiveness Work Group includes NCDPI staff members and representatives
from numerous stakeholder groups, including teachers, principals, central office staff,
superintendents, parents, higher education, research scholars, not-for-profit organizations,
teacher organizations, principal organizations, and the State Board of Education. The Work
Group has considered the wording of the sixth and eighth standards, the combination of
measures that inform the sixth standard rating, the ratings for the sixth and eighth standards,
the abbreviated evaluation option, and student surveys as a source of data on teacher quality.
The group will continue to be actively involved in the formation of policy recommendations
sent to the State Board of Education for action. Most recently, the Work Group met to
provide feedback on the administration of the Common Exams and to provide
recommendations for policy around the exams.
Design Measures of Student Learning for all currently non-tested grades and subjects
During the 2012-13 school year, school districts and charter schools administered 1,237,795
Common Exams for 35 courses or grades/subjects not assessed with End of Grade or End of
Course assessments. NCDPI’s Test Development staff have analyzed the results, set scales
for the assessments, and passed along the data to the SAS Institute so that they can explore
value-added modeling with the assessment results. As the State prepares for administration
in school year 2013-14, there will be changes to the administration process as well as to some
of the exams. NCDPI will print and ship copies of the exams to all districts and participating
charter schools to prevent any printing problems. NCDPI will ask the State Board of
Education to approve policies that require districts and charter schools to use the Common
Exams in place of teacher-made final exams and to use the Common Exam grades in
students’ final course grades. NCDPI is exploring centralized scoring of the constructed
response items on the exams, as well as more detailed scoring training for teachers (if the
scoring process remains at the local-level).
For those content areas not covered with state assessments, Career and Technical Education
Assessments, or Common Exams, NCDPI is piloting two additional processes for producing
Measures of Student Learning. In spring 2013, the State piloted four processes for using
handheld devices and standard reading passages to measure K-2 students’ reading
comprehension. The pilot tested the security required to collect unbiased and reliable data.
The SAS Institute is running simulations with the data from the pilot to explore which
pre/post test method for measuring growth is the best fit for the data obtained through the
process. They will also explore the distribution of growth scores to determine which process
produced the most unbiased results. Their analysis, along with qualitative survey data from
just under 1,000 teachers, administrators, and central office staff from across the state, will
6
(D)(2) Part B Narrative, North Carolina, August 2013
inform a policy recommendation to the State Board of Education on how to measure growth
for K-2 teachers.
NCDPI will measure growth for Grade 3 teachers with a pre/post test model (similar to the
process described above) that will use results of a third grade Beginning of Grade English
Language Arts assessment and the third grade End of Grade English Language Arts
assessment. Third grade students across North Carolina are taking the Beginning of Grade
assessment as they begin school.
Teachers in performance-based and service-delivery content areas will use the Analysis of
Student Work (ASW) Process to determine their Standard 6 rating. The ASW Process
involves the collection and evaluation of student work to document student growth. An initial
ASW pilot with 100 educators in the areas of Arts Education, Healthful Living, and World
Languages was conducted in spring 2013. Based on feedback from the initial pilot and
sessions conducted by NCDPI staff during the 2013 Summer Institutes, the Department plans
to develop a new online platform for the ASW process and to conduct an expanded pilot
during the 2013-2014 school year. The expanded ASW pilot would include the original three
pilot areas as well as Academically or Intellectually Gifted (AIG), Advanced Placement
(AP)/International Baccalaureate (IB), English as a Second Language (ESL), and Exceptional
Children (EC).
Continue to develop high-quality student-teacher data links
NCDPI partnered with the SAS Institute for the use of their roster verification tool. One
roster verification window in the spring allowed all teachers who administered state
assessments, Career and Technical Education Post-Assessments, and Common Exams to
verify their first and second semester rosters. NCDPI established data flows to the SAS
Institute to facilitate the process, and has updated those data flows as the State has updated its
statewide student information system (now PowerSchool). During the 2013-14 school year,
there will be two roster verification windows (one in the fall and one in the spring) to
simplify the process for educators teaching on a block/semester schedule.
NCDPI is using webinars and user testing sessions to gather feedback on both changes to the
actual roster verification tool as well as the guidance provided to school districts. NCDPI
received feedback from central office staff members on August 6 and will hear from school
administrators on September 12 and teachers on September 17. NCDPI and SAS Institute
will use feedback to make changes to the tool, and NCDPI will update guidance and training
materials before the fall 2013 roster verification window opens.
Implement incentive bonuses to staff in low-achieving schools
In April 2013, the State Board of Education approved a policy to transition the incentive
bonuses from school-level awards to classroom-level awards. Under the new policy, in
2012-13 and 2013-14, all eligible certified staff members in a school that receives a valueadded school composite in the “Exceeds Expected Growth” range will still receive a Race to
the Top bonus payment of no more than $1,500. In these schools, eligible teachers who
receive an individual value-added teacher composite in the “Exceeds Expected Growth”
7
(D)(2) Part B Narrative, North Carolina, August 2013
range will receive an additional bonus payment of no more than $500 above the $1,500
payment made on the basis of the school value-added composite, for a total payment of no
more than $2,000. In schools where eligible staff members do not receive a bonus payment,
as a result of a value-added school composite that does not reach the "Exceeds Expected
Growth" range, eligible teachers who receive an individual value-added teacher composite in
the "Exceeds Expected Growth" range will receive a bonus payment of no more than $2,000.
The State Board of Education also approved the use the of State Board of Educationapproved growth model (the Education Value-Added Assessment System, or EVAAS) to
measure student growth.
2. Does the State have evidence indicating the quality of implementation for this subcriterion? What is/has the State doing/done as a result of this information?
The State is using a number of processes to track progress. Initially, much of the work
focused on meeting deadlines, for example, the administration of the Common Exams, and
the release of teacher-level value-added data.
The online NCEES provides data that can be used for analysis on the fidelity of
implementation of the tool and process. Under the auspices of a grant from the National
Governors Association, the Department completed analysis of the relationship between
student growth and teacher ratings on the first five standards of the evaluation instrument,
and has used the information to drive the development of inter-rater reliability training
sessions and other supporting documents for the NCEES.
NCDPI strives to use both qualitative and quantitative feedback on its educator effectiveness
initiatives to improve implementation. With the Analysis of Student Work pilot, the
Common Exams, and the K-2 literacy pilot, the State is completing analysis of student data,
but has also administered surveys to capture qualitative information from educators. Many
of the adjustments being made to the administration of the Common Exams are a result of
this feedback, and feedback from the Analysis of Student Work pilot has led to the State’s
request for an additional pilot year to develop further an online platform and training.
3. What obstacles and/or risks could impact the State’s ability to meet its goals and
performance measures related to this sub-criterion?
While there are still aspects of the D(2) Scope of Work in the design/pilot phase, NC is
entering a critical point in its work on educator effectiveness: the transition from planning to
implementation. As the State anxiously awaits the release of data from the 2012-13 school
year, review of both evaluation data and student growth data will be critical to further
refinement of the system and targeted support for districts struggling with implementation.
Additionally, several proposed delays to the implementation of the sub-criterion are still
under discussion.
Communication regarding the enhanced NCEES process, how to use the information
produced by the process, and how to explain the process to parents and the community at
8
(D)(2) Part B Narrative, North Carolina, August 2013
large is an ongoing challenge. Continuing to make progress in meeting this challenge is
critical to the short- and long-term viability of the enhanced NCEES.
Evaluation: Based on the responses to the previous question, evaluate the State’s
performance and progress to date for this sub-criterion (choose one)
Red (1)
3
Orange (2)
Yellow (3)
Green (4)3
Red – substantially off-track and/or has significant quality concerns; urgent and decisive action is required; Orange –off-track
and/or there are quality concerns; many aspects require significant attention; Yellow –generally on-track and of high or good
quality; only a few aspects require additional attention; Green – on-track with high quality.
9
(D)(2) Part B Narrative, North Carolina, August 2013
Paperwork Reduction Act Statement
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a
collection of information unless such collection displays a valid OMB control number. Public
reporting burden for this collection of information is estimated to average 74 hours (annually)
per response, including time for reviewing instructions, searching existing data sources,
gathering and maintaining the data needed, and completing and reviewing the collection of
information. The obligation to respond to this collection is required to obtain or retain benefit
(34 CFR 75.720, 75.730-732; 34 CFR 80.40 and 80.41). Send comments regarding the burden
estimate or any other aspect of this collection of information, including suggestions for reducing
this burden, to the U.S. Department of Education, 400 Maryland Ave., SW, Washington, DC
20210-4537 or email [email protected] and reference the OMB Control Number 18940011.
10
Download