Peter Shulman Presentation

advertisement
Update on Educator Evaluation
and the Transition to PARCC
January 2015
Objectives
• Provide an update on implementation of AchieveNJ,
including key takeaways and areas of improvement
identified by educators
• Discuss implications for evaluation in the transition from
NJ ASK to PARCC
2
Agenda
AchieveNJ Implementation Update
Key Findings
Implications of Transition to PARCC
3
AchieveNJ: A Careful, Deliberate Path
State Advisory
Committee, Pilot 1
launched
Educator
Effectiveness Task
Force formed
2010
All districts launch
improved
evaluations
TEACHNJ Act
passed
State Advisory
Committee and
external Rutgers
reports issued
$38 million
Race to the Top
award for NJ
2011
Task Force releases
recommendations
Introduction to AchieveNJ
2012
1st round of
evaluation
regulations
proposed
Input and
continuous
improvement
Pilot 2 launched
2013
2nd round of
evaluation
regulations
proposed
2014
2015
Interim implementation
report released; 3rd round
of evaluation regulations
proposed
4
Essential Elements of AchieveNJ
Support
•
•
•
•
Required training
Targeted feedback
School Improvement Panel
Corrective Action Plans for Ineffective/Partially Effective rating
Evaluation
•
•
•
Four levels of summative ratings
Multiple observations
Multiple objective measures of student learning
Tenure
•
•
•
Earn tenure after 4 years based on effectiveness
Effective ratings required to maintain tenure
Dismissal decisions decided by arbitrators
5
Current Status
• Year 1 (2013-14) Interim AchieveNJ Implementation Report published
November, 2014
• Educators now almost halfway through Year 2 of implementation
• Office of Evaluation continues to support districts and leaders with
resources and direct coaching, as needed
• Statewide advisory committee comprised mainly of NJ educators meets
monthly
• 2013-14 median Student Growth Percentile (mSGP) scores recently
released
• Year 1 Final Implementation Report (including analysis of statewide
findings) to be released in spring 2015
6
Release of 2013-14 mSGP Scores
All districts received secure access to their 2013-14 teacher and principal/AP/VP
Median Student Growth Percentile (mSGP) data on January 8, 2015.
•
NJDOE has worked with NJ educators in taking a long and thoughtful approach
to implementing both evaluations and mSGP.
•
mSGP data is an important part, but only one part of an educator’s evaluation.
These scores will be used to finalize 2013-14 evaluations and to inform
educators’ ongoing professional development.
•
About 15% of teachers and 60% of principals/APs/VPs received 2013-14 mSGP
scores.
By statute, mSGPs (like all aspects of an individual’s evaluation) are confidential and should not
be shared publicly.
7
Timeline of SGP Development in New Jersey
A thoughtful, multi-year approach to ensure data is accurate and usable
Evaluation Pilot
Federal Mandate for Stimulus
Advisory Committee
Funds: States Must Calculate
District SGP Profile
Provides Feedback
“Student Growth”; Link
Reports Deployed
on Usefulness of
Teachers to Students
via NJ SMART
SGP Data
2011-12 Teacher
Median SGP (mSGP)
Student SGPs
Reports Provided
Provided to All
to Pilot Districts for
Districts in NJ SMART
Learning Purposes
2010
NJ Adopts SGP
Methodology for
Calculating
Student Growth
2011
SGP Training
Begins for Districts;
SGP Video Released
2012
TEACHNJ Act
Passed; Growth
Measures Required
for Evaluation
2012-13 Teacher
mSGP Reports
Provided to All
Districts for Learning
Purposes and Data
Preview
2013
School SGPs Used in
School Performance
Reports per NJ’s
Federal ESEA Waiver
2014
2013-14 mSGP
Score
Verification &
Certification
Process
Completed by
Districts
2015
2013-14 Teacher &
Principal mSGP Reports
Provided to All Districts
for Use in Evaluations;
Score Verification Process
Announced
8
2013-14 mSGP Data
• The 2013-14 mSGP data counts for 30% of qualifying teachers’ and 20
or 30% of qualifying principals’ 2013-14 evaluations.
• Evaluation data of a particular employee shall be confidential in
accordance with the TEACHNJ Act and N.J.S.A. 18A:6-120.d and 121.d.
– Educator mSGP data should be handled in the secure manner one
would treat, handle, and store any part of a confidential personnel
record and should not be released to the public.
• While a dry run for teacher mSGP data was conducted last year to
improve roster verification processes, if educators identify a problem with
the 2013-14 mSGP score, the Department is offering options for
addressing the issue.
9
Evaluation Score Certification Tool
• Districts will have an opportunity to certify that all 2013-14 evaluation
data is correct or to make changes where necessary.
– The Department will release the 2013-14 Evaluation Score
Certification Tool, a new electronic application for districts to use in
certifying final 2013-14 summative ratings for all educators, in late
January.
– This interface will allow districts to review data, correct any errors
that occurred in the original NJ SMART submission, and certify the
accuracy of each staff member’s final score.
– Districts will have approximately one month to complete this process
after release of the tool.
10
Agenda
AchieveNJ Implementation Update
Key Findings
Implications of Transition to PARCC
11
2011-Present: Successes and Challenges
Successes
Substantive shifts in conversations about effective instruction and instructional leadership
Better, more frequent observations and feedback for teachers from administrators
Increased alignment in instruction, assessments, professional development and PLCs
Transformation of DOE practice from monitoring and compliance to support and accountability
Challenges
Simplifying and streamlining communication while maintaining depth to support implementation
Providing guidance and support to myriad educator specializations and unique circumstances
Timeline for availability of SGP data to districts
Shifting administrator time given importance and demands of observations and feedback
Introduction to AchieveNJ
12
Year 1 Interim Implementation Report
Methodology: Evidence from work with about 300 LEAs , inclusive of
deeper analysis of 17 Partner Districts
Survey Data
•Statewide Rubric
reporting and
compliance survey
data
•Deeper qualitative
feedback through a
questionnaire
Student Growth
Objective (SGO)
Data
•350 SGOs were
evaluated using the
SGO Quality Rating
Rubric
Teacher Practice
Data
•Sample provided
data on 8,350
teachers who were
collectively observed
approximately
25,000 times.
13
Overall Findings
• Districts have done a good job of implementing required elements.
• We want to move from meeting requirements to high quality execution.
Ownership
Quality
Compliance
13-14
14-15
15-16
14
Key Finding 1: More Observations and Feedback
The majority of School Districts are getting the required number of
observations completed.
Before AchieveNJ
Year 1 of AchieveNJ
1 observation
3 observations
per tenured teacher on average
per tenured teacher on average
Approx. 90,000
Approx. 270,000
Observations of
tenured teachers across NJ
observations of
tenured teachers across NJ*
+ 180,000 observations
of tenured teachers in 2013 - 14
*numbers based off estimates
15
Key Finding 2: Observers Differentiate Between
Lessons
Districts are differentiating between the best and worst teaching in their
schools, but distributions vary between districts.
10th
10th
90th
4
4
3.63
3.5
3
2.57
2.5
2
1.5
1
Individual Observations
District A
3.00
3.5
Observation Score
Observation Score
3.30
90th
3
2.5
2
1.5
1
Individual Observations
District B
16
Key Finding 3: Observers Differentiate within
Lessons
Many observers are identifying the strengths and weaknesses of individual
lessons and sharing that information with teachers.
Domain 1: Planning and Preparation
Domain 2: Classroom
Environment
Domain 3: Instruction
Observation
1a
1b
1c
1d
1e
1f
2a
2b
2c
2d
3a
3b
3c
3d
1
2
3
2
2
2
2
2
2
1
2
2
1
2
2
2
2
3
3
3
3
2
3
3
3
3
3
2
2
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
4
3
3
3
3
3
3
3
3
3
3
3
3
3
3
5
3
3
3
2
3
3
3
3
3
4
3
3
3
3
6
3
4
3
3
3
3
3
3
3
3
3
2
3
3
7
3
4
4
3
3
3
4
4
3
4
4
4
4
3
17
Key Finding 4: Teachers Set Student Learning
Goals (SGOs)
Districts are setting the required number of measurable and specific
goals.
Before AchieveNJ
Year 1 of AchieveNJ
0 Goals Required
1 - 2 Goals Required
per teacher
per teacher
?
Approx. 200,000
Goals by teachers tracked and
scored across NJ
Goals by teachers tracked and
scored across NJ*
+ 200,000 Learning Goals Set
tracked and scored in 2013 - 14
*numbers based off estimates
18
Key Finding 5: Use of Data to Set SGOs
Nearly all (98.5%) sample SGOs included some form of baseline data to
inform the goal they set for their students.
Prior Year
Final Grade
Current Year
Test Scores
Math
Average
Score
Participates in
Class
Completes
Homework
1
86
98.5
Yes
No
2
73
92.5
Yes
Yes
3
96
95
Yes
Yes
4
92
85.5
Yes
No
5
67
54
No
No
6
69
58
No
No
7
78
72.5
Yes
No
8
94
80.5
No
No
Student
ID
Markers of Future Success
Preparedness
Group
Number
1
2
2
1
0
0
1
0
1
1
1
1
3
3
2
2
19
Key Finding 6: SGO Alignment and Quality Vary
The alignment of SGOs to New Jersey content standards was inconsistent
across Districts, as was the quality of assessments used.
1
2
K-8 District 1
K-8 District 2
Assessments Used
•
•
•
MAP Assessment
DRA 2 Assessment
Common Writing
Rubric
Assessments Used
•
•
•
Teacher-created
Everyday Math
Model Curriculum Unit
Tests
20
Key Finding 7: Compliance with DEACs and ScIPs
99% of Districts across the State
report having DEACs and ScIPs in place.
99%
DEACs: 60% of partner districts report that they
used the group to "analyze implementation
successes and challenges to recommend
improvements.”
60%
ScIPs: 20% of partner districts said the ScIP was
highly functioning and leading implementation.
20%
21
Focus for Districts and State’s Response
Focus for Districts
•
•
•
Observations :Ensure that all
teachers are getting required
number of observations;
continue to improve accuracy
and quality of feedback
SGOs: Align goals and scoring
plans; use data to set better
targets; improve assessment
quality
DEACs/ScIPs: Provide data and
information to members to help
group inform district and school
policies; develop targeted
support for school staff.
State’s Response
•
•
•
Observations: Streamlined
processes (reg. changes/
waivers); “Syllabus for Success”;
40+ teacher practice
workshops; Achievement
Coaches Program
SGOs: SGO 2.0 workshop and
Guidebook; streamlined forms;
more examples; reg. changes
DEACs/ScIPs: New website
pages; ScIP 1.0 guidance; ScIP
training workshops.
22
Agenda
AchieveNJ Implementation Update
Key Findings
Implications of Transition to PARCC
23
Increasing Student Achievement: An Aligned
Approach
COMMON CORE
PARCC
With fewer,
clearer and
ACHIEVE NJ
more rigorous
standards… aligned
assessments
providing
timely, accurate
data…
and an
evaluation
system that
emphasizes
feedback and
support…
Setting the Context
Effective
Teaching
Instructional
Leadership
Student
Achievement
we impact
teachers and leaders to increase student
achievement.
24
Implementation Timeline: Common Core, State
Assessments, and Student Growth Data
‘10-’11
CCSS
curriculum
alignment
begins (K-2
math)
‘11-’12
CCSS curriculum
alignment
continues (K-12
ELA, additional
math)
CCSS aligned
questions
piloted in NJ
ASK
Setting the Context
‘12-’13
All curriculum
aligned to CCSS
NJ ASK aligned to
CCSS in ELA (3-8)
and Math (3-5)
2011-12 median
Student Growth
Percentiles
(mSGPs) released
to pilot districts
‘13-’14
‘14-’15
NJ ASK completely
aligned to CCSS
PARCC piloted in
classrooms across
1,276 schools
2012-13 mSGPs
released to all
districts as practice
exercise
Full PARCC
Implementation
2013-14 mSGP
data released
25
SGP and the PARCC Transition
Multi-Year Preparation
Growth, Not Proficiency
•CCSS adopted 4
years ago
•NJ ASK incrementally
aligned to CCSS in
content and rigor
•2014 NJ ASK scores
show little change
compared to prior
years despite
increased rigor
•81% of schools who
will use PARCC
engaged in pilot
testing technology
•To calculate SGP,
student growth is
compared with
growth of academic
peers taking the
same assessments
statewide
•SGP does not depend
on tests having a
consistent scale and
is not a criterionreference metric
Multiple Measures,
Lower Stakes
•mSGP is one of
multiple measures for
the educators in
tested grades and
subjects; others
include educator
practice, SGOs, and
additional goals and
evaluation leadership
for principals
•mSGP weight
reduced to 10% for
all educators for
2014-15 to recognize
adjustment to new
assessment
26
Questions and Follow Up
Peter Shulman
Assistant Commissioner/Chief Talent Officer,
Division of Teacher and Leader Effectiveness
Carl Blanchard
Interim Director, Office of Evaluation
www.nj.gov/education/AchieveNJ
educatorevaluation@doe.state.nj.us
609-777-3788
27
Download