The New MA Educator Evaluation Framework:

advertisement
The New MA Educator
Evaluation Framework:
District-Determined Measures and
Student and Staff Feedback
ASE June Statewide Conference
June 10, 2013
Ron Noble
Educator Evaluation Project Co-Lead
Agenda
Setting the Stage
2012-2013 Implementation
Lessons Learned
On the Horizon
District-Determined Measures
Student and Staff Feedback
Q&A
2
Massachusetts Department of Elementary & Secondary Education
Setting the Stage
When policy and practice must move faster than
research and development, where do you begin?
ESE Philosophy:
 Don’t let perfection be the enemy of good: the work is
too important to delay.
 Understand this is just the beginning: we will be able
to do this work with increasing sophistication each
year
 Phased-in implementation: take advantage of
emerging research, resources, and feedback from the
field.
Massachusetts Department of Elementary & Secondary Education
3
Are variations in
Questions
for
contributions
measurable?
•
•
•
Policy Makers:
How should we
the MCAS
Attribution: “Whenuse
crediting
teachers for student learning, how should the
individual contributionsAlternate
of teachers acting in a coteaching or consultant role be
determined?
Assessment?
How do we
differentiate
Assessments: “How can the contributions to student
achievement be accurately
without
measured for teachers instructing special populations
for creating
which alternative
standards and/or assessments are used?”
“two systems”?
Educator differentiation: “Are the key features of teacher effectiveness for
specialized personnel, such as special education teachers different… and should
those unique features lead to additional or different content on observation
Who IS the
protocols, student growth assessments, or alternative instruments?”
evaluator?
•
Evaluator training: “When rating special education teachers…using an
observation protocol or alternative instrument, what special training, if any, do
evaluators need?”
Holdheide, L.R., Goe, L., & Reschly, D.J.. (2010) Challenges in Evaluating Special Education Teachers and English
Language Learner Specialists. National Comprehensive Center for Teacher Quality.
Massachusetts Department of Elementary & Secondary Education
4
Implementation Timeline
June 2011
MA Board of Education passed new educator evaluation regulations
September 2011
Implementation began in 34 “Level 4” schools, 11 “Early Adopter”
districts and 4 Special Education Collaboratives
January 2012
MA Department of Elementary and Secondary Education (ESE)
published the MA Model System for Educator Evaluation
September 2012
RTTT districts began implementation with at least 50% of educators.
September 2013
•RTTT districts begin implementation with remaining educators.
•Non-RTTT districts begin implementation with at least 50% of
educators.
2013-2014
school year
•All districts pilot District-Determined Measures.
•Selected districts pilot student and staff surveys
2014-2015
school year
•All districts implement District-Determined Measures.
•All districts implement student and staff surveys
June 2016
•District determine Student Impact Ratings for all educators.
Massachusetts Department of Elementary & Secondary Education
5
2012-2013 Implementation
234 Race to the Top Districts
At least 50% of educators
Summative Performance Rating only
5-Step Evaluation Cycle
June data reporting (EPIMS)
6 data elements:
1.
2.
3.
4.
5.
6.
Rating on Standard I
Rating on Standard II
Rating on Standard III
Rating on Standard IV
Overall Summative Performance Rating
Professional Teacher Status (Y/N)
Massachusetts Department of Elementary & Secondary Education
6
5 Step Evaluation Cycle
 Every educator is an
active participant in
an evaluation
 Process promotes
collaboration and
continuous learning
 Process applies to all
educators
7
Massachusetts Department of Elementary & Secondary Education
Summative Rating
Summative Performance
Rating
Exemplary
Self-Directed Growth Plan
Proficient
Needs
Improvement
Directed Growth Plan
Unsatisfactory
Improvement Plan
8
Massachusetts Department of Elementary & Secondary Education
Educator Evaluation Spring Convening:
Connecting Policy, Practice, and Practitioners
May 29, 2013
Over 700 participants from district teams
(RTTT and non-RTTT) and educator
preparation programs
Key messages:
Integrate with other key district initiatives
Opportunity to strengthen labor-management
relations
Albeit difficult, it’s the right work
Massachusetts Department of Elementary & Secondary Education
9
On the Horizon
District-Determined Measures
Student and Staff Feedback
10
Massachusetts Department of Elementary & Secondary Education
District-Determined Measures:
Key Terms
Student Impact Rating – a rating of high,
moderate, or low for an educator’s impact on
student learning
District-Determined Measures – measures of
student learning, growth, and achievement
that will inform an educator’s Student Impact
Rating
Massachusetts Department of Elementary & Secondary Education
11
Student Impact Rating
Regulations
 Evaluators must assign a rating based on trends (at least 2
years) and patterns (at least 2 measures)
 Options – 603 CMR 35.07(1)(a)(3-5)
 Statewide growth measure(s)*
 District-determined Measure(s) of student learning
comparable across grade or subject district-wide.
 For educators whose primary role is not as a classroom
teacher, the appropriate measures of the educator's
contribution to student learning, growth, and
achievement set by the district.
Massachusetts Department of Elementary & Secondary Education
12
Summative Rating
Two Ratings
Exemplary
1-yr Self-Directed
Self-Directed
2-yr Self-Directed
Growth Plan Growth Plan
Growth Plan
Proficient
Needs
Improvement
Directed Growth Plan
Unsatisfatory
Improvement Plan
Low
Moderate
High
Rating of Impact on Student Learning
Massachusetts Department of Elementary & Secondary Education
13
Student Impact Rating
Regulations
Why focus on growth?
Level playing field
Fairness
Achievement measures may be acceptable
when the district judges them to be the most
appropriate/feasible measure for certain
educators
14
Massachusetts Department of Elementary & Secondary Education
Revised Implementation
Timeline
 Commissioner’s Memo - 4/12/13
 2013-2014 – districts pilot and identify DDMs
 2014-2015 – districts implement DDMs and collect the
first year of trend data
 2015-2016 – districts collect the second year of trend
data and issue Student Impact Ratings for all educators
 Districts positioned to accelerate the timeline should
proceed as planned.
 Guidance and resources to support districts with the
identification of DDMs are available here:
http://www.doe.mass.edu/edeval/ddm/
Massachusetts Department of Elementary & Secondary Education
15
Revised Implementation
Timeline
 Minimum Piloting Requirements
 Early grade (K-3) literacy
 Early (K-3) grade math
 Middle grade (5-8) math
 High school writing to text
 Traditionally non-tested grades and subjects (e.g., fine arts,
music, physical education)
 If a district is unable to identify a DDM in the grades
and subjects listed above, the district must pilot one of
ESE’s exemplar DDMs to be released in summer 2013.
Massachusetts Department of Elementary & Secondary Education
16
Recommended Steps for Districts
Identify a team of administrators, teachers and
specialists to focus and plan the district’s work on
District-Determined Measures.
Complete an inventory of existing assessments used in
the district’s schools.
Identify and coordinate with partners that have
capacity to assist in the work of identifying and
evaluating assessments that may serve as DistrictDetermined Measures.
Quick Reference Guide: District-Determined
Measures
Massachusetts Department of Elementary & Secondary Education
17
17
ESE Supports
WestEd is supporting ESE with next steps in
implementing the Commonwealth’s Model System
for Educator Evaluation
Two broad categories of work
Support development of anchor standards in almost 100
separate grades/subjects or courses
Identification and evaluation of promising measures,
tools, tests, rubrics
Work to be completed by mid-August
Massachusetts Department of Elementary & Secondary Education
18
ESE Supports
 Supplemental guidance on the selection of DDMs
and the process of determining an Impact Rating
 DDM and Assessment Literacy Webinar Series (March –
December)
 Technical Guide A (released in May 2013) focuses on
selecting high quality assessments
 Includes Assessment Quality Checklist and Tracking Tool
 Technical Guide B (expected in August 2013) will focus
on measuring growth.
19
Massachusetts Department of Elementary & Secondary Education
ESE Supports
Assessment Quality Checklist Tool
General Information
Grade and Subject or Course
Potential DDM Name
Potential DDM Source
Type of Assessment
Item Types
Step #1: Evaluate Content Alignment
Alignment
Alignment to Curriculum
Rigor
Alignment to Intended Rigor
0
Total Score
0
% of Possible Score
0%
Step #2: Evaluate Remaining Evidence of Assessment Quality
Utility
Utility & Feasibility
Feasibility
Assessment Components
Reliability
Validity
Non-Bias
Item Quality
Describe the Process Used to Determine Ratings
0
Table of Test Specifications
Administration Protocol
Instrument
Scoring Method
Technical Documentation
Reliability Evidence Collection
Approach
Reliability Evidence Quality
Validity Evidence Collection
Approach
Validity Evidence Quality
Gathered Evidence of Non-Bias
Describe the Process Used to Determine Ratings
0
0
0
0
0
0
0
0
0
0
0
Range of Item Difficulties
Positively Discriminating Items
0
0
0
No Floor/Ceiling Effects
0
Total Score
0
% of Possible Score
0%
20
Massachusetts Department of Elementary & Secondary Education
DDMs: Request for Feedback
 Attribution: How can ESE best support districts in developing attribution policies
related to the determination of Student Impact Ratings, particularly for
coteachers, consulting teachers, and other scenarios where more than one
teacher contributes to student learning, growth, and achievement?
 Movement of Students: Due to highly specialized and often changing needs,
the population of children identified as needing special education services
fluctuates annually, sometimes in significant amounts, and mostly in the
elementary grades. This fluctuation means students move in and out of special
education classes and may not receive special education instruction for an entire
year. How should ESE recommend districts take student movement into account
when determining special educators’ Student Impact Ratings?
 Selecting Assessments: What are some considerations ESE should be aware of
when providing guidance on the selection of measures of student growth to be
21
21
used in the determination of special educators’ Student Impact Ratings? Please
include specific examples of measures that would or would not be appropriate and
why.
Massachusetts Department of Elementary & Secondary Education
Student and Staff Feedback
 Revised Implementation Timeline: Beginning
in the 2014-2015 school year, districts will
include student feedback in the evaluation of
all educators and staff feedback in the
evaluation of all administrators.
During the 2013-2014 school year, ESE will
work with districts to pilot/field test model
survey instruments.
Massachusetts Department of Elementary & Secondary Education
22
Multiple sources of evidence
inform the summative rating
23
Massachusetts Department of Elementary & Secondary Education
National Overview
A growing number of states are currently
using or preparing to use student surveys in
educator evaluations
Alaska
Idaho
Mississippi
Arizona
Kentucky
New Jersey
Colorado
Maine
New York
Delaware
Massachusetts North Carolina
Georgia
Michigan
Hawaii
Missouri
Rhode Island
Washington
Massachusetts Department of Elementary & Secondary Education
24
Why Use Student Surveys in
Educator Evaluations?
 Perception surveys round out a multiple measure evaluation
system
 Research also finds student surveys are correlated with student
achievement
 The Measures of Effective Teaching Project found students’ perceptions are
reliable, stable, valid, and predictive
 Surveys may be the best gauge of student engagement
 When asked which measures are good or excellent at assessing teacher
effectiveness, teachers reported
 District standardized tests (56 percent)
 Principal feedback (71 percent)
25
 Students’ level of engagement (92 percent)
Massachusetts Department of Elementary & Secondary Education
What Students Say…
 MA’s State Student Advisory Council and six regional student
advisory councils provide a unique feedback loop for students
 MA Student Advisory Council focus groups were overwhelmingly
positive toward soliciting their input through student surveys
 MA students want to help teachers improve
 MA students are excited about the prospect of being surveyed for this
purpose
 MA students offered thoughtful precautions about survey use:
 Use surveys for teacher goal-setting
 Consider making survey feedback visible only to teachers
 Provide 3rd party screeners of any open-ended questions
26
Massachusetts Department of Elementary & Secondary Education
Surveys as a Form of Feedback
 Benefits of Surveys of
Classroom/School
Experiences
 Offers valuable insight from
those with first-hand
experience
 Empowers and engages
survey recipients, sending a
signal that their input is
valued
 Comparatively inexpensive
 Considerations When
Using Surveys of
Classroom/School
Experiences
 Students may lack cognitive
ability or maturity
 Could become a popularity
contest or “rate-yourteacher.com”
 Survey results could be
misused by evaluators
27
Massachusetts Department of Elementary & Secondary Education
National Perspective –
Lessons Learned
 The more immediate the feedback the better
 The more flexibility for teachers to administer surveys when they
wish the better
 Surveys for early grades and special populations require special
attention
 To the extent that surveys are used for high stakes decisions at
all, this should not happen until after they have been used
effectively and reliably, and educators have grown comfortable
with them, in a low stakes setting
 When used for formative purposes, surveys are generally seen as
a good thing
Massachusetts Department of Elementary & Secondary Education
28
Perspectives & Considerations
Key areas for state or district consideration:
 1. Determining survey samples
 2. Timing of survey administration
 3. Reporting of survey results
 4. Using survey results in evaluations
 5. Considerations for pre-readers, special education, and
English Learners
29
Massachusetts Department of Elementary & Secondary Education
Student and Staff Feedback:
Request for Feedback
 Source of Evidence: In what way or ways should ESE recommend student and
staff feedback be used as a source of additional evidence relevant to one or more
Performance Standards?
 Accommodations: What types of arrangements are most appropriate for the
special populations, i.e., pre-readers, students with limited English proficiency, and
students with disabilities, so that their feedback can be taken into account as
well?
 Data Collection Tools: In addition to perception surveys, what other types of
data collection tools for capturing student feedback should ESE recommend and
for what populations would these tools be most useful?
30
Massachusetts Department of Elementary & Secondary Education
Additional Questions?
Ron Noble –
rnoble@doe.mass.edu or 781.338.3243
31
Massachusetts Department of Elementary & Secondary Education
Download