MedEastOhioOTESAMode..

advertisement
A Model For Change
William A. Bussey
Superintendent
Mid-East Career and Technology Centers

Implementing OTES through the Ohio
Appalachian Collaborative (OAC) and Teacher
Incentive Funds(TIF)

All teachers will be evaluated using the OTES
Model-Teacher Performance Standards

All administrators are credentialed evaluators

All teachers will develop and monitor one Student
Learning Objective



Review the processes, results, and challenges with both the
performance rating and student growth measures
Reflect on the good, the bad, and the ugly
Review any changes as we transition to the "authentic"
evaluation tool for teachers this coming school year
I have brought with me the experts!
◦
◦
◦
◦
Barb Funk
Dan Coffman
Scott Sabino
Michelle Patrick

OTES Team created in the Spring of 2012
◦
◦
◦
◦
◦
Executive Director
Building Directors
New Asst. Director-role is evaluation
Asst. Director - role Curriculum and OAC/TIF
Teachers

Introduced the OTES framework to all
leadership teams
District Transformation Team
Strategic Compensation Team
OTES Team
Formative Instructional Practices Team
Assessment/Literacy Tech Team
HSTW Site Coordinators

Train key people within the district

Success starts with a solid foundation

Teacher led PD would increase staff buy-in
◦ Ensure teachers understood:





Framework of the new evaluation system
New components
New tools
New process
Process would increase teacher/admin collaboration
time

Initial OTES Staff training
◦ Teacher Performance – OTES Model





7 Teaching Standards
Placed in the performance rubric
Teacher Evidence
Calibration of a standard to a teaching video
Completed self-assessment
◦ Student Growth – Student Learning Objectives
o
November PD
• Mini Sessions on Formative Instructional Practices, Literacy
w/CC Anchor Standards in Reading and Writing, BFK and
Value Added, Teacher Evidence
o
February PD
• State training manual on Student Growth Measures
o
Quarterly Meetings used as check points
o
Conversations with evaluator

Pre-Pre Observation Conference

Self-Assessment Tool (Optional in the process, but discovered to be
a MUST DO!)

Professional Growth Plan (See example)
◦ Met with teachers (last two week of October) to review the process timeline
and elements (paperwork and documentation) and answer questions.
◦ Follow-up to In-service Day.
◦ Teacher Performance Goal (Goal 2)- Influenced by Self-Assessment Tool
and Ohio Teaching Standards
◦ Self-directed
◦ Statement of the goal and how it will be measured
◦ Student Achievement Goal (Goal 1)- Influenced by SLO (Use standard from
pre-assessment used to develop SLO)
◦ Specific standard and how it will be measured



Teacher Performance Evaluation Rubric (TPER)- (See
example)
◦ Evaluation begins with Proficient (“Rock solid teaching”)
◦ Video examples- see NIET Training Modules
◦ Examine key words and phrases embedded in rubric at each
level (Proficient, Accomplished and Developing)
Pre-Conference
◦ Teachers complete/submit responses using ohiotpes.com
◦ Face-to-face
Observation
◦ 30 minutes
◦ Script the lesson!

Post-Conference

Classroom Walkthroughs (2-5 per teacher)
◦
◦
◦
◦
◦
◦
◦
◦
Follows each observation
Specific questions relating to the lesson and the instructor
Relate to TPER
Area for Reinforcement/Refinement
Shared with the teacher
Opportunity for feedback
Used paper form and ohiotpes.com
The more often, the better!
Set schedule early and put it on the teachers. It’s the
teachers’ evaluation and their responsibility to provide
evidence/documentation relating to the TPER.



Round 1
Pre-Conference
Observation
Walkthrough(s)
Post-Conference
Round 2
Pre-Conference
Observation
Walkthrough(s)
Post-Conference
Summative Performance Rating Conference
Use measures of student growth effectively
in a high quality evaluation system




Make informed decisions on the right measures
Make informed decisions about the appropriate
weight of measures
Increase reliability and validity of selected
measures

Teachers completed the development, implementation and scoring
process

SLO timeline with specific due dates, calendar with expectations

Teachers created their SLO and chose their growth target

Implemented the SLO

Calculated the results

Three main types of targets used
◦ Whole group
◦ Tiered/grouped targets
◦ Individual Targets

Whole group target-one target for all students in SLO
◦ All students will score a 75% or better on the post assessment

Tiered/grouped target-range of targets for groups of
students
◦ Pre-assessment scores between 0 – 25 would be expected to
score between 25-50 on post assessment

Individual target-each student in the SLO receives a
target score
◦ Using a formula such as (100 – pretest)/2 + the pretest = growth
target
Teacher Name: Formula Method
School: Mid East CTC - Zanesville Campus
SLO:
Assessment Name:
Student Name
1
2
3
4
5
6
7
8
9
10
11
Baseline
Growth
Target
Final Score
Met Target
28
64
80
Yes
20
60
48
No
44
72
76
Yes
28
64
76
Yes
12
56
48
74
84
Yes
20
60
44
No
28
64
52
No
40
70
88
Yes
32
66
84
Yes
28
64
60
No
6 of the 10 students met/exceed their growth target
60%
Descriptive
Rating
Most Effective
Above Average
Average
Approaching
Average
Least Effective
Percentage
Exceed/Met
Numerical Tiered Formula Whole
Group
Rating
90-100%
5
10%
20%
10%
80-89%
4
30%
20%
36%
70-79%
3
20%
0
9%
60-69%
2
20%
6%
9%
59 or below
1
20%
54%
36%
Teacher Performance
Ineffective Rating
Above
5
Developing Rating
Accomplished
Accomplished
Proficient
Developing
Expected
2, 3, 4
Proficient Rating
Proficient
Proficient
Developing
Developing
Below
1
Student Growth
Accomplished Rating
Developing
Developing
Ineffective
Ineffective
Fall 2011
 469 students were tested
 81% (379) students earned a bronze or
better
Intervention Provided through KeyTrain Online
Spring 2012
 90 students were tested
 71% (64) students earned a bronze or better
 40%
 50%
 5%
 5%
Bronze
Silver
Gold
Not yet
2011-2012
Level I
 27%
 64%


7%
2%
2012-2013
Bronze
Silver
Gold
Not yet
Level II
14 Programs Bronze
11 Programs Silver
70% Met or
Exceeded
Occupational
Profile!
1 Program Gold
Fall 2012
 467 students were tested
 86% students earned a bronze or better
Intervention Provided through KeyTrain Online
Spring 2013
 60 students were tested
 55% students earned a bronze or better
 37%
 54%


4%
5%
Bronze
Silver
Gold
Not yet
2012-2013
Level I
2013-2014
Level II
Content Area 2012-2013
2011-2012 2010 -2011
English
151
150
146
Math
143
143.25
141.75
Science
147.5
145
143.50
Zanesville Campus
College Readiness
Benchmark
156
2013
149
2012
145.2
2011
142.5
Chemistry
157
147
144.9
144
Algebra I
152
142
142.1
142.9
Geometry
152
144
142.9
140.3
Algebra II
149
143
144.5
142.4
PreCalculus
145
143
144.5
141.2
English 10
147
151
147.8
144.8
English 11
152
149
150.2
147.3
Physics
150
US History
150
English 9
154
English 12
153
Subject
Biology
Buffalo Campus
Subject
Biology
College Readiness
Benchmark
156
2013
147
2012
145.4
2011
144.2
145.4
143.5
Chemistry
157
Algebra I
152
141
142.6
142.4
Geometry
152
144
143.2
141.6
Algebra II
149
144
143.9
142.8
PreCalculus
145
English 10
147
154
150.9
144.4
English 11
152
151
151.9
146.3
Physics
150
US History
150
English 9
154
English 12
153
Teacher
Category
Value
Added
Vendor
Assessments
LEA Measures
Total
SLO/Other Shared
Attribution
A (Value
Added)
B (vendor
Assessments
C (LEA
Measures)
30%
10%
10%
10%
50%
30%
10%
50%
40%
10%
50%

Data Analysis and Assessment Literacy
Data Analysis and Setting Growth Targets
◦ Data driven decisions – What data?
◦ Engage in transparent conversations around
student growth




Outliers, class size, averages, ranges
Identify trends, patterns, expectations for mastery
Gather other available data
Zoom-in and identify similarities and differences in
students
Build staff Assessment Literacy

Priority Standards

Appropriate assessments

Quality Assessment Design

Assessment Blue Print reflects instruction

Instructional Tasks move students to meet
standards



Conversations and Collaboration
Greater focus on Student Learning
Deeper reflection of the:
◦ teaching and learning process
◦ accountability of growth for all students
◦ role student growth plays in determining Educator
Effectiveness

Policy developed utilizing the template
provided by ODE
◦ Simple to allow for flexibility
◦ Change as we negotiate our contract

Further development by the district SLO
Evaluation Committee for SLO guidelines
•
•
What made us breathe a sigh of relief when it was
over
What went well/positive elements
•
•
Yes, some things were quite positive!
Suggestions/ways to use our internal feedback
and insight to feed forward

Roller Coaster of Emotions

WHEW that took some time!
◦ 5 hours per teacher per observation? *gulp*
◦ What do I give up?
◦ Walkthroughs?

Technology
◦ eTPES downfalls
 Other evaluations?
 Walkthrough data?
◦ Support for all levels of learners 

Roller Coaster of Emotions

Process was overall positive
◦ From Self-Assessments  Reflection
◦ Consensus: “It’s not so bad”!!

Technology based

Trial year! WOOHOOO

Focused purpose and common dialogue

Collaboration
◦ Holistic
◦ Rubric
◦ Criteria not a checklist
◦
◦
◦
◦
◦
Administrator with associate school principals
Administrators with administrator
Administrators with teachers
Teachers with teachers
Utopia!


Self-Assessment – everyone 
Walkthrough data – form that collects data we need?
◦ Experimenting with Google Forms?
◦ Use to see trends

Non-instructional staff evaluations
◦ OSCES, ASCA, and OTES
◦ Input from staff

Opportunities for more alignment
◦ for professionals to align all goals; IPDP, OTES/OPES, Resident
Educator,
◦ to look for trends and align with PD,
◦ to group professionals with aligned goals as they work together to
improve their practice,
◦ to align ourselves as evaluators - do we truly calibrate? Can we better
align (with each other) by discussing our ratings and why, etc., etc.,
etc.
Download