Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Educator Effectiveness: September 2014 June 26, 2014 1 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Teacher Effectiveness System in Act 82 of 2012 Building Level Data/School Performance Profile Teacher Observation/ Practice Planning and Preparation Classroom Environment Instruction Professional Responsibilities Indicators of Academic Achievement Indicators of Closing the Achievement Gap, All Students Indicators of Closing the Achievement Gap, Historically Underperforming Students Indicators of Academic Growth/ PVAAS Extra Credit for Advanced Achievement Building Level Data 15% Teacher Specific Data 15% Observation/ Practice 50% Teacher Specific Data Student Performance on Assessments PVAAS 3-Year Rolling Average IEP Goals Progress* LEA Developed Rubrics* Elective Data* Elective Data 20% District Designed Measures and Examinations Nationally Recognized Standardized Tests Industry Certification Examinations Student Projects Pursuant to Local Requirements Student Portfolios Pursuant to Local Requirements *Student Learning Objective Process June 26,2014 2 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Classroom Teachers • Provide Direct Instruction to students – Plan, Instruct and Assess • Use Rating Form 82-1 http://www.portal.state.pa.us/portal/server.pt/commun ity/educator_effectiveness_project/20903/p/1173845 • At the local level, districts must categorize professionals under Teaching Professionals, NonTeaching Professionals, or Principal Effectiveness • Example: Guidance Counselors 3 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Transfers from One Building to Another • If a teacher transfers from one building to another building within an LEA, they will have the option of using the teacher specific data in place of the Building Data (SPP) for two years starting on the date the teacher begins the new assignment. • A teacher who elects this option shall sign a statement of agreement with the LEA giving the LEA permission to calculate their final rating in this manner. 4 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Temporary Professional Employee • There must be 4 months between observations • Final Rating Form (82-1) must be completed twice a year. BOTH final rating forms must include Teacher Specific Data and Elective Data. 5 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Teacher Specific Data • Given the recent release of Educator Effectiveness FAQ, many LEAs have asked how they should go about meeting the legislative intent of the data defined in teacher specific. In order to support our districts, PDE worked with a group of diverse educators representing special education and general education to come up with some policy recommendations and guidance documents/resources on how LEAs can met the legislative intent of Act 82 while also making the process feasible for districts. 6 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Teacher Specific Data • Implementation of Teacher Specific Data should be decided through a conversation with your district’s solicitor, bargaining unit, teachers, and administrators. • PDE legal will not talk specifics beyond what is stated in Act 82 with this section of the pie because implementation of this piece is to be determined at the local level. 7 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Teacher Specific Data • 15% Total – Data must be included if it is available and applicable 1. Student Performance (Advanced and Proficient) on state assessment – not more than 5% • Conversion in Table H in the rules and regulations 2. PVAAS – at least 10% 3. Progress on meeting IEP goals – not more than 5% 4. Locally developed rubrics (LDR) – not more than 15% when none of the other measures are available – using SLO process; for teachers with PVAAS data available it is not more than 5% 8 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us SLO Process for IEP Progress SLO process for IEP Progress- a simple streamlined SLO process to account for the IEP progress. Per Act, any teacher with available and applicable IEP progress must have that data attributed (general education and special education teachers). This template will allow you to address the provisions of Act 82 by filling out a one page summary of the aggregated case load data for students 9 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us • LDR – Locally Developed Rubrics – Defined in Rules and Regulations as LDR designed by the LEA or used from the elective data which includes the following: • • • • • District designed measures and assessments Nationally recognized standardized tests Industry certification examinations Student project pursuant to local requirements Student portfolios pursuant to local requirements • The SLO process must be used to develop your LDR. 10 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Scenarios • Scenario One - If a teacher has PVAAS data available they have to use at least 10% from PVAAS and then they could have up to 5% from a combination of IEP, student achievement (P or A) or LDR. (Teachers make this decision) • Example 1: – PVAAS 10% – IEP Goals 1% – Student Achievement 3% – LDR 1% • Example 2: – PVAAS 12% – Student Achievement 2% – LDR 1% 11 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Scenarios without PVAAS • If a teacher does not have PVAAS or any of the other data (IEP, or number Advanced or Proficient on a state assessment) available then they will have 15% LDR. – So, in the past we said that teachers that did not have PVAAS would have 35% from SLO. It is really still the same… • Example 1: This could mean 1 SLO for all 35% (15% Teacher Specific and 20% Elective Data). Each part of the SLO will be weighted based on the section you will be applying the data and it will specifically state what section the data will fall under on the pie chart. (Ex: IEP targeted group) • Example 2: This could mean 2 SLOs. One SLO for 15% under Teacher Specific and a different SLO for 20% under Elective Data. • Example 3: This could also mean 3 or 4 SLOs as long as it is a total of 15% under Teacher Specific and 20% under Elective Data. • Teachers decide how many SLOs they want to write to achieve their Teacher Specific Data Score. 12 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us How do I arrive at a score for 15% Teacher Specific Data? • Districts will decide locally how to arrive at this final 0-3 score for Teacher Specific Data based on the conversion tables in the rules and regulations for PVAAS and Achievement Data in addition to their own local decisions around LDR and IEP Goal conversions. • A Teacher Decision Making Guide will prompt LEAs to address all data defined in teacher specific data per Act 82 and provides a logical guidance document on how to address each component. 13 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us 14 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us 15 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us SLO Policy Decisions: The “n” count • LEAs are encouraged to utilize a “n” count of 11 across teacher specific and elective data. This is consistent with the “n” count PDE utilizes for other data sources such as the SPP and PVAAS. • In absence of teacher specific and elective data the observation and practice components of the evaluation system could be substituted. • It is a local decision whether an LEA chooses to utilize a lower “n” count for teacher specific and elective data. Hence, an LEA could chose to develop a SLO for less than eleven students, if they believe that they can attribute student achievement to the teacher. • An LEA should discuss any decision to use the “n” count of 11 or a lower “n” count with its solicitor. Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Non Teaching Professional Employee Effectiveness System in Act 82 of 2012 Observation and Practice Planning and Preparation Educational Environment Delivery of Service Professional Development Student Performance/School Performance Profile (SPP) Student Performance 20% Observation/ Practice 80% June 26, 2014 17 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Non Teaching Professional Employee • 3 Groups of Professionals are included – Instructionally Certified – Educational Specialist – Non-Teaching Professional Supervisors • Use Rating Tool 82-3 • Rubrics and Guiding Questions Posted on the PDE website http://www.portal.state.pa.us/portal/server.pt/com munity/educator_effectiveness_project/20903/p/1 173848 18 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Non Teaching Professional Employee Group 1 Instructionally Certified • Is the employee working under an instructional certification? • Does the employee provide direct instruction? – Plan, Instruct and Assess • Crosswalk with Danielson Framework • All Domains 25% • Example: Full time instructional coach 19 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Non Teaching Professional Employee • • • • • • Group 2 Educational Specialist Dental Hygienist Elementary and Secondary School Counselor Home and School Visitor Instructional Technology Specialist School Nurse School Psychologist 20 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Non Teaching Professional Employee • • • • • Group 3 Non-Teaching Professional Supervisors Crosswalk with FFL Supervisor of Curriculum and Instruction Supervisor of Pupil Services Supervisor of Special Education Supervisor of Single Area (subject) Supervisor of Vocational Education 21 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Principal Effectiveness System in Act 82 of 2012 Observation/ Practice Framework for Leadership Domains Strategic/Cultural Leadership Systems Leadership Leadership for Learning Professional and Community Leadership Building Level Data/School Performance Profile Indicators of Academic Achievement Indicators of Closing the Achievement Gap, All Students Indicators of Closing the Achievement Gap, Historically Underperforming Students Academic Growth PVAAS Other Academic Indicators Extra Credit for Advanced Achievement Building Level Data, 15% Observation /Practice 50% Correlation Data Based on Teacher-Level Measures, 15% Elective Data 20% Correlation Data/Relationship Based on Teacher Level Measures Elective Data/Student Learning Objectives District Designed Measures and Examinations Nationally Recognized Standardized Tests Industry Certification Examinations Student Projects Pursuant to Local Requirements Student Portfolios Pursuant to Local Requirements June 26, 2014 3 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Principal Effectiveness • Applies to: – Principal – Assistant Principal – Vice Principal – Director of CTC • Use Rating Tool 82-2 23 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Policy Decisions to consider… • Has your district categorized all of your staff under contract under either teaching or nonteaching employees? • Has your district discussed the “n” count and SLOs? How will this look in your district? • Has you district discussed how teacher specific data will be calculated for each teacher in your district? • Has your district discussed how the SLO Process will be used to measure IEP progress? 24 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Questions?? 25 Measuring Educator Effectiveness Tom Corbett, Governor ▪ Carolyn C. Dumaresq, Acting Secretary of Education www.education.state.pa.us Contact Information for Educator Effectiveness Intermediate Unit Contacts • Jenny Lent – jenny.lent@iu1.org – 724-938-3241 ext. 268 • JoBeth McKee PDE Contacts PDE email for Questions Related to Educator Effectiveness ra-edeff@pa.gov – jobeth.mckee@iu1.org – 724-938-3241 ext. 267 26