Guidelines for the Candidate Assessment of Performance Assessment of Teacher Candidates

advertisement
Guidelines for the Candidate Assessment of Performance
Assessment of Teacher Candidates
June 2015
Massachusetts Department of Elementary and Secondary Education
75 Pleasant Street, Malden, MA 02148-4906
Phone 781-338-3000 TTY: N.E.T. Relay 800-439-2370
www.doe.mass.edu
Table of Contents
PURPOSE ............................................................................................................................................................... 3
CONTEXT................................................................................................................................................................ 3
BACKGROUND ...............................................................................................................................................................3
DEVELOPING CAP ..........................................................................................................................................................4
OVERVIEW OF CAP ................................................................................................................................................. 5
PREPARATION CONTEXT CONSIDERED IN CAP .....................................................................................................................5
SHIFTS IN ASSESSMENT: FROM PPA TO CAP.......................................................................................................................6
CANDIDATE ASSESSMENT OF PERFORMANCE (CAP) .............................................................................................. 7
CAP CONTENT ........................................................................................................................................................ 7
CAP’S SIX ESSENTIAL ELEMENTS .......................................................................................................................................8
CAP RUBRIC OVERVIEW ...............................................................................................................................................10
Quality, Scope & Consistency .............................................................................................................................11
Readiness Thresholds .........................................................................................................................................12
CAP PROCESS ....................................................................................................................................................... 12
CAP 5-STEP CYCLE OVERVIEW .......................................................................................................................................12
CAP 5-STEP CYCLE: STEP-BY-STEP ..................................................................................................................................14
CAP 5-STEP CYCLE: SCHEDULING ...................................................................................................................................17
CATEGORIES OF EVIDENCE .............................................................................................................................................18
Observations ......................................................................................................................................................18
Measure of Student Learning .............................................................................................................................18
Student Feedback ...............................................................................................................................................18
Candidate Artifacts ............................................................................................................................................19
EVIDENCE REQUIREMENTS FOR EACH ESSENTIAL ELEMENT...................................................................................................19
THE SUMMATIVE ASSESSMENT .......................................................................................................................................20
The Role of Professional Judgment ....................................................................................................................20
ROLES & RESPONSIBILITIES ............................................................................................................................................21
Sponsoring Organizations: .................................................................................................................................21
PK-12 Schools and District Partners ...................................................................................................................22
Program Supervisor ............................................................................................................................................22
Supervising Practitioner .....................................................................................................................................23
Candidate ...........................................................................................................................................................23
APPENDICES......................................................................................................................................................... 24
Audio Deep Dive
Throughout the CAP Guidelines be on the lookout for the
Audio Deep Dive icon. Click to hear an ESE team member
provide more detailed information about the topic.
Candidate Assessment of Performance
2
Purpose
The Massachusetts Candidate Assessment of Performance (CAP) is designed to assess the overall
readiness of teacher candidates. By demonstrating readiness through CAP, Massachusetts will
be able to ensure that teacher candidates enter classrooms prepared to be impactful with
students on day one. CAP is the culminating assessment required for program completion in the
Commonwealth and in this way creates an intentional bridge from training to practice by aligning
expectations with the Massachusetts Educator Evaluation Framework. Through CAP, Sponsoring
Organizations are able to ensure that teacher candidates have the skills and knowledge
necessary to be effective teachers in Massachusetts.
The goals of CAP are:
 To ensure teacher candidates are ready to make impact with students day one;
 To measure teacher candidates’ practice on key indicators as outlined in the Guidelines
for the Professional Standards for Teachers (PSTs); and
 To support teachers in improving their practice based on targeted feedback and
performance evaluations.
In support of this, these Guidelines are designed to: 1) outline expectations for the Candidate
Assessment of Performance (CAP) and 2) support Sponsoring Organizations (SO), Program
Supervisors, Supervising Practitioners, and candidates as they work together to ensure
successful demonstration of the Professional Standards for Teachers (PSTs)1.
Context
Background
Over the last five years, Massachusetts has engaged in a significant effort to increase the rigor
and quality of educator preparation in the Commonwealth. Most significantly, in June 2012 the
Massachusetts Board of Elementary and Secondary Education (BESE) approved changes to the
Regulations for Educator Licensure and Preparation Program Approval (603 CMR 7.03) which
raised expectations for preparation across the board through a variety of strategies, including:
 Increasing practicum hour requirements
 Requiring Supervising Practitioners to be rated proficient or higher on their most recent
summative evaluation in order to serve in the role
 Updating the Program Approval Standards to emphasize a provider’s ability to meet
the needs of schools/districts and engage in data-driven continuous improvement
 Collecting and publicly reporting additional outcome measures associated with the
quality of preparation
These guidelines apply to all initial licensure teacher candidates except the following specialist
licenses: Academically Advanced, Reading, and Speech, Language, and Hearing Disorders teachers. It
is the expectation of ESE that programs endorsing in license areas not covered by CAP develop and
implement a performance assessment appropriate for the license.
1
Candidate Assessment of Performance
3
As an extension of these regulatory changes, in January of 2014 BESE passed updated
Professional Standards for Teachers (PSTs) which align expectations for teacher preparation
candidates with those for in-service teachers as outlined in the Massachusetts Educator
Evaluation Framework. The PSTs define the pedagogical and other professional knowledge and
skills required of all teachers. These PSTs and indicators are used by Sponsoring Organizations
in designing their teacher preparation programs and in preparing their teacher candidates.
Developing CAP
Following the adoption of the 2014 PSTs, ESE convened a Teacher Preparation Performance
Assessment Task Force made up of stakeholders working in PK-12 schools and preparation
programs across the state to provide a recommendation on the adoption or development of a
new assessment instrument. Both ESE and the Task Force recognized that the Pre-Service
Performance Assessment (PPA) that had been in use since 2003 needed to be replaced. We
recognized that the Commonwealth needed a more robust instrument to assess candidates’
performance and needed to do so in a way that was consistent with current evaluation
practices in the state. The Task Force investigated all commercially available products (e.g.,
edTPA), reviewed all available research, discussed best practice and ultimately, crafted a set of
considerations that led ESE to the decision to build a state-specific instrument. Members of
this Task Force continued to consult on the development of CAP during the 2014-2015
academic year through an advisory committee. Below is a timeline of activities associated with
the development of CAP.
February 2014:
Teacher Preparation Assessment Task Force convened and met
six times between February and May.
June 2014:
Task Force recommendations were presented to the
Commissioner.
September 2014:
ESE issued a contract to bring on a vendor to support the
development of the new assessment instrument.
December 2014:
Development work began
December 2014 –
February 2015:
ESE together with vendor conducted interviews and focus groups
with external stakeholders
February 2015:
Advisory Committee convened. Committee included members of
the original Task Force as well as several additional stakeholders.
The Advisory Committee met three times between January and
June.
June 2015:
CAP Guidelines released
Adopting PSTs that are the same as the Standards of Effective Teaching Practice (those used in
the MA Educator Evaluation Framework) was a first step in aligning preparation with the
expectations for teachers upon employment; using CAP to assess candidate readiness solidifies
the notion of a career continuum that systematically supports educators in continuously
improving their practice.
Candidate Assessment of Performance
4
Overview of CAP
CAP has been built to mirror the experience of educators engaged in the MA Educator
Evaluation Framework. Components of the evaluation experience have been modified so that
they are appropriate for the context of preparation and focused on essential elements of
practice for novice teachers. Aspects of the MA Educator Evaluation Framework that are
evident in CAP include:
 A 5-step cycle that includes self-assessment, goal setting, plan implementation,
formative assessment and a summative evaluation.
 The use of Elements and performance descriptors from the Model Teacher Rubric
 Assessing performance using multiple measures, including:
o Evidence of growth in student learning
o Feedback from students
o Announced and unannounced observations
This intentional alignment exemplifies the cohesion Massachusetts is building across the
educator career continuum through our complementary educator effectiveness policies. For a
detailed explanation of the parallels between CAP and the MA Educator Evaluation Framework
see Appendix H.
Preparation Context Considered in CAP
It was crucial in building CAP that the system of assessment be appropriately tailored to the
context in which the evaluation of the candidate will occur; in this case a practicum or
practicum equivalent. There are several considerations that ESE accounted for in modifying the
Educator Evaluation Framework for use in preparation:
 Time: In most cases, teachers engage in the 5-step cycle over the course of an entire
school-year or two; in preparation, CAP will need to be executed in as few as 300
hours. Practicums/practicum equivalents range significantly in the number of hours,
days and weeks included in the experience. Some candidates participate in a year-long
residency while others spend as few as eight weeks in a practicum. As is the case with
the practicum experience itself, CAP is intended to be intensive. The level of focused
and concentrated examination of practice required
Audio Deep Dive:
by CAP is necessary for an effective assessment of
CAP in Split Practicums
candidate readiness.

Ownership/Responsibility: Unless already employed as a teacher-of-record2,
candidates will be assessed on their skills while working in classrooms that are not their
own. It can be challenging in these situations to determine the readiness of a candidate
independent from the context in which they are completing the practicum. That being
said, it is the expectation that candidates be provided opportunities to demonstrate
2
Candidates that are employed as teachers-of-record are still required to undergo the CAP assessment for
program completion. Alignment between the Educator Evaluation Framework and CAP may be leveraged by the
candidate and the Sponsoring Organization to reduce duplication of efforts. Ratings on the Educator Evaluation
rubric provided by a school/district evaluator do not replace/substitute CAP ratings. Ratings may differ;
proficiency on one does not necessitate proficiency on the other. In these cases, ESE encourages Sponsoring
Organizations and schools/districts to communicate around expectations for performance.
Candidate Assessment of Performance
5
their own skills and abilities within this context. This will require concerted effort from
the Supervising Practitioner and the Program Supervisor to coordinate authentic
experiences for the candidate during engagement in CAP.

Role of Evaluator: Many evaluations under the Educator Evaluation Framework are
conducted and concluded by a single evaluator, whereas, ratings provided in CAP are
the calibrated and summative judgments of both a Supervising Practitioner and a
Program Supervisor.

Developmental Progression of Practice: ESE acknowledges that teaching is a
profession in which individuals will grow in their expertise and skill-indeed, our
collective educator effectiveness policies are dedicated to supporting continuous
improvement through the career continuum. We are clear, however, that in order to
make an impact with students, teachers must meet a standard for readiness on the first
day in the classroom. To that end, with the help of advisory groups and external
experts, CAP focuses on assessing elements that are deemed essential to novice
teachers being immediately impactful in Massachusetts classrooms. While districts or
individual schools may choose to emphasize different elements within their evaluation
processes, in the case of CAP, ESE has prescribed the key skills on which readiness will
be based.
Shifts in Assessment: From PPA to CAP
With the 2003 Pre-Service Performance Assessment (PPA), Massachusetts was one of the first
states in the nation to implement a standard assessment of candidate performance and require
it for program completion across all Sponsoring Organizations. This history is important as
Massachusetts embarks on the implementation of CAP, which is similarly a first of its kind
endeavor for a state. While the practice of assessment in preparation may not be new for
Massachusetts Sponsoring Organizations, there are significant differences in the approach to
measuring performance in CAP than what occurred under the PPA. Namely, CAP:
 Includes a rubric with descriptors for various levels of performance

Focuses assessment on six essential elements

Uses multiple measures, which have been outlined by ESE, to determine performance

Embeds expectations regarding subject-matter (previously gauged through licensespecific questions) within the application of the PSTs; CAP does not include contentbased questions

Places the responsibility of evidence collection for judgments on the Program
Supervisors/Supervising Practitioners

Emphasizes targeted feedback and creates structures (e.g., observation protocol forms)
to document improvement in practice

Ratings have a direct correlation to performance once employed (as measured by the
Educator Evaluation ratings); serving as a source of validation and check and balance on
the system

Data will be collected by ESE and considered, along with other outcome measures, in
program approval
Candidate Assessment of Performance
6
These changes continue to emphasize ESE’s desire to support Sponsoring Organizations in
strengthening the field-based experiences of teacher preparation candidates and to support
their continuous improvement.
Candidate Assessment of Performance (CAP)
The Candidate Assessment of Performance assesses a candidate’s readiness to positively
impact students’ learning. There are two facets of the assessment system: the content and
the process. Evidence is collected throughout the 5-Step Cycle (process) in service of
measuring whether a candidate has demonstrated skills at a certain level (content). In each
section that follows we describe the details of how the content and process individually and
collectively contribute to measuring candidate readiness.
CAP Content:
 Six Essential Elements
 Rubric Overview
o Quality, Scope, Consistency
o Readiness Thresholds
CAP Process:
 5-Step Cycle Overview
 5-Sep Cycle – Step by Step
 Categories of Evidence
 Summative Assessments
 Roles & Responsibilities
CAP Content
CAP has been built to measure performance relative to the Professional Standards for Teachers.
While these are the same as the Standards of Effective Teaching Practice used in the Educator
Evaluation Framework, the construct of the assessment needed to be altered based on the
context of preparation. To that end, the first set of decisions considered in the development of
CAP was to determine exactly what the assessment needed to measure. The sections that
follow detail the decisions made to include or exclude content in a way that maintained the
core language and expectations of practice articulated in the Educator Evaluation Framework
while adapting it for use in CAP.
Candidate Assessment of Performance
7
CAP’s Six Essential Elements
CAP assesses candidate performance on six elements. In order to understand the decision to
narrow the focus of the assessment to these six elements, you must first understand the
structure of the MA Model Teacher Rubric in relation to the PSTs:
 Standards: Standards are the broad categories of knowledge, skills, and
performance of effective practice detailed in the Regulations. There are four
Professional Standards for Teachers:
1) Curriculum, Planning, and Assessment
Audio Deep Dive:
Professional Standards for Teachers
2) Teaching All Students
3) Family and Community Engagement
4) Professional Culture
 Indicators: Indicators are detailed in PST Guidelines and describe specific
knowledge, skills, and performance for each Standard. For each Indicator, there
is a corresponding level of practice expectation (introduction, practice,
demonstrate). As a result, the expectations for individual Indicators are
differentiated.

Elements: The elements are more specific descriptions of
actions and behaviors. The elements further break down
the Indicators into more specific aspects of educator
practice.
Overall, there are 23 PST Indicators, 13 of which were designated at the “demonstrate” level of
practice. The graphic below shows how these Indicators were narrowed into the essential
elements.
In order to reduce the scope of the assessment, ESE worked with external consultants, internal
experts, and the advisory committee to select elements that were considered essential.
Candidate Assessment of Performance
8
Elements were deemed essential if:
 The absence of a teacher’s competency in the skill was likely to put students at risk
 The element could serve as an umbrella for skills outlined in other elements, in most
cases other elements were pre-requisite skills to those outlined in the essential
element. See Appendix F for the crosswalk of coverage.
ESE is committed to monitoring the efficacy of the choices made in selecting the essential
elements and will revise/update CAP as appropriate in the coming years.
The following were selected as Essential Elements:
Standard
1: Curriculum,
Planning and
Assessment
2: Teaching All
Students
4: Professional
Culture
Element
Proficient Descriptor*
1.A.4: WellDevelops well-structured lessons with challenging,
Structured Lessons measurable objectives and appropriate student
engagement strategies, pacing, sequence, activities,
materials, resources, technologies, and grouping.
1.B.2: Adjustment Organizes and analyzes results from a variety of
to Practice
assessments to determine progress toward intended
outcomes and uses these findings to adjust practice
and identify and/or implement appropriate
differentiated interventions and enhancements for
students.
2.A.3: Meeting
Uses appropriate practices, including tiered
Diverse Needs
instruction and scaffolds, to accommodate
differences in learning styles, needs, interests, and
levels of readiness, including those of students with
disabilities and English language learners.
2.B.1: Safe
Uses rituals, routines, and appropriate responses that
Learning
create and maintain a safe physical and intellectual
Environment
environment where students take academic risks and
most behaviors that interfere with learning are
prevented.
2.D.2: High
Effectively models and reinforces ways that students
Expectations
can master challenging material through effective
effort, rather than having to depend on innate ability.
4.A.1: Reflective
Regularly reflects on the effectiveness of lessons,
Practice
units, and interactions with students, both
individually and with colleagues, and uses insights
gained to improve practice and student learning.
*Proficient Descriptor is included here to provide a sense of the expectation outlined in the element. Specifics
about the expectations of demonstrated competency for preparation candidates is outlined further in the Rubric
Overview section below.
Overall, we believe that the combined performance on these elements will be representative of
candidates’ readiness to be impactful on day one. Again, ESE plans to collect data to assess the
extent to which these elements are predictive of performance once employed.
Candidate Assessment of Performance
9
While ESE has identified essential elements for purposes of CAP, Sponsoring Organizations and
candidates should keep the following in mind:
 CAP is a program completion requirement, not a licensure requirement. It is embedded
as a program requirement because there are other indicators and expectations of
readiness, both outlined by ESE and upon an individual Sponsoring Organizations’
discretion.
 Sponsoring Organizations may choose to include additional elements and have the
authority as an approved Sponsoring Organization to consider other factors in
determinations about readiness and ultimately endorsement for licensure.
 It is important to note that the levels of practice established in the PST Guidelines do
not suggest a hierarchy or priority of skills. Rather, candidates should experience
thoughtful exposure to each Indicator and have appropriate opportunities to practice
or demonstrate the Indicators at the intended level.
A key purpose of CAP is to provide teacher candidates with targeted feedback to inform their
growth and ensure that candidates meet a threshold of performance expectations. Targeted,
detailed feedback requires fine-grained descriptions of educator practice. The detailed
descriptors of each element allow teacher candidates and assessors to prioritize specific areas
for evidence-gathering, feedback, and assessment. The result is a more transparent and
manageable process.
CAP Rubric Overview
As is the case with the MA Model Teacher Rubric, the CAP Rubric is designed to help candidates
and assessors:
1) develop a consistent, shared understanding of what performance looks like in practice,
2) develop a common terminology and structure to organize evidence, and
3) make informed professional judgments about performance ratings.
In support of these goals, the CAP Rubric serves as the content anchor throughout the process
and as a result is used in each step of the 5-Step Cycle.
The CAP Rubric has several features that, relative to the MA Model Teacher Rubric, are unique.
It is important to note, that while the CAP Rubric looks different than the MA Model Rubric for
Teachers it varies in form only. The CAP Rubric uses the exact language of performance
descriptors outlined in the MA Model Teacher Rubric; this helps maintain alignment and
consistency. The major difference between the two rubrics is that the CAP Rubric unpacks the
performance descriptors and sets varying thresholds for performance within an element.
Candidate performance in an element is assessed across three dimensions: Quality, Scope, and
Consistency
Candidate Assessment of Performance
10
Below is a graphic illustrating the main features of the CAP Rubric.
Quality, Scope & Consistency
The performance descriptors presented in the MA Model Teacher Rubric are constructed in
relation to these three dimensions: Quality, Scope, and Consistency. The extent to which a
teachers’ performance varies within an element (from Unsatisfactory to Exemplary) is directly
related to the Quality, Scope, and Consistency that is demonstrated relative to that skill or
behavior. Therefore, in pulling out these aspects of performance we are deconstructing the
Rubric, not changing it. By considering performance across these dimensions we are also able
to further differentiate appropriate expectations for novice teachers. Assessors and candidates
should consider the following explanations in rating performance in each dimension:
 Quality: ability to perform the skill, action or behavior as described in the proficient
performance descriptor
 Consistency: the frequency (e.g., all the time, sometimes, once) that the skill, action or
behavior is demonstrated with quality
 Scope: the scale of impact (e.g., one student, subset of children, all students) to which
the skill, action or behavior is demonstrated with quality
Assessors rate a teacher candidate on the Quality,
Scope and Consistency of their practice on the six
essential elements.
Audio Deep Dive:
Quality, Scope & Consistency
The rubric sets an expectation for teacher candidates to meet a minimum threshold in
exhibiting Quality, Scope and Consistency on a particular element. A teacher candidate must
meet the minimum threshold established for each element of the CAP.
Candidate Assessment of Performance
11
Readiness Thresholds
ESE expects candidates to be ready to make a positive impact on day one, which does not
necessarily equate to full proficiency. ESE has established minimum thresholds in each
dimension of readiness. These are highlighted and noted with a star in the rubric.
Candidates are expected to demonstrate proficiency for all elements in the dimension of
Quality. As you can see from both the bold line in the CAP Rubric itself as well as from the
dimension explanations above, achieving the threshold in Quality is a precursor to be rated for
Consistency and Scope. In this way, the Quality rating serves as a gatekeeper. Throughout the
Rubric, ESE has established consistent thresholds for all elements:
Quality  Proficient
Consistency  Needs Improvement
Scope  Needs Improvement
These minimum thresholds may change as additional research on CAP is completed. Sponsoring
Organizations may establish higher thresholds if they choose. Candidates must meet all
readiness thresholds.
CAP Process
Like the translation of content from the Educator Evaluation Framework to CAP, the CAP
process similarly parallels the Framework in terms of process. There are, however, key
differences. The Educator Evaluation Framework focuses on driving the professional growth
and continuous improvement of an educator; whereas in CAP, the primary purpose is to assess
baseline readiness for entry into the profession. This difference in purpose necessitated several
structural adjustments to the Educator Evaluation Framework for application in CAP.
CAP 5-Step Cycle Overview
In the MA Educator Evaluation Framework, the goal of the 5-Step Cycle is to provide educators
with a continuous opportunity for professional growth and development through self-directed
analysis and reflection, planning, action steps, and collaboration. While these goals remain
present in CAP, they are secondary to the primary goal of assessing candidate readiness.
To this end, the 5-Step Cycle used in CAP has been modified to meet the needs of candidates,
Program Supervisors, and Supervising Practitioners, but retains the same core architecture of
the cycle included in the Educator Evaluation Framework:
Step 1: Self-Assessment
Step 2: Goal-Setting and Plan Development
Step 3: Plan Implementation
Step 4: Formative Assessment
Step 5: Summative Evaluation
The following graphic represents the CAP 5-Step Cycle, with pre-and post-cycle events included.
Candidate Assessment of Performance
12
Candidate Assessment of Performance
13
CAP 5-Step Cycle: Step-by-Step
The descriptions that follow outline the CAP 5-Step Cycle step-by-step. In each step we have provided
details of the events that are expected to occur as well as an anticipated timeframe. For each item we
have provided a link to the required form and/or a supporting resource.
Audio Deep Dive:
Pre-Cycle
Pre-Cycle

Announced Obs. #1 Pre-Conference: Teacher candidate and the Program
Supervisor meet in Pre-Conference for announced observation #1. Resource: Model
Observation Protocol: Pre-Conference Planning Form (Appendix D)

Announced Obs. #1: The Program Supervisor and the Supervising Practitioner
conduct the first observation. Required: Observation Forms (Appendix D)

Establish Student Impact Rating Scale: The Supervising Practitioner, with support
from the Program Supervisor as needed, determines which assessment(s) will be
used for measuring teacher candidate impact on student learning. Resource:
Measuring Candidate Impact on Student Learning (Appendix G)

Announced Obs. # 1 Calibration: The Program Supervisor and the Supervising
Practitioner discuss the evidence collected during the observation and calibrate on
the feedback to be provided to the teacher candidate; including initial ratings on the
CAP Rubric. Required: Observation Forms (Appendix D); Resource: Model
Observation Protocol: After the Observation (Appendix D)
WHAT
WHEN
Within the first three weeks of the beginning of the practicum.
Step 1: Self-Assessment
WHAT
WHEN

Complete Self-Assessment: The teacher candidate completes a Self-Assessment.
Candidates should be completing the Self-Assessment based on prior experiences in
pre-practicum and coursework to assess current skill set. Required: Self-Assessment
Summary Form (Appendix B).

Draft Preliminary Goals: The teacher candidate uses information from the SelfAssessment to draft a preliminary professional practice goal. Required: Preliminary
Goal-Setting & Plan Development Form (Appendix B)

Share Self-Assessment & Preliminary Goals: The teacher candidate shares the SelfAssessment form and the draft goals with the Supervising Practitioner and the
Program Supervisor prior to the first Three-Way Meeting.
Within the first three weeks of the beginning of the practicum. After Announced
Observation #1 and before the post-conference and first Three-Way Meeting.
Candidate Assessment of Performance
14
Step 2: Goal Setting and Plan Development

WHAT
First Three-Way Meeting: During the first Three-Way Meeting the Program
Supervisor and the Supervising Practitioner:

Conduct a post-conference for Announced Observation #1. Required:
Observation Forms (Appendix D), Resource: Model Observation Protocol:
Post-Conference Planning Form (Appendix D)

Share baseline ratings on the CAP Rubric and discuss based on the
candidate’s Self-Assessment. Resource: CAP Rubric and Form (Appendix A)

Work with the candidate to finalize the goals and outline a plan for
implementing those goals. Resource: Finalized Goal(s) & Implementation
Plan Form (Appendix C)
Required: CAP Form and Rubric Section 1 & 3 (Appendix A), Resources: Three-Way
Meeting Checklist (Appendix E)
WHEN
Within 2-5 days after Announced Observation #1
Step 3: Plan Implementation

Unannounced Obs. #1: The Supervising Practitioner will conduct Unannounced
Observation #1. Required: Observation Forms (Appendix D)

Unannounced Obs. #1 Post-Conference: The Supervising Practitioner and the
teacher candidate will meet for a post-conference after Unannounced
Observation #1. Required: Observation Forms (Appendix D). Resource: Model
Observation Protocol: Post-Conference Planning Form (Appendix D)

Collect Student Feedback: The teacher candidate administers student feedback
surveys, using the Model Student Feedback Surveys.

Announced Obs. #2: Pre-Conference: The teacher candidate and the Program
Supervisor hold a Pre-Conference for Announced Observation #2. Resource:
Model Observation Protocol: Pre-Conference Planning Form (Appendix D)

Announced Obs. #2: The Program Supervisor will conduct the second Announced
Observation #2. Required: Observation Forms (Appendix D)

Announced Obs. #2 Post-Conference: Conduct a post-conference for Announced
Observation #2. Required: Observation Forms (Appendix D), Resource: Model
Observation Protocol: Post-Conference Planning Form (Appendix D)
WHAT
WHEN
Unannounced Observation #1: About one-third of the way through the practicum. At
least one week after the 1st Three-Way meeting.
Administer Student Feedback & Announced Observation #2: Half-way through
practicum.
Candidate Assessment of Performance
15
More Observations as Needed
Step 4: Formative Assessment

Formative Assessment Calibration: The Program Supervisor and the Supervising
Practitioner should discuss the evidence collected thus far (including observations
and results from student feedback) and calibrate on the feedback to be provided
to the teacher candidate; including formative ratings on the CAP Rubric. Required:
CAP Form & Rubric (Appendix A)

2nd Three-Way Meeting: During the second Three-Way Meeting the Program
Supervisor and the Supervising Practitioner:
 Share formative ratings on the CAP Rubric and discuss. Required: CAP
Form & Rubric (Appendix A)
 Revisit candidate goal(s) and plan; adjust accordingly (including
potentially modifying the goal, increasing supports, adding additional
observations, etc.). Resource: Finalized Goal(s) & Implementation Plan
Form (Appendix C)
WHAT
Required: CAP Form and Rubric Section 2 & 3 (Appendix A), Resources: Three-Way
Meeting Checklist (Appendix E)
WHEN
Half-way through practicum. After Announced Observation #2.
More Observations as Needed
Step 5: Summative Assessment

Unannounced Obs. #2: The Supervising Practitioner will conduct Unannounced
Observation #2. Required: Observation Forms (Appendix D)

Unannounced Obs. #2 Post-Conference: The Supervising Practitioner and the
teacher candidate will meet for a post-conference after Unannounced
Observation #2. Program Supervisor is optional for post-conference and
observation. Required: Observation Forms (Appendix D). Resource: Model
Observation Protocol: Post-Conference Planning Form (Appendix D)

Summative Assessment Calibration: The Program Supervisor and the Supervising
Practitioner should discuss the entire body of evidence collected (including
observations, results from student feedback, candidate artifacts and measures of
student impact) and calibrate on the assessment to be provided to the teacher
candidate; including summative ratings on the CAP Rubric. Required: CAP Form &
Rubric (Appendix A)
WHAT
Candidate Assessment of Performance
16

Final Three-Way Meeting: During the final Three-Way Meeting the Program
Supervisor and the Supervising Practitioner:

Share summative ratings on the CAP Rubric and discuss. Required: CAP
Form & Rubric (Appendix A)

Complete CAP Form with the candidate. Required: Required: CAP Form &
Rubric (Appendix A)
Resources: Three-Way Meeting Checklist (Appendix E)
WHEN
Unannounced Observation #2: About two-thirds of the way through the practicum. At
least one-week after the 2nd Three-Way Meeting.
Final Three-Way Meeting: Within the final two weeks of the practicum
Post-Cycle

WHAT
WHEN
Develop Professional Practice Goal: The teacher candidate will establish a
preliminary professional practice goal based on the results of the summative
assessment, which can be used to support the transition into his/her first year of
employment. Resource: Preliminary Goal-Setting & Plan Development Form & MA
Model Teacher Rubric
After the Summative Assessment.
CAP 5-Step Cycle: Scheduling
ESE recognizes that the length of practicum experiences can vary widely according to the educator
preparation programs in which teacher candidates are enrolled. Because of these variances, a
prescriptive timeline for completing the CAP cannot be strictly identified. As you can see from the
timeframe estimates above, each step is presented is relation to the context of the length of the
practicum. SOs should modify this schedule accordingly. When planning and scheduling, it is
recommended that:
• each of the four observations (two announced, two unannounced) be equally spaced apart, and
• post-conferences happen within two to five days following observations to ensure timely,
useful feedback to candidates.
Candidate Assessment of Performance
17
Categories of Evidence
The assessment of candidate readiness through CAP is made using multiple measures. There are four
major categories of evidence required in CAP: observations, impact on student learning, student
feedback, and candidate artifacts. In addition to these required categories of evidence, SOs may
identify other sources of evidence or more narrowly specify the evidence required in each category.
Observations
Teacher candidates are observed in at least two announced and two unannounced observations
during their practicums. Program Supervisors and Supervising Practitioners actively collect evidence
during the observation and then synthesize the key evidence to provide focused feedback to
candidates. Each observation is documented using the Observation Forms available in Appendix D. It
is important to note that observations are part of an assessment. To this end, assessors should be
cautious to not dramatically influence or interfere with the candidate’s performance.
Full guidance and support for collecting evidence through observations can be found in Observation
Forms and Model Protocol (Appendix D).
Measure of Student Learning
Working together with the Program Supervisor and Supervising Practitioner, candidates will be
expected to develop a student learning goal. Assessment(s) for measuring a teacher candidate’s impact
on student learning will be finalized during the first Three-Way Meeting.
The Supervising Practitioner should identify at least one measure of student learning, growth, or
achievement that assesses a meaningful sample of the content the teacher candidate is primarily
responsible for teaching. The Supervising Practitioner will set clear expectations for how and when the
measure will be administered and scored. In addition, the Supervising Practitioner will rely on his/her
professional experience with the identified measure(s) and understanding of the specific learning
context, to set parameters for a range of expected learning,
growth, or achievement (see ESE’s Implementation Brief on
Audio Deep Dive:
Measure of Student Learning
Scoring and Parameter Setting for more information about this
process).
Full guidance and support for collecting evidence of candidate impact on student learning can be found
in Measuring Candidate Impact on Student Learning (Appendix G).
Student Feedback
Feedback from students plays a key role in teaching and learning in the Commonwealth and can be a
critical source of evidence in understanding candidate performance. Student feedback must be
collected using model ESE Student Feedback Surveys. These surveys have been carefully crafted for
alignment to the Standards for Effective Teaching practice and items have been validated for use in the
Educator Evaluation Framework.3
3
More information about the development and statistical properties of the surveys please visit:
http://www.doe.mass.edu/edeval/feedback/
Candidate Assessment of Performance
18
Teacher candidates (or their Supervising Practitioner) may administer the short form of the student
feedback surveys; although, ESE encourages the administration of the standard form as it provides
more substantial information about teaching practice and students’ perceptions of the teacher
candidate.
Audio Deep Dive:
Student Feedback Surveys
ESE Model Feedback Surveys have been validated for use in
grades 3-12. For candidates in PreK-2 placements, the
Supervising Practitioner may solicit feedback from students using the K-2 Discussion Prompts &
Administration Protocol. Due to the time-consuming nature of conducting interviews of Pre-K-2
students, Supervising Practitioners may solicit feedback from a representative sample of students.
Candidate Artifacts
Additional artifacts may be submitted as evidence to support the assessment of candidates. These
artifacts may include, but are not limited to:
• unit and/or lesson plans
• examples of students’ work
• behavior plans/ behavior data
• audio/video recordings
• reflection logs
One candidate artifact that is required to be collected is the Self-Assessment and Goal-Setting Forms
(Appendix B). Assessors should use evidence from these forms to assess candidate’s readiness on the
4.A.1 Reflective Practice element.
Evidence Requirements for Each Essential Element
CAP has been designed to generate and collect evidence for each of the essential elements. Each
component of the process has been developed to strategically support the assessors and candidates
in providing sufficient evidence that the thresholds for each element are being met. The table below
outlines the types of evidence that are, at a minimum, required to be used in assessing each element.
As you can see, each element has at least two required sources of evidence. Evidence collection is not
limited to the minimum requirements.
Candidate Assessment of Performance
19
The Summative Assessment
Program Supervisors and Supervising Practitioners are responsible for conducting a summative
assessment with each candidate at the conclusion of CAP. Program Supervisors and Supervising
Practitioners jointly determine the overall ratings based on their collective professional judgment and a
thorough examination of evidence that demonstrate the candidate has met all readiness thresholds
and is prepared to have positive impact on students.
The Role of Professional Judgment
How does an assessor know how to rate a specific element? How does this translate into an overall
determination of readiness?
There are no numbers or percentages that dictate ratings on elements or the summative readiness
determination for an individual candidate. This approach to assessment is modeled on the underlying
tenets of the Educator Evaluation Framework. Consider the following excerpt from educator evaluation
guidance on the Summative Performance Rating:
Rather than adopt a more mechanistic, one-size-fits all approach to supervision and evaluation,
the Massachusetts evaluation framework encourages evaluators to look for trends and patterns
in practice across multiple types of evidence and apply their professional judgment based on
this evidence when evaluating an educator.4 The role of evidence and professional judgment
in the determination of ratings on performance Standards and an overall Summative
Performance Rating is paramount in this process. Formulaic or numerical processes that
calculate outcome ratings and preclude the application of professional judgment are
inconsistent with the letter and the spirit of the evaluation framework.
The use of professional judgment based on multiple types of evidence promotes a more holistic
and comprehensive analysis of practice, rather than over-reliance on one individual data point
or rote calculation of practice based on predetermined formulas. Evaluators are also
encouraged to take into account how educators respond to or apply additional supports and
resources designed to promote student learning, as well as their own professional growth and
development. Finally, professional judgment gives evaluators the flexibility to account for a
wide variety of factors related to individual educator performance, such as: school-specific
priorities that may drive practice in one Standard; an educator’s number of goals; experience
level and/or leadership opportunities; and contextual variables that may impact the learning
environment, such as unanticipated outside events or traumas.
That said, professional judgment does not equate to a “black box” from which evaluators can
determine a performance rating. Evaluators must be well versed in and follow three regulatory
requirements in the application of professional judgment to an educator’s practice:
4
“…[T]he evaluator determines an overall rating of educator performance based on the evaluator's professional judgment
and an examination of evidence that demonstrates the educator's performance against Performance Standards and
evidence of the attainment of the Educator Plan goals” (603 CMR 35.06(6); see also ESE Model Teacher & Caseload
Educator Contract, Section 14(b)); “The professional judgment of the primary evaluator shall determine the overall
summative rating that the Educator receives” (ESE Model Teacher & Caseload Educator Contract, Section 14(c)).
Candidate Assessment of Performance
20



Inclusion of three Categories of Evidence (organized and analyzed using a
performance rubric)
Audio Deep Dive:
Assessment of Goal Attainment
Summative Ratings
Minimum Threshold Requirements
This same line of thinking is applied to decision-making in CAP. The figure below illustrates the entire
process by which an assessor determines a Summative Performance Rating. Based on evidence from
four distinct categories of evidence, the assessor applies his/her professional judgment to (1) an
evaluation of the candidate’s practice within each of the six essential elements, and (2) an assessment
of the degree to which the candidate met his/her professional practice goal(s).
Roles & Responsibilities
There are several stakeholders involved in the effective implementation of field-based experiences and
the assessment of candidate readiness through CAP. Below we have detailed the essential
responsibilities for: Sponsoring Organizations, PK-12 Schools and District Partners, Program
Supervisors, Supervising Practitioners and candidates. The functions that follow are critical to both
candidate preparedness and the effective assessment of practice. Ultimately, it will be the combined
effort of all parties involved to ensure that candidates are indeed ready to make impact on day one.
Sponsoring Organizations:
 Work to serve the needs of PK-12 partners; involve partners in the design and execution of
field-based experiences; and engage in partnerships that improve the experience for
preparation candidates and outcomes for PK-12 students.
 Design, implement and evaluate the quality of field-based experiences ensuring that they begin
Candidate Assessment of Performance
21








early in preparation, cover a range of time periods within the school year, are in settings with
diverse learners and build to candidate readiness for the licensure role.
Identify candidates throughout the program who may be at-risk of not meeting standards and
provide necessary supports and guidance to guide improvement or exit.
Identify Supervising Practitioners that meet all regulatory requirements, including being rated
as proficient or higher on their most recent summative evaluation, and monitor their efficacy in
impacting candidate performance.
Provide training, support, and development to Program Supervisors and Supervising
Practitioners that impacts candidate effectiveness.
Ensure that all candidates receive consistent guidance, support and high-quality feedback
during field-based experiences with the Program Supervisor and Supervising Practitioner.
Oversee CAP as the culminating assessment of performance and ensure that it serves to
document the evidence of candidate readiness (or not) for the licensure role. Maintain all
required CAP forms on file at the Sponsoring Organization.
Develop programs of study that ensure candidates are prepared to demonstrate readiness in
their practicum placements.
Use formative and summative assessment data to target areas of candidate need.
Use data from CAP to inform strategic decisions that have a positive impact on programs,
candidates and employing PK-12 partners.
PK-12 Schools and District Partners
 Engage in the design, implementation and evaluation of field-based experiences.
 Provide Sponsoring Organizations with a list of potential Supervising Practitioners that meet
regulatory requirements, including being rated as proficient or higher on their most recent
summative evaluation.
 Support teachers serving in the role as Supervising Practitioners; monitor their efficacy in
impacting candidate effectiveness; and recognize individuals’ contributions to the profession.
 Coordinate with Sponsoring Organizations to implement field-based experiences that cover a
range of time periods and are in settings with diverse learners.
 When appropriate, calibrate observations and feedback with Program Supervisor and
Supervising Practitioner to ensure teachers are receiving consistent messages about their
practice.
Program Supervisor
 Provide candidates with consistent guidance, support and high-quality feedback during field
based experiences that improves their practice.
 Use CAP as outlined in these guidelines to assess and document evidence of candidate
readiness for the licensure role.
 Coordinate the CAP process in collaboration with the Supervising Practitioner and candidate;
stay on top of timelines, facilitate meetings; calibrate with the Supervising Practitioner; submit
all forms.
 Actively collect evidence during observations, synthesize and analyze the evidence to provide
focused feedback to the candidate about their performance.
 Conduct at least two observations of the candidate; review information from all observations;
Candidate Assessment of Performance
22

support the Supervising Practitioner in conducting observations.
Submit data on candidate performance on CAP.
Supervising Practitioner
 Use CAP as outlined in these guidelines to assess
Audio Deep Dive:
and document evidence of candidate readiness
Supervising Practitioners
for the licensure role.
 Conduct at least two observations of the candidate; review information from all observations;
support the Program Supervisor in conducting observations.
 Actively collect evidence during observations, synthesize and analyze the evidence to provide
focused feedback to the candidate about their performance.
 Identify and set the measures of student learning to be used in CAP prior to the first Three-Way
Meeting, support Program Supervisor in interpreting candidate performance relative to the
parameters that were set.
 Administer, or support the candidate in administering, the student feedback surveys.
Candidate
 Participate in CAP as outlined in these guidelines including attending Three-Way Meetings;
being available for additional observations, and collecting evidence of candidate artifacts.
 Engage in early field-based experiences and activities in coursework that provide you with the
knowledge and skills necessary to demonstrate readiness for the licensure role.
 Demonstrate competency at all threshold levels; attain growth on professional practice goal;
have a moderate or high impact on student learning.
 Administer, or support the Supervising Practitioner in administering, the student feedback
surveys.
 Provide feedback to your Sponsoring Organization about your experience in your preparation
program.
Candidate Assessment of Performance
23
Appendices
APPENDIX A: CANDIDATE ASSESSMENT OF PERFORMANCE (CAP) FORM AND RUBRIC ....................................... 25
APPENDIX B: CANDIDATE SELF-ASSESSMENT & GOAL-SETTING ........................................................................... 34
CANDIDATE SELF-ASSESSMENT & GOAL-SETTING: AN OVERVIEW ....................................................................................... 35
Self-Assessment ................................................................................................................................................. 35
Goal-Setting & Plan Development ..................................................................................................................... 36
CANDIDATE SELF-ASSESSMENT FORM ............................................................................................................................. 37
APPENDIX C: FINALIZED GOAL & IMPLEMENTATION PLAN FORM....................................................................... 42
APPENDIX D: OBSERVATION FORMS & MODEL PROTOCOL ................................................................................ 44
OBSERVATION OVERVIEW ............................................................................................................................................. 45
Announced Observation #1 ............................................................................................................................... 51
Unnnounced Observation #1 ............................................................................................................................. 53
Announced Observation #2 ............................................................................................................................... 54
Unannounced Observation #2 ........................................................................................................................... 54
ESE MODEL OBSERVATION PROTOCOL ........................................................................................................................... 51
Before the Observation ..................................................................................................................................... 51
During the Observation ..................................................................................................................................... 53
After the Observation ........................................................................................................................................ 54
APPENDIX E: THREE-WAY MEETING CHECKLISTS ................................................................................................. 62
APPENDIX F: CROSSWALK FOR THE SIX ESSENTIAL ELEMENTS OF CAP AND PST GUIDELINES .............................. 65
APPENDIX G: MEASURING CANDIDATE IMPACT ON STUDENT LEARNING ........................................................... 68
APPENDIX H: CAP & THE EDUCATOR EVALUATION FRAMEWORK ........................................................................ 70
APPENDIX I: 603 CMR 7.00 REGULATIONS FOR EDUCATOR LICENSURE AND PROGRAM APPROVAL .................... 73
Candidate Assessment of Performance
24
Appendix A: Candidate Assessment of Performance (CAP) Form and Rubric
The following appendix includes three sections to be completed:
 Section 1: General information should be completed by the teacher candidate and the
Program Supervisor
 Section 2: Rubrics should be completed by the Supervising Practitioner and the Program
Supervisor
 Section 3: Summary and Signatures will need to be completed by the Supervising
Practitioner, the Program Supervisor, and the teacher candidate.
All sections of the form must be retained on file at the Sponsoring Organization.
Candidate Assessment of Performance (CAP) Form & Rubric
Candidate Assessment of Performance Form and Rubric
Section 1: General Information (to be completed by the Candidate)
Candidate Information
First Name:
Last Name:
Street Address:
City/Town:
State:
Zip:
MEPID #:
Massachusetts license
number(if applicable):
Program Information
Sponsoring Organization:
Program Area & Grade
Level:
Have any components of the approved program been waived? 603 CMR 7.03(1)(b)
Practicum Information
Yes
Practicum
Practicum/Equivalent Course Number:
No
Practicum Equivalent
Credit hours:
Practicum/Equivalent
Seminar Course Title:
Practicum/Equivalent Site:
Grade Level(s) of Students:
Total Number of Practicum Hours:
Number of hours assumed full
responsibility in the role:
Supervising Practitioner Information (to be completed by the Program Supervisor)
Name:
School District:
Position:
License Field(s):
MEPID or
License #
# of years experience under license:
Initial
Professional
Yes
No
To the best of my knowledge (per the Supervising Practitioner’s
Principal/Evaluator), the Supervising Practitioner has received a summative
evaluation rating of proficient or higher in his most recent evaluation.
Candidate Assessment of Performance (CAP) Form & Rubric
Section 2: CAP Rubric (to be completed by the Program Supervisor and Supervising Practitioner )
I.A.4: Well-Structure Lessons
Unsatisfactory
I-A-4.
WellStructured
Lessons
Develops lessons with
inappropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources, and/or
grouping for the intended
outcome or for the students
in the class.
Needs Improvement
Proficient
Exemplary
Develops lessons with only
some elements of
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources, and
grouping.
Develops well-structured
lessons with challenging,
measurable objectives and
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources,
technologies, and grouping.
Develops well-structured
and highly engaging lessons
with challenging,
measurable objectives and
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources,
technologies, and grouping
to attend to every student’s
needs. Is able to model this
element.
Formative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
I.B.2: Adjustment to Practice
Unsatisfactory
Makes few adjustments to
practice based on formal
and informal assessments.
Needs Improvement
Proficient
Exemplary
May organize and analyze
some assessment results
but only occasionally
adjusts practice or modifies
future instruction based on
the findings.
Organizes and analyzes
results from a variety of
assessments to determine
progress toward intended
outcomes and uses these
findings to adjust practice
and identify and/or
implement appropriate
differentiated interventions
and enhancements for
students.
Organizes and analyzes
results from a
comprehensive system of
assessments to determine
progress toward intended
outcomes and frequently
uses these findings to
adjust practice and identify
and/or implement
appropriate differentiated
interventions and
enhancements for
individuals and groups of
students and appropriate
modifications of lessons
and units. Is able to model
this element.
I-B-2.
Adjustment to
Practice
Formative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
II.A.3: Meeting Diverse Needs
II-A-3.
Meeting
Diverse Needs
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Uses limited and/or
inappropriate practices to
accommodate differences.
May use some appropriate
practices to accommodate
differences, but fails to
address an adequate range
of differences.
Uses appropriate practices,
including tiered instruction
and scaffolds, to
accommodate differences
in learning styles, needs,
interests, and levels of
readiness, including those
of students with
disabilities and English
learners.
Uses a varied repertoire of
practices to create
structured opportunities
for each student to meet
or exceed state
standards/local curriculum
and behavioral
expectations. Is able to
model this element.
Formative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
II.B.1: Safe Learning Environment
II-B-1.
Safe Learning
Environment
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Maintains a physical
environment that is unsafe
or does not support
student learning. Uses
inappropriate or ineffective
rituals, routines, and/or
responses to reinforce
positive behavior or
respond to behaviors that
interfere with students’
learning.
May create and maintain a
safe physical environment
but inconsistently
maintains rituals, routines,
and responses needed to
prevent and/or stop
behaviors that interfere
with all students’ learning.
Uses rituals, routines, and
appropriate responses that
create and maintain a safe
physical and intellectual
environment where
students take academic
risks and most behaviors
that interfere with learning
are prevented.
Uses rituals, routines, and
proactive responses that
create and maintain a safe
physical and intellectual
environment where
students take academic
risks and play an active
role—individually and
collectively—in preventing
behaviors that interfere
with learning. Is able to
model this element.
Formative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
II.D.2: High Expectations
II-D-2.
High
Expectations
Unsatisfactory
Needs Improvement
Gives up on some students
or communicates that
some cannot master
challenging material.
May tell students that the
subject or assignment is
challenging and that they
need to work hard but does
little to counteract student
misconceptions about
innate ability.
Formative Assessment
Quality
Scope
Consistency
Evidence:
Effectively models and
reinforces ways that
students can master
challenging material
through effective effort,
rather than having to
depend on innate ability.
Exemplary
Effectively models and
reinforces ways that
students can consistently
master challenging material
through effective effort.
Successfully challenges
students’ misconceptions
about innate ability. Is able
to model this element.
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
Proficient
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
IV.A.1: Reflective Practice
Unsatisfactory
IV-A-1.
Reflective
Practice
Demonstrates limited
reflection on practice
and/or use of insights
gained to improve practice.
Needs Improvement
May reflect on the
effectiveness of lessons/
units and interactions with
students but not with
colleagues and/or rarely
uses insights to improve
practice.
Formative Assessment
Quality
Scope
Consistency
Evidence:
Exemplary
Regularly reflects on the
effectiveness of lessons,
units, and interactions with
students, both individually
and with colleagues, and
uses insights gained to
improve practice and
student learning.
Regularly reflects on the
effectiveness of lessons,
units, and interactions with
students, both individually
and with colleagues; and
uses and shares with
colleagues, insights gained
to improve practice and
student learning. Is able to
model this element.
*
*
*
Summative Assessment
Quality
Scope
Consistency
Evidence:
Proficient
*
*
*
Candidate Assessment of Performance (CAP) Form & Rubric
Candidate Assessment of Performance Form and Rubric
Section 3: Summary and Signatures
Three-Way Meetings
1st Three-Way Meeting
Candidate
Supervising Practitioner
Date:
2nd Three-Way Meeting
Program Supervisor
Candidate
Supervising Practitioner
Date:
Final Three-Way Meeting
Program Supervisor
Candidate
Supervising Practitioner
Date:
Program Supervisor
Summary Ratings
Element
1.A.4: Well-Structured Lessons
1.B.2: Adjustment to Practice
2.A.3: Meeting Diverse Needs
2.B.1: Safe Learning Environment
2.D.2: High Expectations
4.A.1: Reflective Practice
Quality
Based on the candidate’s performance as
measured on the CAP Rubric, we have
determined this candidate to be:
Consistency
Ready to Teach
Scope
Readiness Thresholds Met?
Not Yet Ready
Supervising Practitioner
Date:
Program Supervisor
Date:
Mediator (if necessary see: 603 CMR 7.04(4))
Date:
Candidate Assessment of Performance (CAP) Form & Rubric
Appendix B: Candidate Self-Assessment & Goal-Setting
This section provides an overview of expectations for the candidate in conducting a Self-Assessment
and developing preliminary goals and a plan for improvement. Included are:
 Overview of the Self-Assessment and Goal-Setting processes
 Candidate Self-Assessment Form
 Preliminary Goal-Setting and Plan Development Form
As a companion to these documents, please see Appendix C which includes the form to record the
finalized goals and implementation plan.
Candidate Assessment of Performance
34
Candidate Assessment of Performance (CAP): Self-Assessment & Goal-Setting
Candidate Self-Assessment & Goal-Setting: An Overview
One of the most important characteristics of the MA Educator Evaluation Framework is that it is designed to
provide each educator with significant agency over his/her evaluation experience. That starts with the SelfAssessment, during which educators reflect on their practice, review data, and identify areas of focus for his/her
goals. Likewise, the Candidate Assessment of Performance (CAP) positions candidates to play a lead role in
maximizing their practicum experiences through the inclusion of self-assessment and goal setting activities. With
support from the Program Supervisor and Supervising Practitioner, the candidate evaluates his/her practice and
develops a professional practice goal that will help guide action steps, resources/support and feedback
throughout the 5-Step Cycle.
The forms that follow were developed to guide candidates through the self-assessment and goal-setting
processes. The Self-Assessment and draft goal are shared with the Program Supervisor and Supervising
Practitioner in advance of the first Three-Way Meeting and will be used as a source of evidence for the 4.A.1
Reflective Practice essential element.
Self-Assessment
In conducting the Self-Assessment, candidates are asked to consider their prior experiences and generate an
authentic assessment of where their strengths lie and where there are areas in need of improvement. Analysis
of prior and existing practice is grounded in the CAP Rubric. Candidates should reflect on the following in
completing the Self-Assessment:
 Skills acquired in coursework
 Experiences in pre-practicum
 Targeted feedback they have received about their practice
 Evidence of impact with students
 Reflection on performance in Announced Observation #1
In the Self-Assessment, as in the CAP Rubric, candidates are asked to consider aspects of their knowledge and
skill across three dimensions:
 Quality: ability to perform the skill, action or behavior as described in the proficient performance
descriptor
 Consistency: the frequency (e.g., all the time, sometimes, once) that the skill, action or behavior is
demonstrated with quality
 Scope: the scale of impact (e.g., one student, subset of children, all students) to which the skill, action
or behavior is demonstrated with quality
By considering performance across these dimensions candidates are able to identify discrete and specific areas
of strength and areas for growth. The self-assessment form does not, however, include the thresholds present in
the CAP Rubric to ensure that the inclusion of the established expectations for performance do not
unintentionally hinder candidates’ ability to provide an authentic assessment of his/her performance.
Candidates should complete the Self-Assessment within the first three weeks of the practicum placement.
Candidates will be asked to share the completed Self-Assessment with the Program Supervisor and Supervising
Practitioner.
35
Candidate Assessment of Performance (CAP): Self-Assessment & Goal-Setting
Goal-Setting & Plan Development
Candidates are required to draft a professional practice goal as part of the Candidate Assessment of
Performance (CAP). Professional practice goals are driven by the needs of the individual educator in relation to
the four Professional Standards for Teachers (PSTs). Professional practice goals should be closely aligned to the
CAP Rubric and support the learning and development of the candidate, with the intent of helping him/her
improve his/her practice. Unlike in the Educator Evaluation Framework, candidates are not required to draft a
student learning goal while engaging in CAP as this is measure is set for them by the Supervising Practitioner.
Through the goal-setting form, candidates are guided to craft a S.M.A.R.T goal, consistent with practices
expected of educators under the Educator Evaluation Framework. The S.M.A.R.T goal framework is useful in
helping individual create effective goals and action plans. Key characteristics of S.M.A.R.T goals are:
S = Specific and Strategic – Goals should be specific so that at the end of the evaluation cycle educators
and evaluators can determine whether they have been achieved. Goals should also be strategic, i.e.,
serve an important purpose for students, the school, and/or the district.
M = Measurable – Goals should be measurable so that progress toward a goal can be evaluated and
managed.
A = Action Oriented – Goals have active, not passive verbs. The action steps attached to the goals
indicate who is doing what.
R = Rigorous, Realistic, and Results Focused (the 3 Rs) – Goals should make clear what will be different
as a result of achieving the goal. A goal needs to describe a realistic yet ambitious result. It needs to
stretch the educator, team, school, or district toward improvement, but it should not be out of reach.
T = Timed and Tracked – A goal needs to have a final deadline, as well as interim deadlines by when key
actions will be completed and benchmarks will be achieved. Tracking the progress on both action steps
and outcome benchmarks is important, as they help educators know whether they are on track to
achieve the goal, and give educators information they need to make midcourse corrections.
Candidates’ professional practices goals should be derived from the Self-Assessment, and target specific areas
they identified as opportunities for growth. Goals will be finalized during the first Three-Way Meeting.
36
Candidate Assessment of Performance (CAP): Self-Assessment Form
Candidate Self-Assessment Form
Directions: Independently, reflect on your performance in each dimension of an element. Use the performance
descriptors from the CAP Rubric to help ground your assessment. Authenticity is encouraged. Consider the
following in rating your current level of performance (as applicable):
 Skills acquired in coursework
 Experiences in pre-practicum
 Targeted feedback you have received about your practice
 Evidence of impact with students
 Reflection on performance in Announced Observation #1
Unsatisfactory
I-A-4.
WellStructured
Lessons
Develops lessons with
inappropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources, and/or
grouping for the intended
outcome or for the students
in the class.
I.A.4: Well-Structure Lessons
Needs Improvement
Proficient
Develops lessons with only
some elements of
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources, and
grouping.
Develops well-structured
lessons with challenging,
measurable objectives and
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources,
technologies, and grouping.
Exemplary
Develops well-structured
and highly engaging lessons
with challenging,
measurable objectives and
appropriate student
engagement strategies,
pacing, sequence, activities,
materials, resources,
technologies, and grouping
to attend to every student’s
needs. Is able to model this
element.
Quality
Scope
Consistency
I.B.2: Adjustment to Practice
I-B-2.
Adjustment
to Practice
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Makes few adjustments to
practice based on formal
and informal assessments.
May organize and analyze
some assessment results
but only occasionally
adjusts practice or modifies
future instruction based on
the findings.
Organizes and analyzes
results from a variety of
assessments to determine
progress toward intended
outcomes and uses these
findings to adjust practice
and identify and/or
implement appropriate
differentiated interventions
and enhancements for
students.
Organizes and analyzes
results from a
comprehensive system of
assessments to determine
progress toward intended
outcomes and frequently
uses these findings to
adjust practice and identify
and/or implement
appropriate differentiated
interventions and
enhancements for
individuals and groups of
students and appropriate
modifications of lessons
and units. Is able to model
this element.
Quality
Scope
Consistency
Dimensions of Readiness: Quality: ability to perform the skill, action or behavior; Consistency: the frequency (e.g., all the
time, sometimes, once) that the skill, action or behavior is demonstrated with quality; Scope: the scale of impact (e.g., one
student, subset of children, all students) to which the skill, action or behavior is demonstrated with quality
Candidate Assessment of Performance (CAP): Self-Assessment Form
II.A.3: Meeting Diverse Needs
II-A-3.
Meeting
Diverse
Needs
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Uses limited and/or
inappropriate practices to
accommodate differences.
May use some appropriate
practices to accommodate
differences, but fails to
address an adequate range
of differences.
Uses appropriate practices,
including tiered instruction
and scaffolds, to
accommodate differences
in learning styles, needs,
interests, and levels of
readiness, including those
of students with disabilities
and English learners.
Uses a varied repertoire of
practices to create
structured opportunities for
each student to meet or
exceed state
standards/local curriculum
and behavioral
expectations. Is able to
model this element.
Quality
Scope
Consistency
II.B.1: Safe Learning Environment
II-B-1.
Safe
Learning
Environment
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Maintains a physical
environment that is unsafe
or does not support
student learning. Uses
inappropriate or ineffective
rituals, routines, and/or
responses to reinforce
positive behavior or
respond to behaviors that
interfere with students’
learning.
May create and maintain a
safe physical environment
but inconsistently
maintains rituals, routines,
and responses needed to
prevent and/or stop
behaviors that interfere
with all students’ learning.
Uses rituals, routines, and
appropriate responses that
create and maintain a safe
physical and intellectual
environment where
students take academic
risks and most behaviors
that interfere with learning
are prevented.
Uses rituals, routines, and
proactive responses that
create and maintain a safe
physical and intellectual
environment where
students take academic
risks and play an active
role—individually and
collectively—in preventing
behaviors that interfere
with learning. Is able to
model this element.
Quality
Scope
Consistency
Dimensions of Readiness: Quality: ability to perform the skill, action or behavior; Consistency: the frequency (e.g., all the
time, sometimes, once) that the skill, action or behavior is demonstrated with quality; Scope: the scale of impact (e.g., one
student, subset of children, all students) to which the skill, action or behavior is demonstrated with quality
Candidate Assessment of Performance (CAP): Self-Assessment Form
II.D.2: High Expectations
II-D-2.
High
Expectations
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Gives up on some students
or communicates that
some cannot master
challenging material.
May tell students that the
subject or assignment is
challenging and that they
need to work hard but does
little to counteract student
misconceptions about
innate ability.
Effectively models and
reinforces ways that
students can master
challenging material
through effective effort,
rather than having to
depend on innate ability.
Effectively models and
reinforces ways that
students can consistently
master challenging material
through effective effort.
Successfully challenges
students’ misconceptions
about innate ability. Is able
to model this element.
Quality
Scope
Consistency
IV.A.1: Reflective Practice
IV-A-1.
Reflective
Practice
Unsatisfactory
Needs Improvement
Proficient
Exemplary
Demonstrates limited
reflection on practice
and/or use of insights
gained to improve practice.
May reflect on the
effectiveness of lessons/
units and interactions with
students but not with
colleagues and/or rarely
uses insights to improve
practice.
Regularly reflects on the
effectiveness of lessons,
units, and interactions with
students, both individually
and with colleagues, and
uses insights gained to
improve practice and
student learning.
Regularly reflects on the
effectiveness of lessons,
units, and interactions with
students, both individually
and with colleagues; and
uses and shares with
colleagues, insights gained
to improve practice and
student learning. Is able to
model this element.
Quality
Scope
Consistency
Dimensions of Readiness: Quality: ability to perform the skill, action or behavior; Consistency: the frequency (e.g., all the
time, sometimes, once) that the skill, action or behavior is demonstrated with quality; Scope: the scale of impact (e.g., one
student, subset of children, all students) to which the skill, action or behavior is demonstrated with quality
Candidate Assessment of Performance (CAP): Self-Assessment Form
Candidate Self-Assessment: Summary Sheet
Name:
Date:
Directions: In the table below, please record the rating for each element. Use the following key: Exemplary (E),
Proficient (P), Needs Improvement (NI), Unsatisfactory (U)
Element
1.A.4: Well-Structured Lessons
1.B.2: Adjustment to Practice
2.A.3: Meeting Diverse Needs
2.B.1: Safe Learning Environment
2.D.2: High Expectations
4.A.1: Reflective Practice
Self-Assessment Summary
Quality
Consistency
Scope
Based on your Self-Assessment, briefly summarize your areas of strength and high-priority areas for growth.
Area(s) of Strength
Evidence/Rationale
Element/Dimension
Area(s) for Growth
Evidence/Rationale
Element/Dimension
Please share your Self-Assessment Summary as well as the Goal Setting & Plan Development Forms with your
Program Supervisor and Supervising Practitioner at least three days in advance of the initial Three-Way Meeting,
or earlier upon request.
Candidate Assessment of Performance (CAP): Goal-Setting & Plan Development
Preliminary Goal-Setting & Plan Development
Name:
Date:
Prompt: Identify/Clarify a Focus or Goal Topic (Essential Element, See Self-Assessment Form)
Strategic Prompt: Why is this topic/focus area important?
Objective:
Specific, Rigorous, Results-Focused Prompt: What skills, knowledge, or practice will I acquire or develop through
achieving this goal?
Realistic, Timed Prompt: When will I achieve this goal?
Action-Oriented, Tracked Prompt: How will I demonstrate progress toward this goal?
Measured Prompt: How will I know the goal has been achieved?
Draft Professional
Practice Goal:
What actions will you take to achieve the goal?

What actions/supports/resources will you need from
your Program Supervisor and Supervising Practitioner?

Appendix C: Finalized Goal & Implementation Plan Form
The form that follows may be used in documenting the candidate’s finalized professional practice goal
and the agreed upon implementation plan. This form is optional; Sponsoring Organizations and
assessors may adopt or adapt.
Candidate Assessment of Performance
42
Candidate Assessment of Performance (CAP): Candidate Goal(s) & Implementation Plan
Candidate Professional Practice Goal(s) & Implementation Plan
Name:
Date:
Goal(s): Based on the candidate’s self-assessment and the baseline ratings determined by the Program
Supervisor and Supervising Practitioner, the candidate has set the following S.M.A.R.T goal(s):
Essential Element
CAP Professional
Practice Goal(s)
Implementation Plan: In support of attaining the goal(s), the candidate, Program Supervisor and
Supervising Practitioner agree on the following actions (add more rows as needed):
Action
Supports/Resources from
Timeline/Frequency
Measure of Student Learning: In addition to attaining the professional practice goal, the candidate will
also be assessed based in part on their impact on student learning. The Supervising Practitioner, in
coordination with the Program Supervisor, has set the following measures of student learning.
Measure of Student Learning
Impact Rating
High
Moderate
Low
Parameters
Candidate Assessment of Performance (CAP): Observation Forms & Model Protocol
Appendix D: Observation Forms & Model Protocol
This section provides a model protocol that Sponsoring Organizations may choose to use or modify
according to their needs. The use of this protocol is optional, but is highly recommended. The
protocol and forms include:

An overview of the observation expectations
Required Observation Forms:
 Observation forms for each of the four observations (including a blank form to be used as needed
for additional observations).
o Announced Observation #1 Form
o Unannounced Observation #1 Form
o Announced Observation #2 Form
o Unannounced Observation #2 Form
o Blank Observation Form
Optional Model Observation Protocol:
 A detailed, step-by-step outlined of the Model Observation Protocol of what should occur before,
during, and after each observation, including:
o Pre-Conference Planning Form
o Post-Conference Planning Form
o Candidate Observation Self-Reflection form
Candidate Assessment of Performance (CAP): Observation Forms & Model Protocol
Observation Overview
Observations are one of the most critical sources of evidence collected by assessors in the Candidate
Assessment of Performance (CAP). The protocol and forms that follow are designed to support candidates and
assessors in engaging in observations that:
1) Collect and document evidence of performance for the six Essential Elements
2) Provide focused, actionable feedback to candidates about their performance
Under CAP, assessors are required to conduct a minimum of four observations, two announced and two
unannounced. Program Supervisors and Supervising Practitioners are encouraged to conduct additional
observations; Sponsoring Organization may set additional requirements for the context of their programs that
exceed the minimum number of observations required.
It is the expectation that each (announced and unannounced) observation include all of the following:
 Active evidence collection during the observation (see below for more information)
 Analysis and synthesis of the evidence by the Program Supervisor/Supervising Practitioner following the
observations; linking evidence to the six essential elements and identifying strengths and areas for
improvement
 Self-reflection by the candidate
 Targeted feedback to the candidate that will improve his/her practice
Announced observations must include all of the above and the following:
 Review of candidate’s lesson materials (e.g., plan, assessment goals, relevant student artifacts) by
Program Supervisor/ Supervising Practitioner in advance
 Conversation prior to the observation about goals for the lesson and areas of focus for evidence
collection and feedback (driven by candidate’s goals and Essential Elements)
The forms that follow are designed to document implementation of the observations according to the
expectations outlined above. These forms must be retained in candidate files. As is the case throughout CAP,
Sponsoring Organizations may add additional components or expectations for documentation.
Note on Active Evidence Collection: ESE expects that the assessor conducting the observation actively collect
evidence during the observation; including teacher moves/behaviors and student actions/behaviors. It is
important to note that assessors should avoid making judgments about performance DURING the observation.
Active evidence collection should serve solely to document what happens during the observation. After the
observation, assessors should refer to the evidence to support claims about candidate performance. There are
several methods that may be deployed in order to accomplish this including; time-stamped scripting,
videotaping, audio recording, etc. It is recommended that Sponsoring Organizations require and train Program
Supervisors/ Supervising Practitioners in a particular method of evidence collection as this will help to calibrate
the consistency of feedback provided to candidates. ESE is not collecting documentation of active evidence
collection, the forms that follow, however, will serve as indication that it has in fact occurred.
In addition to the observation forms, ESE has also provided a model protocol. The Model Observation Protocol is
designed to be supportive of Program Supervisors and Supervising Practitioners as they facilitate all aspects of
the observation process including: preparing for the pre-conference; conducting the pre-conference; selecting
refinement and reinforcement objectives; and conducting a post-conference. The model is provided as a
resource only; Sponsoring Organizations and assessors may adopt or adapt the model protocol to meet their
needs, or use something else entirely.
Candidate Assessment of Performance (CAP): Observation Form
Observation Form
What:
How:
Who:
Focus Elements:
Focus Elements:
Pre-Observation Conference
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
One-on-One
Other
Active Evidence Collection occurred during the observation and is synthesized and categorized below.
Element
1.A.4
1.B.2
2.A.3
2.B.1
2.D.2
4.A.1
Focused Feedback
Reinforcement Area/Action:
(strengths)
Refinement Area/Action:
(areas for improvement)
Evidence
Candidate Assessment of Performance (CAP): Observation Forms
CAP Observation Form: Announced Observation #1
What:
Who:
Observation # 1
How:
Announced
Program Supervisor & Supervising Practitioner
1.A.4 Well-Structured Lessons; 2.D.2 High Expectations
Focus Elements:
Note: As this is the first observation, assessors should attempt to collect evidence for all
elements in order to provide a baseline for future observations
Pre-Observation Conference
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
One-on-One
Other
Active Evidence Collection occurred during the observation and is synthesized and categorized below.
Element*
Evidence**
1.A.4*
1.B.2
2.A.3
2.B.1
2.D.2*
4.A.1
Focused Feedback
Reinforcement Area/Action:
(strengths)
Refinement Area/Action:
(areas for improvement)
* Observations must collect and document evidence for at least the focus elements. Focus elements are highlighted.
** Evidence included is indicative of performance relative to each element. It may include evidence that demonstrates one
or more of the dimensions (quality, consistency, scope) of an element are being met or that performance is not yet at the
expected threshold.
Candidate Assessment of Performance (CAP): Observation Forms
CAP Observation Form: Unannounced Observation #1
What:
Who:
Observation # 1
How:
Unannounced
Supervising Practitioner
Focus Elements:
1.A.4: Well Structured Lessons; 2.B.1: Safe Learning Environment
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
One-on-One
Other
Active Evidence Collection occurred during the observation and is synthesized and categorized below.
Element*
Evidence**
1.A.4*
1.B.2
2.A.3
2.B.1*
2.D.2
4.A.1
Focused Feedback
Reinforcement Area/Action:
(strengths)
Refinement Area/Action:
(areas for improvement)
* Observations must collect and document evidence for at least the focus elements. Focus elements are highlighted.
** Evidence included is indicative of performance relative to each element. It may include evidence that demonstrates one
or more of the dimensions (quality, consistency, scope) of an element are being met or that performance is not yet at the
expected threshold.
Candidate Assessment of Performance (CAP): Observation Forms
CAP Observation Form: Announced Observation #2
What:
Who:
Observation # 2
How:
Announced
Program Supervisor
Focus Elements:
1.B.2: Adjustments to Practice; 2.A.3: Meeting Diverse Needs
Pre-Observation Conference
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
One-on-One
Other
Active Evidence Collection occurred during the observation and is synthesized and categorized below.
Element*
Evidence**
1.A.4
1.B.2*
2.A.3*
2.B.1
2.D.2
4.A.1
Focused Feedback
Reinforcement Area/Action:
(strengths)
Refinement Area/Action:
(areas for improvement)
* Observations must collect and document evidence for at least the focus elements. Focus elements are highlighted.
** Evidence included is indicative of performance relative to each element. It may include evidence that demonstrates one
or more of the dimensions (quality, consistency, scope) of an element are being met or that performance is not yet at the
expected threshold.
Candidate Assessment of Performance (CAP): Observation Forms
Observation Form: Unannounced Observation #2
What:
Who:
Observation # 2
How:
Unannounced
Supervising Practitioner & Program Supervisor
Focus Elements:
1.B.2: Adjustment to Practice & Others as identified during the Formative
Assessment
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
One-on-One
Other
Active Evidence Collection occurred during the observation and is synthesized and categorized below.
Element*
Evidence**
1.A.4
1.B.2*
2.A.3
2.B.1
2.D.2
4.A.1
Focused Feedback
Reinforcement Area/Action:
(strengths)
Refinement Area/Action:
(areas for improvement)
* Observations must collect and document evidence for at least the focus elements. Focus elements are highlighted.
** Evidence included is indicative of performance relative to each element. It may include evidence that demonstrates one
or more of the dimensions (quality, consistency, scope) of an element are being met or that performance is not yet at the
expected threshold.
Candidate Assessment of Performance (CAP): Model Protocol
ESE Model Observation Protocol
The model protocol was developed with support and expertise from the National Institute for Excellence in
Teaching (NIET) and is based in great part on NIET’s extensive experience conducting evidence-based
observations and meaningful evaluation conversations that lead to improved practice. Future training and
support provided by ESE on CAP will focus on the approach to observation outlined in this model protocol. The
Model Observation Protocol is provided as a resource and suggested framework; Sponsoring Organizations may
choose to adopt or adapt the model protocol to meet their needs, or use their own protocol for observation that
best suits their organization.
The Model Observation Protocol guides assessors through several steps:
1) Before the Observation:
 Preparing for the pre-conference
 Conducting the pre-conference
2) During the Observation:
 Actively collecting evidence
3) After the Observation:
 Analyzing the evidence
 Preparing for the post-conference
 Identifying reinforcement/refinement objectives
 Conducting the post-conference
Implementation of each step will, of course, vary depending on the type observation (announced v.
unannounced) and whether one assessor or multiple assessors are participating.
The assessor will use a given protocol, the Observation Form, and the CAP Rubric (Appendix A) to successfully
complete each observation. At the end of the observation, the assessor will be able to rate the candidate on the
essential elements that are being observed for Quality, Scope, and Consistency, and will be able to provide
evidence to support the chosen ratings.
Before the Observation
Work done prior to the observation focuses around conducting a pre-conference. Therefore, these steps are
only likely to be applicable for an announced observation as they require advance notification to the candidate
of the upcoming observation. The pre-conference can be an important opportunity to build rapport with the
teacher candidate, establish a coaching relationship, and begin to collect evidence for the upcoming
observation.
Ideally, the pre-conference occurs one to two days prior to the observation and lasts between 15-20 minutes.
Preparing for the Pre-Conference
Begin by gathering/reviewing evidence, including:
 Lesson Plan
 Lesson materials (e.g., assessment, handouts, etc.)
 Prior observations and feedback provided to candidate
Review the lesson plan and associated materials for evidence of each of the six essential elements. The PreConference Planning Form can guide this evidence collection.
It is important to note that this observation is part of an assessment. To this end, assessors should be cautious
prior to the lesson to not dramatically influence or alter the candidate’s plan. Assessors should refrain from
Candidate Assessment of Performance (CAP): Model Protocol
providing substantial feedback on the lesson plan prior to the observation. Reviewing the lesson plan and
materials in advance serves only to better position the assessors in collecting focused evidence and providing
quality feedback after the observation.
The planning form also prompts the assessor to draft a set of questions to aid in the conversation that occurs in
the pre-conference. The assessor should create questions that are likely to:
1) Generate additional evidence in support of the essential elements
2) Address any aspects of the lesson that may negatively impact of observation
The pre-conference is short; to keep the conversation focused assessors should prepare 2-3 questions keeping in
mind the goals mentioned above. The following are examples of questions the assessor may plan to ask in a preconference:
General Questions
 Tell me about the lesson I will observe.
 What do you expect students to know and be able to do at the end of the lesson?
 What kind of background do the students need to have for this lesson?
 Tell me about any challenges or specific areas of the rubric that you are currently working to
strengthen.
 Is there anything else you would like me to know before the lesson?
Standards/Objectives
 How will you check for student mastery in the lesson?
 How will the learning objective be communicated to students?
 How do you plan to connect the lesson to previous learning?
 What type(s) of thinking will be evident in the lesson? How will students apply this thinking
during the lesson?
Lesson Structure and Pacing
 Talk about the lesson structure (beginning, middle, and end).
 Talk about classroom procedures.
 How is the lesson structured for students that may progress at different learning rates?
 Talk about any anticipated learning difficulties that may occur during the lesson.
Activities and Materials
 How do the activities relate to the objective?
 How will you make the lesson relevant to students?
 Talk about the grouping that will be used in the lesson to maximize student learning.
 Talk about the strategies that will be used during the lesson to maximize student understanding.
If the observation is being conducted jointly between the Program Supervisor and Supervising Practitioner, the
preparation should also be coordinated to ensure that the two observers have a unified focus and set of
expectations for the observation.
Conducting the Pre-conference
Ideally, the pre-conference occurs one to two days prior to the observation and lasts between 15-20 minutes.
While conducting the pre-conference face-to-face may be preferable, assessors may consider taking advantage
of other methods of engaging in pre-conference, such as Skype, Facetime, Google Hangout, etc.
Candidate Assessment of Performance (CAP): Model Protocol
A pre-conference should include the following: an introduction, a discussion based on the review of lesson
materials, and a summary of next steps.
Pre-conference Introduction (2 min)
The introduction helps to set the tone and purpose of the pre-conference. While it may appear overly formal it
can be valuable in establishing routines that help to keep the conversation focused and brief. Below is an
example of one approach to the introduction of a pre-conference:
 Greeting: “Thanks for taking the time to meet with me. I’m really looking forward to coming into your
class on _________”
 Time: “This discussion should take us about 20 minutes”
 Set Purpose: “The purpose of our conversation is for you to help me to know what I can expect to see
happen during the observation and for you to know what things I am specifically looking for.”
Discussion of the Lesson (15 min)
Following the brief introduction, the assessor should transition quickly into probing further on the candidate’s
intentions and plans for the lesson being observed. It is most productive when the assessor has reviewed the
lesson plan prior to this conversation and can ask specific, probing questions about the lesson. The candidate
should be doing the majority of the talking during this portion of the pre-conference. The assessors should be
capturing notes on the conversation. Below is an example of one approach to the discussion about the lesson:
 Reference review of materials: “I reviewed the materials you sent me in advance and think I have a clear
sense of the lesson but was hoping you could elaborate on a few points to be sure I understand your
plan.”
 Ask questions: see pre-conference preparation section for examples
Pre-conference Closure (3 min)
Assessors should leave time at the end of the conference to summarize any takeaways from the conversation as
well as align expectations for the upcoming observation. Below is an example of one approach to preconference closure:
 Revisit prior feedback: “After our second observation, we agreed that you would work to [fill in] so I will
be looking for evidence of that in the upcoming observation.”
 Review the focus Essential Elements: “Also, because this is the third observation, I will also be collecting
evidence specifically for element 1.B. 2.”
 Summarize takeaways from the conversation: “Based on what you shared with me during our
conversation, it sounds like you are also looking for feedback on your transitions so I will be sure to make
note of those as well.”
During the Observation
The primary goal of the assessor during the observation is to actively collect evidence. Active evidence collection
should capturing both teacher and student behavior/actions. Active evidence collection does not include making
judgments or inferences during the observation; this occurs after when the assessors is analyzing and
synthesizing the evidence. Evidence should reflect exactly what happens in the classroom. Evidence collected
should include a balance of both summary statements as well as direct quotes.
There are various tools assessors may use to collect evidence during the lesson. This could include scripting,
videotaping, audio-recording, or using other commercially available applications that aid in observing specific
classroom interactions.
Evidence collected during the observation is solely to aid the assessor in identifying trends and selecting
illustrative examples of aspects of performance. It is not designed to be shared directly with the candidate nor is
Candidate Assessment of Performance (CAP): Model Protocol
it collected by ESE. Individual providers may, however, decide to collect this information from assessors for
training or documentation purposes.
After the Observation
After an observation, assessors review evidence collected and begin to analyze it as a measure of candidate
performance and then strategically plan for a post-conference in which candidates are provided with targeted
feedback.
Ideally, post-conferences occur one to two days following the observation and last between 20-30 minutes.
Post-conferences should not occur immediately after the lesson as this does not allow for sufficient time for the
assessors to synthesize evidence and feedback or for the candidate to adequately reflect.
Sponsoring Organizations may also consider having the candidate submit a written reflection to the assessor(s)
prior to the post-conference. See the Candidate Self-Reflection Form. If adding this step, assessors should plan
to complete their analysis prior to reviewing the candidate self-reflection.
Analyzing & Categorizing the Evidence
Following the observation, the assessors should review the evidence collected during the lesson and begin to fill
in the evidence chart located in the appropriate Observation Form. When categorizing evidence, assessors
should consider the following:
 Not every piece of evidence collected during the observation needs to be sorted into the evidence chart.
 Evidence may demonstrate that one or more of the dimensions (Quality, Consistency, Scope) of an
element are being met OR that performance is not yet at the expected threshold.
 It is recommended that you consult the CAP Rubric when categorizing evidence.
 Evidence statements should not simply reiterate or restate the performance descriptors present in the
CAP Rubric; the evidence should explain what happened in the observation that shows/does not show
that a skill has been demonstrated.
 Assessors might consider “tagging” evidence that gets included in the Observation Forms by dimension
(quality, scope, consistency) so that it can easily be referred to when making summative judgments.
Preparing for the Post-Conference
The primary purpose of the post-conference is to provide candidates feedback about their performance during
the observation. After categorizing evidence according to the elements, assessors should work to identify
specific areas of strength (reinforcement) and areas for improvement (refinement). The Post-Conference
Planning Form can guide the development of this feedback.
Begin by gathering/reviewing evidence, including:
 Lesson Plan & Pre-Conference Planning Form
 Notes from Pre-Conference
 Observation Form that contains categorized evidence
 Candidate Self-Reflection Form (if required)
For observations that are conducted jointly, the Program Supervisor and Supervising Practitioner should
calibrate on the evidence categorized as well as the identification of areas for reinforcement/refinement. This
must be done prior to meeting with the candidate to ensure that the candidate receives consistent, calibrated
feedback about their performance.
Candidate Assessment of Performance (CAP): Model Protocol
Identifying Reinforcement and Refinement Areas
Assessors are asked to identify for the candidate areas of strength and areas for improvement. This does not
preclude the candidate from self-identifying areas as well. Areas of reinforcement and refinement should be tied
directly to the six Essential Elements.
 Reinforcement: The area of reinforcement should identify the candidate’s instructional strength in a
way that encourages the continuation of effective practices in the future. The area of reinforcement
should be deep rooted in student-based evidence that demonstrates successful positive impact on
student learning.
 Refinement: The area of refinement should identify the areas in need of instructional improvement.
In reflecting on the analysis of the evidence, assessors should select one to two (but no more than three)
reinforcement and refinement areas. Assessors are encouraged to select the reinforcement and refinement
areas that are most likely to improve candidate practice and have a positive impact on student learning.
The refinement/reinforcement objectives can focus on the Quality, Consistency or Scope dimension of an
element. However, assessors should not set refinement or reinforcement goals around Consistency or Scope
until the candidate has successfully met the Quality threshold.
There are several guiding questions and considerations that can support the identification of the effective
reinforcement/refinement areas:
 Which areas on the CAP rubric received the highest ratings (reinforcements) and the lowest ratings
(refinements)?
 Which of these areas would have the greatest impact on student achievement?
 Which of these areas would have the greatest impact on other areas of the rubric?
 In which area does the candidate have the most potential for growth?
 Make sure that the reinforcement is not directly related to the refinement. It is important that
candidates see their area of strength as separate from their area needing improvement.
 Choose a refinement area for which you have sufficient and specific evidence from the lesson to support
why the teacher needs to work in this area.
 Select refinement topics around which you are prepared to provide specific support. There is nothing
worse than telling a teacher they need to alter their practice and then not being able to provide specific
examples for how this can be done.
 Understand the teacher’s capacity when identifying an area of refinement. In other words, where will
you get the biggest bang for your buck?
 Remember—reinforcements should be only to strengthen the candidate’s performance. Do not hedge
this part of the post-conference with qualifying statements such as “it could have been even better if,”
or “next time you could also do…” Teachers need to hear what they are effective at, and have it be left
at that.
 When developing the post-conference plan, consider identifying the area of refinement first. This will
ensure that the reinforcement and refinement do not overlap.
Once you have identified the areas of refinement/reinforcement fill them in at the bottom of the Observation
Form.
Conducting the Post-Conference
Ideally, the post-conference occurs one to two days prior to the observation and lasts between 20-30 minutes.
While conducting the post-conference face-to-face may be preferable, assessors may consider taking advantage
Candidate Assessment of Performance (CAP): Model Protocol
of other methods of engaging in pre-conference, such as Skype, Facetime, Google Hangout, etc. Postconferences should not occur immediately after the lesson as this does not allow for sufficient time for the
assessors to synthesize evidence and feedback or for the candidate to adequately reflect.
A post-conference should include the following: an introduction, a discussion of reinforcement/refinement
areas, and a summary of next steps.
Post-conference Introduction (5 min)
The introduction helps to set the tone and purpose of the post-conference. While it may appear overly formal it
can be valuable in establishing routines that help to keep the conversation focused and brief. Below is an
example of one approach to the introduction of a post-conference:




Greeting: “Thanks for taking the time to meet with me. I’m really looking forward to our discussion on
the lesson I was able to see in action. ”
Time: “This discussion should take us about 30 minutes”
Set Purpose: “The purpose of our conversation is for us to identify both strengths and areas of
improvement in your practice”
Probe for self-reflection: “What are your thoughts about how the students responded to the lesson?” OR
if the candidate already completed the self-reflection form, “I saw from your reflection that…”
Discussion of Reinforcement/Refinement Areas (20 min)
The discussion about strengths and areas for improvement should begin with outlining the areas of
reinforcement and then transition to the areas of refinement. The assessor should provide specific examples
from the observation as evidence of the area of refinement or reinforcement. Below is an example of one
approach to the discussion:


Share areas of Reinforcement:
o Provide evidence from observation: “There were several instances throughout the lesson where
you asked a variety of questions to check for student understanding. For example, after showing
the pictograph you…”
o State impact on students: “In doing so, students were required to justify their thinking and it
allowed you to quickly identify misconceptions in students understanding.”
o Provide recommended action: “Continue to…”
Share areas of Refinement:
o Ask self-reflection question: Ask a specific question to prompt the teacher to talk about what
you want him or her to improve. Utilize a question that includes specific language from the
rubric, which can lead the teacher to reflect on the indicator you have identified as his/her area
of refinement as it relates to the lesson. Example: “When developing lessons, how do you decide
on the pacing of the lesson so sufficient time is allocated for each segment?”
o Share evidence from observation: “You mentioned earlier that you wanted students to be able to
work in groups and then report their findings. However, there was not sufficient time for this to
occur during the lesson. According to the observation log, the first 6 minutes was spent
organizing materials and transiting students; the next 23 minutes was spent with you modeling
the objective at the board with some questions and answer time built in.”
o Provide concrete suggestions for how to improve: “As you modeled how to analyze a
pictograph, students could have worked with their group members to answer your questions
prior to your providing the answer, then they could have reported to the class their findings. This
would have still allowed you to model, but would have also allowed students to work together to
Candidate Assessment of Performance (CAP): Model Protocol
o
o
analyze the pictograph. Students who may not have required this review could have worked
independently in a group to analyze their own pictograph while the rest of the class participated
in your modeling. This would have also allowed you to differentiate the pacing of the lesson to
provide for students who progress at different learning rates. This lesson could also have been
segmented into two different lessons.”
Provide recommended action: “Moving forward…”
Share resource/support: “As you work to further refine this skill, I think it might be helpful if you
go and observe Mrs. Blank in 3rd grade who is highly-skilled in this area. I’ve already spoken with
her and she has agreed to an observation and debrief next week.”
Post-conference Closure (5 min)
Evaluators should leave time at the end of the conference to summarize any takeaways from the conversation.
Below is an example of one approach to post-conference closure:



Share Observation Form: “I’ve categorized the evidence from observation as well as recorded the
reinforcement and refinement areas and actions here” (Alternatively, you may have the candidate fill in
the refinement/reinforcement goals based on the discussion. Evaluators should, however, be sure that it
is an accurate reflection of the intended objectives for refinement/reinforcement.)
Leave time for questions: “Do you have any other questions?”
Confirm next steps in process: “The next formal observation will be unannounced and conducted by your
Supervising Practitioner. Because it is the second observation she will be focusing evidence collection on
the refinement goals we discussed today as well as essential elements 1.A.4 and 2.B.1.”
CAP Model Protocol: Pre-Conference Planning Form
Model Observation Protocol: Pre-Conference Planning Form
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Whole Group
Small Group
Element
1.A.4: Well-Structured Lessons
1.B.2: Adjustments to Practice
2.A.3: Meeting Diverse Needs
2.B.1: Safe Learning Environment
2.D.2: High Expectations
4.A.1: Reflective Practice
Refinement areas previously identified
Questions to ask in pre-conference
One-on-One
Evidence
Other
CAP Model Protocol: Candidate Self-Reflection Form
Model Observation Protocol: Candidate Self-Reflection Form
Directions: Following an announced or an unannounced observation, please use the form below to reflect on
the lesson. Submit the form to your Supervising Practitioner/Program Supervisor within 24 hours of the
observation.
Observation Details
Date:
Time (start/end):
Content Topic/
Lesson Objective:
Type of Observation:
Announced
Unannounced
Observed by:
Supervising
Practitioner
Program
Supervisor
Reflection Prompt: What do you think went particularly well? How did this strength impact your students’
learning?
Reflection Prompt: If you could teach this lesson again, is there anything you would do differently? How would
this have impacted your students’ learning?
Essential Element
1.A.4: Well-Structured Lessons
1.B.2: Adjustments to Practice
2.A.3: Meeting Diverse Needs
2.B.1: Safe Learning Environment
2.D.2: High Expectations
Evidence: Where possible, provide one piece of evidence that you believe
demonstrates your performance relative to the quality, consistency or
scope of each element.
CAP Model Protocol: Post-Conference Planning Form
Model Observation Protocol: Post-Conference Planning Form
Observation Details
Date:
Time (start/end):
Content Topic/Lesson Objective:
Refinement Area #1
1.A.4: Well Structured Lessons
2.B.1 Safe Learning Environment
1.B.2: Adjustments to Practice
2.D.2 High Expectations
2.A.3: Meeting Diverse Needs
4.A.1 Reflective Practice
1.A.4: Well Structured Lessons
2.B.1 Safe Learning Environment
1.B.2: Adjustments to Practice
2.D.2 High Expectations
2.A.3: Meeting Diverse Needs
4.A.1 Reflective Practice
Self-Reflection Question(s) to
prompt candidate
Evidence from Observation
Recommended Action
Potential Resources/Guided
Practice/Training to support
Refinement Area #2
Self-Reflection Question(s) to
prompt candidate
Evidence from Observation
Recommended Action
Potential Resources/Guided
Practice/Training to support
CAP Model Protocol: Post-Conference Planning Form
Reinforcement Area #1
1.A.4: Well Structured Lessons
2.B.1 Safe Learning Environment
1.B.2: Adjustments to Practice
2.D.2 High Expectations
2.A.3: Meeting Diverse Needs
4.A.1 Reflective Practice
1.A.4: Well Structured Lessons
2.B.1 Safe Learning Environment
1.B.2: Adjustments to Practice
2.D.2 High Expectations
2.A.3: Meeting Diverse Needs
4.A.1 Reflective Practice
Evidence from Observation
Recommended Action
Reinforcement Area #2
Evidence from Observation
Recommended Action
Upcoming Steps in the CAP Process




Type of Next Observation:
Focus of Next Observation:
Date/topic of next three-way meeting:
Other:
Appendix E: Three-Way Meeting Checklists
The purpose of the following resource is to support Program Supervisors and Supervising Practitioners
in executing Three-Way Meetings. Please refer to the appropriate appendices for more details on each
topic.
Candidate Assessment of Performance (CAP): Three-Way Meeting Checklist
Announced Observation #1 (PS & SP)
SP and PS
TC
First Three-Way Meeting (occurs within first three weeks)
Before
 Complete Self-Assessment & GoalSetting Forms  Share with PS/SP

Conduct a Post-Conference for
Announced Obs. #1
 Calibrate feedback from Announced
Obs. #1

Share baseline ratings

Finalize professional practice
goal(s)
 Review Candidate Self-Assessment
& Goal-Setting Form
 Prepare to share baseline ratings on
CAP Rubric
Required:
 Observation Form: Announced

Forms
During
45- 60 min
Observation #1
Candidate Self-Assessment & GoalSetting Form
Optional:
 Model Observation Protocol: Post

Agree on implementation plan

Sign-off at conclusion of meeting
Required:
 Observation Form: Announced

After

Share goals and plan with
practicum seminar instructor

Act on commitments made in
implementation plan
Optional:
 Finalized Goal(s) & Implementation
Observation #1
CAP Rubric & Form ((Section 1 & 3)
Plan Form
Optional:
 Finalized Goal(s) & Implementation
Plan Form
Conference Planning Form
Candidate Self-Assessment & Goal
Setting Form (used by SP/PS to prepare
baseline)
Unannounced Observation #1 (SP) and Announced Observation #2 (PS)
TC
Second Three-Way Meeting (occurs half-way through)
Before
 Administer Student Feedback
Surveys  Share results with
PS/SP
SP and PS

Review all available evidence
During
30-45 min

Share formative ratings and
discuss

Revisit candidate goals and
implementation plan; adjust
accordingly (including potentially
(including observations, student
feedback, measures of student
learning, self-reflections forms etc.).

Individually assess candidate
performance using the CAP Rubric

Calibrate formative assessment
ratings
modifying the goal, increasing
supports, adding additional
observations, etc.)

Sign-off at conclusion of meeting
After

Share formative assessment and
updated goals and plan with
practicum seminar instructor

Schedule/conduct additional
observations

Act on commitments made in
implementation plan
Forms
Candidate Assessment of Performance (CAP): Three-Way Meeting Checklist
Required:
 CAP Rubric & Form (Section 2:
Formative Assessment)
Required:
 CAP Rubric & Form (Section 3)
Optional:
 Finalized Goal(s) & Implementation
Plan Form
Optional:
 Finalized Goal(s) & Implementation
Plan Form
Unannounced Observation #2 (SP – PS optional but encouraged)
Forms
SP and PS
TC
Final Three-Way Meeting (in final two weeks)
Before
 Share evidence of performance
including, but not limited to:
candidate artifacts, measures of
student learning, student feedback 
 Review all available evidence

 Individually assess candidate
performance using the CAP Rubric
During
30-45 min
Share summative ratings and
discuss
Sign-off at conclusion of meeting

Calibrate summative assessment
ratings
Required:
 CAP Rubric & Form (Section 2:
Summative Assessment &, Section 3)
Required:
 CAP Rubric & Form (section 3)
After

Draft a professional practice goal
to use during first (or next) year
of employment

Ensure all documents are
retained in candidate files

Submit summative assessment
data
Appendix F: Crosswalk for the Six Essential Elements of CAP and PST
Guidelines
The table crosswalks the coverage of indicators identified at the “demonstrate” level of practice in the
PST Guidelines by the essential elements assessed in CAP.
Candidate Assessment of Performance
65
CAP Crosswalk: Coverage of PSTs through the Essential Elements
(1) Curriculum, Planning, and Assessment standard: Promotes the learning and growth of all students by
providing high quality and coherent instruction, designing and administering authentic and meaningful student
assessments, analyzing student performance and growth data, using this data to improve instruction, providing
students with constructive feedback on an on-going basis, and continuously refining learning objectives.
Indicator
Essential Element in CAP
(a) Curriculum and Planning indicator: Knows the
subject matter well, has a good grasp of child
development and how students learn, and designs
1.A.4 Well-Structured Lessons
effective and rigorous standards-based units of
instruction consisting of well-structured lessons with
measurable outcomes.
(b) Assessment indicator: Uses a variety of informal
and formal methods of assessment to measure student
learning, growth, and understanding, develop
1.B.2 Adjustment to Practice
differentiated and enhanced learning experiences, and
improve future instruction.
SEI (a) Uses instructional planning, materials, and
student engagement approaches that support students
1.A.4 Well-Structured Lessons
of diverse cultural and linguistic backgrounds,
strengths, and challenges.
SEI (c) Demonstrates knowledge of the difference
2.B.1 Safe Learning Environment
between social and academic language and the
importance of this difference in planning,
2.A.3 Meeting Diverse Needs
differentiating and delivering effective instruction for
English language learners at various levels of English
2.D.2 High Expectations
language proficiency and literacy.
(2) Teaching All Students standard: Promotes the learning and growth of all students through instructional
practices that establish high expectations, create a safe and effective classroom environment, and demonstrate
cultural proficiency.
Indicator
(a) Instruction indicator: Uses instructional practices
that reflect high expectations regarding content and
quality of effort and work, engage all students, and are
personalized to accommodate diverse learning styles,
needs, interests, and levels of readiness.
(b) Learning Environment indicator: Creates and
maintains a safe and collaborative learning
environment that values diversity and motivates
students to take academic risks, challenge themselves,
and claim ownership of their learning.
(c) Cultural Proficiency indicator: Actively creates and
maintains an environment in which students' diverse
backgrounds, identities, strengths, and challenges are
respected.
Essential Element in CAP
2.A.3 Meeting Diverse Needs
2.D.2 High Expectations
1.A.4: Well-Structured Lessons
2.B.1 Safe Learning Environment
2.A.3 Meeting Diverse Needs
2.B.1 Safe Learning Environment
CAP Crosswalk: Coverage of PSTs through the Essential Elements
(d) Expectations indicator: Plans and implements
lessons that set clear and high expectations and make
knowledge accessible for all students.
(f) Classroom Management Indicator: Employs a
variety of classroom management strategies to
monitor, modify, and motivate positive student
behavior and to establish and maintain consistent
routines and procedures.
SEI (b) Uses effective strategies and techniques for
making content accessible to English language learners.
2.A.3 Meeting Diverse Needs
2.D.2 High Expectations
1.A.4: Well-Structured Lessons
2.A.3 Meeting Diverse Needs
2.B.1 Safe Learning Environment
2.A.3 Meeting Diverse Needs
SEI (d) Creates and maintains a safe collaborative
learning environment that values diversity and
2.A.3 Meeting Diverse Needs
motivates students to meet high standards of conduct,
effort, and performance.
(4) Professional Culture standard: Promotes the learning and growth of all students through ethical,
culturally proficient, skilled, and collaborative practice.
Indicator
Essential Element in CAP
(a) Reflection indicator: Demonstrates the capacity to
reflect on and improve the educator's own practice,
using informal means as well as meetings with teams
and work groups to gather information, analyze data,
examine issues, set meaningful goals, and develop new
approaches in order to improve teaching and learning.
4.A.1 Reflective Practice
1.A.4 Well-Structured Lessons:
1.B.2 Adjustment to Practice
(f) Professional Responsibilities indicator: Is ethical and
reliable, and meets routine responsibilities consistently.
2.A.3 Meeting Diverse Needs
2.B.1 Safe Learning Environment
2.D.2 High Expectations
4.A.1 Reflective Practice
Appendix G: Measuring Candidate Impact on Student Learning
The guidance that follows outlines how Supervising Practitioners should identify student learning
measures and set parameters for impact.
Candidate Assessment of Performance
68
CAP: Guidance for Measuring Candidate Impact on Student Learning
The Supervising Practitioner should identify at least one measure of student learning, growth, or achievement
that assesses a meaningful sample of the content the teacher candidate is primarily responsible for teaching.
The Supervising Practitioner will set clear expectations for how and when the measure will be administered and
scored. In addition, relying on his/her professional experience with the identified measure(s) and his/her
understanding of the specific learning context, the Supervising Practitioner will set parameters for a range of
expected learning, growth, or achievement (see ESE’s Implementation Brief on Scoring and Parameter Setting
for more information about this process). Student outcomes below that range will be considered lower than
expected and outcomes above that range will be considered higher than expected.
For example, if the teacher candidate is responsible for teaching a
math unit, the Supervising Practitioner may choose the end of unit
assessment as the measure of student learning to include in the CAP. If
over the past four units the average end-of-unit assessment scores
were 84, 89, 81, and 83, the Supervising Practitioner may determine
that a class average between 80 and 90 represents expected
achievement, less than 80 represents lower than expected
achievement and more than 90 represents higher than expected
achievement.
The candidate will administer the identified measure(s) of student
learning, growth, or achievement. Administration does not need to
occur at the end of the practicum, but rather at the instructionally
appropriate time during the practicum. After the measure is scored,
the candidate should analyze the results and compare them to the
parameters set by the Supervising Practitioner. Did all students achieve
the expected outcomes? If not, were there patterns in performance
that might indicate why some students made higher or lower than
expected gains?
Wherever possible, measures of
student growth should be used. As
stated in Technical Guide B,
“Student growth scores provide
greater insight into student learning
than is possible through the sole use
of single-point-in-time student
achievement measures. This is
because students do not enter a
classroom with the same level of
knowledge, skills, and readiness.
Achievement scores provide
valuable feedback to educators
about student attainment against
standards, but taken by themselves
may not be a sufficient reflection of
student progress.” Growth
measures allow students of all
abilities an opportunity to
demonstrate how much they have
learned and in many ways provide a
fuller picture of the impact of
instruction.
The experience of administering, scoring, and analyzing a measure of
student learning, growth, or achievement is a crucial component of
CAP. It is an essential skill of every effective teacher to be able to draw
conclusions about his/her practice from student outcome data.
Therefore, it is important to gauge a candidate’s aptitude to develop this skill. It is important to note that a
measure of student learning, growth, or achievement is not a complete measure of a candidate’s impact on
student learning. In the educator evaluation framework, multiple measures over multiple years are used to
inform conclusions about educator impact. Given the abbreviated classroom experience associated with CAP, it
is impossible to generate enough data to draw a conclusion about the candidate’s impact on student learning. It
is possible, however, to assess the candidate’s ability to reflect on student outcomes and make connections to
his/her practice.
Candidate Assessment of Performance
69
Appendix H: CAP & the Educator Evaluation Framework
This section highlights the important parallels between CAP’s 5-step Cycle and the 5-step Cycle used as
part of the Educator Evaluation Framework.
Candidate Assessment of Performance
70
CAP & the Educator Evaluation Framework
The CAP measures a candidate’s readiness to be effective on day one of his/her teaching career. It is aligned to
the Massachusetts Educator Evaluation Framework, which outlines a comprehensive process for continuous
educator growth and development use by all school districts statewide. This intentional alignment exemplifies
the cohesion Massachusetts is building across the educator career continuum through our complementary
educator effectiveness policies.
The CAP is designed to promote honest reflection about candidate performance and keep student learning at
the center of a candidate’s practicum experience. Like the evaluation framework, inquiry into practice and
impact is grounded in a 5-Step Cycle. The 5-Step Cycle used in CAP has been modified to meet the needs of
candidates, Program Supervisors, and Supervising Practitioners, but retains the same core architecture of the
cycle included in the evaluation framework:
Step 1: Self-Assessment
Step 2: Goal-Setting and Plan Development
Step 3: Plan Implementation
Step 4: Formative Assessment
Step 5: Summative Evaluation
One of the most important characteristics of the educator evaluation framework is the design to provide each
educator with significant agency over his/her evaluation experience. That starts with the Self-Assessment,
during which educators reflect on their practice, review data, and identify areas of focus for his/her goals.
Likewise, the CAP positions candidates to play a lead role in maximizing their practicum experiences through
the inclusion of self-assessment and goal setting activities. With support from the Program Supervisor and
Supervising Practitioner, the candidate evaluates his/her practice and develops a professional practice goal
that will form the backbone of his/her plan throughout CAP.
Another point of alignment is visible in Step 3: Plan Implementation. The CAP, like the educator evaluation
framework, requires the collection of multiple forms of evidence to evaluate educator practice and progress
toward goals. Announced and unannounced observations, artifacts of practice, student feedback, and
measures of student learning are all part of the evidentiary bases of both the CAP and the evaluation
framework. This deliberate congruity will help candidates successfully transition to the educator workforce.
The CAP’s inclusion of a formative assessment prior to the summative evaluation is also borrowed from the
educator evaluation framework. The evaluation framework is founded on educator growth and development,
so it is fitting that “no surprises” has emerged as a mantra in many districts. It is vitally important that
educators receive consistent, timely, and actionable feedback throughout the duration of their plans. However,
the formative assessment provides an opportunity for a more thorough mid-point check. Evaluators and
educators sit down to review evidence of the educator’s practice as it relates to their performance rubric and
the educator’s goals. If there are concerns, the evaluator may adjust the educator’s plan to provide more
targeted support. The formative assessment plays a similar role in the CAP. Here, Program Supervisors and
Supervising Practitioners meet with candidates to review the evidence collected so far and decide what
supports or interventions, such as additional observations, might be needed. Candidates in jeopardy of not
meeting CAP expectations should be put on notice during the formative assessment and be provided strategies
for improvement prior to the summative assessment.
Finally, following the CAP’s summative assessment, passing candidates use the feedback received to develop a
draft professional practice goal for their first year of teaching. Since this assessment is firmly aligned to the
Candidate Assessment of Performance
71
educator evaluation framework, candidates placed in Massachusetts districts will enter with a high degree of
familiarity and comfort with the 5-Step Cycle and be prepared with a draft professional practice informed by
the authentic evaluation experience provided by the CAP.
In support of the alignment between these two systems, Sponsoring Organizations should make use of the
significant resources materials available through the educator evaluation website. In particular:

Training Modules and Workshops for Evaluators and Teachers. Including a few spotlighted below:
o For Evaluators (to be adapted for use with Supervising Practitioners and Program Supervisors)
 Module 5: Gathering Evidence
 Module 6: Observations & Feedback
o For Teachers (to be adapted for use with candidates)
 Workshop 2: Self-Assessment
 Workshop 4: Gathering Evidence

Evidence Collection Toolkit: Designed to support districts to establish clear and consistent expectations
for evidence collection and promote a meaningful process for the collection, analysis, and sharing of
high quality artifacts. Includes: brief guidance, examples of district strategies, a worksheet for district
decision-making, and a handout of Evidence Collection Tips for Educators.

Student Feedback Surveys, including:
o Resources to highlight the value and importance of this measure in understanding and improve
teaching practice.
o Technical Report outlining the process of developing and validating the surveys.
o The Model Feedback Surveys and Administration Protocols

Guidance on measuring student impact, including example assessments and Implementation briefs
issued to support the work around developing District-Determined Measures (DDMs)

Quick Reference Guides that provide helpful overviews. Including a few spotlighted below:
o Connection between Educator Evaluation & Professional Development
o Connection between Educator Evaluation & the MA Curriculum Frameworks
Candidate Assessment of Performance
72
Appendix I: 603 CMR 7.00 Regulations for Educator Licensure and Program
Approval
This section excerpts regulations from 603 CMR 7.00 outlining the requirements relevant to the
execution of these guidelines.
Candidate Assessment of Performance
73
The Massachusetts Regulations for Educator Licensure and Preparation Program Approval (603 CMR 7.03)
require an assessment of a candidate’s performance in a practicum or practicum equivalent using guidelines
developed by the Massachusetts Department of Elementary and Secondary Education (ESE) for all programs
that are approved to endorse candidates for an Initial teacher license5 as well as the Performance Review
Program for Initial Licensure (603 CMR 7.05(4)(c) and 7.08(1)).
603 CMR 7.08(1) Professional Standards for Teachers
(1) Application. The Professional Standards for Teachers define the pedagogical and other professional
knowledge and skills required of all teachers. These standards and indicators referred to in 603 CMR 7.08 (2) and
(3) are used by Sponsoring Organizations in designing their teacher preparation programs and in preparing their
candidates. The standards and indicators are also used by the Department in reviewing programs seeking state
approval, and as the basis of performance assessments of candidates. Candidates shall demonstrate that they
meet the Professional Standards and Indicators referred to in 603 CMR 7.08 (2) and (3) by passing a
Performance Assessment for Initial License using Department guidelines.
5
These guidelines apply to all initial licensure teacher candidates except the following specialist licenses:
Academically Advanced, Reading, and Speech, Language, and Hearing Disorders teachers. It is the expectation of
ESE that programs endorsing in license areas not covered by CAP develop and implement a performance
assessment appropriate for the license.
Candidate Assessment of Performance
74
Download