Local Evaluation - University of Wisconsin

edTPA
Local Evaluation
University of Wisconsin-Whitewater
Dr. Kelly Jewell
edTPA Local Evaluation
Materials
• Local evaluation slides and script are authored by Tine Sloan, Nicole
Merino, and Tory Harvey. Other materials are authored by the Stanford
Center for Assessment, Learning and Equity (SCALE). All are available
for use by campuses participating in edTPA at the “exploratory” level.
• Copyright © 2013 Board of Trustees of the Leland Stanford Junior
University. All rights reserved.
• edTPA is a trademark of Stanford or its affiliates. Use, reproduction,
copying or redistribution of trademarks, without the written permission of
Stanford or its affiliates is prohibited.
Confidentiality & Secure
Access - NDA
• Candidate Samples
• Non Disclosure Agreement
What is Local Evaluation?
A way to make sense of
candidate work through the
frame of edTPA rubrics
An opportunity to
understand how candidates
are or are not meeting
performance standards
A time to investigate the
degree to which official
scores relate to local
evaluations
• Common language,
shared
understandings
• Feedback to
programs,
candidates
• Agreement check
What Local Evaluation is
NOT
Scoring
A deficit view of
candidates’ work
Today
1. Considerations for Local Evaluation
2. Structure of tasks and rubrics
3. Evaluation Process
– Planning
– Instruction
– Assessment
4. Feedback
Considerations for
local evaluation
How to organize the actual events
Considerations for local evaluation:
Who’s involved
Considerations for local evaluation
When you do it
Before or
After
Official Score
Results
Considerations for local evaluation
Type of data you are working with
Candidate Artifacts &
Commentaries
Rubric level official
scores & local
evaluations
Overall pass
rates
Considerations for local evaluation:
Tasks & Time
Which Tasks?
All 3 tasks or
a subset?
All rubrics or
a subset?
How many
candidates?
All or a %?
2-3 portfolios
per evaluator
is reasonable
Which
samples?
How much
meeting time?
A full day allows
for 3 tasks
For
Collaborative
analysis:
Random
samples, or
samples that
represent
different levels
of proficiency?
2 hours per
sample for
independent
reviews
Another ½ day
to debrief
results of
independent
review
Considerations
Recommendations for 1st evaluation event
Who’s doing it
Everyone – instructors & supervisors together
When you do it
Before official scores are in
Level of data
Candidate documents, video
Tasks & Time
•All or a % of candidate documents
•All tasks & rubrics
•2-3 portfolios per individual to evaluate on their own
•Schedule
•Full day of collaborative work with a common
sample (protocol modeled today)
•A few days for individual time to evaluate another shared
portfolio (4-5 evaluators per portfolio)
•2 hours for groups to meet & calibrate above before
evaluating remaining portfolios on their own
•½ - full day after completed evaluation to discuss results
For all local evaluation,
the first step is:
Looking at
the evidence
candidates
create
through
edTPA
Learning the
rubrics
Using the
rubrics to
make sense
of the
evidence &
map it to the
rubrics
Structure of tasks
and rubrics
Introductory information
Structure of the Portfolio
Planning
• Instructional and
social context
• Lesson plans and
Instructional
materials, student
assessments
Instruction
• Video Clips
• Instruction
Commentary
Assessment
• Analysis of whole
class assessment
• Analysis of learning
and feedback to
THREE students
• Assessment
Commentary
• Planning
Commentary
Analysis of Teaching Effectiveness
Academic Language Development
Two types of evidence:
Artifacts & Commentaries
Planning
• Instructional and
social context
• Lesson plans and
Instructional
materials, student
assessments
Instruction
• Video Clips
• Instruction
Commentary
Assessment
• Analysis of whole
class assessment
• Analysis of learning
and feedback to
THREE students
• Assessment
Commentary
• Planning
Commentary
Analysis of Teaching Effectiveness
Academic Language Development
Key Content Understandings
A word on Academic
Language
• To better understand the Academic
Language demands within edTPA,
candidates, faculty, and evaluators should
access:
– Academic Language Overview – TPAC
Online
Rubric Blueprint
Task name: Rubric Title
Guiding Question
Level 1
Level 2
Struggling
candidate,
not ready to
teach
Needs more
practice
Level 3
Acceptable
level to
begin
teaching
Level 4
Solid
foundation
of
knowledge
and skills
Level 5
Highly
accomplished
beginner
Rubric progression
Expanding repertoire of skills & strategies
Deepening of rationale and reflection
1
Not Ready
Teacher
Focus
Whole
Class
Fragmented,
Indiscriminate
5
Proficient Novice
Highly Accomplished
Beginner
Student
Focus
Individuals/
Flex. Groups
Integrated,
Intentional &
Well Executed
Quality of writing
Grade level teaching
assignment
Emotional reactions
Quantity and
technical quality of
materials
Leniency/Stringency
Halo Effect
Bias
Fill in the gaps
Evaluation Process
Time to dig into the work
Materials
• Handbooks
• Candidate work sample
• Evaluation Rubrics
• One person per table will have documented
evidence mapped to the evaluation rubric for the
candidate sample (do not look at these yet)
Our process today:
As a group
Individually
As a table
Overview of
each rubric
Read,
Discuss to come
to agreement
Strategies for
navigating task
& gathering
evidence
Gather
evidence
compare to
provided
evaluation
map evidence
to rubric
As a group
share
Task-by-Task Evaluation
Read the Context for Learning.
Evaluate Task 1 (Planning)
completely, then move on to 2
Evaluate Task 2 (Instruction)
completely (consulting Task 1
evidence as needed)
Evaluate Task 3 (Assessment)
completely (consulting Tasks 1
& 2 evidence as needed)
Task 1
PLANNING
Overview of the Planning Task
Artifacts
• Instructional and learning
context
• Lesson plans and
Instructional materials,
Rubrics
1. Planning to Build Student Understanding
2. Planning to Support Varied Student Learning
Needs
3. Using Knowledge of Students to Inform
Teaching and Learning
student assessment
• Planning Commentary
4. Identifying and Supporting Academic
Language Demands
5. Planning Assessments to Monitor and Support
Student Learning
Rubric progression
Expanding repertoire of skills & strategies
Deepening of rationale and reflection
1
Not Ready
Teacher
Focus
Whole
Class
Fragmented,
Indiscriminate
5
Proficient Novice
Highly Accomplished
Beginner
Student
Focus
Individuals/
Flex. Groups
Integrated,
Intentional &
Well Executed
History/SS Rubric 1
History/SS Evaluation Rubric 1
Look Fors
Guiding Question: How does the candidate use evidence to evaluate and
change teaching practice to meet students’ varied learning needs?
• Identify key
characteristics of a
performance category
• Concrete examples
that demonstrate
candidate
performance
• Examples are not
exhaustive
Navigation of the Planning
task
Read Context of
Learning Form
Pay attention
to which
prompts link
to which
Rubrics (see
Local Eval
Rubric Doc)
Skim Lesson Plans
Note progression of objectives,
assessments
Read Commentary
Considerations
for gathering evidence
Goals
• Identify evidence that
aligns with a performance
category.
• Refer to the source of the
evidence (artifact
commentary reference)
Preferences…
• Hard copy
– Highlight
– Underline
– Written notes
• Electronic
– Copy and paste
– Highlight
Gathering & mapping evidence for
Rubric 1
• Where to find the evidence:
Individually
Read
Highlight &
gather
evidence
As a table
Share which
rubric category
you placed it in
Discuss
discrepancies,
come to
agreement
• Skim context for learning form
• Focus on prompt 1 in commentary
• Skim lesson plans & teaching materials –
focus on progression of daily objectives
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric 1
• What are you looking for?
map evidence
to the rubric
category
compare to
provided
evaluation
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
History/SS Evaluation Rubric 2
History/SS Evaluation Rubric 3
Gathering & mapping evidence for
Rubrics 2 & 3
• Where to find the evidence:
Individually
As a table
Read
Share which
rubric category
you placed it in
Highlight &
gather
evidence
Discuss
discrepancies,
come to
agreement
map evidence
to the rubric
category
compare to
provided
evaluation
• Focus on prompts 2 & 3 in commentary
• Refer to context form & lesson plans
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubrics 2 & 3
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
History/SS Evaluation Rubric 4
Academic Language
Demands
• There are language demands that teachers
need to consider as they plan to support
student learning of content, which include:
• Language Functions
• Vocabulary
• Syntax
• Discourse
Vocabulary and Language
Functions
Language Functions are the
content and language focus of
learning tasks often represented
by the active verbs within the
learning outcomes. Functions are
the purposes for which language is
used. For example:
•
Summarizing information
•
Evaluating performances
•
Classifying based on attributes
Vocabulary includes words
and phrases (and
symbols) that are used
within the disciplines
including:
1. words and phrases with
subject specific meanings
that differ from meanings
used in everyday life
2. general academic
vocabulary used across
disciplines; and
3. subject-specific words
defined for use in the
discipline.
Syntax and Discourse
Syntax is the set of
conventions for organizing
symbols, words, and
phrases together into
structures (e.g., sentences,
formulas, staffs in music).
Discourse includes the structures
of written and oral language. It is
how members of the discipline
talk, write and participate in
knowledge construction.
Discipline specific discourse has
distinctive ways of structuring oral
or written language (text
structures) that provide useful
ways for the content to be
communicated. For example:
– Narration
– Exposition
– Description
– Argument
History/SS Evaluation Rubric 5
Gathering & mapping evidence for
Rubrics 4-5
• Where to find the evidence:
Individually
Read
Highlight &
gather
evidence
map evidence
to the rubric
category
As a table
Share which
rubric category
you placed it in
Discuss
discrepancies,
come to
agreement
compare to
provided
evaluation
• Rubric 4: Commentary prompt 4, lesson
plans & materials
• Rubric 5: Commentary prompt 5, lesson
plans, assessments
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric document
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
Task 2
INSTRUCTION
Overview of the Instruction Task
Artifacts
Rubrics
• Video Clips
6. Learning Environment
• Instruction
Commentary
7. Engaging Students in Learning
8. Deepening Student Learning
9. Subject Specific Pedagogy
10. Analyzing Teaching Effectiveness
History/SS Evaluation Rubric 6
History/SS Evaluation Rubric 7
History/SS Evaluation Rubric 8
Navigation of the Instruction
task
Note which lesson is part of video, then watch video first
View Video
Pay attention
to which
prompts link
to which
Rubrics (see
Local Eval
Rubric Doc)
Read Commentary
Gathering & mapping evidence for
Rubrics 6-7-8 (Jigsaw Activity)
• Where to find the evidence:
Individually
Watch video,
read
Highlight &
gather
evidence for 6
or 7 or 8
map evidence
to the rubric
category
As a table
Share rubric
category with
fellow rubric
pal
Each group
presents
findings on
their rubric
compare to
provided
evaluation
• Instruction commentary (see specific
prompts for each rubric)
• Video
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric document
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
History/SS Evaluation Rubric 9
History/SS Evaluation Rubric
10
Gathering & mapping evidence for
Rubric 10
• Where to find the evidence:
Individually
As a table
Read
Share which
rubric category
you placed it in
Highlight &
gather
evidence
Discuss
discrepancies,
come to
agreement
map evidence
to the rubric
category
compare to
provided
evaluation
• Instruction commentary prompt 5
• Video
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric document
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
Task 3
ASSESSMENT
Overview of the Assessment
Task
Artifacts
• Analysis of whole class
assessment
• Analysis of learning and
feedback to THREE
students
• Assessment
Commentary
Rubrics
11. Analysis of Student Learning
12. Using Feedback to Guide Further
Learning
13. Student Use of Feedback
14. Analyzing Students’ Language Use
15. Using Assessment to Inform Instruction
History/SS Evaluation Rubric
11
History/SS Evaluation Rubric
12
History/SS Evaluation Rubric
13
Navigation of the Assessment
task
Note feedback
Skim 3 Assessment
Samples
Pay attention
to which
prompts link
to which
Rubrics (see
Local Eval
Rubric Doc)
Read Commentary
Gathering & mapping evidence for
Rubrics 11-13
• Where to find the evidence:
Individually
Read
Highlight &
gather
evidence
map evidence
to the rubric
category
As a table
Share which
rubric category
you placed it in
Discuss
discrepancies,
come to
agreement
compare to
provided
evaluation
• Rubric 11: Commentary prompt 1,
evaluative criteria, work samples
• Rubric 12 : Commentary prompt 2a, work
samples
• Rubric 13 : Commentary prompt 2b
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric document
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
History/SS Evaluation Rubric
14
History/SS Evaluation Rubric
15
Gathering & mapping evidence for
Rubrics 14-15
• Where to find the evidence:
Individually
As a table
Read
Share which
rubric category
you placed it in
Highlight &
gather
evidence
Discuss
discrepancies,
come to
agreement
map evidence
to the rubric
category
compare to
provided
evaluation
• Rubric 14: Commentary prompt 3, work
samples, and/or video
• Rubric 15: Commentary prompt 4
• Materials to use
• Candidate sample (do not look at scores)
• Evaluation Rubric document
• What are you looking for?
• See “look fors”
• Highlight, note evidence linked to rubric
language
• Determine where most of the evidence
falls—emerging, proficient or advanced
Feedback
To Programs and Candidates
Feedback to
Programs
Opportunities for faculty
learning & program renewal
Activities some programs have found useful:
Affords new types of
collaborations for faculty
across practices (which
means bringing multiple
people to the table)
Holistic &
Integrated
Rubric
Language
Using rubrics to make
sense of data allowed for
a shared language and
deeper shared
understandings
Elements of the edTPA
Faculty learning about
candidates’ practice is
critical to programmatic
changes that support
candidate learning.
New
Forms of
Evidence
Access to
Data
Electronic platforms
may provide easy
access to faculty
(especially valuable to
access candidate
documents not just scores)
Feedback to
Candidates
Opportunities for candidate
learning & future growth
What’s the
purpose of
candidate
feedback?
What’s the
difference between
gathering evidence
for evaluation and
providing feedback
to candidates?
During this year:
How might we
translate local
evaluation
evidence into
feedback to
candidates?
Feedback T-Chart for Rubric
#____
Rubric Title----------edTPA Areas for Improvement
edTPA Strengths
RUBRIC #:
RUBRIC #:
Construct:
Construct:
Next steps
• Talk with your department about piloting
edTPA
• Start discussion on local evaluation
responsibilities within department for
Continuous Review.
Thank You
If you have questions please
contact: Kelly Jewell
jewellk@uww.edu