TPA National and State Update

advertisement
The Teaching
Performance
Assessment Consortium
(TPAC)
Andrea Whittaker, Ph.D.
Stanford University
September 2011
Agenda and Goals
• Update on National Project
• Ohio policies and timeline
Stanford Center for Assessment, Learning and Equity 2011
Why Now?
Blue Ribbon Panel –
10 Principles
PARCC and
Smarter Balance
Assessments
Where TPAC fits in
TPAC is working to develop and implement at scale a way of
assessing teaching that…
• Provides evidence of teaching effectiveness,
• Supports teacher preparation program improvement
• Informs policy makers about qualities of teaching
associated with student learning.
TPAC is ONE example of an assessment system that is
designed to leverage the alignment of policies and support
program renewal.
Stanford Center for Assessment, Learning and Equity 2011
Accountability reframed
How can we gather and use evidence of the
qualities of teaching performance that
inspire, engage, and sustain students as
learners – to improve teaching and teacher
preparation?
Stanford Center for Assessment, Learning and Equity 2011
National Leadership
AACTE
• overall project management, communication
with programs
Stanford University
• assessment development and technical support
Council of Chief State School Officers
• policy development and support,
communication with state education agencies
(prior to March 2011)
Stanford Center for Assessment, Learning and Equity
Highlights of Pearson’s Role
in the TPA
• Pearson has been selected as Stanford’s
operational partner.
• Support Stanford and AACTE with assessment
development and technical review.
• Train and certify scorers, provide a scoring
platform and report results for the operational
TPA.
Stanford Center for Assessment, Learning and Equity 2011
Pearson’s Role
in the Field Test
• Development Support for Field Testing
• Handbook and template publication
• Recruitment and training of scorers, scoring and scorer
compensation
• Benchmarking
• Reporting results
• Providing an electronic platform to manage TPA submissions.
Pearson’s Role in
Operational Use
• Pearson will provide Assessment Services to
deliver the TPA Nationally and Sustainably.
• Web-based services that allow candidate registration,
assembly of artifacts, faculty/supervisor feedback,
final submission for official TPA scoring and a score
report.
• Scoring services such as the recruitment, training and
certification of all scorers, scoring for all submitted
TPA responses
• Reporting services such as the generation of all official
score reports to candidates and institutions of record.
Partnering States
Standards and tPAC
• Common Core alignment
• InTASC alignment
• NCATE/CAEP endorsement
• SPA endorsement
Stanford Center for Assessment, Learning and Equity 2011
TPAC Lineage
• National Board for Professional Teaching
Standards (NBPTS) portfolio assessments –
accomplished teachers
• Connecticut BEST assessment system – teachers
at end of induction
• Performance Assessment for California Teachers
(PACT) – pre-service teachers
Stanford Center for Assessment, Learning and Equity 2011
Role of K-12 Partners
• NEA and AACTE affiliate state meetings
• Roles for cooperating teachers and school site
principals
• Call for collaboration with IHEs
Stanford Center for Assessment, Learning and Equity 2011
State Policy issues
Emerging recognition of state role and
responsibility for educator effectiveness – states
are revisiting policies and practices
TPAC is coming at this through:
• Program improvement and accountability
• The psychometric challenge – is the instrument
usable?
• Informs policy development in critical levels of
program approval (measure of effectiveness), as
well as initial licensure (candidate readiness)
Stanford Center for Assessment, Learning and Equity 2011
STATE POLICY ISSUES
States are realizing that valid, reliable and
predictive measures are critical to the success
of any change, especially when student
performance is the end objective.
Early implementers:
Washington, Minnesota, Tennessee, Illinois,
Wisconsin, Ohio
Stanford Center for Assessment, Learning and Equity 2011
House Bill 1
Transfers responsibility for approving teacher preparation programs
from the State Board to the Chancellor of the Board of Regents
Directs the Chancellor, jointly with the State Superintendent, to:
(1) establish metrics and educator preparation programs for the
preparation of educators and other school personnel, and (2)
provide for inspection of the institutions.
Through HB1, Ohio is first in the nation to require a four-year
induction program (Resident Educator)
Ohio Comprehensive System of Educator
Accountability
Metrics
Pre-Service
• Content Knowledge: Praxis II
• Performance Assessment: TPA
Teacher Residency
• Formative assessment coupled
with goal setting and coaching
Performanc
e
Not
Effective
Effective
Not
Effective
Outcome
More coursework or enter different area of study
Recommended for resident educator license
PAR
Program
Not
Effective
Employment terminated
Effective
• Annual summative assessment
based on multiple measures of
educator effectiveness including
student growth
Annual Teacher Evaluation
• Formative assessments that
inform PD and coaching
support
Effective
Continue with Residency
Not
Effective
Not
Effective
PAR
Program
• Annual summative assessment
based on multiple measures of
educator effectiveness including
student growth
Effective
Continue as Teacher
Recommended for Five Year
Professional License
Employment terminated
Effective
Informs decisions: retention,
dismissal, tenure, promotion,
compensation
Ohio alignment
• TPA has also been aligned to the Ohio Teacher
Standards.
• Karen Herrington is working to align TPA
with the alignment instrument with
state/national standards Ohio IHEs compiled
in 2005-06
Ohio’s LIneage
• Praxis III Assessment in Entry Year Teaching
• Focus of Planning, Environment, Teaching for
Learning and Professionalism
• Pathwise Training for Mentors assisting entry
year teachers and incorporation
• Transition of PIII to Resident Educator
Program
TPA Architecture
Stanford Center for Assessment, Learning and Equity 2011
Design Principles for
Educative Assessment
 Discipline specific and embedded in curriculum
 Student Centered: Examines teaching practice
in relationship to student learning
 Analytic: Provides feedback and support along
targeted dimensions.
 Integrative: maintains the complexity of
teaching
 Affords complex view of teaching based on
multiple measures
Stanford Center for Assessment, Learning and Equity 2011
TPA Architecture
• A summative assessment of teaching practice
• Collection of artifacts and commentaries
• “Learning Segment” of 3-5 days
Stanford Center for Assessment, Learning and Equity 2011
TPAC Artifacts of Practice
Planning
• Instructional and
social context
• Lesson plans
• Handouts, overheads,
student work
• Planning
Commentary
Instruction
• Video Clips
• Instruction
Commentary
Assessment
• Analysis of Whole
Class Assessment
• Analysis of learning
and Feedback to two
students
• Instructional next
steps
• Assessment
Commentary
Daily Reflection Notes
Analysis of Teaching Effectiveness Commentary
Evidence of Academic Language Development
Stanford Center for Assessment, Learning and Equity 2011
Conceptual Framework of
Assessment
• What? – candidate describes plans or provides
descriptions or evidence of what candidate or
students did
• So what? – rationale for plans in terms of
knowledge of students & research/theory,
explanation of what happened in terms of student
learning or how teaching affected student learning
• Now what? – what candidate would do differently if
could do over, next instructional steps based on
assessment, feedback to students
Stanford Center for Assessment, Learning and Equity 2011
Multiple Measures
Assessment System
Embedded Signature Assessments
TPAC Capstone
Assessment
Integration of:
Child
Case
Studies
Analyses
of
Student
Learning
Curriculum/
Teaching
Analyses

Planning

Instruction

Assessment
Analysis of
Teaching

Observation/Supervisory Evaluation &
Feedback
Stanford Center for Assessment, Learning and Equity 2011
with attention to
Academic Language
Targeted
Competencies
PLANNING
ASSESSMENT
• Planning for content
understandings
•
•
• Using knowledge of
students to inform teaching •
• Planning assessments to
monitor and support
student learning
INSTRUCTION
• Engaging students in
learning
Analyzing student work
Using feedback to guide further
learning
Using assessment to inform
instruction
REFLECTION
•
Analyzing Teaching Effectiveness
ACADEMIC LANGUAGE
•
•
• Deepening student learning
during instruction
•
Identifying Language Demands
Supporting students’ academic
language development
Evidence of language use
Stanford Center for Assessment, Learning and Equity 2011
Rubric progression
• Early novice  highly accomplished beginner
• Rubrics are additive and analytic
• Candidates demonstrate:
• Expanding repertoire of skills and strategies
• Deepening of rationale and reflection
• Teacher focus  student focus
• Whole class  generic groups  individuals
Stanford Center for Assessment, Learning and Equity 2011
Rubric blueprint
Task name: Rubric Title
Guiding Question
Level 1
Struggling
candidate,
not ready to
teach
Level 2
Some skill
but needs
more
practice to
be teacherof-record
Level 3
Acceptable
level to
begin
teaching
Level 4
Solid
foundation
of
knowledge
and skills
Level 5
Stellar
candidate
(top 5%)
Rubric Sample
Eliciting and Monitoring Students’ Mathematical Understandings
Level 1
Level 2
Level 3
Level 4
Level 5
Candidate talks
throughout the
clip(s) and
students provide
few responses.
Candidate
primarily asks
surface-level
questions and
evaluates student
responses as
correct or
incorrect.
The candidate
elicits student
responses
related to
reasoning/prob
lem solving.
Candidate elicits and
builds on students’
reasoning/ problem
solving to explicitly
portray, extend, or
clarify a
mathematical
concept.
All
components
of Level 4
plus,
The candidate
stays focused on
facts or
procedures with
no attention to
mathematical
concepts and
representations
of content.
Candidate uses
representations
Candidate
in ways that
makes vague or help students
superficial use of understand
representations
mathematical
to help students concepts.
understand
mathematical
concepts.
Candidate uses
strategically chosen
representations in
ways that deepen
student
understanding of
mathematical
concepts.
Candidate
facilitates
interactions
among
students to
evaluate their
own ideas.
Academic Language
• Academic language is different from everyday
language. Some students are not exposed to
this language outside of school.
• Much of academic language is disciplinespecific.
• Unless we make academic language explicit for
learning, some students will be excluded from
classroom discourse and future opportunities
that depend on having acquired this language.
Stanford Center for Assessment, Learning and Equity
Academic Language
• Academic language is the oral and written
language used in school necessary for learning
content.
• This includes the “language of the discipline”
(vocabulary and forms/functions of language
associated with learning outcomes) and the
“instructional language” used to engage
students’ in learning content.
Stanford Center for Assessment, Learning and Equity
Academic Language
Competencies Measured
• Understanding students’ language development
and identifying language demands
• Supporting language demands (form and function)
to deepen content learning
• Identifying evidence that students understand and
use targeted academic language in ways that
support their language development and content
learning.
Stanford Center for Assessment, Learning and Equity 2011
Development Timeline
• 2009-10 Small-scale tryout tasks & feedback from
users.
• 2010-11 Development of six pilot prototypes based
on feedback. Piloted in 20 states. User feedback
gathered to guide revisions.
• 2011-12 National field test of 13 prototypes,
producing a technical report with reliability and
validity studies, and a bias and sensitivity review.
National standard setting.
• 2012-13 Adoption of validated assessment
Stanford Center for Assessment, Learning and Equity 2011
Pilot Data Analysis
• Scores (descriptive stats)
• Scoring process
• Inter-rater reliability and agreement rates
• Examinee and faculty feedback
• Benchmark identification
Stanford Center for Assessment, Learning and Equity 2011
Handbook Changes
• Deep focus on student learning
• Five level rubric
• Clear organization, prompts and alignment with
rubrics
• Academic language reframing
• Analyzing teaching
• Subject specific glossaries
• Professional look and interactive features
Stanford Center for Assessment, Learning and Equity
2011
Ohio Spring Pilot 2011
• Three IHEs completed 150 portfolios in six
content areas
• 91 were scored by 35 calibrated faculty
(univ./school) scorers representing 9
institutions
• Results were returned to the three IHEs from a
commonly used server
• Feedback sent to T Candidates with scored
portfolios in late summer
Program/Unit
Discussions
• Results shared with program representatives
• Discussions about strengths and challenges
noted from data results
• Sharing of next steps based upon the results for
the coming academic year
Framing Reliability and
Validity Research
• Current policies in play
• Evidence needed to support TPA use for
accreditation and licensure decision-making
• Potential role for VAM and other predictive
validity measures
Stanford Center for Assessment, Learning and Equity 2011
Field Test Design
•Design is driven by overall goals:
• Data to enhance validity evidence
• Reports to describe technical aspects and the set of validity and
reliability studies
• Effectiveness and efficiency of scorer training materials and process
• Refinements to the assessment
• Reporting design and distribution
• Support systems:
• portfolio management system and
• scoring management system
•Participation/Sampling plan – location (state-based or national
population) and discipline-specific
Stanford Center for Assessment, Learning and Equity 2011
Field Test Analyses
• Field Test data analysis and research areas:
• Content Validity
• CV meetings held in July 2011
• Bias Review scheduled for November 2011
• Construct Validity
• defining the construct of the TPA, factor analysis
• Consequential Validity
• candidates & programs
• Predictive Validity
• reliability between performance on the TPA and other measures (e.g.,
TPA scores and state teacher certification test scores)
Stanford Center for Assessment, Learning and Equity 2011
Field Test
Participation
• Subject Areas to be field tested
• Elementary Literacy , Elementary Mathematics,
English/Language Arts, History/Social Science,
Secondary Mathematics, Science
• Special Education, Early Childhood Development,
Middle Grades (Science, ELA, Math, and History
Social Science), Art, Performing Arts (Music, Dance,
Theater), Physical Education, and World Language
• Other low-incidence draft handbooks will be available for
trying out
Stanford Center for Assessment, Learning and Equity 2011
Field Test
Participation
• Pearson will support scoring training and scoring
stipends for a national sample of 18,000 candidates
• Scoring training and certification online (some
synchronous events)
• Scorers to include IHE faculty, field supervisors,
cooperating teaching, principals, NBCTs and others
with pedagogical content knowledge and experience
with beginning teacher development.
• Local, state and national scoring
Stanford Center for Assessment, Learning and Equity 2011
Ohio’s Projections
for 2011-2012
• 72% of Ohio’s IHEs are current TPA
participants
• Additional IHEs have pending MOUs being
completed
• Over 2100 portfolios, 13 of the14 content
areas, are projected for completion
Timeline of Activities
• Release of revised handbooks
• September 2011
• Commitment/registrations to participate in Field Test
• Summer/Fall 2011
• Pearson systems ready for registration, submissions, and
scoring
• Spring 2012 – scorer management system ready
• TBD 2012 – candidate registration and TPA submission
system ready
Stanford Center for Assessment, Learning and Equity 2011
Timeline of Activities
• Release of revised handbooks
• September 2011
• Commitment/registrations to participate in Field Test
• Summer/Fall 2011
• Pearson systems ready for registration, submissions, and
scoring
• Spring 2012 – scorer management system ready
• TBD 2012 – candidate registration and TPA submission
system ready
Stanford Center for Assessment, Learning and Equity 2011
Next Steps
• Join TPAC Online (Ning)
• Field test commitments
• Technical assistance
• AACTE affiliate meetings
• Ongoing webinars and Ning discussions
• PACT/TPAC Implementation Conference –
October 20-21 in San Diego
• AACTE Annual Meeting – February 17-19, 2012
Stanford Center for Assessment, Learning and Equity 2011
Ohio’s Next Steps
NE Region
SW Region
Host: Hiram College /University of Akron
Host: University of Cincinnati
Contact: Jennifer Miller/Lynn Kline
Contact: Chet Laine
Date: October 28 or Nov 4 (TBA)
Date: February 24
SE Region
NE Region
Host: Franciscan University
Host: Bowling Green State University
Contact: Mary Kathryn McVey
Contact: Mary Murray
Date: November 16
Date: March (TBA)
Central Region
Host: Ohio Dominican University
Contact: Bonnie Beach
Date: January 6
Ohio’s Next Steps
NE Region
SW Region
Host: University of Akron
Host: University of Cincinnati
Contact: Lynn Kline
Contact: Chet Laine
Date: Nov. 9
Date: February 24
SE Region
NE Region
Host: Franciscan University
Host: Bowling Green State University
Contact: Mary Kathryn McVey
Contact: Mary Murray
Date: November 16
Date: March 14
Central Region
Host: Ohio Dominican University
Contact: Bonnie Beach
Date: January 6
Other TPAc
presentations
• Breakouts today
•
•
•
•
•
Supporting Students
Engaging faculty
Lessons learned from Scoring
Pilot year insights
Academic language
• TPAC 101 on Thursday
• Thursday Keynote – Program renewal
Stanford Center for Assessment, Learning and Equity
2011
Download