Impact of Transformation Initiative on Standard 2

advertisement
1
2f. Unit Assessment System revised in view of Transformation Initiative
University of Cincinnati Educator Preparation Programs
Unit Assessment System
Transforming Lives, Schools, and Communities
The University of Cincinnati Educator Preparation Programs are committed to transforming
lives, schools, and communities. We target the continuous improvement of the lives of the p-12
students with whom we work, our partner schools, performance of our candidates, the quality
of our programs, and the quality of our procedures and operations. We are accountable
internally to our candidates, faculty, and clinical faculty and externally to our specialized
program associations, state department of education, the students with whom our candidates
work, our partner schools, and the community. As a Transformation Initiative institution we are
accountable to our field and to improving student outcomes through replicable efforts in
teacher preparation. As an institution on the first annual list of institutions accredited by NCATE
in 1954, we have a long tradition of self-study. We recognize the role of assessment and
evaluation for decision-making and increased effectiveness. We recognize the need for
multiple sources of data, and have identified the need for an “assessment mosaic” focused on
improving p-12 student outcomes and unit operations.
The culture of data-based decision making has long been established in the unit. The
Assessment and Evaluation Board in the College of Education, Criminal Justice, and Human
Services has been in place since 1996, evolving into an Assessment Advisory Board in 2010.
There have been efforts to evaluate educator programs prior to the NCATE 2000 standards, and
systematic data collection, management, and application to the continuous improvement of
programs and unit operations has been in place since 2002. The University Council for Educator
Preparation, comprised of university-wide faculty members and administrators, public school
teachers, and community members monitors the assessment system. Implementation of the
assessment is managed by the Office of Assessment and Continuous Improvement, with Dr.
James Vondrell as the director. Examples of initiatives he has directed include: the shift to a
paper-free system and more efficient field and clinical placements; candidate, mentor, and field
site evaluation; an annual student satisfaction survey, and advisory panels comprised of
principals, associate superintendents, and superintendents of our district partners.
As a unit, we are committed to a transparent system that promotes discussion with various
stakeholders. We ground our efforts in research and evidence. As with any assessment cycle,
our system is constantly under review for the power of the data it generates. In addition, our
Transformation Initiative has forced us to review the focus of our system.
Development of the Assessment System
The Assessment System was initiated in 2002 through a collaborative effort of five groups
representing faculty members and the professional community. As a planning process, the work
groups met individually, presenting their plans to the Continuous Improvement Committee (now
the University Council on Educator Preparation - UCEP). The committee then endorsed the
plans for implementation at the program level.
2
The assessment system was based on several principles put forth by UCEP:
 Data is gathered from the professional community (cooperating teachers/mentors,
graduates, employers, district personnel, and other members of the professional
community)
 Data is gathered from candidates, faculty members, cooperating teachers/mentors,
graduates, employers, district personnel, members of the professional community, as
well as the students and clients with whom they work
 Because of the broad base of data collected, members of these groups are participants
in the design, implementation, and evaluation of the assessment system and its
components
 Data is gathered related to standards, proficiencies, and tenets of the Conceptual
Framework as well as national and state standards
Various measures of the Unit Assessment Plan were used during the 2001-2002 academic year,
with near complete implementation during the 2002-2003 academic year. In our review of the
data generated by assessment efforts, a need was identified to develop more specific
performance assessments for advanced programs. This need emerged concurrent with changes
in graduate program policies reducing the number of required credit hours to earn masters
degrees. As a result, programs were revised, and new performance assessments were
developed for those programs, implemented during the 2004-2005 academic year. All aspects
of the assessment system are institutionalized, though individual assessments undergo annual
review, to insure the data are useful.
Several opportunities to the system and programs have presented. These opportunities forced
even greater reference to assessment data of programs, the unit, program operations, and p-12
student learning. These changes and the opportunities include:
 The shift from quarters to semesters beginning Fall 2012, which provided us the
opportunity to use data from the assessment of programs and unit operations to
completely rethink programs in view of seven years of program, unit, and operations
data
 Collaborating with Stanford University as a one of four institutions in Ohio piloting the
Teacher Performance Assessment (Ohio is a “fast track” state), forcing us to rethink our
assessments of performance in clinical experiences
 Awarded the Woodrow Wilson Fellows program, providing us the opportunity to design
a program for candidates with strong content knowledge and degrees in mathematics
and science to become teachers in high-needs schools
 The introduction of a series of formative assessment tools consistent with the Ohio
Residency Program (evolved from work with the New Teacher Center)
 Redesigned all programs in response to themes described in our Transformation
Proposal
 Our recognition that the system must be clearly aligned with best practices in
assessment and evaluation
 Our recognition that any system involved in preparing professionals must be related to
the impact on the clients; in our case we must systematically collect, analyze, review,
and use data related to the impact of our candidates and graduates on the students
with whom they work
3

Our commitment to establish an “assessment mosaic” in which a wide range of
assessment strategies and data sets, grounded in outcomes of p-12 students, are
designed, evaluated, and used continuously to inform program and procedural
improvements.
Relationship of Assessment System to Conceptual Framework
Our conceptual framework has evolved in view of our participation in the Transformation
Initiative. Our Unit standards for performance expectations have become: Candidates of the
University of Cincinnati are committed to transforming the lives of P-12 students, their schools,
and their communities, and
 Demonstrating foundation knowledge, including knowledge of how each individual
learns and develops within a unique developmental context
 Articulating the central concepts, tools of inquiry, and the structures of their discipline.
 Collaborating, leading, and engaging in positive systems change
 Demonstrating the moral imperative to teach all students and address the responsibility
to teach all students with tenacity
 Addressing issues of diversity with equity and using skills unique to culturally and
individually responsive practice
 Using technology to support their practice
 Using assessment and research to inform their efforts and improve outcomes
 Demonstrating pedagogical content knowledge, grounded in evidence- based practices,
committed to improving the academic and social outcomes of students
Our assessment system is organized around these institutional standards. In order to
demonstrate our commitment to national, professional standards (Ohio is a partnership state
and state standards and national standards are synonymous) all assessments in the system are
explicitly aligned. This alignment has forced us to discontinue our student teaching/internship
performance assessment because Ohio has moved from Praxis III to a “fast-track” Teacher
Performance Assessment state.
Identifying our unit dispositions was the first task of our Unit-wide Continuous Improvement
Committee. Our unit dispositions reflect our “Ways of Being.” Intrinsic to our dispositions is the
notion of community and belonging. We appreciate each individual’s fundamental need for
acceptance and belonging, and that a student’s fundamental need is to be successful and
competent. We appreciate that we are members of a community, and that “none of us can find
ourselves, know ourselves, or be ourselves, all by ourselves” (Binau, 2000). As educators
transforming lives, schools, and communities we, aspire to the following:






initiative on behalf of all learners
responsibility to promote effort and excellence in all learners
rapport with students, peers, and others
a commitment to reflection, assessment, and learning as an ongoing process
grounded in inquiry
collaboration with other professionals to improve the overall learning
environment for students
acknowledging multiple perspectives
4









dedication to teaching the subject matter and in keeping informed and
competent in the discipline and its pedagogy
appreciating both the content of the subject are and the diverse needs, assets,
and interests of the students and value both short and long term planning
commitment to the expression and use of democratic values in the classroom
responsibility for making the classroom and the school a “safe harbor” for
learning, in other words, a place that is protected, predictable, and has a
positive climate
value opportunities to collaborate with parents
recognition of the fundamental need of students to develop and maintain a
sense of self-worth, and that student misbehavior may be attempts to protect
self-esteem
belief that all children can learn and persistence in helping every student
achieve success
value all students for their potential and people and help them value each other
high ethical and professional standards.
Because of our intense commitment to these dispositions, we developed a unit-wide Candidate
Dispositions Progress Report for the formal documentation of candidate dispositions. In
addition, a Dispositions Brief Report was developed to both document exemplary dispositions
and to identify areas of development for specific candidates. These reports identify candidates
in terms of dispositions and general behavior, and are effective in documenting behavior that
requires intervention and action plans. However, as formative assessment tools for classroom
observation, these reports were less behavioral and measurable than we wished. As part of our
Transformation Initiative, clear, specific measurable descriptions of specific behaviors
demonstrating our dispositions were generated. We are currently piloting and calibrating
Student-Teacher Performance Assessment Tool (Appendix A), with one set of pilot data
collected. In this assessment, we used research to generate specific items that would support
candidate development of appropriate interactions. A second issue that emerged was that of
campus behavior. In an effort to again provide candidates with more specific feedback, a
Classroom Disposition Assessment was developed. Both assessments are web-based.
All measures are aligned with institutional standards and candidate proficiencies. Our
dispositions are measured and documented across the unit. In this way the University of
Cincinnati Educator Preparation Programs, with the involvement of its professional community,
is implementing an assessment system that reflects the conceptual framework(s).
Relationship of Assessment System to Professional, State, and Institutional Standards:
Programs and Unit Operations
In addition to aligning our assessment system to our institutional standards, the system is
aligned with Ohio Standards for the Teaching Professions and the Model Core Teaching
Standards (CSSO, 2011) for initial programs and National Board for Professional Teaching
Standards for advanced programs. All licensure programs employ the standards of the
appropriate specialized program associations. The use of the Student-Teacher Performance
Assessment Tool is being piloted to evaluate candidate performance in all professional field
experiences as required by our state. The assessment plan demonstrates the alignment in the
5
presentation of data. Through presenting our assessment efforts in this way, we are constantly
reminded of our professional, state, and institutional standards.
Relationship of Assessment System to National Models for Assessment Systems
As we evaluate our assessment system, we identified our efforts as “purpose oriented”
(Goodwin, Englert, & Cicchinelli, 2002). The over-riding goal of a purpose oriented system is
improving student outcomes. This is consistent with our Transformation Initiative Proposal
which aims to improve outcomes for all students. In addition, this system is appropriate in that
it based in clear standards (professional, state, and institutional standards) flowing directly into
assessments and multiple measures. Two aspects for this purpose related accountability involve
(a) evaluating the effectiveness of our efforts and reforms to support programs in making
decisions and (b) monitoring learning and holding candidates and programs responsible for their
student outcomes.
The shift to a new web-based application for our assessments (from ReMark to Qualtrics) has
provided the impetus for us to examine the measurements aspect of our system. The National
Institute for Learning Outcomes Assessments (NILOA, 2011) contends that learning outcomes
must be useful and transparent. We want our system to be as useful as possible to programs
and clearly communicate to candidates, faculty members, administrators, p-12 partners, and the
community. This alignment with the National Institute for Learning Outcomes Assessment
supports our efforts in being evidence based. As we review our system, the six aspects of the
transparency framework and examples of the activities in each area are:
Assessment Plans: Assessment processes, procedures, and activities
Example of Revision in
NILOA Activities
Our System
Response to Review
Specific to Program Level
Candidate Learning Outcomes Statements
Alignment with SPAs, NBPTS, Align syllabi as well as
INTASC, Ohio
assessments with standards
Prominently posted
Available to students
Review what the measures
are, how they are used, and
their frequency of use (field
coordinators meet to assess)
Assessment Plans
Handbooks and assessment
website are reviewed by field
coordinators of each program
each year
Review descriptions of the
assessment to ensure that
they are clear and
understandable
Review all assessments for
clarity, reading level, and
transparency
Available on every syllabus,
handbook, assessment
In response to the StudentTeacher Performance
Assessment Tool Pilot study,
coordinators modified
requirements for previous
assessments
Educator Impact Rubric has
been repeatedly revised and
finally replaced because of its
complexity
6
Post or link assessments so
they can be reviewed by all
stakeholders
Review Office of Assessment
and Continuous Improvement
website
Downloaded or accessed in a
timely fashion
Data downloaded and shared
Receptive to feedback or
comments
Explained, analyzed, and
interpreted in a way easily
understood
Recommendation for a
"button" on the home page
for easier access
At program coordinators'
request, all disposition
assessments are downloaded
and shared weekly;
evaluations of or by
university supervisors are
shared prior to hiring
deadlines
University Council for
Increased flexibility of
Educator Preparation, Field
scheduling of meetings with
Coordnators Council,
members of the p-12 school
Licensure Council, Partnership community Partnership
Panels
Panels
Evidence of Learning
Program development plans
When results are shared, a
narrative is included
Presented in text and graphics
Data posted for programs
We have always relied on
graphs; we will review the
need for narrative
Disseminated and summarized
for different groups
Website and emails used
Candidates are sent emails
regarding rationales for
changes in their programs;
candidate outcomes are
posted on website; explore
changing language/format for
additional groups
Examine the extent to which
evidence is used to identify
evidence for change
Downloadable or accessible
Receptive to feedback or
comments
Use of Evidence
Program development plans
for each program
Assessment Resources
Resources available on Office
of Assessment and
Continuous Improvement
website; field coordinators
Blackboard group
Evaluation surveys with
stakeholders
Programs review evidence
from each assessment and
design a response plan
Handbooks explaining the
Student-Teacher
Performance Assessment are
posted on Blackboard
Cooperating Teacher
Assessment, University
Supervisor assessment,
candidate assessments
7
Clearly communicated
Current Activities
Ongoing review of websites
and handbooks
Prominently Posted
Posted on website
Candidates provide feedback
on materials
Review ease of access with
candidates
The Teacher Performance Assessment Consortium
Ohio is a “fast track” state in the implementation of the Teacher Performance Assessment (TPA).
The goals of this assessment are to: (a) create a reliable, valid performance assessment
system to improve teacher quality; (b) create a data base to track teacher performance,
provide information so that states can make accreditation decisions and improve the
licensure process; and (c) develop an evidence based method for making decisions
about requirements, professional development, and employment. The initiative for the
Student-Teacher Performance is being led by the American Association of Colleges of
Teacher Education and the Council of Chief State Officers. Linda Darling-Hammond and
Ray Pecheone of Stanford University are the project’s co-directors. The University of
Cincinnati is one of four pilot institutions in Ohio. Five of our faculty members have
been prepared and calibrated as scorers. Our timeline for full implementation of the
TPA is:
 2011-12: Pilot full assessment; this pilot assessment may or may not include all licensure
areas; Stanford gathers pilot data to establish validity, reliability, and fairness.
 2012-13: UC School of Education moves to semesters; full implementation of TPA in all
licensure areas, including early childhood, special education, foreign language, middle
childhood; full scale data collection with all program completers.
 2013-14: Full implementation; possible high stakes Ohio licensure assessment; full scale
data collection with program completers.
Multiple Assessments/ Multiple Benchmarks
As we designed our assessment system, we planned for multiple assessments, ratings by several
stakeholders (e.g., faculty members, cooperating teachers), and benchmarks for remaining in
good standing in the programs. For initial programs, these benchmarks included (a) application
for the professional cohort; (b) application for the internship/student teaching/clinical practice,
and (c) program completion. The criteria for program completion included:





Appropriate GPA (2.8-3.0 depending on the program)
Passing scores on Praxis I (or waiver through ACT/SAT equivalent)
Completion of required coursework
Successful completion of early field experiences
Completion of adequate number of credits to complete requirements.
Application for internship/student teaching includes:

Appropriate GPA
8



Successful completion of field experiences
Documentation of appropriate dispositions in class and in the field
Successful completion of coursework (at least a C in each course).
Application for completion:

Successful completion of critical performances and Teacher Performance
Assessments

Documentation of appropriate dispositions

Successful completion of coursework

Successful completion of field and clinical experiences

Passing scores on Praxis II
For advanced programs and programs for other school personnel, application includes:

Appropriate GPA

Documentation of dispositions

Goal statements and/or resume

Interviews (required by some programs)
Each of the advanced programs and programs for other school personnel complete annual
reviews on candidate performance. This review includes:
 Candidate progress towards program completion
 Appropriate GPA
 Performance on program assessments completed
 Candidate dispositions
Requirements for program completion include:

Appropriate GPA

Satisfactory performance on program-identified work samples

Completion of culminating experience

Satisfactory completion of required and elective coursework.
In addition to these benchmarks, we implemented assessments that included multiple
stakeholders, allowing us to triangulate our data.
Stakeholders
Candidates
Faculty Members
Cooperating
Teachers/Mentors
Candidate Assessment
Reflections on Student-Teacher Performance Assessment
Self-evaluation items on program evaluations
Student-Teacher Performance Assessment
Student Satisfaction Survey
Evaluation of performance in field experiences
Scoring Student-Teacher Performance Assessments
Candidate performance in courses
Classroom assessments of dispositions
Dispositions brief reports
Additional specialized program association assessments
Evaluation of candidate performance in field experiences
Cooperating teacher/mentor evaluation of program
9
Graduates
Ratings of Candidate Use of Technology
Candidate disposition assessments
Dispositions brief reports
Follow-up survey
Employer surveys
Hiring reports
Stakeholders also participate in assessment of the program on the whole and the unit
operations for implementing the program. Each of the measures for these stakeholders is
presented in the table below:
Stakeholders
Candidates
Faculty
Members
Cooperating
Teachers/
Mentors
Employers
Graduates
Program Assessment
Program Evaluations
Course Evaluations
Evaluation of Professional Experiences
Evaluation of University Supervisor
Program Development Form (completed
in review of candidate performance data)
Review and follow-up of dispositional
assessments
Cooperating teacher/mentor evaluation
of program
Unit operations Assessment
Items on program evaluations
Items on Evaluation of Professional
Experiences
Program Evaluations
Evaluation of Placements
Instructor/Student Use of
Technology
Cooperating teacher/mentor
evaluation of program
Employer questionnaire
Follow-up survey
Follow-up survey
In that we have identified p-12 student outcomes as another aspect of our assessments, we
have identified these measures:
 Candidate performance on engaging learners on the Student-Teacher Performance
Assessment
 Value added outcomes of candidate’s students
Performance Assessments and Candidate Success
Several efforts were used to measure the relationship of performance assessments to candidate
success. When Ohio used Praxis III, we carefully monitored the success rate of our candidates
which was always 98-100%. The University Council on Educator Preparation released a request
for proposals, funding studies for programs or groups of programs to study the relationship
between those measures we have implemented and success as rated by employment,
satisfaction, employer ratings, or other indicators. The results of these studies were presented
to programs for their consideration in the revision of their performance measures. We also
monitor employment rate as compared to local and statewide institutions. As we implement
the Student-Teacher Performance Assessment, we will have a carefully calibrated assessment of
our candidate performance.
10
Fairness, Accuracy, and Consistency
The fairness and accuracy of measures is addressed through (a) multiple measures of each
aspect, including multiple raters; (b) the consistent use of research-based rubrics rather than
rating scales; (c) training of raters; (d) annual faculty review of instruments and data; and (e)
review of validity of all items by experts and alignment with national standards. With the
emerging use of the Student-Teacher Performance Assessment (STPA), evaluators are trained by
a set of individuals trained by the consortium at Stanford University and calibrated on their
scoring. The calibrating of scorers and distributing candidates’ work across state universities
increases validity and consistency.
Multiple measures of each aspect of assessment, including multiple raters. In our design we
ventured to have repeated, multiple measures of each factor, completed by multiple trained
raters. For example, in evaluation of performance in internships, candidates are rated by their
cooperating teachers/mentors, their university supervisors, and themselves. Instructors,
cooperating teachers/mentors, and university supervisors rate dispositions. A team of faculty
members, university supervisors, and cooperating teachers/mentors rates portfolios. Each
rating scale is supported by an explicit rubric, on which raters are trained.
Training of raters. Each cooperating teacher/mentor, university supervisor, and faculty member
receives training on the forms he or she is required to complete. These include in-person
materials, and for those who cannot easily come to campus, podcasts and electronic handbooks,
posted on Blackboard.
Annual faculty review of instruments and data. Content validity is addressed each year
through faculty review of all instruments and data. Through this review, the validity of the
instruments is addressed, with the key question for discussion “Is this instrument truly
measuring what we designed it to measure?” This review is concurrent with a review of validity
by experts and alignment with national standards. Subgroups of faculty with expertise in
assessment review the instruments. Program faculty members insure that the instruments are
aligned with the national standards, providing an additional level of content validity.
Consistency. In order to monitor the consistent completion and submission of assessments, a
candidate-specific checklist and folder system was developed. Each university supervisor “signsoff” each semester to document the collection of appropriate data.
Assessment of Program Quality, Unit Operations, and Candidate Performance
The Unit Assessment System involves regular and comprehensive data collection on program
quality, unit operations, and candidate performance at three points for each candidate: (a)
before the internship/student teaching professional experience (b) after the internship/student
teaching professional experience, and (c) one to two years after completing the program. One
of the important aspects of the Assessment System is the use of marker items across all
programs. This allows programs to review their program strengths and weaknesses relative to
other programs. In addition it allows us to calculate unit-wide means.
The assessments used for candidate performance include:
11








Candidate Dispositions Progress Report - administered at least once in each field
experience
Candidate Dispositions Brief Report – completed to document exemplary or
problems with dispositions (includes the development of plans for remediating
dispositions)
Candidate Classroom Dispositions Rating – completed in coursework
Candidate Performance in Field Experience, aligned with individualize
specialized program association standards
Candidate Evaluation of STPA Program - administered during classes at the end
of spring semester
Employers Program Evaluation Form
Candidate Use of Technology – completed by cooperating teacher or mentor at
least once during the internship
Program specific portfolios and critical performances – described by each
program; core proficiencies or indicators from each program are rated on a
four-point scale.
The assessments used for program quality include:

Candidate Evaluation of Program

Program Development Form completed after faculty review of data

Cooperating teacher/mentor Program Evaluation

Employer Program Evaluation

Candidate Follow-Up Survey

Course evaluations – every course, every time taught

Candidate evaluation of field experience – completed by candidate at the
conclusion of each field experience
The specific items and assessments for Unit Operations include:

Candidate Evaluation of Program

Cooperating teacher/mentor Program Evaluation items related to supervision
and role

University Supervisor Evaluations of Placement

Program Completer Follow Up

Course evaluations

Candidate evaluation of field experience

Candidate evaluation of university supervisor

Student Satisfaction Survey

Golden Apples (most effective faculty members)
All assessments are implemented unit-wide; individual programs use additional measures, but
all complete these measures at least three points for each candidate.
Our commitment to student learning necessitated that we identify a way to track the
accountability of our candidates with the children, youth, and adults with whom they work. The
School Psychology Program had instituted the use of goal attainment scaling to account for their
candidates’ acquisition, maintenance, and generalization of practice activities (Bonner &
12
Barnett, 2003). Goal Attainment Scaling was originally described as a means of evaluating
mental health services (Kiresuk & Sherman, 1986). Goal attainment scaling was proposed as a
method for determining the extent of a child’s goals attainment and as a comparison of the
relative effectiveness of various strategies or actions. MacKay and associates (1993) suggested
that goal attainments scaling is responsive to measuring diverse functional goals across services,
making it a strong outcome measure for groups of children, youth, and adults where the rate of
progress varies. The use of goal attainment scaling methodology has been demonstrated to be
of significant value in the evaluation of teaching or intervention based change, and is “a more
accurate estimate than any other measure” (Sladeczek, Elliott, Kratochwill, Robertson-Mjaanes,
& Stoiber, 2001,p. 52). With our pilots of the Student-Teacher Performance Assessment (TPA),
however, we determined that the analysis of student work, assessments, and differentiation of
further instruction was more consistent with the work of teachers and yielded data on k-12
student achievement. Rubrics 6, 7, 8, and 9 of the TPA and similar analysis of student
work/differentiation/reassessment strategies are now in place.
Use of Multiple Assessments
Multiple measures are used for each indicator. These measures are completed both internal to
the University (candidates, faculty members) and external to the University (cooperating
teachers/mentors, employers, members of the professional community). In addition, measures
mandated by the Ohio Department of Education, such as Praxis II is in place.
Improvement Cycles (Collection, Compilation, Summarization, and Analysis)
There are two Improvement Cycles: (a) The Program Improvement Cycle and the (b) Operations
Improvement Cycle. The Program Improvement Cycle is presented in Table 4, and the
Operations Improvement cycle is presented in Table 5.
Table 4: Program Improvement Cycle
For Fall Semester Program Meetings (Summer Work)
The following data is aggregated and summarized for decision-makers in program areas:
 Admissions data
 Praxis II data
 Cohort application data
 Intern/Student Teacher Evaluations
 Candidate Dispositions Progress Report data
 Data about program completers
 Follow-up data
 New SPA standards
 Performance on all SPA assessments
Handbook and candidate materials reviewed
Data posted to website
By December 1
 Data-based decisions made regarding the programs submitted to Accreditation Office
 Summary report by program area submitted to UCEP/Associate Deans
 Areas needing attention identified
13
 Curriculum proposals for curriculum adjustment/maintenance generated
 Candidates informed of program improvements in response to data
 Submission of new programs to ODE for fall approval cycle
During Fall Semester
 Finalize major curriculum changes and develop proposals to be considered by the School
leadership
 Submit curriculum proposals to the School leadership
 Monitor progress of proposals
During Spring Semester
 Record approved curriculum changes
 Update coursework inventories
 Update program plans and information sheets
 Update evaluation forms for spring semester distribution
 Submit updates to bulletins to Associate Dean
Late Spring Semester
 Dean’s office submits official updates to bulletins
 Dean’s office submits official updates to website
 Implement program improvements; continue to collect data on candidates and
programs
Table 5: Operations Improvement Cycle
For Fall Semester Program Meetings (Summer Work)
The following data is aggregated and summarized for decision-makers in program areas:
 Prior year budget results
 Budget projects for current academic year
 Results of administrator evaluations
 Accreditation annual reports
 Faculty productivity
 Student Satisfaction Survey results
 Reports on grants and projects
 Candidate employment rates
Handbook and candidate materials reviewed
By December 1
 Identify areas needing attention and report to Accreditation Office via program
development plan
 Launch searches for faculty vacancies for the next fall
 Identify potential grant opportunities
During Fall Semester
 Submit curriculum proposals to School leadership
 Monitor progress of proposals
 Explore potential areas of outreach
During Spring Semester
 Complete course schedules for coming year
 Complete load reporting for academic year
14
 Finalize Graduate Assistant/University Graduate Scholarship allocation
By March 15
 Initiate Student Satisfaction Survey
Late Spring Semester
 Dean’s office submits official updates to bulletins
 Dean’s office submits official updates to website
 Implement program improvements; continue to collect data on candidates and
programs
In addition to the two areas of assessment required by NCATE (CAEP), we recognize that
a full assessment system would include an analysis of the outcomes of the students with
whom our candidates and graduates work.
P-12 Student Outcomes Improvement Cycle
For Fall Semester Program Meetings (Summer Work)
The following data is aggregated and summarized for decision-makers in program areas:
 Aggregation and content analysis of districts’ teacher evaluations of graduates
 Support value added project ( Dr. Julie Morrison)
 Generate TPA impact on student learning reports
 Track employment of graduates
By December 1
 Identify additional data sources for impact on p-12 student outcomes
 Programs review data reports in view of program design and clinical experiences
 Programs review syllabi in views of evidence and research based practices
Spring Semester
 Aggregate and generate reports from data sources
By March 15
 Programs generate report on program and clinical experiences to improve potential for
positive impact on student learning
Late Spring Semester
 Dean’s office submits official updates to bulletins
 Dean’s office submits official updates to website
 Implement program improvements; continue to collect data on candidates, programs,
and outcomes
All forms related to field experiences are collected during the semester of the field experience.
Data are regularly and systematically collected, compiled, summarized, analyzed, and reported
publicly for the purpose of improving candidate performance, program quality, and unit
operations. Each program sends a report to students related to data analysis and efforts to
improve the program to students through the program list serv. Program data are provided on
the website in the format typically used by specialized program association reports.
The Assessment System Alignment
15
The assessment system is presented in view of each of our institutional standards. Before the
presentation of each institutional standard, we review the alignment of the standard with
national standards, as well as our own conceptual framework and candidate proficiencies.
Institutional Standards are presented in the order of content knowledge, skills, and dispositions
elements of NCATE Standard 1. In addition, the Assessment System includes two other areas of
concern: (a) insuring the integrity of field experiences and supervision and (b) insuring the
general quality of programs and unit operations.
Alignment of Assessments Addressing Knowledge
Institutional Standard: University of Cincinnati candidates demonstrate foundation knowledge, including
knowledge of how each individual learns and develops within a unique developmental context
Model Core Teaching
Standards
Standard #1: Learner
Development. The teacher
understands how children
learn and develop,
recognizing that patterns of
learning and development
vary individually within and
across the cognitive, linguistic,
social, emotional, and physical
areas, and designs and
implements developmentally
appropriate and challenging
learning experiences.
Ohio Standards for the
Teaching Profession
Standard #1: Student
Teachers understand
student learning and
development, and respect
the diversity of the
students they teach.
NPBTS Core
Principles
1: Committed to
students and their
learning
Assessments
Praxis II Principles of
Learning and Teaching
Student-Teacher
Performance Assessment
Employer Evaluation of
Program
Mentor/Cooperating
Teacher Evaluation of
Program
Performance in field and
clinical experiences
Grades in coursework
Institutional Standard: University of Cincinnati candidates demonstrate content knowledge, able to articulate
the central concepts, tools of inquiry, and the structures of their discipline
Model Core Teaching
Standards
Standard #4: Content
Knowledge. The teacher
understands the central
concepts, tools of inquiry, and
structures of the discipline(s)
he or she teaches and creates
learning experiences that
make these aspects of the
discipline accessible and
meaningful for learners.
Ohio Standards for the
Teaching Profession
Standard #2: Content
Teachers know and
understand the content
area for which they have
instructional responsibility.
NPBTS Core
Principles
2: Know the
subjects they teach
and how to teach
those subjects to
students
Assessments
Praxis II content
knowledge tests
Student-Teacher
Performance Assessment
Event
Description of content
knowledge in planning
assessments
Employer Evaluation of
Program
Mentor/Cooperating
Teacher Evaluation of
Program
Performance in field and
clinical experiences
Grades in coursework
16
Institutional Standard: University of Cincinnati candidates demonstrating pedagogical content knowledge,
grounded in evidence- based practices, and maximizing the opportunity for learning, and professionalism.
Model Core Teaching
Standards
Ohio Standards for the
Teaching Profession
NPBTS Core
Principles
Standard #5: Innovative
Applications of Content. The
teacher understands how to
connect concepts and use
differing perspectives to
engage learners in
critical/creative thinking and
collaborative problem solving
related to authentic local and
global issues.
Standard #2: Content
Teachers know and
understand the content
area for which they have
instructional responsibility.
2: Know the
subjects they teach
and how to teach
those subjects to
students
Assessments
Praxis II content
knowledge tests
Teacher Performance
Assessment Event
Description of pedagogical
content knowledge in
planning assessments
Employer Evaluation of
Program
Mentor/Cooperating
Teacher Evaluation of
Program
Performance in field and
clinical experiences
Grades in coursework
Direct observation
Collaborative assessment
logs
Alignment of Assessments Addressing Skills
Institutional Standard: University of Cincinnati candidates demonstrate the ability to address issues of diversity
with equity and skills of culturally responsive interactions.
Model Core Teaching
Standards
Standard # 3: Learning
Environments. The teacher
works with learners to
create environments that
support individual and
collaborative learning,
encouraging positive social
interaction, active
engagement in learning,
and self-motivation.
Standard #2: Learning
Differences. The teacher
uses understanding of
individual differences and
diverse communities to
ensure inclusive learning
environments that allow
each learner to reach
his/her full potential.
Ohio Standards for the
Teaching Profession
Standard # 5: Learning
Environment. Teachers
create learning
environments that
promote high levels of
learning and
achievement for all
students.
NPBTS Core Principles
3: Responsible for
managing and
monitoring student
learning
Assessments
Direct observation and
collaborative assessment
logs
Student-Teacher
performance assessment
Dispositions assessment
Unit and Planning
assessments
Employer survey
Candidate program
evaluation
17
Institutional Standard: University of Cincinnati candidates demonstrate the ability to use technology to support
their practice.
Model Core Teaching
Ohio Standards for the
NPBTS Core Principles
Assessments
Standards
Teaching Profession
Standard #8: Instructional
Standard #4: Instruction.
2: Know the subjects
Direct observation and
Strategies. The teacher
Teachers plan and deliver they teach and how to
collaborative assessment
understands and uses a
effective instruction that
teach those subjects to
logs
variety of instructional
advances the learning of
students
Student-Teacher
strategies to encourage
each individual student.
performance assessment
learners to develop deep
Candidate use of technology
understanding of content
Unit and Planning
areas and their
assessments
connections, and to build
Employer survey
skills to access and
Candidate program
appropriately apply
evaluation
information.
University of Cincinnati candidates demonstrate the ability to use assessment and research to inform their efforts
and improve student outcomes.
Model Core Teaching
Standards
Standard #6: Assessment.
The teacher understands
and uses multiple methods
of assessment to engage
learners in their own
growth, to document
learner progress, and to
inform teacher planning
and instruction.
Ohio Standards for the
Teaching Profession
Standard #3: Assessment.
Teachers understand and
use varied assessments
to inform instruction,
evaluate and ensure
student learning.
NPBTS Core Principles
2: Know the subjects
they teach and how to
teach those subjects to
students
Assessments
Program specific
assessments
Direct observation and
collaborative assessment
logs
Student-Teacher
performance assessment
Dispositions assessment
Unit and Planning
assessments
Employer survey
Candidate program
evaluation
Mentor program evaluation
University of Cincinnati candidates demonstrate pedagogical content knowledge, grounded in evidence- based
practices, and maximizing the opportunity for learning, and professionalism.
Model Core Teaching
Standards
Standard #7: Planning for
Instruction. The teacher
draws upon knowledge of
content areas, cross
disciplinary skills, learners,
the community, and
pedagogy to plan
instruction that supports
every student in meeting
rigorous learning goals.
Ohio Standards for the
Teaching Profession
Standard #4: Instruction.
Teachers plan and deliver
effective instruction that
advances the learning of
each individual student.
NPBTS Core Principles
2: Know the subjects
they teach and how to
teach those subjects to
students
Assessments
Program specific
assessments
Direct observation and
collaborative assessment
logs
Student-Teacher
performance assessment
Dispositions assessment
Unit and Planning
assessments
18
Standard #8: Instructional
Strategies. The teacher
understands and uses a
variety of instructional
strategies to encourage
learners to develop deep
understanding of content
areas and their
connections, and to build
skills to access and
appropriately apply
information.
Employer survey
Candidate program
evaluation
Mentor program evaluation
Praxis II Principles of
Learning and Teaching
Alignment of Assessments Addressing Dispositions
Institutional Standard: University of Cincinnati candidates demonstrate collaboration, leadership, and
engage in positive systems change.
Model Core Teaching
Standards
Ohio Standards for the
Teaching Profession
NPBTS Core
Principles
Standard #10: Collaboration.
The teacher collaborates
with students, families,
colleagues, other
professionals, and
community members to
share responsibility for
student growth and
development, learning, and
well-being.
Standard #6 Teachers
collaborate and
communicate with students,
parents, other educators,
administrators and the
community to support
student learning.
Standard #7 Teachers
assume responsibility for
professional growth,
performance, and
involvement as individuals
and as members of a
learning community.
5: Members
of learning
communities
Assessments
Disposition Assessments
Disposition Observations
Teacher Performance
Assessment
Parent Feedback
Institutional Standard: University of Cincinnati candidates demonstrate the moral imperative to teach all
students and address this responsibility with tenacity.
Model Core Teaching
Standards
Standard #9: Reflection and
Continuous Growth. The
teacher is a reflective
practitioner who uses
evidence to continually
evaluate his/her practice,
particularly the effects of
his/her choices and actions
on others (students, families,
and other professionals in
the learning community),
and adapts practice to meet
the needs of each learner.
Ohio Standards for the
Teaching Profession
Standard #5 Teachers create
learning environments that
promote high levels of
learning and achievement
for all students.
NPBTS Core
Principles
4: Think
systematically
about their
practice and
learn from
experience
Assessments
Disposition Assessments
Disposition Observations
Student-Teacher
Performance Assessment
19
Assessments Insuring Integrity of Field Experiences and Supervision







Candidate evaluation of field experience
Supervisor evaluation of field experience
Candidate evaluation of university supervisor
Collaborative assessment logs
Goal setting agreements
Performance assessments
Dispositions assessments
Insuring General Quality of Programs and Unit operations










Supervisor evaluation of field experiences
Candidate evaluation of field experiences
Candidate evaluation of university supervisor
Candidate evaluation of each course
CECH Student Satisfaction Survey
Candidate evaluation of program
Results of Ohio Department Education of teaching assignments search
Results of Ohio Office of Jobs and Family Services survey of graduates
Employer Surveys
Follow-up surveys with successful graduates
Use of Technology
All faculty evaluations of programs and a growing number of candidate evaluations of programs
are completed through Qualtrics, online programs that assist in sending surveys and aggregating
data. Excel is used to aggregate the data. All data are posted to the University of Cincinnati
NCATE website. Assessments are also distributed electronically, for the most part, submitted
electronically. We have approached the goal of running as a paperless system.
Evaluations and the Evaluation Review Process
Evaluation forms for the Assessment System were initially generated through the work of the
Assessment and Evaluation Work Group. These forms were piloted in 2001-2002, and used
again 2002-2003. Each year, when programs are provided with their centrally managed data,
faculty members are given the opportunity to modify the program-specific items on the
evaluations. The forms for each of the programs of the unit are provided in the Appendix. Each
program is provided the data from their evaluations in excel spreadsheets in order to explore
other analytic techniques. After the system is implemented for three years, we will review
whether our current analytic techniques are providing the information we need.
Systematic Evaluation of Changes
20
The unit not only makes changes where evaluations indicate, but also systematically studies the
effects of any changes to assure that the intended program strengthening occurs and that there
are no adverse consequences. With our use of unit-wide measures and items, we are able to
track changes across programs across academic years. We have begun this effort, and present
all data across the years for which they are available.
References
Bonner, M. & Barnett, D. W. (2004). Intervention-based school psychology services: Training
for child-level accountability; preparing for program-level accountability. Journal of
School Psychology, 42, 23-43.
Goodwin, B., Englert, K., & Cicchinelli, L. (2002). Comprehensive accountability systems: A
framework for evaluation. Aurora, CO: Mid-Continent Research for Education and
Learning (MCREL). Retrieved June 17, 2011 from
http://www.mcrel.org/PDF/AssessmentAccountabilityDataUse/5021IR_Comprehensive
AccountabilitySystems.pdf
MacKay, G., McCool, S., Cheseldine, S., & McCartney, E. (1993). Goal attainment scaling: A
technique for evaluating conductive education. British Journal of Special Education, 20,
143-147.
McLaren, C. & Rodger, S. (2003). Goal attainment scaling: Clinical implications for paediatric
occupational therapy practice. Australian Occupational Therapy Journal, 50, 216-224.
National Institute for Learning Outcomes Assessment (2011). Providing evidence of student
learning: A transparency framework. Retrieved June 15, 2011 from
www.learningoutcomeassessment.org/TFCComponentAP.htm
Ohio Board of Regents (2011). Ohio TPA. Retrieved June 17 2011 from http:/ohiotpa.org.
Sladeczek, I. E., Elliott, S. N., Kratochwill, T. R., Robertson-Mjaanes, S., & Stoiber, K. C. (2001).
Application of goal attainment scaling to a conjoint behavioral consultation case.
Journal of Education and Psychological Consultation, 12 (1), 45-58.
21
Appendix A- Student Teacher Performance Assessment Tool
Pilot Assessment - Increasing Positive Outcomes for p-12 Students
Candidate:
Observer/Date:
Coding: O - observed; S - strength; D - point for discussion
Rapport and Relationships
Code
Notes
Code
Notes
I-Thou Interaction - interacts with each student at a
person to person level
Calls students by name
Greets students at the door
Makes personal conversation with students with more
that superficial knowledge
Smiles
Makes eye contact
Active listening - reflects back the emotion in a clarifying
statement
Gives evidence of having heard the student by reflecting
the idea of feelings of the student
Jokes to relieve tension
Asks questions and makes comments that demonstrate
personal interest
Shows humor
Provides praise and reprimand without producing
student embarrassment
Shows respect
Give compliments
Encourages attendance and enthusiastically personally
attends extracurricular activities
Comments/concerns/examples:
Communication
Welcoming tone of voice
Reflects a calm visage
Clarifies understanding, recognizing that they may be
responsible for the lack of understanding
Paraphrases and expands on student ideas
Provides support (e.g. "I appreciate how difficult this
seems.")
Varies pitch, volume, and inflection
Nods and gestures to encourage and demonstrate
enthusiasm
22
Comments/concerns/examples:
Motivation
Code
Notes
Code
Notes
Code
Notes
Encouraging Feedback, such as complimenting sincerely
Praises the accomplishment/achievement
Challenges students to think, problem solve, take up the
challenge
Asks questions that intrigue students
Relates to students experiences in their community, as a
class, as members of a school
Provides a rationale for the lesson, concept, skill that is
accepted by students
Allows students to make some decisions
Involves students in discussion, activity, or teaching
Enforces classroom rules
Uses cooperative/collaborative learning structures
Praises the accomplishment/achievement
Comments/concerns/examples:
Learning environment
Written communication is legible, clear, and attractive
Books readily available in the room
Relevant posters, changed frequently
Pictures of the class/students are posted in classroom
Computers/software available and in use for reinforcing
instruction
Videos used as instructional media
Arranges the classroom to facilitate interaction
Comments/concerns/examples:
Management
Clarifies how the student might use feelings
constructively
Manages classrooms through clear procedures which are
verbalized and reviewed
Provides opportunities to make decisions about
procedures
Refrains from using negative judgments, (e.g. should
never, everybody ought, any fifth grader would
understand this)
23
Uses explicit reprimands (In this room people are quiet
while others are talking. Please keep quiet for our
speaker).
Makes statements regarding self-management and
personal responsibility rather than relying on teacher
presence and control
Circulates among students
Assumes role of learner, listener, supportive adult as
needed
Provides clear rules and procedures
Actively follows teacher’s rules and procedures
Consistent with rules/procedures
Reminds students of rules
Provides nonverbal signals that behaviors need to
change
Consistently and fairly provides natural consequences
Uses the least intense correction possible
Ignores minor issues when students continue to be
engaged; picks battles
Uses rational rather than power arguments
Responds positively to justified criticism
Provides redundant cues - visual and verbal; kinesthetic
and verbal; written and spoken
Appropriate flexibility in applying rules
Makes rules together with students
Comments/concerns/examples:
Instruction
Frequent and varied testing
Provides adequate wait time
Changes tack when lesson is lagging
Probes for students' background, beliefs, and interests
Explains the reason for activities
Uses content specific pedagogy
Code
Notes
24
Comments/concerns/examples:
Assessment
Engages students in evaluating their own work
Engages students in reviewing their progress
Varies assessments using:
learning logs
Performances; Journals
Portfolios/work samples
Post-test/pre-test
Questioning
Students as teachers
Other
Comments/concerns/examples:
Code
Notes
Initiative
Seeks or accepts new tasks
Acquires resources for teaching
Identifies a mentor or model teacher who is active,
positive, and engaged
Generates new ideas, relationships, applications,
products
Seeks out and uses data and strategies to address
classroom concerns
Consciously modifies behavior toward students to obtain
desirable results
Makes predictions about the effort of own behavior on
students and tests those predictions
Comments/concerns/examples:
Code
Notes
Reflection
Separates own opinions from data
Verbalizes that conditions or events can improve
Uses data as opposed to acting on impulse
Analyzes own behavior
Believes students are capable of liking him or her
Code
Notes
25
Comments/concerns/examples:
Differentiates instruction
Analyzes student work and reteaches
implements IEP identified accommodations and
adaptations
Adaptive Technology
Alternative activities
Inclusive instruction
Independent study
Learning contracts
One on one
Peer support
Small groups
Varied assignments and activities; no single
activity/assignment longer than 20 minutes without
movement or change
Varied texts
Comments/concerns/examples:
Code
Notes
Characteristics of Effective Urban Teachers
Perseveres despite challenges that may arise
Demonstrates commitment to carrying out all objectives,
activities, and projects to promote high standards
Describes challenges through multiple lenses
Demonstrates unique paths to problem solving
Holds high expectations
Emphasizes strengths rather than deficits
Demonstrates self-reflection regarding relationships
Code
Notes
Creates learning opportunities adapted to diverse
populations
Ardently interested
Persistence
Value of children's learning
Putting ideas into practice
Approach to at-risk students
Professional/personal orientation to students
Professional/personal orientation to bureaucracy
Professional/personal orientation to fallibility
Strong planning and organization
Respect for parents
26
Comments/concerns/examples:
Works Cited
American Association of School Personnel Administrators. (1997). Teacher of the future: A continuous cycle of improvement.
Bebeau, M.J., Rest, J.R., & Narvaez, D. (1999). Beyond the promise: A perspective on research in moral education. EJ587024.
Benfu, L. (2000) Ethics teaching in medical schools. The Hastings Report. 30(4)AN00930334. Retrieved February 28, 2005.
Benninga, J.S., Berkowitz, M., Kuehn, P., & Smith, K. (2003). The relationships of character education and academic
achievement in elementary schools. Journal of Research in Character Education, 1(1), 17-30.
Darling-Hammond, L., & Sykes, G. (Eds.) (1999). Teaching as the learning profession. San Francisco: Jossey-Bass.
Darling-Hammond, L., Wise, A.E., Pease, S.R. (1983). Teacher evaluation in the organizational context: A review of the
literature. Review of Educational Research, 53(3), 285-328.
Dispositions for professional teaching practice - Chicago UTEP seeks these attributes in their applicants
Esquivel, G.B. (1995). Teacher behaviors that foster creativity. Educational Psychology Review, 7(2), 185-202.
Goodlad, J. (2002).Kudza, rabbits, and school reform. Phi Delta Kappa, 84 (1), 16-23.
Greenwood, C.R., & Maheady, L. (1997). Measurable Change in Student Performance: Forgotten Standard in Teacher
Preparation? Teacher Education and Special Education, 20 (3), 265-275.
Haberman, M. (1996). Selecting and preparing culturally competent teachers for urban schools. In J. Sikula (Eds)., Handbook
of research on teacher education (pp. 747-760). New York: McMillan
Murray, H.G. (1985). Classroom teaching behaviors related to college teaching effectiveness. New Directions for Teaching and
Learning, 1985(23), 21–34.
National Commission on Teaching and America's Future. (1996).What matters most: Teaching for America's future.
Rabinowitz, W., & Travers, R.M.W. (1953). Problems of defining and assessing teacher effectiveness. Educational Theory, 3 (3)
212-219.
Rey, R. B., Smith, A. L., Yoon, J., Somers, C., & Barnett, D. (2007). Relationships between teachers and urban African American
children. School Psychology International, 28 (3), 346-364.
Rogers, D., & Webb, J. (1991).The ethic of caring in teacher education. Journal of Teacher Education, 42(3), 173-181.
Simon, A., & Boyer, E.G. (Eds.). (1974). Personal author, compiler, or editor name(s); click on any author to run a new search on
that name. Mirrors for behavior III: An anthology of observation instruments. Wyncote, PA: Communication Materials Center.
Thompson, S., Rousseau, C., & Ransdell, M. (2005). Effective teachers in urban school settings: Linking teacher disposition and
student performance on standardized tests.
VanGyn, G. (1996). Reflective practice: The needs of professions and the promise of cooperative education. Journal of
Cooperative Education, 31(2-3), 103-131.
VanTartwijk, J., Brok, P. Veldman, L., & Wubbels, T. (2009) Teacher's practical knowledge about classroom management in
multicultural classrooms. Teaching and Teacher Education, 25, 453-460.
Wayda, V. & Lund, J. (2005).Assessing dispositions: An unresolved challenge in teacher education. Journal of Physical
Education, Recreation, and Dance, 76(1), 34-76.
Download