NCATE: 2.2.a Standard on which the unit is moving

advertisement
American University – School of Education, Teaching, and Health
NCATE: 2.2.a Standard on which the unit is moving
to the target level
2.2.a Describe areas of the standard at which the unit is currently performing at the target level for
each element of the standard. Summarize activities and their impact on candidate performance and
program quality that have led to target level performance. Discuss plans and timelines for attaining
and/or sustaining target level performance as articulated in this standard.
In order to describe work undertaken to move to the Target Level, excerpts from the "target" rubrics of
NCATE Standard 2 are cited in [brackets] below along with specific examples of evidence to show that
the EPP is moving to the Target Level or present a timeline for moving toward target.
2a. Assessment System
[The unit, with the involvement of its professional community, is regularly evaluating the capacity and
effectiveness of its assessment system, which reflects the conceptual framework and incorporates
candidate proficiencies outlined in professional and state standards.]
The EPP's professional community, including full-time and part-time faculty, clinical faculty supervisors,
supervisor leads, method leads, cooperating teachers, partner school principals, and program alumni
each have a role in the assessment system, which reflects the conceptual framework principles of
Community, Diversity, Equity and Excellence as well as the INTASC and SPA standards, and standards
designed for advanced programs in curriculum and instruction. Evaluation of the system and
assessments by these stakeholders occurs at three levels described in Exhibit 2.4.a.
In AY 2013-2014, the EPP systematized the participation of alumni and school leaders in the assessment
system by developing and implementing alumni and employment surveys. The EPP is also assisting the
state in its efforts to develop an educator database that tracks the performance of candidates licensed
in DC. The advanced program in curriculum and instruction is building its professional community, which
includes school leaders, alumni and leaders in policy organizations.
[The unit regularly examines the validity and utility of the data produced through assessments and
makes modifications to keep abreast of changes in assessment technology and in professional
standards.]
The EPP collects data that are meaningful, valid, and useful in measuring candidate performance and
uses these data for program improvement. Each key assessment includes a rubric, which indicates the
acceptable level of candidate performance. For initial programs, the validity and utility of data produced
has been proven during SPA or state review and rejoinder process. As SETH programs continue to
receive national recognition and are externally reviewed less frequently, the Director of the Office of
Teacher Education and the Director of Special Education will design and implement a biannual internal
program review system. This system is projected to be in place in AY 16-17 in order to provide enough
time for SPA and state review feedback, revised assessment implementation and effective system
design. This process will ultimately include the review of evidence regarding the relationship between
candidate performance assessments to candidate program success to graduate success in the classroom
(see below). A similar process will be adopted by the advanced programs in AY 17-18.
[Decisions about candidate performance are based on multiple assessments....Data show a strong
relationship of performance assessments to candidate success throughout their programs and later in
classrooms or schools.]
The EPP has operated with an assessment system with multiple assessments occurring at multiple
decision points since 2006 for initial programs, and began to implement this process in 2014 for
advanced programs. Decision point details are provided in Exhibit 2.4.a.
SPA and state review data show a strong relationship between performance assessment outcomes and
candidate program success. Faculty and program directors regularly use assessment data to follow
individual candidates and monitor areas for improvement. For example, if a candidate performs poorly
on the unit-plan key assessment in a methods class, the faculty member and director will alert the
clinical faculty supervisor to provide additional support in lesson and unit planning during field
experience. Confirmation of the expected growth occurs in the final evaluation of student teaching and
professional teaching portfolio assessments.
The employment and performance follow-up decision point has been recently added and will be the
focus of the work of the directors of SETH's initial and advanced programs in AY 14-15. An employment
and performance follow-up survey for all programs is being designed and will be implemented in Fall
2014. The Director of the Office of Teacher Education is also providing feedback to the state about the
State's Educator Preparation Program system, which is currently being designed for operation in 2015.
This system will provide performance data about program graduates. Assessments at each decision
point for advanced programs are being implemented and tested in AY 14-15.
[The unit conducts thorough studies to establish fairness, accuracy, and consistency of its assessment
procedures and unit operations. It also makes changes in its practices consistent with the results of these
studies.]
In the past two years the EPP has taken the following measures to ensure fairness, accuracy and
consistency in its assessment procedures and unit operations:
·Updated all assessments in initial programs to the new 2011 INTASC standards, most notably the
Professional Teaching Portfolio and Final Evaluation of Student Teaching. It is important to the EPP that
programs reflect the education field's new understanding about student learning, research, common
core standards, and teaching practice.
·Aligned key assessments in initial programs more closely to the individual SPA standards as suggested
by SPA reviewers. The EPP emphasizes the importance of candidate knowledge of their discipline; these
aligned assessments provide an opportunity to specifically measure knowledge, skills and dispositions
defined by specialty associations.
·Updated rubrics for key assessments, including those used to assess candidate performance at the
completion of student teaching. Rubrics were updated with guidance of supervisor leads, methods leads
and the Teacher Education Committee. Faculty and cooperating teachers who use assessments were retrained.
·Ensure that multiple reviewers complete the key assessments. For example, the Final Evaluation of
Student Teaching and the Lesson Plan Analysis are both completed by the candidate, cooperating
teacher and clinical supervisor who discuss assessment results and how results might differ among
evaluators. In addition, two clinical faculty members score the professional teaching portfolio. This
ensures a consensus regarding a candidate's mastery of the INTASC standards. Currently, a third
evaluator is employed if scores differ by more than 10 points.
Many of these new assessments and rubrics are still being implemented for the first time and data are
currently being collected. The program directors will review at least two implementations of these
assessments with the supervisor leads, methods leads and teacher education or special education
committee in order to determine if they are valid and providing useful data to assess candidate
performance. During assessment reviews, these stakeholders will review the range of scores provided to
each candidate to ensure training for evaluators is resulting in accurate and fair assessments. These
studies will occur in June 2015. As assessments for the advanced programs are implemented, the
Director and faculty will employ many of these same techniques to ensure assessments are fair,
accurate and consistent.
2b. Data Collection, Analysis, and Evaluation
SETH's assessment system provides regular and comprehensive data on program quality, unit
operations, and candidate performance at key program stages, extending beyond the first year of
completers' practice. Assessment data from candidates, graduates, faculty, and other members of the
professional community are based on multiple assessments from both internal and external sources that
are systematically collected.
In 2010, the EPP invested in upgrading and expanding its web-based, customized assessment system.
The Dean approved a vendor change to the GoEd system, which provides for more robust portfolio,
assessment, and reporting capabilities. In 2012 GoEd expanded to include new modules to track
prospective candidates, monitor candidate advising, track faculty appointments, add a separate
portfolio for the Special Education program, and update the Teacher Education portfolio system to
include the new INTASC standards. Key assessments continue to be implemented and scored through
GoEd by faculty, clinical faculty supervisors, cooperating teachers, and candidates. The system allows
cooperating teachers and clinical faculty supervisors access to real-time assessment data about
candidates and provides candidates a place to track their performance at each decision point in the
program. In addition, GoEd has a robust reporting system that allows assessment data to be aggregated
and disaggregated in many ways: across semesters, by programs, by traditional and alternative route,
etc. Three important changes to the system have improved candidate quality and overall unit
operations.
•
Prospective Candidates: The prospective student module allows unit staff to track
correspondence with candidates. This allows staff and faculty to review the transcript analysis of
candidates before program admission and to notify them as early as possible if they have any
content deficiencies. A candidate can then work with program advisors to create a plan to
eliminate content deficiencies. This process has increased the number of candidates admitted
without content deficiencies.
•
Candidate Advising: Program advisors use the system to track candidate advising. During the
initial advising session, advisors are able to outline the entire program of study with candidates
online. Any staff member who communicates with a candidate enters advising notes, including
candidate complaints and resolutions. This system provides increased program transparency to
candidates and staff members. These changes have helped integrate GoEd into all EPP processes
and allows the EPP to better track student progress through their programs.
•
Key Assessments: As the EPP receives national recognition from the SPA and state review
process, key assessments are updated in the GoEd system. Entering all SPA assessments into
GoEd allows staff to hold faculty accountable for completing required assessments. Candidates
are also more aware of the importance of content and discipline-specific assessments.
•
Faculty System: A new faculty module was created to collect syllabi, faculty CVs, adjunct faculty
rate information, and results of student evaluations of faculty teaching. Adjunct faculty
members also use the system for re-appointment evaluations.
In AY 14-15, the EPP will turn its attention to expanding GoEd to include information about candidates in
their first year of practice. A new alumni section is under development and will be operational by Fall
2014. This section will collect employment data about candidates. Candidate surveys will be collected
and these data, when appropriate, will be linked to a candidate's alumni account.
In addition, in AY 14-15 and 15-16, the EPP will add the advanced programs key assessments to the
GoEd system, train candidates how to use these assessments, train faculty how to use the system to
score these assessments, and review the data from the assessments.
[These data are regularly and systematically compiled, aggregated, summarized, analyzed, and reported
publicly for the purpose of improving candidate performance, program quality, and unit operations.]
Standard 2 Section 2c. above and Exhibit 2.4.a. explain the three-level system for using data to improve
candidate performance, program quality and EPP operations. In the past year a related change to the
EPP includes the creation of the new positions of Methods Leads and Supervisor Leads, and their
required use of assessment data. These new leadership positions review data with the Director of the
Office of Teacher Education and the Director of the Special Education Program and propose program
changes when evidence indicates change is recommended.
As the advanced program assessments are added to the system, full-time faculty in those programs will
follow the same framework to review data and propose program changes as necessary.
[The unit has a system for effectively maintaining records of formal candidate complaints and their
resolution.]
The EPP will continue to follow the official AU Policy on Student Academic Grievances.
[The unit is developing and testing different information technologies to improve its assessment system.]
SETH budgets resources for the GoEd system to be maintained by an external developer and upgraded
when needed. The Dean approves all decisions regarding changes to the system based on overall EPP
benefit and specific program requirements. The Director of the Office of Teacher Education and program
development staff regularly attend national conferences (AACTE, CAEP, AERA, ISTE) to stay abreast of
current trends in teacher education, including new technologies that may enhance the unit's assessment
system. Currently the EPP is focusing on system enhancements related to the collection of employment
and employment performance data from alumni.
2c. Use of Data for Program Improvement
[The unit has fully developed evaluations and continuously searches for stronger relationships in the
evaluations, revising both the underlying data systems and analytic techniques as necessary. The
unit...systematically studies the effects of any changes to assure that programs are strengthened
without adverse consequences.]
Many of the changes made during the last three years to assessments have been to the assessment
procedures or rubrics. These changes were made based on SPA reviewer feedback, peer review
feedback, or feedback from faculty and supervisors implementing assessments. As programs receive
national recognition and assessments are implemented multiple times, the EPP will have additional data
available to study the effects of these assessment changes on the programs, and on candidates'
performance after graduation. Please see Exhibit 2.4.g. for changes made in the past three years.
In the next year the Director of the Office of Teacher Education will work with faculty leads and special
education faculty to design a monitoring system focused on changes to assessments and assessment
processes. This group's primary goal will be to monitor the effect of changes made to assessments and
improvements relative to candidate outcomes.
[Candidates and faculty review data on their performance regularly and develop plans for improvement
based on the data.]
Through the three levels of the Assessment Analysis Structure described earlier, program stakeholders
analyze data each semester, annually and biannually. Plans for improvement are developed based on
those data reviewed.
Download