E1_6_ProgramApprovalProcess

advertisement
2015 Certification & Program Officials
Conference
Sessions E1-6: GaPSC/CAEP Approval Process
December 2, 2015
Enjolia Farrington and Nate Thomas
GaPSC Education Specialists
Collaborate with us!
Visit our Conference Padlet throughout the day to post your questions.
At 3:00 there will be a general session wrap up by Kelly Henson and then
a GaPSC Panel will answer your questions. Be in the know!
http://padlet.com/julie_beck3/certification_conference
Password: Cert Conf
Presentations
http://www.gapsc.com/Commission/Media/DocsPresentations.aspx
Our Mission
To build the best prepared,
most qualified, and most
ethical education workforce in
the nation.
Topics
1.
2.
3.
4.
5.
6.
7.
Warm-Up Activity: KWL
Why Program Approval?
Overview of GaPSC approval process
Overview of CAEP accreditation
Program Approval Standards & Evidence
GaPSC Updates
Annual Reporting
Warm-Up Activity
KWL Chart
1. Each table discuss what you know about
the approval standards and process
2. Individually add post-its of what you
want to learn during the session
3. As you learn something new, add to the
chart through the session.
Overview
Program Approval
Establishes and enforces standards used
to prepare teachers, service personnel,
and school leaders, and approves
education program providers and
programs whose candidates receive state
certification.
Big Questions of Program
Approval
• Is this Educator Preparation Provider
preparing quality candidates who can be
effective educators?
• How do we know?
Peer Review System
Site Visit
(peer review of education program provider and preparation programs)
Evaluation Review Panel
(peer approval recommendation to the Ed Prep Standing Committee if not all
standards are met)
Educator Preparation Standing Committee
(final approval recommendation)
Professional Standards Commission
(final approval decision)
GaPSC Approval Process
Process - GaPSC
•
•
•
•
•
•
•
•
•
•
ISA Form Completed
PRS – II Created
Site Visit team formed
Site Visit team reviews PRS II
State Program Reports sent to Educator Preparation Provider (EPP)
EPP creates Addendum in PRS II
Site Visit team reviews new evidence
Previsit with chair and Education Specialist to finalize site visit
Site visit team comes on campus
Report completed and recommendations made to ERP/EPSC/Commission
PRS-II
• Use of PRS-II
–
–
–
–
Upload evidence
Provide narrative about evidence
No other evidence room
When will PRS II be completed? (typically 8
months before onsite)
CAEP Accreditation Process
Process - CAEP
•
•
•
•
•
•
•
•
•
•
•
ISA Form Completed and confirmed by Education Specialist
Select dates and notify CAEP (12-18 months prior to visit)
AIMS Template Available
Site Visit team formed
Site Visit team reviews Self-Study Report
Site Visit team submits Formative Feedback Report
EPP submits Addendum in AIMS
Site Visit team reviews new evidence
Virtual Previsit with chair and Education Specialist to finalize site visit
Site visit team comes on campus
Report completed within 30 days and EPP submits Rejoinder
Process – CAEP
Selected Improvement Pathway
• Format: structured report
• Addressing the standards: EPPs write directly to the standards with
evidence and supporting narrative
• Demonstrating quality assurance: The EPP develops and implements a
data-driven Selected Improvement Plan that focuses on improvement with
respect to a selected Standard, Standard component, or cross-cutting
theme
CAEP Self-Study
•
EPP context & Conceptual Framework
•
Capacity Tables (regional accreditation, clinical educator qualification, parity)
•
Evidence uploaded for each standard
•
Questions/prompts specific to the standard about the source of evidence –
questions or prompts are specific to the type of evidence and the standard
–
Characterization of the quality of the evidence
–
Discussion of results and their implications
–
Demonstration of quality assurance
•
Response to previous NCATE Area(s) for Improvement
•
Submission of Selected Improvement Plan and Recruitment Plan
•
Evidence of integration of cross-cutting themes of diversity and technology
CAEP Review of Assessments
• Improve the quality of assessments used by EPPs to evaluate and report
candidate/completer performance
• EPP-wide assessments/surveys reviewed by CAEP up to three years prior
to submitting the Self-Study Report
– Specific assessments created or modified by the EPP and used across all discipline
specific content areas in the EPP
– Student teaching observation instruments, exit surveys, teacher work samples,
portfolios, etc.
• The intent is to allow EPPs:
– time to improve their assessments/surveys and scoring guides
– provide more precise feedback to candidates
– improve the program’s ability to analyze data for evidence leading to continuous
improvement, and
– ensure consistency among evaluators using the same assessment.
CAEP AFIs & Stipulations
• Area for Improvement: Identifies a weakness in the evidence for a
component or a standard. A single AFI is usually not of sufficient severity
that it leads to an unmet standard. Must be corrected within seven years.
• Stipulation: Deficiency related to one or more components or a CAEP
standard. A stipulation is of sufficient severity that a standard is
determined to be unmet. For EPPs seeking to continue their accreditation,
a stipulation must be corrected within two years to retain accreditation.
Standards &
Features of Evidence
Standards
• Use of Program Approval Standards (2016)
– 5 CAEP Standards
– 1 Georgia Special Requirements Standard
– Effective Fall 2016
• Use PRS-II
– Upload evidence
– Provide narrative about evidence
Standard 1
Content & Pedagogical Knowledge
Standard 1:
Content and Pedagogical Knowledge
Providers ensure:
Candidates apply content
and pedagogical
knowledge
Providers ensure:
Providers ensure:
Candidates Demonstrate
skills and commitment that
afford access to college &
career ready standards
Candidate Use of Research
and Evidence
Candidates demonstrate:
Understanding of InTASC
Standards
Diversity
(InTASC Standard 2)
Provider
Quality
Assurance and
Continuous
Improvement
Providers ensure:
Candidates model and
apply technology
standards
Standard 1:
Content and Pedagogical Knowledge
• How do candidates demonstrate an understanding of the 10
InTASC standards at the appropriate progression level(s) in the
following categories: the learner and learning; content;
instructional practice; and professional responsibility?
• How does the provider ensure that candidates use research and
evidence to develop an understanding of the teaching
profession?
• How do candidates use research and evidence to measure their
P-12 students' progress and their own professional practice?
Standard 1:
Content and Pedagogical Knowledge
• How do candidates apply content and pedagogical knowledge as
reflected in outcome assessments in response to standards of Specialized
Professional Associations (SPA), the National Board for Professional
Teaching Standards (NBPTS), states, or other accrediting bodies (e.g.,
National Association of Schools of Music - NASM)?
• How do candidates demonstrate skills and commitment that afford all
P-12 students access to rigorous college- and career-ready standards
(e.g., Next Generation Science Standards, National Career Readiness
Certificate, Common Core State Standards)?
• How do candidates model and apply technology standards as they
design, implement and assess learning experiences to engage students
and improve learning; and enrich professional practice?
Standard 1:
Features of Potential Evidence
• Data are disaggregated by licensure area
• Evidence is provided directly informing
on candidate proficiency for each of the
four InTASC categories
• At least two cycles of data are provided
• At least one comparison point is available
for analysis
Standard 1:
Features of Potential Evidence
• Specific documentation is provided that
candidates “use research and evidence”
– Items on observational instruments
– Required in unit or lesson plans
– Part of work sample
– edTPA
Standard 1:
Features of Potential Evidence
• Plan or submitted assessments that include:
– Assessment or observation proficiencies specific to
college- and career-ready teaching are identified
– Plan or assessments are specific to subject content area
– Candidates demonstrate deep content knowledge
– Candidates have required students to apply knowledge to
solve problems and think critically in subject area
– Candidates have demonstrated the ability to differentiate
instruction for students with at least two different needs
(e.g. ELA, urban/rural, disadvantage, low or high
performing)
Standard 1:
Features of Potential Evidence
• At least three of the four categories listed below
are addressed:
– Accessing databases, digital media, and tools to
improve P-12 learning
– Knowing why and how to help P-12 students to access
and assess quality digital content
– Ability to design and facilitate digital learning,
mentoring, and collaboration including the use or
social networks
– Candidate use of technology to track, share, and
evaluate student learning
Standard 2
Clinical Partnerships & Practice
Standard 2:
Clinical Partnerships & Practice
Partners co-select, prepare,
evaluate, support and retain
high quality clinical educators
Partners co-construct mutually
beneficial P-12 collaborations
Providers work with partners to
design clinical experiences of
sufficient depth, breadth,
diversity, coherence, and
duration
(establish mutually agreeable
expectations for candidate
entry, exit, theory and practice,
coherence, and shared
accountability for candidate
outcomes)
Provider
Quality
Assurance and
Continuous
Improvement
Standard 2:
Clinical Partnerships and Practice
• How do clinical partners co-construct mutually beneficial P-12 school and
community arrangements, including technology-based collaborations, for
clinical preparation and share responsibility for continuous improvement
of candidate preparation?
• What are the mutually agreeable expectations for candidate entry,
preparation, and exit to ensure that theory and practice are linked,
maintain coherence across clinical and academic components of
preparation, and share accountability for candidate outcomes?
• How do clinical partners co-select, prepare, evaluate, support, and retain
high-quality clinical educators, both provider- and school-based, who
demonstrate a positive impact on candidates' development and P-12
student learning and development?
Standard 2:
Clinical Partnerships and Practice
•
What are the multiple indicators and appropriate technology-based applications used
to establish, maintain, and refine criteria for selection, professional development,
performance evaluation, continuous improvement, and retention of clinical educators
in all clinical placement settings?
•
How does the provider work with partners to design clinical experiences of sufficient
depth, breadth, diversity, coherence, and duration to ensure that candidates
demonstrate their developing effectiveness and positive impact on all students'
learning and development?
•
How are clinical experiences, including technology-enhanced learning opportunities,
structured to have multiple performance-based assessments at key points within the
program to demonstrate candidates' development of the knowledge, skills, and
professional dispositions (as delineated in Standard 1), that are associated with a
positive impact on the learning and development of all P-12 students?
Standard 2:
Features of Potential Evidence
•
Evidence documents that P-12 schools and EPPs have both benefitted from the
partnership
•
Evidence documents that a collaborative process is in place
– a shared responsibility model that includes such things as co-construction of instruments and
evaluations, curriculum revisions, and key assignments
– A system is in place to ensure P-12 educator are involved in on-going decision-making (e.g.,
exit and entry requirements, etc.)
•
Evidence documents that clinical experiences are sequential, progressive, and
linked to coursework.
•
Educators and/or administrators co-construct criteria for selection of clinical
educators
– are involved in the selection and evaluation of clinical educators
– candidates and clinical educators evaluate each other
– results of evaluations are shared with clinical educators
Standard 2:
Features of Potential Evidence
• Data are collected (survey data specific to clinical
educators) and used by EPPs and P-12 educators to refine
criteria for selection of clinical educators
• Resources are available on-line to ensure access to all
clinical educators
• Clinical educators receive professional development on the
use of evaluation instruments, professional disposition
evaluation of candidates, specific goals/objectives of the
clinical experience, and providing feedback.
Standard 2:
Features of Potential Evidence
• Evidence documents that all candidates have active clinical
experiences in diverse settings
• Attributes (depth, breath, diversity, coherence, and duration) are
linked to student outcomes and candidate/completer performance
– Impact is assessed by candidates in more than one clinical experience
– Candidates use both formative and summative assessments
– Candidates are assessed on the ability to use data to measure impact
on student learning and development and to guide instructional
decision-making
– Candidates have purposefully assessed impact on student learning
using two comparison points
Standard 2:
Features of Potential Evidence
• Evidence documents that candidates have used technology to
enhance instruction and assessment
– Use of technology is both by candidate and students
– Specific criteria for appropriate use of technology are identified
• Clinical experiences are assessed using performance-based criteria
– Candidates are assessed throughout the program with data supporting
increasing levels of candidate competency
– Clinical experiences are assessed using performance-based criteria
– Evidence documents a sequence of clinical experiences that are
focused, purposeful, and varied with specific goals for each experience
Standard 3
Candidate Quality, Recruitment, and Selectivity
Standard 3:
Candidate Quality, Recruitment, and Selectivity
Establish and monitor
dispositions beyond
academic ability
Create criteria for
program progression and
monitor candidate
advancement
Document high standard
for content knowledge
and effectively impact P12 learning
Provider sets admission
requirements
Provider presents plans
and goals to recruit highquality candidates
(diverse backgrounds,
populations, hard-tostaff schools and
shortage areas)
Provider
Quality
Assurance and
Continuous
Improvement
Understand Code of
Ethics, Professional
Standards of Practice,
Relevant Laws
Standard 3:
Candidate Quality, Recruitment, and Selectivity
• Presents plans and goals to recruit and support completion of
high-quality candidates from a broad range of backgrounds and
diverse populations to accomplish their mission.
• How does the provider ensure that the admitted pool of
candidates reflects the diversity of America's P-12 students?
• How does the provider address community, state, national,
regional, or local needs for hard-to-staff schools and shortage
fields, currently, STEM, English-language learning, and students
with disabilities?
• What are the admission requirements?
Standard 3:
Candidate Quality, Recruitment, and Selectivity
•
How does the provider gather data to monitor applicants and the selected
pool of candidates?
•
Provide an analysis of the evidence that ensures that the average grade point
average of its accepted cohort of candidates meets or exceeds the minimum
of 3.0, and the group average performance on nationally normed
ability/achievement assessments is in the top 50 percent from 2016-2018;
•
How does the provider establish and monitor attributes and dispositions
beyond academic ability that candidates must demonstrate at admissions and
during the program?
•
How does the provider select criteria, describes the measures used and
evidence of the reliability and validity of those measures, and reports data
that show how the academic and non-academic factors predict candidate
performance in the program and effective teaching?
Standard 3:
Candidate Quality, Recruitment, and Selectivity
•
How does the provider gather data to monitor applicants and the selected
pool of candidates?
•
Provide an analysis of the evidence that ensures that the average grade point
average of its accepted cohort of candidates meets or exceeds the minimum
of 3.0, and the group average performance on nationally normed
ability/achievement assessments is in the top 50 percent from 2016-2018;
•
How does the provider establish and monitor attributes and dispositions
beyond academic ability that candidates must demonstrate at admissions and
during the program?
•
How does the provider select criteria, describes the measures used and
evidence of the reliability and validity of those measures, and reports data
that show how the academic and non-academic factors predict candidate
performance in the program and effective teaching?
Standard 3:
Candidate Quality, Recruitment, and Selectivity
•
What are the criteria for program progression and how does the provider monitor
candidates' advancement from admissions through completion?
•
Analyze the evidence to indicate candidates' development of content knowledge,
pedagogical content knowledge, pedagogical skills, and the integration of
technology in all of these domains.
•
How does the provider document that the candidate has reached a high standard
for content knowledge in the fields where certification is sought and can teach
effectively with positive impacts on P-12 student learning and development?
•
How does the provider document that the candidate understands the expectations
of the profession, including codes of ethics, professional standards of practice,
and relevant laws and policies, before recommending for licensure?
Standard 3:
Features of Potential Evidence
•
Documentation of existence of a recruitment plan, based on EPP mission with
targets for 5 to 7 years out
•
Data on admitted and enrolled candidates are disaggregated by race/ethnicity and
gender
•
Evidence that results are recorded and monitored, and results are used in planning
preparation for shifting cohorts including modifications to recruitment strategies
•
Informed knowledge of employment opportunities in schools/districts/regions
where completers are likely to be placed is documented
•
STEM and ELL opportunities are explicitly addressed in the EPP analysis of shortage
area employment needs, along with employment needs in hard to staff schools
Standard 3:
Features of Potential Evidence
•
Documentation of existence of a recruitment plan, based on EPP mission with
targets for 5 to 7 years out
•
Data on admitted and enrolled candidates are disaggregated by race/ethnicity and
gender
•
Evidence that results are recorded and monitored, and results are used in planning
preparation for shifting cohorts including modifications to recruitment strategies
•
Informed knowledge of employment opportunities in schools/districts/regions
where completers are likely to be placed is documented
•
STEM and ELL opportunities are explicitly addressed in the EPP analysis of shortage
area employment needs, along with employment needs in hard to staff schools
Standard 3:
Features of Potential Evidence
• EPP documents that the average score of each cohort of admitted
candidates meets a minimum GPA of 3.0 and performance on a nationally
normed test of academic achievement/ability in the top 50 percent
• OR similar average cohort performance using a state normed test of
academic achievement/ability in the top 50 percent [GACE scores]
• OR EPP has a “reliable, valid model” in which they use admissions criteria
different from those specified in 3.2 that result in positive correlation with
measures of P-12 student learning
Standard 3:
Features of Potential Evidence
• EPP establishes non-academic factors to use at admission or
during preparation that are research based
• EPP monitors progress and uses results from the evaluation of
the non-academic factors for individual candidate mentoring
and program improvement (curriculum and clinical
experiences)
• EPP reports data showing how academic and non-academic
factors predict candidate performance in the program
Standard 3:
Features of Potential Evidence
 Measures provide evidence of developing proficiencies of
candidates in critical areas such as:
o Ability to teach to college- and career-ready standards;
o Content knowledge;
o Dispositions; pedagogical content knowledge;
o Pedagogical skills;
o Integration of use of technology;
o Impact on P-12 student learning
• EPP documents candidates’ understanding of codes of ethics,
professional standards and ethics, and relevant laws and
policies
Standard 3:
Features of Potential Evidence
 Measures provide evidence of developing proficiencies of
candidates in critical areas such as:
o
o
o
o
o
o
Ability to teach to college- and career-ready standards;
Content knowledge;
Dispositions; pedagogical content knowledge;
Pedagogical skills;
Integration of use of technology;
Impact on P-12 student learning
• Evidence of actions taken such as:
–
–
–
–
Changes in curriculum or clinical experiences
Changing admissions criteria
Providing mentoring
Counseling out
Standard 4
Program Impact
Standard 4:
Program Impact
Provider demonstrates
that completers
effectively apply
professional knowledge,
skills, and dispositions
Provider documents that
program completers
contribute to an
expected level of
student learning growth
Provider demonstrates
that employers are
satisfied with
completer’s preparation
Provider
Quality
Assurance and
Continuous
Improvement
Provider demonstrates
that program completers
perceive their
preparation as relevant
and preparation was
effective
Standard 4: Program Impact
•
How does the provider document, using multiple measures, that program completers
contribute to an expected level of student-learning growth?
•
How does the provider demonstrate, through structured and validated observation
instruments and/or student surveys, that completers effectively apply the
professional knowledge, skills, and dispositions that the preparation experiences were
designed to achieve?
•
How does the provider demonstrate, using measures that result in valid and reliable
data and including employment milestones such as promotion and retention, that
employers are satisfied with the completers' preparation for their assigned
responsibilities in working with P-12 students?
•
How does the provider demonstrate, using measures that result in valid and reliable
data, that program completers perceive their preparation as relevant to the
responsibilities they confront on the job, and that the preparation was effective?
Standard 4:
Features of Potential Evidence
• Preparation Program Effectiveness Measures (PPEM)
• EPP analyzes, evaluates, and interprets information
provided by the State
– Characteristics and patterns in data
– At least 20% of completers are represented
– Explanation of results based on results of teacher
placement
– EPP judges the implications of the data and analyses for
the preparation program and considers appropriate
modifications
Standard 4:
Features of Potential Evidence
• Observation instruments are structured and inclusive of the application of
professional knowledge, skills, and dispositions
• Interpretations of performance are made, especially in relation to
benchmarks, norms and cut scores
• EPP administered surveys
– Survey return rates are at acceptable levels and inclusive of most licensure areas in
the EPP
– The representativeness of the sample, the characteristics of the respondents, and
the survey response rate
– Disaggregated data specific to high need schools or licensure areas
– Data are analyzed, evaluated, and interpreted
– Conclusions are supported by the data and comparison points for data are
provided
Standard 5
Provider Quality Assurance & Continuous Improvement
Standard 5:
Provider Quality Assurance & Continuous Improvement
Quality assurance system
relies on relevant,
verifiable,
representative,
cumulative, and
actionable measures
Quality assurance system
monitors candidate
progress, completer
achievements, provider
operational effectiveness
Provider regularly and
systematically assess
performance against its
goals, tracks results,
tests innovations, uses
results for improvement
Provider
Quality
Assurance and
Continuous
Improvement
Measures of completer
impact are summarized,
externally benchmarked,
analyzed, shared widely,
acted upon
Provider assures that
appropriate stakeholders
are involved in program
evaluation and
improvement
Standard 5:
Provider Quality Assurance & Continuous Improvement
Monitors candidate
progress, completer
achievements,
provider operational
effectiveness
Appropriate
stakeholders are
involved in
program
evaluation and
improvement
Quality
Assurance
System
Measures of
completer impact
are summarized,
externally
benchmarked,
analyzed, shared
widely, acted upon
Relies on relevant,
verifiable,
representative,
cumulative, and
actionable
measures
Provider regularly
and systematically
assess
performance
against its goals,
tracks results,
tests innovations,
uses results for
improvement
Standard 5:
Provider Quality Assurance and Continuous Improvement
• Describe how the quality assurance system is comprised of multiple
measures that can monitor candidate progress, completer achievements,
and provider operational effectiveness.
• Describe how the quality assurance system relies on relevant, verifiable,
representative, cumulative and actionable measures, and produces
empirical evidence that interpretations of data are valid and consistent.
• How does the provider regularly and systematically assess performance
against its goals and relevant standards, track results over time, test
innovations and the effects of selection criteria on subsequent progress
and completion, and use results to improve program elements?
Standard 5:
Provider Quality Assurance and Continuous Improvement
• How are measures of completer impact, including available
outcome data on P-12 student growth, summarized,
externally benchmarked, analyzed, shared widely, and acted
upon in decision-making related to programs, resource
allocation, and future direction?
• How does the provider assure that appropriate stakeholders,
including alumni, employers, practitioners, school and
community partners, and others defined by the provider, are
involved in program evaluation, improvement, and
identification of models of excellence?
Standard 5:
Features of Potential Evidence
• EPP’s quality assurance system monitors candidate
progress, completer achievements, and EPP
operational effectiveness
• Documentation that quality assurance system
supports targeted change (e.g., through capacity to
disaggregate data by program and/or candidate level,
and to respond to inquiries)
• Documentation that the system operations and data
are regularly reviewed
Standard 5:
Features of Potential Evidence
• Documentation that evidence is:
–
–
–
–
–
Relevant (related to standard)
Verifiable (accuracy of sample)
Representative typical and free of bias
Cumulative (generally 3 cycles or more)
Actionable in a form to guide program improvement
• Documentation that interpretations of evidence
are consistent (across different sources of data)
and valid (e.g., inter-rater reliability)
Standard 5:
Features of Potential Evidence
• Documentation that EPP regularly and systematically:
– Reviews quality assurance system data,
– Poses questions,
– Identifies patterns across
• Documentation that EPP addresses areas of apparent areas for
improvement and makes appropriate changes in preparation.
• EPP documents appropriate tests of effects of selection criteria
(under 3.2) and other program changes:
– Baseline(s)
– Intervention description
– Comparison(s) of results and next steps taken and/or planned
Standard 5:
Features of Potential Evidence
• Outcome and impact measures include:
–
–
–
–
–
Analysis of trends
Comparisons with benchmarks
Indication of changes made in preparation
Considerations for distribution of resources
Future directions anticipated
• Evidence that outcome measures and their trends
are posted on the EPP website and in other ways
widely shared
• Development of a data driven Improvement Plan
Standard 5:
Features of Potential Evidence
• Specific evidence is provided that:
– Indicates which particular stakeholders are involved (e.g., alumni,
employers, practitioners, school and community partners, others
defined by the EPP)
– Illustrates ways stakeholders are involved (e.g., communications,
discussions of implications of data, program evaluation, selection and
implementation of changes for improvement, decision making)
– Regular and appropriate stakeholder groups are involved in decisionmaking, evaluation, and continuous improvement (Note: Not every
stakeholder group would necessarily be appropriate for every decision
process)
• EPP identifies at least two examples of use of and input from
stakeholders
Standard 6
Georgia Requirements for Educator Preparation Programs
Georgia Standard 6:
Georgia Requirements
• Admission Requirements
• Reading Methods (Applies only to ECE, MG, Special Ed General
Curriculum, Special Ed Adapted Curriculum, and Special Ed
General Curriculum/Early Childhood Education
• Identification and Education of Children with Special Needs
• Georgia P-12 Curriculum, Instruction, and Educator Evaluation
• Professional Ethical Standards and Requirements for
Certification and Employment
• Field Experiences Appropriate to the Grade Level and Field of
Certification Sought and Clinical Practice
• Content Coursework Requirements for Service Programs in
Curriculum and Instruction, Instructional Technology, and
Teacher Leadership
Georgia Standard 6:
Features of Potential Evidence
• Admission Data
• Course syllabi and/or content matrices for reading
methods, special needs, and technology courses
• Content matrices from preparation program reports or
folios showing correlation with Georgia mandated P-12
standards (i.e. Common Core Georgia Performance
(CCGPS), Georgia Performance Standards (GPS), and GACE
objectives.
• Course syllabi and/or content matrices that includes
knowledge about and application of professional ethics
• Field and Clinical experience tracking chart
• Preparation program Reports and/or Folios
Annual Reporting
Preparation Approval Annual Report (PAAR)
•
•
•
•
EPPs will retrieve their PPEM data via PAAR
Majority of data in PAAR will be pre-populated
EPPs will respond to PPEM data and other data such as:
– Ethics violations data
– Data from the administration of the ethics
assessment
– Survey data (completer survey, first year teacher
survey, employer survey)
2016 PAAR due after PPEM data is available
CAEP EPP Annual Report
• Due in April
• Only NCATE/TEAC/CAEP accredited EPPs
• Report outcome measures and progress on
correcting AFIs
• Report candidate & completer data
• Feedback will be provided after submission
Onsite Visit
Onsite Logistics to be
Confirmed during Pre-visit
•
•
•
•
Lodging, work room, meals at the hotel
Work room, interview rooms on site
Transportation from hotel to campus
Ability for Wi-fi at both locations,
projector, screen, shredder, printer
• Interviews scheduled for Monday
Questions?
Thank you!
Take the survey:
https://www.surveymonkey.com/r/PSCDriveIn15
Download