Self-Study Addendum Report High Point University Standard 1

advertisement
Self-Study Addendum Report
High Point University
The EPP wishes to thank the CAEP program reviewers for their formative feedback
related to High Point University’s Self Study Report (SSR) and Selected Improvement
Plan that was submitted in July 2015. After careful review the EPP will briefly respond
to the feedback noted within each of the five standards as it is related to the various tasks
listed based on “evidence that is inconsistent in meeting the standard”, and those areas
needing “further clarification” and/or “verification”. Additional documents have been
uploaded by the EPP to support this addendum and will be referenced under the
appropriate standard below.
Standard 1: Content and Pedagogical Knowledge
Standard 1 Task 1a:
During the onsite visit the EPP will demonstrate how candidates are tracked using the
Foliotek Data Management System that the School of Education requires of all
candidates. Within this system the EPP monitors gateway assessment data for all
candidates and uses this management tool to generate EPP annual reports that provide
feedback about needed changes on a programmatic level (both disaggregated by licensure
area and aggregated to monitor trends in EPP overall quality).
Evidence that the EPP has used the TRIPOD pilot survey results and other partnership
feedback will be provided during the onsite reviews. Interviews with key partnership
stakeholders can be arranged at the CAEP team’s request.
Standard 1 Task 1b:
One question raised in the Formative Feedback Report is whether the EPP requires all
candidates to participate in university programs such as Service Learning, Undergraduate
Research and the B.A. to M.Ed. programs. (these were documented as strategies utilized
by the EPP to achieve progress on the Selected Improvement Plan’s goal of increasing
Content Knowledge). To date, the EPP currently does not require candidates in any of
the licensure programs to participate in these institutional programs. It is felt that the data
collected on candidates who do choose to participate in these programs will be used to
make these programmatic changes if it is determined that such participation does lead to
significantly greater acquisition of content knowledge (InTASC #4) and the candidate’s
ability to apply and integrate content knowledge across disciplines (InTASC). The EPP
is also interested in looking at how participation in these types of programs also enhances
the “professional dispositions” associated with InTASC Standards #4 and #5.
Specifically, the candidate’s demonstration that he/she “values knowledge (InTASC #
4(o), 4(p), InTASC #5(r). Data collection will therefore be ongoing and candidates who
do participate in these programs are being tracked and compared to candidates who do
not.
1 Standard 1 Task1c:
The EPP has provided a comprehensive curriculum alignment to CAEP Standard 1 for
each of the 10 InTASC Standards (CAEP 1.1), and indicators 1.2, 1.3, 1.4 and 1.5 for
each licensure program offered (see uploaded Curriculum Matrices Alignment
document 1.c.2). The alignment table includes a listing of all required professional
education, specialty and supporting courses required for the major/licensure area, the
learning outcomes for each course, major assignments and assessments as they align to
the appropriate InTASC Standard or other standard 1 component. An additional
Curriculum Matrices table is provided (see Curriculum Matrices Table c1.1) that
summarizes this more comprehensive information to verify that all components of CAEP
Standard 1 have indeed been addressed in each major program of study offered by the
EPP. Course syllabi will be available for review during the onsite visit along with
samples of candidate work products by licensure/program area.
“Early Adopters” were asked to upload assessment instruments for CAEP Early Review
and feedback last year. The EPP complied with this request and submitted detailed
information including the Context, the Validity, and the Reliability for ten of its EPPcreated instruments on February 17, 2015. (See AIMS site). To date the EPP has not
received feedback from CAEP. In response to this, the EPP has provided an additional
document from CAEP verifying that Early Adopters would not be held responsible for
making adjustments to existing assessment tools since the training of CAEP reviewers
had not been completed in time for early adopters to receive this feedback (see document
titled “Early Adopter Assessment Review”). However, the EPP has responded to the
current Formative Feedback report by providing an overall summary table which details
how each component of CAEP Standard 1 (1.1, 1.2, 1.3, 1.4 and 1.5) and certain
components of CAEP Standards 2 (2.3) and Standard 4 (4.3, 4.4) are currently assessed
for all candidates (See document titled “Summary of HPU Assessment Alignment to
CAEP 1c.1”). This table includes the specific items within each assessment tool that
correspond to the CAEP standard component, the year(s) in which the assessment is
conducted and the corresponding Assessment Gateway that the data is used to support.
The table supports that all InTASC Standards included in CAEP 1.1 are assessed at
various gateways in the program, at exit and during the first and second year of
teaching by employers. Reviewers wishing to access the instrument and corresponding
data will note that the last column of the table also includes the Evidence # as it appears
in the SSR on the AIMS site. It is also worth noting that ten of these assessment
instruments, along with information on their reliability and validity, was provided in the
Assessment Review uploaded by the EPP in February of 2015 and can be accessed on
the AIMS Site as well. These ten instruments include the Candidate Disposition
Evaluation, Candidate Reflection Rubric, Content, Depth and Application Rubric,
Literacy-Infused Curriculum Project Rubric, Leadership and Collaboration Project
Rubric, Internship Formal Observation & Impact on P-12 Learning Rubric, Professional
Development Plan Rubric, Candidate Internship Evaluation Rubric, Using Data to Assess
Student Learning Rubric, and Principal and Graduate Survey. Individual EPP-created
Assessment item alignment to CAEP Standard 1 have also been uploaded (see Candidate
Disposition Evaluation c1.1; Candidate Performance in the Field c1.1; Rubric for
Internship Formal Observation; LEA/IHE Certification of Capacity, Leadership
2 and Collaboration Rubric c1.1, Literacy-Infused Curriculum Project Rubric c1.1,
Using Data to Assess Student Learning Project Rubric c1.1, Professional
Development Plan Rubric c1.1, Content, Depth and Application Project Rubric
c1.1 ). These additional documents provide a more in-depth view of how each specific
assessment tool used by the EPP is aligned to the various InTASC Standards as well as
CAEP Standard 1.2, 1.3, 1.4 and 1.5.
Samples of completed assessment instruments and rubrics for candidates representative
of all program/licensure areas will be available for review during the on-site visit.
Standard 2: Clinical Partnership and Practice
Standard 2 Task 1a:
1. The EPP routinely involves clinical P-12 educators in the curriculum planning,
assessment and program improvement process. The Teacher Education Council (TEC)
includes members currently employed in local school districts as administrators,
principals and teachers. The council meets four times each academic year to provide
oversight and governance to the School of Education as it uses candidate and program
assessment data to consider improvements to existing programs, new programs of study,
and EPP initiatives. The council is also charged with oversight of EPP policies and the
approval of candidates for provisional status, admission or dismissal from the School of
Education. Due to the size (34 members) of the TEC, subcommittees are frequently
formed to address specific tasks relevant to program improvement. Two examples of
these subcommittees include the creation of the Diversity Recruitment Plan
Subcommittee who was charged with the development of the EPP Diversity Recruitment
Plan and intentionally included P-12 educators who could meaningfully contribute to the
development of goals and objectives in this area. A second subcommittee, the
committee on Clinical Educator Quality, included the selection of P-12 educators and
EPP faculty from the TEC to develop a plan for assessing and supporting those serving as
cooperating teachers and clinical educators for the EPP. As a result of this committee’s
work, policies guiding the selection of cooperating teachers, a newly developed
principal’s evaluation of cooperating teachers (see document titled “Principal Survey
of Cooperating Teacher 2a.1), and a series of professional development opportunities
for cooperating teachers have been developed. This is a working committee whose most
recent meeting was 1/4/16 (see documents titled “Principal Survey of Cooperating
Teacher 2a.1 & Minutes of Cooperating Teacher Subcommittee 2a.1).
2. All candidates are required to demonstrate knowledge and use of technology in the
clinical field settings to which they are assigned beginning in sophomore year. The
Candidate Performance in the Field (Evidence #65, item #4), the Literacy-Infused
Curriculum Rubric (Evidence #3: Technology Items: 3a.1, 3c.1, 3d.1, 4d.1.), the
Candidate Internship Evaluation: Internship I/II (Evidence # 58: Technology Items:
3.d.1, 4c.1.), and the final exit evaluation, the LEA/IHE Certification of Capacity
(Evidence # 5: Items # 4d.1) are assessments which are completed on all candidates
3 throughout the program that specifically evaluate candidate proficiency with technology
“in the field”. Furthermore, the EPP includes specific items on the Exit Evaluation for
Program Completers (Evidence #45: Item #10) and the Survey to Elementary/MGE/Sec
Principals (Evidence #8: Items #9, 10, 11) to regularly monitor the impact of the EPP
technology courses on teaching effectiveness.
3. Verification that the sequence of diverse experiences outlined in the Field Experiences
Handbook along with more detailed examination of EPP MOU’s will be provided during
the onsite visit.
4. The subcommittee on Clinical Educator Quality does not currently evaluate and/or
select teachers for lower level clinical field experiences and practica. Due to the specific
nature of the courses linked to these practica, it is the EPP faculty who make these
determinations based on the objectives and goals of the field experience as it is tied to the
education course. The EPP typically uses its clinical faculty to make placement
recommendations since these individuals have been in the local schools most recently and
therefore more knowledgeable about the quality of various teachers. The EPP does
evaluate all clinical educators at the conclusion of each semester by asking candidates to
complete a survey about the teacher who hosted them for the field experience. These
surveys can be reviewed during the onsite visit.
Standard 2 Task 1b: (From the EPP Technology Coordinator)
The initial goals of the iPad Project were established before the School of Education was
asked to be involved. The iPads and MacBooks computers were purchased for Montlieu
through a business partnership of which HPU was a part. All of the training for teachers
on the use of the iPads and MacBooks was provided by HPU’s IT Staff and through
Apple Education. Our IT Department provided the setup of the wireless network in the
school. The goal of the iPad Project was to increase student test scores in all areas,
improve student attendance, and decrease behavior problems.
The School of Education was asked to participate in the partnership and provide
professional development for teachers and direct services to students in grades K-5. The
partnership was tied to education coursework at both the undergraduate and graduate
level. Each year, the technology coordinator meets with the Principal to determine a
focus for the year.
Montlieu Academy of Technology has best been used as the placement for candidates in
the 5th Year B.A. to M.Ed. Literacy Program who are enrolled in EDU 5010: Advanced
Instructional Technology for the 21st Century. The project for these candidates involves
using different iPad Apps over the course of 8 weeks. The candidates choose their area
of focus and the apps, design a project around them that includes lessons and activities,
conduct the lessons, reflect on the successfulness of each in a blog, and present a final
synopsis in the form of a research proposal.
Montlieu Academy of Technology is also currently used as the practicum placement for
B.A. to M.Ed. STEM candidates. This project involves assigning K-5 students to each of
4 our candidates (two students per candidate). The candidates are to design a series of
lessons on the theme selected by Montlieu (this year we did vocabulary with LEGO Build
to Express). Each week, candidates meet with the students for an hour to conduct the
lessons. They reflect on the lessons and how the students. At the end, there is an
additional session that lasts for three hours. Each EPP candidate’s work – lessons,
reflections, and final products are documented on a wiki site. These can be viewed
during the onsite visit.
Standard 2 Task 1c:
1. The MOU’s developed through partnerships with P-12 schools are monitored annually
through a series of assessments including candidate practicum evaluations. Depending
on the nature and extent of the MOU, some partnerships are formalized through
institutional and school district IRB with extensive evaluation components. Partnerships
that are developed as part of the university’s Service Learning Program are evaluated
through that office as well. In its revised Tenure and Promotion Guidelines, the EPP has
chosen to include IHE/LEA School Partnerships with substantive and well-defined
program evaluation as an approved activity for “Faculty Scholarship”. MOU agreements,
evaluation data and interviews with MOU stakeholders will be arranged during the onsite
visit.
2. Please provide the EPP with more information on the 89%. What source is this from?
3. At the present time the Subcommittee on Clinical Educator Quality only assists in the
select and evaluation of clinical faculty who serve as cooperating teachers using data
collected by university supervisors and teacher candidates (see Evidence # 37). Clinical
faculty who supervise candidates throughout the remainder of the program are evaluated
by the EPP using the Candidate Field Experience Feedback Form (see Evidence # 72).
4. All P-12 teachers who work with the EPP to provide supervision and mentoring of
candidates at all levels are referred to as “clinical faculty”. The EPP designates those
supervising student teachers (Internship II) as “cooperating teachers” because their
responsibilities are more complex and evaluations of candidates are more intense.
5. With regard to clinical field experiences, verification will be provided to support that
all candidates complete EDU 2110, 3110 and 4008/09/10 and complete technology
assignments which are intentionally sequenced in their progression to lead the candidate
from the development of knowledge in educational technology, to application of
technology in the role of the classroom teacher, to the integration of technology into
lesson planning and assessment. The EPP utilizes a model of instruction that integrates
technology courses into methods “blocks” for all candidates. One of the courses in the
“block” is the required Internship-I and candidate wikis highlighting assignments such as
the Progress Monitoring Project will be available for review during the onsite visit.
Additionally, all candidates are required to teach one lesson in the clinical field setting to
which they have been assigned that “infuses” technology during the implementation of
the Literacy-Infused Curriculum Project. The university supervisor evaluates each
candidate’s technology usage which is part of the Literacy-Infused Curriculum Project
5 Rubric (see Technology items # 3a.1, 3c.1, 3d.1, 4d.1.)
6. The EPP normally provides cooperating teachers with final evaluations when the
stipend ($100) from the university is sent to the cooperating teacher. The cooperating
teacher who hosted the intern receives the check, a thank-you note from the Associate
Dean, and the results of the candidate’s and university supervisor’s feedback concerning
the internship experience. The evaluations that are completed by the principal during the
semester that precedes Internship II have, to date, been kept confidential although there
has not been a cooperating teacher who has NOT been recommended by the principal in
Internship I to continue with the candidate into Internship II.
7. Apart from the Student Teaching Orientation, additional resources for Cooperating
Teachers are available on the School of Education website and can be accessed on the
link Cooperating Teacher Resources (on the main page for the School of Education) and
includes documents from orientation, assessment forms and School of Education policies
regarding student teaching.
8. The data included in the Candidate Performance in the Field (see Evidence #65) is
gathered for all candidates enrolled in EDU 1200: Introduction to Teaching, EDU 2100:
Nature of the Learner, and EDU 3100: Collaboration in the General Education
Classroom (freshman, sophomore and junior/senior level courses). Data reported in
Evidence #65 is disaggregated by licensure area. The Candidate Performance in the
Field is not used during Internship II (Student Teaching) as the EPP has developed more
in-depth assessments to use at this advanced level that allow for greater item analysis and
intervention, when needed.
9. The EPP also provides Professional Development Opportunities for Cooperating
Teachers. Two sessions have been scheduled for the spring of 2016 including 1/28:
“Knowing Your role As a Cooperating Teacher” and 2/18: “Coaching Strategies 101”
(see document titled “Professional Development 2c.9”). Cooperating Teachers also
attend a final seminar and dinner at the conclusion of the semester with their own student
teacher, faculty in the School of Education and all university supervisors. At this event
outstanding cooperating teachers are recognized and often serve as keynote speakers at
the event (verification of this will be provided during the onsite visit).
Standard 3: Candidate Quality, Recruitment and Selectivity
Standard 3 Task 1a:
1. Since the submission of the SSR in July 2015, the EPP has worked with the Diversity
Subcommittee to implement the activities that were outlined in the plan for fall 2015.
The EPP has provided the requested update on progress made thus far in the
implementation of the Diversity Recruitment Plan (see document titled “Recruitment
Plan Progress 3a.1).
2. Candidate performance on the new Pearson tests that replaced the Praxis II tests was
provided in the SSR (see Evidence #14 and the last table includes the data on candidates
6 taking the Pearson test up until July 2015. Additional candidates and computed pass
rates for Pearson tests and Praxis II appear on the EPP website (see Performance Data
under Performance of Graduates link).
Standard 3 Task 1c:
1. At the recommendation of the CAEP review team, the EPP has reconvened the
Diversity Recruitment Subcommittee to address a five-year to seven year set of targets
rather than three-year (see Formative Feedback Report, pg. 11). A draft of this
addendum to the original report as it appeared in the SSR will be available for review
during the onsite visit.
2. Interviews with EPP candidates will be arranged during the onsite visit.
3. Please see the Selected Improvement Plan (Goal 5) for detailed Goals/Objectives
and EPP strategies to improve the Math content of Elementary Education majors (those
taking the math Pearson test). The new Pearson test was implemented in North Carolina
in July 2014 and therefore the EPP will use the Pearson test as a means of assessing the
impact of the strategies being implemented in the SIP. The newest data available from
the Math Skills Lab shows the significance of using “Gaming Strategies” to enhance the
math content of elementary education majors as demonstrated on the Pearson test (see
document titled “2015 Analysis of Gaming Strategies for Pearson GC Math Scores
3c.3)
Standard 4: Program Impact
General Response (To Narrative and Questions Raised):
In an effort to demonstrate that the EPP has met Standard 4, several measures providing
evidence that our graduates are indeed successful when they exit the School of Education
were provided, particularly since value-added data is not sufficient at this time.
The EPP included information on the “8-annual reporting measures” as these data have
been identified by CAEP as being related to the provider’s evidence of meeting Standard
4 (see CAEP Accreditation Handbook, January 2015, pg. 20). Loan Default Rate is
included as one of these 8 measures along with Employment Rate, Employer Satisfaction,
Retention, etc. It was assumed by the EPP that Loan Default Rate percentages would
indicate the stability of program completers in getting and keeping a teaching position
(one would assume that an individual would be less likely to default on a loan if he/she
has a stable income so therefore, the EPP included this “evidence” as part of the Standard
4 submission.
To date, the EPP has used the data gathered from the 8-reporting measures, employer and
program completer survey data, the data provided by the N.C. State Department of Public
Instruction on “Teacher Effectiveness” (see Evidences #10, 14, 16, 85) and a listing of
graduate awards and distinctions as evidence of program impact. Strategies currently
7 being piloted include the TRIPOD Survey, the use of BA to M.Ed. programs to follow
candidates upon graduation into their P-12 classrooms and to study the predictive impact
of additional opportunities such as participation in Undergraduate Research, Service
Learning, 5th Year BA to M.Ed., and the Mentor Teacher Support Program on subsequent
impact. Once the EPP clarifies the extent to which these program “extras” meaningfully
impact the success of our graduates in P-12 classrooms, these data will be used to make
programmatic changes such as requiring Undergraduate Research participation during
each candidate’s time of enrollment in the EPP. Doctoral candidates who are currently
employed in key P-12 leadership roles within N.C. school districts have been utilized to
assist the EPP in “objectively” studying which variables have the highest predictive
validity in measuring the effectiveness of candidates during student teaching and
subsequent impact on P-12 student learning.
Standard 4 Task 1a:
1-3. The Mentor Teacher Program uses a variety of strategies to reach out to first and
second year graduates. Funding for the program is through donations of education
alumni through the Office of Institutional Advancement. One faculty member serves as
the coordinator of the program.
The retired teachers who provide mentoring do so on a voluntary basis although the EPP
has worked to ensure that mentors come from a variety of P-12 teaching areas. All
candidates at exit from the program are invited to participate in the Mentor Teacher
Program. Participation has been greatest for those graduates who have chosen to remain
in the Piedmont Triad area to teach. Efforts are being made to study more effective ways
to provide “virtual” mentoring opportunities for graduates who leave North Carolina
(79% of High Point University’s student body is from outside the state). Retired teachers
convene with the program’s coordinator periodically to evaluate the range of services
being provided. Based on the feedback from mentors, seminars are offered in areas that
coincide with new teachers’ greatest needs. These sessions include a formalized seminar,
light dinner, fellowship with other graduates and a grab bag of educational “goodies” as a
token for attendance. For the spring of 2016, four seminars have been planned. The first
three (1/27; 2/25; 4/7) will target HPU current program completers and recent graduates
with an invitation to principals in EPP partner schools inviting any first and second year
teacher. The final seminar (4/26) will target EPP candidates who are exiting from the
program and will include on-site mock interviews from ten principals, dinner and a
seminar (see document titled “New Teacher Support Seminar Series 4a.1).
Standard 4 Task 1b:
1. The EPP has documented the study of certain strategies (Service Learning,
Undergraduate Research and participation in B.A. to M.Ed. programs) to determine
impact on progress toward SIP goals of improving Content Knowledge. While it is
correct that the EPP currently does not require candidates in any of the licensure
programs to participate in these institutional programs, it is felt that the data collected on
candidates who do choose to participate will be helpful in recommending future revisions
8 in program requirements if it can be determined that such participation does lead to
greater acquisition of content knowledge (InTASC #4) and the candidate’s ability to
apply and integrate content knowledge across disciplines (InTASC). The EPP is also
interested in looking at how participation in these types of programs also enhances the
“professional dispositions” associated with InTASC Standards #4 and #5. Specifically,
the candidate’s demonstration that he/she “values knowledge (InTASC # 4(o), 4(p),
InTASC #5(r). There is already some data to support that graduates who previously
participated in the university URCW program have made more significant impact on P12 learners as noted by awards and distinctions that these individuals have received.
Standard 4 Task 1c:
1. Data collection from “mentors” has been both anecdotal and compiled by the
coordinator of the program as a result of discussions during meetings and also more
formally through data collected on surveys. Frequency counts of items indicating the
greatest areas of need which have been identified by mentors has been useful in planning
services and developing seminar topics for first and second year graduates.
Standard 4 Task 2a:
1. The Tripod Survey was piloted last spring with 10 candidates who were enrolled in
Internship II (student teaching). The EPP does not have adequate data at this time to use
the TRIPOD results for programmatic changes although some interesting findings were
noted (see Evidence # 38). It was noted that the category of classroom management
(Control), was consistently rated lower by P-12 students who completed surveys in all ten
classrooms (similar ratings were obtained for the overall CAEP normed group).
Additionally, it was noted that the ratings of the ten student teachers by the EPP (HighAverage, Average, Marginal) seemed generally consistent with the ratings of P-12
students for these 10 candidates. However, with the limited sample data, no conclusions
can be drawn at this time.
2. The ten students selected for the pilot included candidates who were identified as
Marginal, Average and High-Average in their performance across all Gateway indicators.
An effort was also made to include candidates at the Elementary, Middle and Secondary
levels. They were not aware of how they were selected. Cooperating teachers and
principals had to agree to allow for class time to be used for the purposes of surveying P12 students.
Standard 4 Task 2b:
1. The marginal candidate (Evidence # 47) was on an intervention plan during
Internship II and therefore all policies and procedures that apply were carried out with
this individual. She did complete the program successfully as has since become
employed. Interestingly, her parent emailed the president of HPU following graduation
to express appreciation for the mentoring the candidate received during her time at High
Point University (see document titled: E-Mail From Parent 4b.1). The EPP has
included her in the follow-up TRIPOD study completed in fall 2015 in an effort to assess
how this candidate is performing in her first year of teaching. These results will be
9 informative, we believe. The EPP will provide more information about this particular
candidate during the onsite visit in March.
Standard 4 Task 2c:
1. The EPP is using the TRIPOD survey with all student teachers this spring 2016
semester (N=43). The EPP has also received the fall 2015 survey data for student
teachers and the first follow-up surveys on our recent graduates who completed the
program in May 2015. A comparison of the student teacher TRIPOD survey for one
candidate (Spring 2015) and her follow-up TRIPOD survey as a beginning teacher (Fall
2015) has been uploaded as a sample (see document titled TRIPOD Follow-up Survey
4c.1). It should be noted that this candidate was considered “High-Average” by EPP
faculty and also currently participates in the B.A. to M.Ed. program in Elementary
Literacy.
Standard 4 Task 3a:
1. The following components will be included in the draft of a Value-Added Data Plan
that will be available for review during the onsite visit:
• N.C. Growth Data by Institution (available by March 2016) compiled by the N.C.
State Department of Public Instruction (in progress--not approved yet by the State
board of Education for release)
• TRIPOD Survey data (all student teacher candidates from the EPP will be
included in the spring of 2016)
• Employer/Program Completer Survey Data
• Mentor Teacher Support Program Data (Impact)
• School of Education Predictive Validity Study (expansion—see Evidence #76)
• Use of BA to M.Ed. Candidates to assess P-12 student learning
• Continued social networking to gather value-added data reports from program
completer
Standard 4 Task 3b:
1. Within the SSR the EPP has documented the use of certain strategies (Service
Learning, Undergraduate Research and participation in B.A. to M.Ed. programs) to
achieve progress on SIP goals of improving Content Knowledge. This is indeed correct,
the EPP currently does not require candidates in any of the licensure programs to
participate in these institutional programs. It is felt that the data collected on candidates
who do choose to participate in these programs will be used to make these programmatic
changes if it is determined that such participation does lead to significantly greater
acquisition of content knowledge (InTASC #4) and the candidate’s ability to apply and
integrate content knowledge across disciplines (InTASC). The EPP is also interested in
looking at how participation in these types of programs also enhances the “professional
dispositions” associated with InTASC Standards #4 and #5. Specifically, the candidate’s
demonstration that he/she “values knowledge (InTASC # 4(o), 4(p), InTASC #5(r). Data
collection will be ongoing.
10 Standard 4 Task 3c:
1-3. See above. As per the email provided for documentation purposes (see document
titled “Email from NCDPI 4c.1), North Carolina EPP programs were to have access to
“growth data” by December 2015. At the present time the request is in the process of
being approved by the N.C. State Board of Education and should be available for the
March onsite visit.
Standard 5: Provider Quality, Continuous Improvement and Capacity
Standard 5 Task 1a:
1. The current versions of the EPP Conceptual Framework for the initial and advanced
programs of study as well as the framework for the Masters and Doctoral degrees in
Educational Leadership appear on the School of Education website in all handbooks for
the various program areas.
2. Standards alignment matrices for all programs of study appear in the two documents
titled “Curriculum Matrices Table 1c.1 and Curriculum Matrices 1c.1.
Standard 5 Task 1b:
1. In August 2006, the N.C. State Board of Education adopted a new guiding mission
for North Carolina Public Schools which states that “every student graduates from high
school, globally competitive for work and postsecondary education and prepared for life
in the 21st century”. In response, the reforms which resulted included a significant shift
to a revised set of Professional Teaching Standards which was adopted by the North
Carolina Professional Teaching Standards Commission in 2007. A period of
“revisioning” of teacher preparation programs and subsequent “re-approval” by the
NCSDPI of all existing licensure programs of study occurred in 2010 and provides
authorization for all programs until 2017.
In response to these reforms, the EPP realigned its existing programs of study to the
newly adopted N.C. Professional Teaching Standards and the mission of N.C. public
schools. The School of Education, under the governance of its Teacher Education
Council, reviewed and revised the Conceptual Frameworks driving the initial and
advanced teacher preparation programs. The process used for revision was largely to
review the research relevant to best practice in today’s 21st century classroom and engage
faculty in discussions about how these principles could best be applied to the preparation
of candidates with regard to their content and pedagogical knowledge, professional
dispositions and acquisition of 21st century skills such as leadership and collaboration.
Two faculty with specific expertise in developmental psychology provided leadership in
outlining how each program could be sequenced to allow for critical progressions in
development. Further, faculty in the School of Education were involved in several
NCSDPI Ad Hoc Committees to assist in the state’s overall revisioning of IHE
curriculum and “Electronic Evidences” (major hallmark projects used to validate
11 candidate licensure eligibility. EPP faculty also served as program reviewers in the
NCSDPI Blueprint and electronic evidence approval process for other North Carolina
Teacher Preparation programs in the summer of 2010. Therefore, the EPP engaged in
the revision process from 2009-2010 with the revised framework, grounded in research,
approved by the Teacher Education Council in March of 2011. This lengthy process was
also slowed because, at this same time, High Point University revised its curriculum,
moving from three-credit to four credit courses. The EPP also had to restructure and
resubmit all individual education courses and degree programs to the university’s
Educational Policies Committee for approval in the fall of 2010.
Documentation and minutes verifying the above series of activities can be viewed during
the on-site visit.
Standard 5 Task 1c:
1. The CAEP Accreditation site on the School of Education website..
2. Two samples of a Professional Organizations Program Alignment (Council for
Exceptional Children: CEC) for the Special Education: General Curriculum degree and
the NSTA, NCSS, NCTM, IRA Professional Standards Alignment for the Elementary
Education degree programs can be found by reviewing the documents titled “CEC
Standard Alignment Sped 5c.2” and “NCTM NCSS NSTA IRA Alignment Elem
5c.2”.
Standard 5 Task 2a:
1-4. All items will be available during the onsite visit. Interviews with TEC members
and EPP faculty, Provost, CFO, and University President will be scheduled for the March
visit.
Standard 5 Task 2c:
1. The Candidate Performance in the Field evaluation (see Evidence #65) has not
provided enough differentiation in scores to show growth for candidates progressing in
the program. The EPP has determined that this is may be more a matter of how clinical
educators are assessing candidates rather than flaws with the instrument itself. Even
though some instruments have been designed by the EPP to measure growth in the
program, candidates in earlier coursework often receive higher ratings by teachers due to
what appears to be a reluctance to assign a “lower rating” that could impact the
candidate’s grade. This same tendency has been noted when EPP candidates are assessed
at midterm during student teaching (Internship II). Efforts have been made to address
this by computing reliability statistics between the cooperating teacher and the university
supervisor on the Candidate Internship Evaluation Form for each of the items assessed
(see document titled “Reliability: Candidate Internship Evaluation-Midterm 5c.1).
Results indicate consistent reliability ratings above 80% (ranging from 82%-97%)
agreement between cooperating teachers and university faculty supervising the internship
in all licensure areas. These ratings have been used to engage clinical educators in a
12 discussion of where candidates are expected to be at the mid-point of the student teaching
experience (assuring them that the EPP does not expect candidates to be at
“accomplished” after only half of the internship), particularly in the areas that appear to
have less agreement such as Classroom Climate and Professionalism. The EPP has
included these types of discussions in orientation sessions with clinical educators and
revised the procedures for completing the evaluations in earlier courses to determine if
this has an impact on scores for first, second, third and fourth year candidates in the
program.
2. The CAEP Evidence Guide was published in January 2015 and, as an Early Adopter,
the EPP was asked to submit assessments and rubrics during this same time period. The
EPP has since reviewed the document and did address the reliability and validity of ten
instruments uploaded to AIMS in February 2015. In this Addendum, the document titled
“Summary of HPU Assessment Alignment to CAEP 1c.1” supports the “purpose” of
each assessment by indicating which “gateway” the data is used to inform. Additionally,
the table also indicates whether each assessment listed is a formative or summative tool
and supports that the EPP uses a balance of the two in making candidate and
programmatic interpretations of the data collected.
3. The timing of May Marathon permits discussion of the EPP candidate/programmatic
data (with the exception of TRIPOD and Principal Survey data which is usually not
received by the EPP until the summer), Junior Year Review and updates on the progress
made toward EPP and departmental yearly goals. Faculty input regarding goal setting for
the upcoming academic year is also completed each May. A full School of Education
faculty meeting is held in August prior to the first Teacher Education Council meeting.
Faculty are then able to review the data gathered by the EPP during the summer, the
Dean’s End-of-Year Report and the IHE Performance Report (which is submitted to the
NCSDPI in June). The August Teacher Education Council meeting provides a more
formal presentation on the preceding information and allows for input from P-12
educators and candidates who represent the various EPP programs. New goals that
require curriculum, programmatic or policy revisions become the work of the Teacher
Education Council for that year and must be approved prior to implementation by the
EPP. Minutes of Teacher Education Council and May Marathon meetings will be
available for review during the onsite visit.
4. The LEA/IHE Certification of Capacity is a state-wide adopted exit evaluation
developed by McREL. As a verification for licensure eligibility it is adequate but the
EPP has found that it does not provide enough information to be useful for other purposes
therefore the EPP developed a more comprehensive assessment to be used during the
final evaluation of Internship I that could be repeated during Internship II at the mid-point
of the semester. The transition to the new Pearson tests in North Carolina for Elementary
and Special education candidates has been a challenge.
5. The Doctoral Predictive Validity study will be expanded in 2015-2016 to include
TRIPOD survey results and surveys from employers in an effort to better predict which
assessment tools have the highest predictive validity with teaching effectiveness.
13 6. CAEP 8 data are housed on the SOE website under the link called Graduate
Performance Data. These data are also distributed to the Office of Admissions and used
for marketing purposes during Open House and Freshmen Orientation. The Dean is
required to submit an annual End-of-Year Report to the Provost each June which includes
trends in enrollment data, retention, faculty productivity, progress on yearly goals and
proposed goals for the upcoming academic year.
Standard 5 Task 3a:
1-3. Interviews with Program Completers and Employers will be scheduled for the
March visit. Verification of assessment instruments specifically measuring P-12 student
learning will be provided.
Standard 5 Task 3c:
1-2. See discussions above
Standard 5 Task 4a:
1. Teacher Education Council members include faculty from the School of Education,
liaison faculty from the College of Arts and Sciences who offer licensure programs in the
School of Education, a Graduate School Representative, an undergraduate candidate, a
B.A. to M.Ed. student representative and a candidate representing the doctoral students
enrolled in the Ed.D. program. P-12 educators are selected who largely represent key
stakeholders in partner schools or those in executive leadership roles in surrounding
districts such as the Director of Human Resources for Thomasville City Schools, the
Initially Licensed Teacher Coordinator (ILT) for Davidson County Schools and the TitleI Coordinator for Guilford County Schools. school administrators at the elementary,
middle and secondary level from Guilford County Schools. Efforts are also made to
include participants
Documentation and minutes verifying the above series of activities can be viewed during
the on-site visit. Interview with current TEC members will be arranged for CAEP
program reviewers during the onsite visit.
Standard 5 Task 4c:
1. As previously noted (see Standard 2), the 34-members of the Teacher Education
Council are frequently organized into subcommittees by the Dean (chair of TEC) for
purposes of studying a particular curriculum issue, policy revision or initiative.
Examples of this include the subcommittee selected to draft the EPP Diversity
Recruitment Plan, the subcommittee charged with reviewing and revising the EPP
selection criteria for cooperating teachers and the subcommittee appointed by the dean to
implement a plan for phasing in the new Pearson and Praxis II testing requirements for
program completers in North Carolina. This subcommittee included faculty in the School
of Education (Elementary Literacy/Math) and select faculty in the College of Arts and
Sciences (Department of Mathematics and History) to assist in reviewing the institution’s
14 general education requirements and proposing a plan for revising the timeframe
recommended by the School of Education for completing the required specialty and
multi-subject tests.
2. See above. Ed.D. Advisory Board Selection includes area superintendents and
educational leaders who hold key executive leadership roles in North Carolina school
districts. The board also includes the Provost, a representative from the Norcross
Graduate School and doctoral candidate representation. The graduate candidate who
serves on the Ed.D. Advisory Board also serves as the representative on the University
Graduate Council for continuity.
3. Evidence of collaborative initiatives with stakeholders in P-12 schools with regard to
program evaluation will be provided during the onsite visit. Interviews with these key
stakeholders can be arranged during the onsite visit.
Standard 5 Task 5a:
1-3. Interviews with regard to the development, implementation, and evaluation of the
SIP will be arranged during the onsite visit.
Standard 5 Task 5c:
The Selected Improvement Plan (SIP) is considered an ongoing project as many of the
policy changes (raising the required GPA for admission to 3.0 and requiring candidates to
maintain grades of C or higher in all education and supporting courses was phased in
with incoming freshmen in the academic year following approval of the policy revision
by the Teacher Education Council. The EPP therefore will not have data from a group of
candidates to evaluate the full impact of these new policies until the end of the 2015-2016
AY. Furthermore, to assess the impact of the SIP policy changes on performance of
graduates and P-12 students will require an additional two years of data collection
beyond 2016.
It also has been questioned in the Formative Feedback Report that the EPP has
documented the use of certain strategies (Service Learning, Undergraduate Research and
participation in B.A. to M.Ed. programs) to achieve progress on SIP goals of improving
Content Knowledge. This is indeed correct, the EPP currently does not require
candidates in any of the licensure programs to participate in these institutional programs.
It is felt that the data collected on candidates who do choose to participate in these
programs will be used to make these programmatic changes if it is determined that such
participation does lead to significantly greater acquisition of content knowledge (InTASC
#4) and the candidate’s ability to apply and integrate content knowledge across
disciplines (InTASC). The EPP is also interested in looking at how participation in these
types of programs also enhances the “professional dispositions” associated with InTASC
Standards #4 and #5. Specifically, the candidate’s demonstration that he/she “values
knowledge (InTASC # 4(o), 4(p), InTASC #5(r). Data collection will be ongoing.
15 Standard 5 Task 7a-c:
See table outlining how operations are evaluated, who is involved, and how the results
are used in Evidence # 31. Interviews with EPP personnel associated with operations,
SOE Operational budget, library holdings, etc for three years will be arranged during the
onsite visit.
Diversity
1c.1
Please see document titled HPU Assessment Alignment 1c.2 to find a listing of critical
assessments aligned to specific diversity indicators.
Technology
1c.1
Please see document titled HPU Assessment Alignment 1c.2 to find a listing of critical
assessments aligned to specific technology indicators and Standard 1.5.
16 
Download