examples

advertisement
ABHES
PROGRAM EFFECTIVENESS PLAN
(PEP) MANUAL
Updated January 2012
TABLE OF CONTENTS
The Purpose of the Program Effectiveness Plan (PEP) ...................................................................... 3
Developing, Implementing, and Monitoring the Program Effectiveness Plan .............................5
Format and Content Guidelines ...........................................................................................................7
Subsection 1 –Program Effectiveness Plan content
a.
b.
c.
d.
e.
f.
g.
h.
i.
January 2012
Title/Cover Page ....................................................................................................................... 7
Student Population ................................................................................................................... 7
Program Objectives .......................................................................................................................8
Program Retention Rates ...........................................................................................................10
Program Job Placement Rates .................................................................................................12
Credentialing Examination Participation Rates ....................................................................14
Credentialing Examination Pass Rates ....................................................................................15
Program Assessment ...................................................................................................................16
Student, Clinical Extern Affiliate, Graduate And Employer Satisfaction Surveys ........... 18
Faculty Professional Growth And In-Service Activities ..................................................... 22
Subsection 2 – Outcome Assessment
i.
ii.
iii.
iv.
v.
vi.
vii.
Historical outcomes......................................................................................................................23
Types and Uses of Assessment Data .........................................................................................23
Initial baseline rates & measurement of results .......................................................................24
Summary and analysis of data collected................................................................................25
How data is used to improve the educational process ........................................................25
Goal Adjustment ..........................................................................................................................26
Activities undertaken to meet future goals .............................................................................26
Format Examples....................................................................................................................................27
Conclusion…… .. ....................................................................................................................................30
Program Effective Plan Manual
Updated January 2012
2
PURPOSE OF THE
PROGRAM EFFECTIVENESS PLAN
The Program Effectiveness Plan (PEP) is an internal quality assessment tool that
evaluates each program within an educational institution by
 establishing and documenting specific goals,
 gathering outcome data relevant to these goals,
 analyzing outcomes in relation to benchmarks and program’s short- and longterm objectives, and
 designing strategies to improve program performance.
The program effectiveness assessment is expected to result in the achievement and
maintenance of outcomes. For each of the outcomes identified by a program, the
program establishes the level of performance that serves as a benchmark for
acceptable program performance. These benchmarks meet or exceed requirements
established by any applicable state or federal authority and by ABHES policies and/or
standards.
Program success is based on student achievement in relation to its mission, including
but not limited to consideration of the following:







Retention rates
Participation in and results of licensing and certification examinations
Graduation rates
Job placement rates
Program assessment
Survey responses from students, clinical externship sites, graduates, and
employers
faculty professional growth and in-service activities
Developing and using the Program Effectiveness Program (PEP) should fulfill several
purposes, including:
1. Assisting the institution in achieving internal effectiveness through establishing goals
for short- and long-term successes.
Further, criteria for measuring the
accomplishment of these goals can be defined, allowing the institution to focus its
plans and activities on the critical processes needed for effectiveness. Once
defined, these goals and criteria should then be used to unify administrative and
educational activities, which can achieve a high degree of commitment and
common direction among all employees.
2. Assessing progress and the need for change and continuously reviewing the process
to help the institution make timely changes based upon valid information to achieve
even greater effectiveness.
3. Communicating key information about the institution’s goals, its degree of
effectiveness, and how it plans to enhance overall quality to external publics such
Program Effective Plan Manual
Updated January 2012
3
as graduates, employers, and community leaders. Information, which depicts the
most important elements of the institution’s operation, communicates clearly and
accurately to external publics how well the institution is meeting the needs of
students and providing quality-learning experiences.
4. Measuring how the PEP meets the expectations and requirements of approving or
accrediting organizations, including state boards and ABHES, to demonstrate
regulatory compliance.
A document which defines institutional goals and
educational processes is a primary focus of most accrediting agencies as they
measure overall effectiveness and the quality of programs and services provided.
All goals and activities are key indicators of program effectiveness and should relate to
the institution’s mission to demonstrate mission achievement and continuous
improvement, as the institution’s mission is the impetus and barometer of the program’s
effectiveness. The PEP requires an institution to look at its past, present, future, and
strategies and to continuously ask:
Where have we been?
This data becomes the baseline for gauging
and demonstrating improvements.
Where are we now?
Current data demonstrates how you will
measure change from the baseline data
using the caparison to identify changes
needed.
Where do we want to go?
A look toward the future for goals to improve
or enhance processes and/or programs.
How do we get there?
Program Effective Plan Manual
Updated January 2012
Processes used to achieve the new
direction based upon the input of all
relevant constituents.
4
DEVELOPING, IMPLEMENTING, AND MONITORING
THE PROGRAM EFFECTIVENESS PLAN
(Standards and Examples)
The standards addressing the Program Effectiveness Plan may be found in the
ACCREDITATION MANUAL 17TH Edition, Chapter V, Section I, pages 72-76 as published
by the ACCREDITING BUREAU OF HEALTH EDUCATION SCHOOLS (ABHES). The standards
outline the ABHES requirements in relation to the development, implementation, and
maintenance of the PEP, including the outcomes assessment requirements and
(Section I, Subsection 2), which gives a detailed description and explanation of the
meaning and implications on the required components of the PEP. This manual provides
suggestions and examples for addressing each PEP standard.
Developing an PEP requires that each program collect, maintain, and use information
reflecting the areas outlined in Chapter V, Section I of the ABHES Manual. The data
should be analyzed for a specific 12 month period of time as defined by the institution
and be used as the foundation for making comparisons across future time periods.
Many institutions perform its analysis in conjunction with its fiscal/calendar year or in
conjunction with the ABHES annual reporting period (July 1 – June 30), since the
majority of the required PEP information is also required on the ABHES Annual Report.
Regardless of the selected timeframe, the data is to be updated at least annually.
The PEP is unique to the program and institution and the institution must evidence its efforts
made to ensure continuous improvement. The process requires that the institution: (1)
systematically collect data and information on each of the educational outcomes areas
and achievement of its occupational objectives at least annually; (2) complete an
analysis of the data and information including, but not limited to, performing a
comparison with previous findings; and (3) identify what changes in educational
operations or activities it will make based on the analysis.
Steps in preparing and managing the PEP are similar to those suggested for preparing
an institution’s self-study. Structured organization is essential. Although the exact
organizational procedures will vary from institution to institution, the following
suggestions may be helpful:

The program faculty (full time and part time) assisted by the president/director,
director of education, and a representative from admissions and placement are
the key individuals acting as a team to initiate, guide, and direct the
development and implementation of the PEP. It is their commitment to the PEP
and empowerment of the team to oversee these activities that will ensure
continuous improvement and the ultimate success of the planning process.

The process is a collective effort that should involve all faculty, administrators,
staff, and advisory board members. Consideration should also be given to
actively recruiting student, graduate, and employer representatives in the
process. It is important that all members of the administration, faculty, governing
Program Effective Plan Manual
Updated January 2012
5
board, and student body understand and appreciate the importance of the PEP
and its value to the institution.

Establish subcommittees to prepare specific PEP sections. These subcommittees
should be effectively utilized to complete the various tasks in all facets of the PEP,
including development, implementation, and evaluation. The consideration of
subcommittee members should depend on each member’s responsibilities.
Include the names of those responsible for implementing and monitoring the PEP.

Establish baseline rates developed through analyzing the results of past annual
retention and placement rates, which will be used in the analysis process. The
data collected each year on the ABHES Annual Report includes retention and
placement percentage; therefore, it is a valuable part of the PEP. Each program
should maintain these annual reports with supporting documentation, for at least
three years so as to provide historical data from which goals may be set. Be
specific in the data to be collected and collect data that will clearly evidence
the level of educational outcomes and satisfaction experienced by current
students, graduates, and employers.

The PEP may include any other elements determined to be important measures
of program effectiveness such as a review of default rates in the student loan
programs under Title IV of the Higher Education Act, based on the most recent
data provided by the Secretary of Education. These findings may be coupled
with student retention and placement rates to determine what correlation, if any,
can be determined. Any correlation identified should be reviewed for correction.

Because the PEP focuses on overall program improvement, it is a work in progress
as there are many potential elements of the institution’s daily operations, which
are relevant and important to improving effectiveness. Each program is
encouraged to collect a variety of statistical data, which will assist it in improving
the educational outcomes.

The PEP team and subcommittees should adopt and implement a realistic and
enforceable periodic schedule throughout the year to review the PEP and
document progress through minutes of all meetings where the PEP is discussed.
The meeting minutes should show the progress to date, a short summary of the
data analyzed, changes anticipated, and continuation or new direction the
institution is taking to improve the educational processes. Minor revisions to goals
may be made during the monitoring of the PEP; however, substantial revisions
should only be made at the annual review unless there is a major change in the
institution’s leadership and/or mission. These periodic meetings will ensure that
the PEP is utilized and evaluated on a continuing basis.
Program Effective Plan Manual
Updated January 2012
6
PROGRAM EFFECTIVENESS PLAN
FORMAT AND CONTENT GUIDELINES
While each program must address each element required of the Program Effectiveness
Plan (PEP), the plan may be a comprehensive one which collectively represents all
programs within the institution, or may be individual plans for each distinct program. The
following section is to serve as a guide as it contains elements that should be
incorporated into the PEP. Each standard is given, followed by examples of how an
institution might demonstrate established goals and compliance.
Title/Cover Page to include the following:
ABHES I.D. CODE
Name of Institution
Address
City
Name of Program
Program Director
Credential awarded
Portion of program offered via distance learning
Length of program (clock hours, semester/quarter credits, weeks, etc.)
12 Month period covered by the plan (e.g., July 1, 20?? through June 30, 20??)
V.I.1. A program has an established documented plan for assessing its effectiveness as
defined by specific outcomes.
The Program Effectiveness Plan includes (all of the following standards) clearly stated:
STANDARD
a. student population
A description of the characteristics of the student population is included in the Plan.
Student population demographics such as gender ratios, median age,
race/ethnicity, marital status, and socioeconomic descriptions should also be
included and identified by program if they differ from the overall institutional
demographics.
EXAMPLES:
There are many ways an institution may identify such information. The following
examples include narrative, listing and chart formats:
Narrative Format
The institution’s student population has doubled over the last three years and is
represented by a diversity of demographic characteristics. Approximately 80% of
the population is independent with an average annual income below $22,000, and
20% are dependent with an average annual family/household income of $40,000.
Program Effective Plan Manual
Updated January 2012
7
Male to female ratio is 39% to 61% respectively, and student ages range from 18 to
63. Recent business closings have resulted in an increase in the student population
of dislocated workers seeking retraining. The majority of students require some form
of financial assistance. The race/ethnicity composition is African American/Black
13%, American Indian/Alaskan Native 0.3%, Asian/Pacific Islander 0.7%, Hispanic
1.5%, Mexican American 0.5%, Caucasian/White 78.8%, and Undisclosed race 5.1%
Listing Format
In the 2010-2011 Annual Report year, the student body consisted of approximately:
61% female, 39% male
60% attend day classes: 40% attend evening classes
83% earned an average or above grade in high school English
75% earned an average or above grade in high school math
9% English as a second language
58% HS
12% GED
30% had prior postsecondary education
71% were first in family to receive postsecondary education
61% were employed
80% were independent with a household income of $22,00 or less
91% attended full-time classes and 9% part-time
36% were married
29% under age 25, 34% age 25-34, 26% age 35-44, 7% age 45-54, 4% age 55+
Afro-American Black 13%, American Indian/Alaskan Native 0.3%, Asian/Pacific
Islander 0.7%, Hispanic 1.5%, Mexican American/Chicano 0.7%, Caucasian/White
78.8%, Undisclosed race 5%
Chart Format
Gender
ratios
Median
M F
age
PROGRAM
Nursing Assistant
Dental Hygienist
Massage Therapist
Medical Assistant
Medical Billing & Coding
Specialist
Patient Care Specialist
Pharmacy Technician
Phlebotomy Technician
INSTITUTION TOTALS 39 69
Race/
Ethnicity
W NW U
13 82
SocioEconomics
Marital Independent
status
& <$22,000
5
STANDARD
b. program objectives
Programs objectives are consistent with the field of study and the credential offered
and include as an objective the comprehensive preparation of program graduates for
work in the career field.
Program Effective Plan Manual
Updated January 2012
8
Program Characteristics of each currently offered program should include:




Degree level,
Program description,
Program objectives, and
Description of student outcomes, specifying the competencies students should
possess upon conclusion of the program.
If an institution offers Medical Assistant, Nursing Assistant, and Surgical Technology
programs, then its PEP might present the following overview:
EXAMPLE:
The Medical Assistant academic associate’s degree program prepares the
student to become a multi-skilled allied health professional with diverse
duties in medical offices, clinics and health centers. The program includes a
balance of classroom, laboratory, and clinical experiences.
Objectives of the program are to:
 Prepare a knowledgeable entry-level employee with technical skills and work
habits necessary to perform effectively in various health-care related fields
including medical transcriptionist, medical billing specialist, medical
office manager, and medical assistant.
 Provide clinical activities that include assisting the physician in patient
care responsibilities by recording medical histories, taking vital signs,
preparing the patient for examination, assisting physician during patient
examination and surgical procedures, collecting and performing various
laboratory tests, administering medications, performing diagnostic
procedures such as EKGs and dressings, and providing patient
education.
 Teach courses in anatomy, physiology, pharmacology, computer applications,
clinical procedures, interpersonal skills, confidentiality, medical ethics, professional
behavior, and patient interface, as well as basic office procedures to ensure
competency.
At the completion of the program, the student will be able to:
 Assume a wide range of responsibilities in a medical office or
ambulatory care center.
 Communicate with patients to schedule appointments, receive and
process payments.
 Sit for the credentialing examination
The Nursing Assistant diploma program prepares the student to function under the
supervision of a physician and/or a registered nurse and to participate as a member of
a healthcare team in providing nursing care. The program includes classroom,
laboratory, and clinical patient care experiences.
Objectives of the program are to:
 Prepare a competent, nurse assistant to function effectively in acute, long-term
care, and ambulatory settings.
Program Effective Plan Manual
Updated January 2012
9
 Provide a collaborative learning environment in which the student will develop
and apply principles of systematic reasoning through critical thinking.
 Guide the learner in the continuing process of personal and professional growth.
At the completion of the program, the student will be able to:
 Function in the delivery of care to clients.
 Communicate with clients, client families, and members of the healthcare team.
 Perform nursing skills applying critical thinking.
 Integrate ethical, professional, legal responsibility, and accountability into
actions and decisions.
 Assume responsibility for personal and professional growth.
 Sit for the State certification board exam.
The Surgical Technology certificate program prepares the graduate to function as an
intraoperative team member under the direct supervision of a surgeon or registered
nurse. The graduate is prepared for this role through didactic, laboratory, and external
clinical experiences.
Objectives of the program are to:
 Prepare the graduate for a professional career.
 Prepare a competent surgical technologist to perform intraoperative first scrub
duties.
 Guide the learner in the processes for certification and professional
development.
At the completion of the program, the graduate will be able to:
 Effectively perform pre-, intra-, and post-operative duties
 Practice aseptic and sterile technique
 Practice all patient safety measures and act in an ethical manner
 Assume responsibility for personal and professional growth
 Sit for a national certification exam
Continue same format for all programs offered at the institution.
STANDARD
c. program retention rate
At a minimum, an institution maintains the names of all enrollees by program, start date,
and graduation date. The method of calculation, using the reporting period July 1
through June 30, is as follows:
(EE + G) / (BE + NS + RE) = R%
EE = Ending Enrollment (as of June 30)
G
= Graduates
BE = Beginning Enrollment (as of July 1)
NS = New Starts
RE = Re-Entries
R% = Retention Percentage
Program Effective Plan Manual
Updated January 2012
10
Include the retention results for the last three annual reporting years as the baseline, if
available, along with goals for the upcoming year. If an institution has developed longterm goals for retention, this information should also be included with status updates.
EXAMPLE: Retention rates for the past three years, taken from the institution’s Annual
Report:
2008-2009
67%
64%
80%
Medical Assistant
Nursing Assistant
Surgical Technology
2009–2010
69%
65%
81%
2010–2011
70%
67%
85%
To establish the goals for the next reporting period 2011-2012, an institution may choose
to average the three previous years for each program. However, in this example the
goal would be below the 70% benchmark in the medical assistant and nursing assistant
programs; therefore, this would not be a practical way to determine the next year’s
program goal.
A program may elect to establish its goal by an increase of a given percentage each
year, such as five percent or by determining the percent increase from year to year of
the three previous years. Note in the example that the Surgical Technology program
increased retention 1% between 2008-2009 and 2009-2010 and then increased 4%
between 2009-2010 and 2010-2011. So the average increase among those three years
is 2.5%. Using the averaged percent method, a realistic 2012 goal then might be 87.5%.
The program may also establish intermittent goals of a percentage increase from
month to month or an increase in relation to the same month or other predetermined
time periods in the previous year—e.g., a 1% increase from month to month or a 2%
increase in April 2012 over the April 2011 rate. Intermittent goals are advantageous as
they keep everyone on target throughout the year.
The chart below shows a comparison of the medical assisting, nurse assisting, and
surgical tech programs to overall retention. The next step is to develop improvement
plan/strategies for achieving the 2012 retention rates for each of the three programs
that will assist in reaching its projected 2012 rates. Surgical Technology is doing well so
its goal may be to either increase by 2.5% as stated previously or maintain the 85%
retention rate. The Medical Assisting program is just at 70%, but did show a 1% increase
between 2009-2010 and 2010-2011; therefore, it would be realistic to challenge it with
perhaps another 1-2% increase for 2012. For the Nursing Assistant program to reach 70%
would require an increase of 3% in one year. By establishing program goals, the
institution is assured that all programs are working toward overall retention
improvement.
100
MA
80
60
NA
40
ST
20
Overall
0
2008-09
Program Effective Plan Manual
Updated January 2012
2009-10
2010-11
11
Other areas that might be considered to address retention include setting
 an average daily attendance goal for example at 90%;
 a maximum withdrawal rate per quarter for example 10%;
 a quarterly retention goal
 quarterly grade distribution goals in the percentage of As and Bs for selected
courses/classes for example for anatomy and physiology
Quarter
Dec 09
Mar 10
Jun 10
Sept 10
Dec 10
Mar 11
Jun 11
Sept 11
Dec 11
Mar 12
Jun 12
Sept 12
Total
Mean
Total Average As & Bs
As %
45
62
62
61
50
65
49
27
54
72
83
51
681
57%
82%
Bs %
22
35
35
32
26
14
34
8
31
25
11
32
305
25%
Based on this distribution, the institution might elect to develop strategies to maintain
the 82% rate or raise the goal to 85%. Each quarter an intervention plan might be
developed for those struggling students not making As and Bs. Such an intervention
plan might enhance retention.
 Similarly quarterly grade distribution goals could be set in overall enrollment
performance.
Average Quarterly Grade Distribution for March 2010-June 2011
Quarter Total EOQ
Ending Students
Mar 10
571
June 10
354
Sept 10
391
Dec 10
417
Etc.
Total
Mean
FTEs
As%
Bs%
Cs%
Ds%
Fs%
Ws%
1606
1118
1180
1295
35
32
32
36
33
29
28
34
16
17
16
15
11
9
13
12
5
3
2
2
0
10
9
1
15134
1261
376
31%
378
32%
209
17%
103
9%
38
3%
95
8%
In the analysis trends of grade distribution are noted. Goals then could be set to raise
the percentage of As and Bs while reducing the percentage of Ds, Fs and Ws
accompanied by departmental strategies.
STANDARD
d. job placement rate in the field
An institution has a system in place to assist with the successful initial employment of its
graduates and is required to verify employment post-initial employment date. At a
minimum, an institution maintains the names of graduates, place of employment, job title,
Program Effective Plan Manual
Updated January 2012
12
employer telephone numbers, employment date and verification dates. For any
graduates identified as self-employed, an institution maintains evidence of
employment. Documentation in the form of employer or graduate verification forms or
other evidence of employment is retained.
The method of calculation, using the reporting period July 1 through June 30, is as follows:
(F + R)/(G-U)=P%
F
R*
G
U**
P%
=
=
=
=
=
Graduates placed in their field of training
Graduates placed in a related field of training
Total graduates
Graduates unavailable for placement
Placement percentage
*Related field refers to a position wherein the graduate’s job functions are related to the
skills and knowledge acquired through successful completion of the training program.
**Unavailable is defined only as documented: health-related issues, military obligations,
incarceration, continuing education status, or death.
Important Note: graduates pending required credentialing/licensure in a regulated
profession required to work in the field and, thus, not employed or not working in a related
field as defined above, should be reported through back-up information required in the
Annual Report. This fact will then be taken into consideration if the program placement
rate falls below expectations and an Action Plan is required by ABHES.
EXAMPLES:
Placement results for the same annual reporting years identified above for retention are
used as your baseline data, if available, along with goals for the upcoming year. In
addition, if an institution has developed long-term goals for placement, this information
should also be included with status updates.
Placement rates for the past three years, beginning with 2008-2009, taken from the
institution’s ABHES Annual Reports are:
Medical Assistant
Nursing Assistant
Surgical Technology
2008-2009
88%
85%
86.5%
2009–2010
90%
88%
88%
2010–2011
94%
91%
91.3%
These rates indicate a steady annual increase and all rates exceed the 70 percent
ABHES benchmark. The chart shows a comparison of the three programs to the overall
placement.
94
92
90
88
86
84
82
80
MA
NA
ST
Overall
2008-09
Program Effective Plan Manual
Updated January 2012
2009-10
2010-11
13
Since these are good placement rates, the institution may elect to hold at these rates
for 2012 and develop strategies to maintain the rates or the institution may elect to
increase by a given percentage for 2012 such as one percent or use an average of the
increases.
STANDARD
e. credentialing examination participation rate
Participation of program graduates in credentialing or licensure examinations required
for employment in the field in the geographic area(s) where graduates are likely to
seek employment,
The method of calculation, using ABHES’ reporting period July 1 through June 30 is as
follows:
Examination participation rate = G / T
T =
G=
Total graduates eligible to sit for examination
Total graduates taking examination
EXAMPLE:
While credentialing may not be required for employment, concerted efforts should be
made by the institution to promote and encourage participation in licensure exams by
setting participation and pass rate goals and establishing strategies for achieving those
goals. Include results of periodic reviews conducted throughout the reporting year of
certification exam results by program along with goals for the upcoming year. If results
are not easily accessible without student consent, the institution should consider
incentive procedures or devise alternate methods to obtain results, which can be
documented to assess program effectiveness. Again include the three most recent
years of data collection. Data may be analyzed by class or just by program. Data
collected and analyzed by class provides more detail.
Example by program
PROGRAM
Nursing Assistant
GRADS
NUMBER
% GRADS
’09 ’10 ‘11 TOOK EXAM TOOK EXAM
20 26 24 15 20 21 75 77 88
Medical Assistant
24
26
30
20
21
25
83
81
83
Surgical Technology 23
19
17
15
11
13
65
58
76
EXAMPLE:
Looking at the nursing assistant graduate percentage of those who took the test versus
those that have taken the test for the last three years would be lower than the 2011
pass rate. Therefore, it would be more advantageous to calculate the percentage
increase between ‘09/’10 and ‘10/’11 to get an average to establish the percent
increase for ‘11/’12 [2% (percent increase between 2009 & 2010) + 11(percent increase
between 2010 & 2011) ÷ 2 = 6.5]. So the goal for the number taking the nursing assistant
exam in 2012 would be 94.5%.
Program Effective Plan Manual
Updated January 2012
14
If students are admitted and graduate on a quarterly basis, the institution might find
data collected quarterly to be more beneficial such as this example:
Example by program by class (useful tracking if employ adjunct faculty that change each term)
PROGRAM
Nursing Assistant
Winter
Spring
Summer
Fall
GRADS
’09 ’10 ‘11
20
26 24
2
4
7
7
6
5
6
7
6
5
9
6
Medical Assistant
Winter
Spring
Summer
Fall
24
7
5
6
6
26
4
6
7
9
NUMBER TOOK
EXAM
15
20
21
2
3
5
5
6
5
3
3
3
4
7
6
PERCENT GRADS
TOOK EXAM
75
77
88
30
9
10
5
6
Other data to demonstrate student-learning outcomes may include entrance
assessments, pre- and post-tests, course grades, GPA, CGPA, standardized tests, and
portfolios.
STANDARD
f. credentialing examination pass rate
An ongoing review of graduate success on credentialing and/or licensing
examinations required for employment in the field in the geographic area(s) where
graduates are likely to seek employment is performed to identify curricular areas in
need of improvement. A program maintains documentation of such review and any
pertinent curricular changes made as a result.
The method of calculation, using ABHES’ reporting period July 1 through June 30th, is as
follows:
F / G = L%
F = Graduates passing examination (any attempt)
G = Total graduates taking examination
L% = Percentage of students passing examination
At a minimum, the names of all graduates by program, actual graduation date, and the
credentialing or licensure exam for which they are required to sit for employment are
maintained.
Example by program
PROGRAM
F
Nursing
Assistant
r
o
Medical Assistant
m
GRADS
’09 ’10 ‘11
20
26
24
NUMBER (G)
TOOK EXAM
15
20
21
NUMBER(F)
PASSED
12
15
16
PERCENT(L)
PASSED
80
75
76
24
20
15
75
26
30
21
25
17
19
81
76
19
17
15
11
13
14
9
10
93
82
77
tSurgical Technology 23
From this data, establish a goals for the percentage of graduates passing the
exam using the same methods described above for graduates taking the exam.
Since passing rates have not steadily climbed, setting a reasonably achievable
passing goal could be established by merely averaging the three most recent
Program Effective Plan Manual
Updated January 2012
15
passing rates (80+ 75+ 76÷ 3 = 77), which would give a goal of 77% passing for the
nursing assistant program.
Again, if students are admitted and graduate on a quarterly basis, the institution
might find data collected quarterly to be more beneficial such as this example:
Example by program by class (good tracking if employ adjunct faculty that change each term)
PROGRAM
Nursing Assistant
Winter
Spring
Summer
Fall
GRADS
’09 ’10 ‘11
20
26 24
2
4
7
7
6
5
6
7
6
5
9
6
Medical Assistant
Winter
Spring
Summer
Fall
24
7
5
6
6
26
4
6
7
9
NUMBER TOOK
EXAM
15
20
21
2
3
5
5
6
5
3
3
3
4
7
6
NUMBER
PASSED
12 15 16
PERCENT
PASSED
80 75 76
30
9
10
5
6
STANDARD
g. program assessment
The program assesses students prior to graduation as an indicator of the program’s
quality. The assessment tool is designed to assess curricular quality and to measure
overall achievement in the program, as a class, not as a measurement of an individual
student’s achievement or progress toward accomplishing the program’s objectives and
competencies (e.g., exit tool for graduation). Results of the assessment are not required
to be reported to ABHES, but are considered in curriculum revision by such parties as
the program supervisor, faculty, and the advisory board and are included in the
Program Effectiveness Plan.
EXAMPLES FOR USE AS PROGRAM ASSESSMENT:




Comprehensive Final Exam
Scenarios
National Practice Exams
Practical demonstrations using a comprehensive checklist before students go on
externship
 Comprehensive Final Exam
An important measure of program effectiveness is how well it prepares students with the
entry level competencies for field of work. A comprehensive examination designed to
measure the individual student’s preparation in the required competencies identified by
the program objectives administered to every student prior to completion and the
collective results used to assess the program’s performance in preparing students, as a
group, for employment in the field. The comprehensive examination may include either
written questions, practical demonstrations, or a combination of methods so long as
necessary clinical competencies are validly and reliably assessed.
Such an exam should be designed to incorporate all major elements of the curriculum
for assessment of quality. A well-designed exam will point directly to that segment of
the curriculum that needs remedy.
Program Effective Plan Manual
Updated January 2012
16
For example, if scores are consistently low in the anatomy and physiology segment of
the exam as indicated by a three year trend, then an action plan may include new
textbooks, an instructor change , instructor professional development or in-service or
the course taught as a prerequisite instead of a core element. These cohort exam
scores are then closely monitored for upward trends that indicate that the plan is
working.
A program may find it beneficial to score the exam with ranges rather than pass/fail.
This communicates to the student that it is being used as an overall quality
improvement tool, rather than a personal test. Each student is provided instructions
regarding the scenario. The students then role play as they complete the scenarios
under the direction of the faculty.

MA SCENARIO EXAMPLE (Thanks to Ross Medical Education Center for sharing these examples)
Program Assessment Evaluation Method: Medical Assistant Day In The Office Scenario
Total Number of seniors Total number of seniors
Overall Proficiency %
who completed day in
scoring “Proficient” or
for all combined day in
MA Program
the office scenarios
“Acceptable”
the office scenarios.
Years
08-09
09-10
10-11
08-09
09-10
10-11
08-09
09-10
10-11
N/A
N/A
100%
N/A
N/A
100%
N/A
N/A
100%
Day in the Office Scenario (Tasks)
Prepare and maintain electronic medical records
Manual Filing with Alphabetic System
Take and Record Height and Weight
Take a complete set of Vital Signs
Perform Visual Acuity with Snellen Chart
Perform a physical and chemical U/A
Administer an ID injection
Administer a SQ injection
Administer an IM injection
Administer a Z-track injection
Perform a Standard 12-Lead EKG
Perform a Spirometry Test
Measure Infant Height/Weight, Head and Chest Circumference
Perform a Urine Pregnancy Test
Perform Multi-draw Venipuncture
Perform Venipuncture with Butterfly
Perform a Capillary Puncture and MicroHematocrit
Perform a Capillary Puncture and Hemoglobin
Perform a Capillary Puncture and a CLIA Waived Mono Test
Measure Blood Glucose using a Handheld Monitor
Wrapping Instruments/Operate an Autoclave
Apply Sterile Gloves/Set Up Sterile Tray/Remove Sutures
Code Assignment/Posting Patient Charges Electronically
Program Effective Plan Manual
Updated January 2012
17
2010‐11 Class Proficiency
Percentage
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
Rationale for Data
The program assessment evaluation tool is used to
assess curricular quality and measure overall
achievement in the program by class.
The purpose of the assessment tool is to determine if
classes are performing clinical skills at a high level and
progressing toward accomplishing the program’s
objectives through demonstrated proficiencies.
Collection Procedures
Program assessment rubrics are used to assess senior students’ clinical
skills through end-of-term “Day in the Office” scenarios. The Program
Assessments are compiled twice a year in July for January 1 to June 30
and in January for July 1 to December 31. To ensure data is significant,
the data is compiled by class and by program and then
reviewed/analyzed/assessed based on the results from the prior 6
month period.
Program assessment of students’ clinical skills is based on 4
categories: Proficient, Acceptable, Limited, and no Opportunity. If
the score for any competency falls below the 90% proficiency
level (by totaling proficient + acceptable categories), the concern
is brought to the Director of Education in order to perform a
comparative analysis for the proficiency among all Ross campuses.
Goals for 2011‐12
Responsible Party
80% participation rate of all classes during the
reporting period.
Director, Instructors
Review Dates
January 2012 and
July 2012
75% of proficiencies listed on each rubric are
completed for each evaluation period.
STANDARD
h. surveys of students (classroom and clinical experience), clinical extern affiliate,
graduate and employer satisfaction with the program surveys
A program must survey each of the constituents identified above. The purpose of the
surveys is to collect data regarding student, clinical extern affiliate, graduate and
employer perceptions of a program’s strengths and weaknesses.
At a minimum, an annual review of results of the surveys is conducted, and results are
shared with administration, faculty and advisory boards. Decisions and action plans are
based upon review of the surveys, and any changes made are documented (e.g.,
meeting minutes, memoranda).
The institution establishes: (i) a goal for the percent of surveys returned and (ii)
benchmarks for the level of satisfaction desired. Accordingly, a program must
document that at a minimum the survey data included in its effectiveness assessment
include the following:
A representative sample must provide feedback to determine program effectiveness;
therefore, two goals should be established for all surveys
(1) a goal for the percent of surveys returned as well as
Survey participation rate: SP / NS = TP
SP = Survey Participation (those who actually filled out the survey)
NS = Number Surveyed (total number of surveys sent out)
TP = Total Participation by program, by group; meaning the number of students/clinical
extern affiliates/graduates/employers by program who were sent and completed
the survey during the ABHES reporting period (July 1–June 30).
Program Effective Plan Manual
Updated January 2012
18
(2) satisfaction benchmarks
Programs must assess satisfaction by surveys for the currently enrolled student, the
clinical extern affiliate, the recent graduate, and the graduate’s employer.
Student:
Student evaluations are used as a composite of student views relating to course
importance and satisfaction and overall class attitudes about the classroom and
clinical environments.
EXAMPLE:
Student Satisfaction:
The surveys conducted periodically throughout the reporting year should assess the
students’ satisfaction with the administration, faculty, and program training, including
externship. The institution would also want to establish a percentage return goal such as
90% of the students complete the survey.
Rationale for Data
Secure feedback from students
relating on importance and
satisfaction on customer service
and overall attitudes related to
the institution’s administration.
Data used to reflect on what
worked or didn’t work.
End of term student evaluations
used as composite of student
views relating to course
importance and satisfaction and
overall class attitudes about the
classroom environment. Faculty
use data to determine
effective/ineffective activities
and compare this information
with other classes.
Collection
Procedures
Student
Satisfaction
Surveys collected
semiannually
Goals
90% student participation
Using Student Satisfaction Surveys
(orientation through graduation) the
baselines are:
Tutoring
80%
Academic Advising
80%
Support From Admissions
75%
Financial Aid
75%
Career Services
75%
Library
80%
Spirited/Fun Environment
50%
Orientation Sessions
75%
Recognition
65%
Mission Statement
50%
Admin Accessibility
80%
Facility
70%
Social Activities
50%
Student Course Using the Student Course Surveys the
Surveys collected following baselines have been established:
each term
Presentation/Delivery Methods
80%
Course Pace
80%
Course Objectives
75%
Information Clarity
85%
Textbook
75%
Supplementary Materials
75%
Classroom Assignments
80%
Support From Faculty
75%
Encouragement/Motivation
80%
Summary/Analysis
Feedback obtained from
completed surveys
should be tallied for
each category.
Feedback obtained from
completed surveys
should be tallied for
each category.
Improvement
Strategies
The data is collected
and benchmarks are
set and analyzed for
improvement
strategies when
measures fall below
established baselines.
Failure to achieve a
baseline goal will be
addressed at faculty
and in-service
meetings
STANDARD
Clinical extern affiliate:
Externship site evaluations include a critique of students’ knowledge and skills upon
completion of their in-school training and reflect how well the students are trained to
perform their required tasks. They include an assessment of the strengths and
weaknesses, and proposed changes, in the instructional activities for currently enrolled
students. The sites also evaluate the responsiveness and support provided by the
designated school representative, who visited the site and remained in contact with
the site throughout the duration of the students’ externship.
Program Effective Plan Manual
Updated January 2012
19
EXAMPLE:
Externship sites are off-campus labs enabling students to apply acquired knowledge and
skills. Students on externship should be given an opportunity evaluate this experience just
as they did in the classroom.
Summarized results of the externship site evaluations of the students’ knowledge and
skills upon completion of their in-school training should reflect how well the students are
trained to perform their required tasks, and include an assessment of the strengths and
weaknesses, and proposed changes, if any, in the instructional activities for currently
enrolled students. The sites should also evaluate the responsiveness and support
provided by the designated school representative, who visited the site and remained in
contact with the site throughout the duration of the students’ externship.
Collection
Rationale for Data
Procedures
To maintain interaction with off- Externship site
site labs to identify student skill survey collected
level and provide follow-up bi-weekly by
instruction when deficiencies externship
identified.
coordinator on
Friday during
student
externship
assignment.
Student clinical
experience
evaluation
Goals
100% Student participation
Attendance
Initiative/Appearance
Communication
Critical Thinking
Information Use
Quality of Work
Multi-Tasking
Technical Procedural
Proficiency
Professional Attitude
Interaction & responsiveness
of institution
100%
80%
80%
80%
90%
90%
80%
90%
80%
Summary/Analysis
Feedback obtained from
completed surveys
should be tallied for
each category.
Improvement
Strategies
Failure to
achieve a
baseline goal
will be
addressed at
faculty and
curriculum
meetings to
design
improvement.
95%
For all students to rate their overall clinical
experience at or above the “cut score” of 3 on
a 5-point Likert scale.
Rating of School Representative
School representative’s responsiveness 90%
Quality of the designated school
representative’s visit to the site
90%
Rank the representative’s contact with
the site throughout the duration of the
student’s externship.
90%
For all clinical affiliates to rate their contact
with the school experience at or above the “cut
score” of 4 on a 5-point Likert scale.
STANDARD
Graduate:
A program has a systematic plan for regularly surveying graduates. Graduate surveys
are provided no sooner than 10 days following graduation. At a minimum, an annual
review of the results is conducted and shared with administration, faculty and advisory
boards. Decisions and action plans are based upon the review of the surveys, and any
changes made are documented (e.g., meeting minutes, memoranda).
EXAMPLE:
Graduate Satisfaction:
Include information reflecting the level of satisfaction of graduates with the academic
program, how well the educational and clinical experiences prepared the student for
employment, and how the education relates to their current position. Such information
Program Effective Plan Manual
Updated January 2012
20
could include a measurement of the quality of instruction and the relevance and
currency of curricula. The major distinction between graduate and student satisfaction is
that graduate feedback should be sought once graduates’ have had an opportunity to
seek and secure employment in their fields and have been employed long enough to be
able to evaluate their training in relation to the tasks performed on the job.
Because graduate and employer satisfaction surveys provide valuable information for
timely program revision and development, they must be conducted on an on-going
basis and summarized at least annually as the information from these surveys are vital to
the PEP in evaluating educational outcomes and setting short- and long-term goals. The
institution should also establish a percentage return goal such as 60% of the graduates
return the survey.
Rationale for Data
To ensure the ability
of graduates to secure
employment related
to their interests and
training both upon
graduation and
throughout their
lifetime.
Collection
Procedures
Data collected
using the alumni
survey a
minimum of 30
days following
student
employment.
Goals
60% of employed graduates during the reporting
year will return the completed survey.
Baseline is to have an 80% or higher evaluation
in skill categories and 70% or higher in the
leadership area of surveys rated at 3 or above on
a 5 point Likert scale.
Prepared for career in ability to:
 Communicate
with
employer/coworkers
 Think critically
 Use information
 Multi-task
 Apply knowledge
 Use learned skills
 Perform to employer satisfaction
 Secure desired position
Satisfied with instruction quality
Provided effective job search skills
Summary/Analysis
Feedback obtained
from completed
surveys should be
tallied for each
category.
Improvement
Strategies
The data is
collected and
benchmarks set
and analyzed
for
improvement
strategies when
measures fall
below
established
baselines.
Employer:
A program has a systematic plan for regularly surveying employers. Employer surveys
are provided to employer no fewer than 30 days following employment. At a minimum,
an annual review of the results is conducted and shared with administration, faculty
and advisory boards. Decisions and action plans are based upon the review of the
surveys, and any changes made are documented (e.g., meeting minutes,
memoranda).
EXAMPLE:
Employer Satisfaction:
Information about the degree of employer satisfaction regarding the competencies of
graduates who have completed a program of study is a major part of determining
program effectiveness. This information reflects how well employees (graduates) are
trained (skill level) to perform their required tasks, and include an assessment of the
strengths and weaknesses, and proposed changes, if any, in the instructional activities for
currently enrolled students. This example below is for the institution, however, to ensure
that all program graduates are meeting this goal, data could be tallied by program. The
Program Effective Plan Manual
Updated January 2012
21
institution should also establish a percentage return goal such as 50% of the employers
return the survey.
Rationale for Data
To maintain
interaction with
employing
community,
identify current
workplace needs,
and anticipate
future job
requirements that
will shape the
careers and
graduate
opportunities.
Collection
Procedures
Goals
Employer Survey Using the Employer Survey, rate graduate
job performance and use data to update
curriculum, program objectives and program
offerings.
Collected
quarterly and
tallied annually
in November
Employer Response Rate
50%
Data collected in the following:
Technical Knowledge
Proficiency
Information Use
Quality of Work
Multi-Tasking
Communication
Critical Thinking
Professional Attitude
85%
80%
80%
70%
85%
75%
85%
Summary/Analysis
Feedback obtained
from completed
surveys should be
tallied for each
category.
Improvement
Strategies
The data is
collected and
benchmarks
are set and
analyzed for
improvement
strategies
when
measures fall
below
established
baselines.
Other goal examples:
For all employers to rate graduate overall
knowledge base, clinical experience, and
interpersonal communication skills at or above the
“cut score” of 3 on a 5-point Likert scale.
STANDARD
i. faculty professional growth and in-service activities.
A program maintains data that evidences faculty participation in professional growth
activities and in-service sessions that promote continuous evaluations of the programs
of study, instructional procedures and training.
Include the schedule, attendance roster, and topics discussed at in-service training
sessions conducted during the reporting year. The data should evidence that the
sessions promote continuous evaluation of the program of study, training in instructional
procedures, and review of other aspects of the educational programs. Outline
procedures for monitoring all full-and part-time faculty participation in professional
growth activities in an effort to remain current in their fields. Include the past two years’
in-service training and professional activities outside the institution for each faculty
member.
EXAMPLE:
Rationale for
Data
Invest in faculty
development to
ensure current
expertise and
ability.
Collection
Procedures
Professional
development plan
tied to other
assessments (student
evaluations, faculty
evaluations, etc.) to
directly address
identified areas of
need.
Professional
development plans
prepared annually
based on instructor
evaluation and
reviewed quarterly.
Program Effective Plan Manual
Updated January 2012
Goals
Full-time and adjunct faculty
professional development
participation, as documented in
professional development plans and
professional development programs,
will increase.
Summary/Analysis
Feedback obtained
from completed
surveys tallied for
each category.
Participation minimum:
 On-campus quarterly in-service
3 In-service
evaluation
 Instructor initiated off-campus
Standard feedback
professional development directly
related to teaching assignment
2 form for PD
 Current licensure where required in activities
the field
22
Improvement
Strategies
Data collected and
benchmarks set and
analyzed for
improvement strategies
when personnel fail to
fulfill plan.
Future topics based on
survey feedback and inservice evaluations
STANDARD
Subsection 2- Outcomes Assessment
V.I.2. A program has a process for assessing effectiveness.
The Program Effectiveness Plan specifies a process and a timetable for the annual
assessment of program effectiveness in achieving the outcomes it has identified with its
objectives and criteria. The plan must:
i. Document historical outcomes and show evidence of how these historical data are
used to identify expected outcomes and to achieve expected goals (e.g.,
evaluations, advisory boards, credentialing).
Outcomes are the result of students’ successful completion of a program.
Outcomes, though not limited to, are generally defined in terms of retention,
placement, student competencies, student, clinical, graduate, and employer
satisfaction. Use at least three years’ historical outcomes for each element. The last
three PEPs and Annual Reports provide the necessary historical data. Data from
other prior years may be used if it will better define the picture of progress or set
more realistic goals. Describe the measurable standards used to judge the
effectiveness of your institution.
ii. Identify and describe types of data that are used for assessment, how data were
collected, rationale for use of each type of data, timetable for data collection, and
parties responsible for data collection.
Institutions are expected to collect data that clearly evidences the level of
educational outcomes of retention and placement and satisfaction experienced by
current students, graduates, clinical sites and employers of graduates. In addition,
institutions are to include information that is relevant to improving overall
effectiveness, such as in-service training programs and professional growth
opportunities for faculty.
The institution is encouraged to collect a variety of statistical data that will assist it in
improving the educational outcomes. A few examples of possible surveys and
studies include:
 New or entering student surveys  Program evaluations
 Faculty evaluation studies
 Alumni surveys
 Student demographic studies
 Labor market surveys
Studies of student performance might include:




Admission assessments
Grades by course
Standardized tests
Quarterly grade distribution
 Pre-test and post-test results
 Portfolios
 Graduate certification examination results
 Average daily attendance
Consider other studies such as a review of surveys of professional and trade
associations, Chamber of Commerce, U.S. Department of Labor, or economic
development board studies.
Program Effective Plan Manual
Updated January 2012
23
EXAMPLE:
Data Collection
Rationale for Use
Employer Survey collected quarterly and
tallied annually in November by career
services department.
Using the Employer Survey, rate the job performance of graduates and use
data to update curriculum, program objectives and program offerings.
Rating goals:
Employer Response Rate
75%
Goals for data collected in the following areas:
Communication
84%
Critical Thinking
75%
Information Use
80%
Quality of Work
80%
Multi-Tasking
70%
Technical Knowledge Proficiency
85%
Professional Attitude
85%
iii. Review initial baseline rates and measurements of results after planned activities
have occurred.
Data related to the PEP must be evaluated at least once per year and should take
place at a predetermined time. Many institutions evaluate data related to their PEP
on a monthly or quarterly basis then complete an annual comprehensive
evaluation. As previously noted, it is suggested that an institution establish a
schedule or range of evaluation dates for each year to ensure that timely
monitoring is taking place.
To maximize the integrity of the process and opportunities for improving the
educational programs, the individuals involved in the evaluation of the data must
have the responsibility and authority for the development of the educational
programs and educational processes.
An institution should develop an evaluation plan designed to meet its needs; no one
model is prescribed as each institution is unique. An example of how an institution
may evaluate the PEP could be by completing the following activities:
a. Measuring the degree to which institutional or educational goals have been
achieved.
b. Conducting a comprehensive evaluation of the elements outlined in the
Standards Manual.
c. Summarizing the institutional changes that have been developed and/or
implemented based upon information gained from the evaluation process.
d. Documenting changes in institutional, academic, or administrative processes
such as revised goals, planning documents, or program goals and activities.
At the end of the year, a review of the data collected will demonstrate how well the
predetermined goals were met in all categories and will identify changes needed.
An example of how data may be reported could be to set it up in a table format for
easy analysis.
Program Effective Plan Manual
Updated January 2012
24
Goals
Summary/Analysis
Improvement Strategies
Retention program focused on motivating  Daily instructors contact absent students &
students to stay in school only 70% stayed. document discussion.
 Refer students to individuals or services to help
overcome attendance obstacles.
 Department chairs distribute at-risk list to every
employee each Tuesday.
Placement
 Placement staff will be on campus two days/week
during the hours between day and evening classes
70% placed within 60 days of
to meet with students in their last term.
graduation
 Placement personnel will join and participate in
local business and civic organizations.
 Increase employer presence on campus with mock
interviews, speaking, and twice a year career fair.
 Establish on-line career bank
Improved to 80% or above graduate Employers rated performance of graduates  Add at least three critical thinking scenarios to the
below 80% on the following:
last three modules
job performance
Critical Thinking 75%
 Add four multitasking practica the last two
Multitasking
70%
modules.
Retention
75%
iv. Provide a summary and analysis of data collected and state how continuous
improvement is made to enhance expected outcomes.
Provide an overview of the data collected. Summarize the findings for all elements
reviewed that indicate the institution’s strong and weak areas with plans for
improvements, where applicable, and use results to develop the basis for the next
annual review, presenting new ideas for changes to help the institution further
improve its effectiveness.
One of the most important indicators of the effectiveness of the PEP is the
improvement of the educational programs offered at the institution. Establish
specific goals as benchmarks to measure improvement of the institution as a whole
as well as each program. Goals can be set as an annual incremental increase or
set as a static goal, such as 50 percent for employer survey returns.
Summary/Analysis
Employer Response Rate
45%
Overall Job Performance Rating:
Communication
Critical Thinking
Information Use
Quality of Work
Multi-Tasking
Technical Knowledge Proficiency
Professional Attitude
82%
75%
88%
85%
67%
94%
85%
Use of Data to Improve
Career Services maintain bi-weekly contact with employers
and graduates. Phone calls made to unresponsive employers
encouraging them to complete the survey.
The recommendation is that all programs add more emphasis
in communications, critical thinking, and multi-tasking. More
case studies, role-playing, and practical applications in these
areas will be included in all third and fourth quarter courses.
Employer response rate improved but continued effort is
needed to generate a better response. All baseline goals
established were met or exceeded with the exception of
communication and multi-tasking.
v. Identify how data were used to improve the educational process.
At least annually monitor activities conducted that include systematically collecting
data/information on each of the elements; analyzing the data/information and
comparing it with previous findings; and based on the findings, identifying changes
to be made in educational activities.
Program Effective Plan Manual
Updated January 2012
25
An institution may offer an exemplary educational program in terms of curriculum,
but for one reason or another, the educational processes are not allowing the
contents to be delivered effectively to the students. However, by analyzing the
data in the PEP, such as employer, graduate, and student surveys, and faculty
professional development, an institution is able to change the process to enhance
the program or change the program entirely.
Summary/Analysis
Professional development participation as documented in
professional development plans increased and included all
full-time and adjunct faculty. Professional development
was tied to annual evaluations, licensing requirements, and
student evaluations to directly address identified areas of
improvement. Five of the nine faculty attended two field
related workshops, three renewed licenses, all attended at
least two of the four in-services.
Use of Data to Improve
Faculty participation in professional development activities
throughout the system has been widespread and generally
successful. Full-time faculty members have usually
followed through to complete their planned activities,
thereby benefiting the Collegeand the students. Part-time
faculty members have attended a variety of training
sessions organized by the College. For instance, quarterly
Professional Development sessions on campuses have been
beneficial to full-time and part-time instructors, as indicated
by faculty evaluation results from the sessions. Individual
faculty plans will include at least two field related seminars
with some financial support and campus-wide training will
continue to be implemented based on instructional needs, as
determined by faculty members, the Dean’s evaluation, and
mid-term and end of quarter student evaluations, as well as
other assessment data.
vi. Adjust goals as a result of the evaluation of a PEP, based on an assessment of
community and employer demand for graduates, which justifies the continued
need for a program.
At this juncture, it would be advantageous to identify those responsible and
establish periodic times for review to ensure that progress toward the new goals are
on track or if not, determine why, new strategies, and/or adjust the goal.
Goals
Improve employer
survey return rate by
hand delivering the
surveys to respondents
Who Responsible
Clinical coordinator
Review Dates
April 30, 2012, &
November 30, 2012
Summary/Analysis Strategy Adjustment
As of 4-30-12, employer
survey rates have been
improved by 30%
Continue through 11-3011, modify if a decline in
percent returns occurs
vii. Identify the activities that will be undertaken to meet the goals set for the next year.
Problems/Deficiencies
Inconsistent externship
recordkeeping
ATB counseling not frequent
enough and not meeting ABHES
standards
Low Employer Survey return
Inadequate Graduate Survey return
Faculty Files Incomplete
Professional Development
Program Effective Plan Manual
Updated January 2012
Specific Activities
Externship coordinator will contact externship supervisor two days prior to report due date.
Three days following due date, if externship report not received, externship coordinator will
contact externship supervisor.
Institute required mentoring and tutoring for 1st & 2nd term students.
Develop standardized reporting form and procedures for ATB counseling by 7/31/-Distribute to all faculty.
DOE to monitor to ensure that counseling is provided weekly and filed.
Within 10 days of surveys failing to be returned, the career services department will make a
follow-up phone call to those delinquent.
Within 10 days of surveys not returned, the career services department will make a follow-up
phone call to those delinquent.
Dean of education will review faculty files quarterly; contact faculty who have not submitted
professional development documentation.
Quarterly memo to all personnel reminding that all credentials received, CEUs, etc. are to be
submitted to the DOE promptly after receipt.
Increase to quarterly internal monitoring of professional development for all instructors.
Request credentialing information every 90 days. Require instructors to obtain two CEUs per year.
26
FORMAT EXAMPLES*
All PEPs must contain the following:
Title Page to include all of the following:
ABHES I.D. CODE
Name of Institution
Address
City State and Zip Code
Name of Program
Name of Program Director
Credential Awarded
Portion of the program, if any offered via distance learning
Length of program (clock hours, semester/quarter credits, weeks, etc.)
Introduction
Schedule of Review
Institutional Mission and Objectives
Program Description and Objectives
Student Population
Overall Student Population
Program Student Population
FORMAT EXAMPLE I
Program Retention
Retention statistics are extracted from reports in CampusVue. The annual period used to
measure retention for the purposes of accreditation is July 1 through June 30. The
following program retention rates were determined using the ABHES formula as follows:
(EE + G / (BE + NS + RE) = R%
EE = Ending enrollment (as of June 30 of the reporting period) G = Graduates
BE = Beginning enrollment (as of July 1 of the new reporting period)
NS = New starts
RE = Re-entries
R% = Retention percentage
Retention Percentage –
Program
Medical Assistant, Diploma
FY 2008
57.46%
FY 2009
69.7%
FY2010
75.0%
Summary/Analysis:
The Medical Assistant Diploma Program started December 2004. Retention rates have
increased in Fiscal Year 2010 due to implementation of several factors; biweekly meetings with
the program chair, weekly attrition meetings, more accountability for instructors, and campus
wide efforts to retain students.
Rationale: To have the most current picture of who is in the program in order to monitor
participation and progress, and build retention plans based on student needs.
Retention Goal: The retention goal for FY 2012 is 79%
Program Effective Plan Manual
Updated January 2012
27
Responsible Parties: Academic Dean, Program Chair, Faculty and other staff as applicable
Review Dates: Retention of the program will be monitored weekly; however more in depth in
monthly meetings
Types of Data/Methods of Assessment: Daily attendance reports, Weekly retention meetings,
Monthly retention reports, Faculty and student surveys, and Quarterly retention reports.
Reports reviewed with team members: Registrar, DOE, Program Chairs.
Continuous Improvement Strategies:
• Daily monitoring of student attendance by the Instructor, Program Director, Academic
Dean and the Career Services Representative to proactively identify students at risk of
withdrawing or being withdrawn. Regular meetings will be held to discuss at-risk students
and action plans developed both individually (advising) and collectively to address
challenges facing potential drop students.
• The Academic Dean will hold daily meetings with the Program Directors to determine each
student absent four or more days.
• All absent students will be called daily by their Instructors, Program Directors and/or
Academic Dean. Preferably before their regularly scheduled class is over.
• Faculty will be coached on holding students accountable for attendance and for
developing engaging classrooms. If specific classes are identified as having lower than
average attendance and/or retention, coaching and developmental activities will be
implemented for the faculty member(s). Outstanding attendance and recognition
awards will be given to instructors who have the highest retention and attendance each
quarter.
Job Placement Repeat sections for each of the elements to be evaluated
Placement Percentage –
Program
Summary/Analysis:
FY 2008
FY 2009
Rationale:
Goal:
Responsible Parties:
Review Dates:
Types of Data/Methods of Assessment:
Continuous Improvement Strategies:
Credentialing Examination Participation Rate –
Credentialing Examination Pass Rate –
Program Assessment Exam –
Surveys –
Student Surveys
Clinical Affiliate Surveys
Graduate Surveys
Employer Surveys
Faculty Professional Growth and In-Service Activities –
Program Effective Plan Manual
Updated January 2012
28
FY 2010
FORMAT EXAMPLE 2 MA program started in FY2011
Where are the program
objectives published?
Program: MA
The program objective
is stated on page 15 of
the 2011‐2012
Michigan Campus
Catalog.
How does the program
determine that graduates
have achieved the
objectives (e.g. surveys,
credentialing exam)
Who reviews the data?
What process is utilized?
(e.g. semiannually
by advisory
committee)
What changes have
resulted from data
review?
Date of most
recent program
review
Graduation requirements
are stated on page 9 of
the 2011‐2012 Michigan
Campus Catalog.
Student must have a
grade average of 70% or
higher with no less than
60% in any individual
course, and attend no
less than 85% of
scheduled classroom
days. In addition, student
must successfully
complete externship.
Data is reviewed by
campus and corporate
leadership on a weekly
basis by retention and
placement reports, and
also reviewed by the
campus advisory board
twice annually.
Currently no changes are
in place for the Canton
campus as this is the
first year of operation.
The most recent Advisory
Board meeting was held
January 2011.
PROGRAM RETENTION RATE:
MA
2008‐2009
na%
2009‐2010
na%
2010‐2011*
82%
Goals for 2011‐12
Responsible Party
Review Dates
Maintain a retention rate of 82%
Director, Faculty and All Staff
Weekly
Summary/Analysis
Improvement Strategies
The Canton campus achieved a successful 2010‐2011 year with
the first class opened October 4, 2010. All 11 students
completed the program. All 2010‐11 classes achieved a
satisfactory 82% retention rate. Although some withdrawals
were unavoidable due to serious medical conditions, it is to be
noted that some students simply left the program due a change
in their vocational desires. Ensuring prospective students
understand what the program fully entails, including academic
and attendance requirements; will result in a more committed
student population, resulting in a higher retention rate.
Admissions representatives are trained to uncover issues that
may result in students having to withdraw from the program, and
assess preventive strategies with the students so they are
prepared to complete the course. Faculty members are
encouraged to communicate concerns directly to the Director
when a student is struggling with the program. When a student’s
attendance falls below 85%, the attendance card is brought to
the front desk so the Director is aware of student’s attendance.
The student is also counseled by the Director on attendance
requirements.
Job Placement Repeat format for each of the elements to be evaluated
*Thanks to Everest College, McLean, Virginia, and Ross Medical Education Center,
Canton, Michigan, for agreeing to share their PEP formats for this Manual.
Program Effective Plan Manual
Updated January 2012
29
OTHER EXAMPLES:
Examples of changes to a process that might enhance a program:
 If a course requires outside lab or practice time and an analysis of the students’
actual lab or practice time demonstrates that the students are not completing the
required hours, formally scheduling those hours or adding additional laboratory
times may dramatically increase the effectiveness of that course.
 If an analysis of the data demonstrates that a large number of students are failing
a specific course or are withdrawing in excessive numbers from a particular
program, the institution may change the prerequisites for that course or offer extra
lab hours or tutoring to see if the failure or withdrawal rate are positively affected.
Examples of changes to a program that might enhance a program:
 If the analysis of the data indicates that large numbers of students are dropping or
failing a course when taught by a particular instructor, the instructor may need
additional training or a different instructor may need to be assigned to teach that
course.
 If surveys from employers and graduates indicate that a particular software
program should be taught to provide the students with up-to-date training
according to industry standards, the institution could add instruction in the use of
the particular software program. But only after the institution is assured that the
instructor has been properly trained in its use.
CONCLUSION
The PEP is a working document used as a resource to constantly identify and access a
program’s goals that have been established to meet its educational and occupational
objectives. An effective PEP is regularly reviewed by key personnel and used in
evaluating the effectiveness of each program and the overall operations of the
institution. It is important for each institution to establish a program which exhibits the
institution’s progress toward providing the highest quality education for its students to
present to ABHES evaluation teams, government regulatory groups, and the general
public.
“Of all our human resources, the most precious is the desire to improve.” –Unknown
Program Effective Plan Manual
Updated January 2012
30
Download