REVISED 1/2013 Evaluation Methodology

advertisement
REVISED 1/2013
Program Evaluation Plan: A Prescription for Healthcare Training in Tennessee (RX TN)
Evaluation Methodology: Comparison Group (Non-Experimental)
Date of Interim Report: 11/30/14
Date of Final Report: 9/30/16
PURPOSE. The purpose of this evaluation is to collect, analyze, and interpret data pertaining
to RX TN that will (a) lead to continuous program improvement and (b) determine the extent to
which the various program components are associated with positive outcomes and impacts in the
lives of program participants. Due to the multi-year nature of this project, the evaluation will
begin with an implementation evaluation to ensure that all program activities are being
conducted as planned. While the implementation evaluation will be ongoing, the evaluation
focus will also include participant outcomes and impacts later in the project cycle.
To promote engagement in evaluation activities throughout the grant period, the primary
stakeholders of this project have been identified prior to program implementation. The U.S.
Department of Labor (USDOL), the grantor of this project, is the primary stakeholder who seeks
to fund programs that lead to education and employment outcome improvements. Students
receiving student services provided by this grant, and/or those who enroll in one of the grantfunded career programs funded with this project, also have a large stake in program
implementation and outcomes. Thus, these students are a crucial source of data during both the
implementation stage and the outcomes stage of evaluation. Each of the career training fields
will have a designated curriculum specialist who also has a vested interest in evaluation findings.
The program director, assistant director, and all other staff funded with this grant, as well as
community partners including employers, are other identified stakeholders of this project.
Members of stakeholder groups have already contributed to the evaluation planning process, and
they are expected to show continued involvement throughout the evaluation in an interactive
1
REVISED 1/2013
process with the third-party evaluator. This will ensure that the most pertinent evaluation
questions are being addressed; appropriate and feasible methods for data collection are
developed and used; and evaluation results are disseminated and utilized as appropriate.
EVALUATION QUESTIONS. The evaluation questions to be addressed by this evaluation
were created by working collaboratively with grant team leaders to ensure a shared
understanding of the “theory of change” underlying the program and to align evaluation
processes with the informational needs of these stakeholders. Engaging this subset of
stakeholders (i.e., the intended users of the evaluation findings) early on in the process helps to
promote ownership of the evaluation process, as well as eventual use of evaluation results.
Information regarding program implementation, as well as program outcomes, will enable
program stakeholders to improve program processes as well as evaluate whether program
activities are leading to intended outcomes.
The purpose of the implementation evaluation is to determine the extent to which the
various project inputs/activities/outputs are occurring as intended by the program creators. This
will allow project staff to (a) make adjustments when program activities are not occurring as
planned, and (b) determine whether program outcomes and impacts may be attributed to these
planned activities. Thus, a 3rd party evaluator will be involved early in the planning stage to
provide “real-time” feedback to program staff, and a significant amount of evaluation resources
will be expended to answer the planning and implementation evaluation questions presented in
the top portion of Table 1. By thoroughly addressing implementation, specific recommendations
can be made to the USDOL and other institutions as future programs are developed and funded.
The central purpose of the evaluation of program outcomes and impacts is to determine
the extent to which the program accomplished its goals. By utilizing matched comparison groups
2
REVISED 1/2013
and statistically controlling for differences between the participant and comparison groups (e.g.,
by using Analysis of Covariance), results from the evaluation can be used to estimate the value
added by the program. In other words, the design used for this evaluation will enable
measurement of the impact of various program components on its participants by contrasting
participant outcomes and impacts with those of non-participants (see the bottom portion of Table
1 for outcomes/impacts evaluation questions).
METHODS. To address the evaluation questions presented in Table 1, a mixed-methods
approach to data collection will be used to collect information from multiple sources. Using both
qualitative and quantitative data sources concurrently will allow the third-party evaluator, with
assistance from the data manager, to gain a multi-dimensional perspective that allows for a more
thorough analysis and promotes triangulation. Table 1 shows the methods and data sources for
each evaluation question. Importantly, this table shows that data on both the comparison groups
and the program participant groups (described below) will come from the same data sources. All
participant and comparison group members will be affiliated with a consortium institution;
therefore, it will be possible to collect the information needed by USDOL (e.g., name, date of
birth, and Social Security Number) to track comparison group members during the student
enrollment process.1 Because evaluation is ongoing, and especially because this is a multi-year
program, it is likely that some of these components will be modified along with the changing
priorities and data needs of the primary stakeholders. As such, an examination of the utility of
this plan will occur on a routine basis.
DESIGN & DATA ANALYSIS. The various programs of study, along with the diverse nature
of the evaluation questions and corresponding methods of data collection, will necessitate a
1
This consortium has agreed to provide these data to the Department of Labor on an annual basis during the grant
period of performance.
3
REVISED 1/2013
comprehensive yet flexible design and data analysis plan. Random assignment into participant
and control groups is not practical for this project because educational training will be provided
to all students who apply to, are accepted into, and enroll in a grant-funded program of study. In
other words, program directors will select participants from a pool of self-selected candidates
based on their assessment of students’ aptitude in the programs.
When appropriate, the planning and implementation evaluation will address whether the
program is organized, managed, and delivered in a consistent manner across RX TN institutions.
Comparison groups (consisting of an equal number of students as the participant groups) will be
used to measure grant impact on participants in most of the training programs. Overall, program
participants will differ from the cohort group because only the participant group will have
received the various support prescriptions included in the RX TN grant. For instance,
participants will be provided with skills assessment and readiness services, career exploration
and academic planning services, academic preparation and supplementary instruction services,
and retention and completion coaching services. With the exception of the Surgical Technology,
Patient Care Technician and Emergency Medical Dispatcher programs, for which no appropriate
comparison data are available, posttest comparisons of participant and comparison groups will be
used to measure grant impacts. See Table 2 for a description of all comparison groups. When it
is appropriate, comparison and participant groups will be matched on the basis of age, gender,
and institution to attenuate the effect of nonequivalence on validity. When group differences
exist, statistical analyses will control for these variables. The last column in Table 1 indicates the
data analysis procedures that will be used to address each evaluation question.
INTERPRETATION & DISSEMINATION. So that evaluation results are meaningful and
conducive to program improvement, data will be interpreted and conclusions will be made by the
4
REVISED 1/2013
3rd-party evaluator in conjunction with the RX TN director and other stakeholders as
appropriate. Evaluation results will be shared regularly with RX TN management to ensure that
adjustments are made in a timely manner. The 3rd party evaluator will work with grant
management to develop a plan to disseminate findings2, some of which will be included in the
interim report. The evaluator will also assist grant management with decision-making that leads
to the modification/improvement of processes that will be evaluated and reflected in the final
report. This dissemination plan is tied to the utility standard of evaluation3, and will ensure that
results are used in purposeful ways that support continuous program improvement. Finally,
consistent with the grantor’s goal to fund programs that supply TAA and other adults with the
skills and credentials needed to gain high-wage, high-skill employment, the evaluator will make
evidence-based suggestions for future replication of the program in the final report.
THIRD-PARTY EVALUATOR SELECTION. The 3rd party evaluator of RX TN must
practice in accordance with the national program evaluation standards and the Guiding
Principles of the American Evaluation Association (AEA, 20084). Using AEA’s Guiding
Principles ensures that program evaluators engage in rigorous evaluations that are based on
systematic data-based inquiry. Using the Guiding Principles in the selection process also helps
ensure that only competent, ethical, and respectful evaluators are chosen to facilitate the
evaluation of this program. The process of selecting an evaluator requires adherence to state and
federal purchasing guidelines to appropriately bid for these services. Roane State expects to
attain a 3rd party evaluator by April 1, 2013.
2
A dissemination plan will include regularly scheduled meetings with stakeholders, and/or the development of
multiple, tailored evaluation reports aimed at specific stakeholder groups.
3
Yarbrough, D. B., Shulha, L. M., Hopson, R. K., and Caruthers, F. A. (2011). The program evaluation standards:
A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage
4
American Evaluation Association: Guiding principles for evaluators (2008). American Journal of Evaluation, 29,
397-398.
5
REVISED 1/2013
Table 1
Evaluation Question
Method
Data Sources
Analysis
Planning/Implementation Evaluation
1. What process was used to plan the
various program components, including
student services?
Key Variables
(a) Curriculum selection, creation, and
application, (b) Program design or expansion
using grant funds, (c) Delivery methods of
program curricula, (d) Program administration
structure, (e) Support services to participants,
(f) Types of in-depth assessment of
participants’ abilities, skills, and interests for
selection into program, and determining
appropriate program and/or course sequence;
Who conducted the assessment? (g) How were
assessment results used? Were assessment
results useful in determining the appropriate
program and course sequence for participants?
(h) Methods used to provide career guidance
2. What can be done to improve the
program during both planning and
implementation?
Key Variables
(a) Satisfaction of key stakeholders (program
participants, team leaders, employers), (b)
Quality of student support services (e.g.,
academic planning, Healthcare Career
Workshop, academic “boot camps,” digital
literacy training, supplementary tutoring in
“gateway” courses, completion coaches), (c)
Questionnaires,
Review of Technical
Proposal including
Work Plan
Program Director,
Assistant Director,
Data Manager,
Curriculum
Specialists,
Program
Coordinators
Content analysis of qualitative data
Interviews and/or
Focus Groups
Grant Director,
Assistant Director,
Curriculum
Specialists,
Program
Coordinators,
Program
Participants
Employers,
Program
Participants
Content analysis of qualitative data
Surveys
-Content analysis of qualitative data
-Descriptive data analysis of quantitative
data
6
REVISED 1/2013
Evaluation Question
Quality of technology components (e.g.,
shared online learning tools), (d) Contribution
of partners to program design, curriculum
development, recruitment, training, placement,
program management, leveraging of resources,
and commitment to program sustainability, (e)
Outreach to TAA workers, (f) Dissemination
of curriculum to other schools
3. What factors are contributing to
partners’ involvement (or lack thereof) in
the program?
4. Which contributions from partners are
most critical to the success of the grant?
Which contributions are less critical?
5. Are program activities and outputs
consistent with what was planned? Is there
consistency across institutions?
Method
Data Sources
Analysis
Interviews and/or
Focus Groups
Grant Director,
Assistant Director
Content analysis of qualitative data
Interviews and/or
Focus Groups
Grant Director,
Assistant Director,
Program
Coordinators
Program
Participants
Content analysis of qualitative data,
disaggregated by institution
Survey
Document Analysis
Progress Reports
- Content analysis of qualitative data,
disaggregated by institution
-Descriptive data analysis of quantitative
data, disaggregated by institution
Comparison of progress reports to project
plan
7
REVISED 1/2013
Outcomes/Impacts Evaluation
6. To what extent was the self-paced
competency curricula (a component of
student support services) associated with
higher COMPASS test scores?
Document Analysis
Compass Test
Scores
7. To what extent were student support
services associated with graduation and
retention rates at participating institutions?
Document Analysis
Student Data
Survey
Program
Participants
Document Analysis
Student Data
Surveys
Employers,
Program Graduates
8. How many program participants
completed a TAACCCT-funded program of
study?
9. To what extent did program participants
achieve mastery of key program outcomes?
5
-Descriptive data analysis of quantitative
data
-ANOVA (group5 x campus) comparing
COMPASS test scores of students in
healthcare programs at TTCs, TAA
workers, and others considering training
who received self-paced competency
curriculum, with those who did not.
-Descriptive data analysis of quantitative
data
-ANCOVA (group x campus x gender x
age) comparing Allied Health/Nursing
“hold” students who received student
support services with those hold students
who did not (regardless of whether they
enrolled in Allied Health/Nursing)
-ANCOVA (group x campus x age x
gender) comparing Allied Health/Nursing
students receiving services with those who
did not receive services
-Content analysis of qualitative data
-Descriptive data analysis of quantitative
data
-Descriptive data analysis
-ANCOVAs (group x campus x age x
gender) for each program of study
-Content analysis of qualitative responses
-Descriptive Data Analysis (disaggregated
by program)
See Table 2 for a description of comparison groups for each program of study.
8
REVISED 1/2013
10. How many participants earned degrees
and certificates in the various grant-funded
programs of study?
11. How many participants who completed a
grant-funded program of study entered
employment in the quarter after the quarter
of program exit?
12. How many participants who completed a
grant-funded program of study who entered
employment (in the quarter following the
quarter of program exit) retained
employment (into the second and third
quarters after program exit)?
13. What are the average earnings for
participants attaining employment?
Certification/Licensure National Healthcare
Exam Scores or Pass
Assoc. Cert.Exam,
Rates
EMD Cert. by the
State of TN Dept.
of EMS, NCLEXRN, National Board
for Cert. in OT
Document Analysis
Student Data
-Descriptive data analysis (e.g., means,
frequencies) of quantitative data
-ANOVAs (group x campus) for each
program of study
-Descriptive data analysis of quantitative
data
-Degrees/Certificates and Employment:
-ANCOVA (group x campus x age x
gender) comparing participants versus
non-participants
-Salary:
-ANCOVA (group x campus x gender x
age) comparing salary of participants with
the salary of non-participants
9
REVISED 1/2013
Table 2
Program
Component(s)
Surgical Tech.
(AAS)
Allied Health
Sciences (AAS)
Medical
Informatics
(AAS)
LPN to RN
Mobility
Description of Participant vs. Comparison
Group Differences
N/A: This is a new program at all consortium
schools. No comparison group is available.
Enhancements to curriculum to make it
available online to RX TN participants.
Enhancements to curriculum to make it
available online to RX TN participants.
Participants will experience enhanced
pathways from technology center LPN
training to community college RN training.
Participants exposed to more online and
simulation components.
Occupational
Enhancements to curriculum to make it
Therapy Asst
available online to RX TN participants.
Emergency
N/A: This is a new program at all consortium
Medical Dispatch- schools. No comparison group is available.
-Public Safety
E.C.G. Tech
Enhancements to curriculum to make it
available online to RX TN participants.
Patient Care Tech
Phlebotomy Tech
N/A: This is a new program at all consortium
schools. No comparison group is available.
Enhancements to curriculum to make it
available online to RX TN participants.
The source of the Participant Group are those
students enrolled in programs at the following
institutions during the duration of the grant period:
Roane, Cleveland, Walters
The Comparison
Group consists of
the Spring, 2013
cohort of students
at the following
institutions:
Roane, Volunteer, Cleveland, Chattanooga,
Columbia, Southwest, Walters
Roane, Volunteer, Cleveland, Columbia,
Dyersburg, Nashville
Roane,Volunteer
Roane, Cleveland, Columbia, Dyersburg, Motlow,
Northeast, Pellissippi, Southwest, Walters
Roane, Cleveland,
Walters, Motlow,
Dyersburg
Roane, Cleveland, Chattanooga
Roane, Cleveland,
Nashville
Roane, Walters,
Volunteer
Roane, Volunteer, Dyersburg, Northeast, Walters;
TTCs: Memphis, Nashville
Roane, Volunteer, Columbia, Dyersburg, Jackson,
Northeast, Walters; TTCs: Memphis, Nashville,
McMinnville, Murfreesboro
Roane, Volunteer, Columbia, Dyersburg, Jackson,
Walters; TTC-Nashville
Roane, Volunteer, Columbia, Dyersburg, Jackson,
Northeast; TTCs: Memphis, McMinnville,
Murfreesboro
Roane
Roane, TTCNashville
10
Download