Levels of Evaluation for Grant Proposals

advertisement
Levels of Evaluation for Grant Proposals
Babbi J. Winegarden, Ph.D.
Assistant Dean for Educational Development and Evaluation
UCSD School of Medicine
bwinegarden@ucsd.edu
Link Goals to Process (think proximal) and Outcomes (think distal)
A. Primary Outcomes: Think Program Implementation (Often “Count” Data –
other terms used include “Process” data)
a. Participation (Count Who, Demographics, Time Involved, # of trainings, #
of people trained)
i. Faculty, Students, Practitioners
ii. Professions Represented
iii. Participant Contact Hours
iv. Institutions Represented
v. Geographic Regions
vi. Underserved Areas Addressed
b. Examples
i. Process Measures
1. Minutes, meetings – adherence to timelines
2. Participant Profiles – Tracker data (given at beginning of a
session to provide info that will go back to HRSA (or
whatever agency your grant is from) – what kind of people
you are drawing into your training
a. Credentials
b. Emails
c. Addresses
d. Dr. Ms. Etc.
e. Last 4 of SS#
f. Age group
g. Gender
h. Ethnicity
i.
03-27-08
Advanced degree
j.
Health care discipline (Allied, primary care,
allopathic, osteo, nursing
k. Role, admin faculty, practitioner
l.
Populations you serve
i. Ethnicity
ii. age
ii. Program Evaluation
1. Numbers/Counts (participants, materials developed,
frequency, length and type of trainings offered)
2. Content Areas (What was taught?)
3. Formats (How was it taught?)
4. User Satisfaction (Surveys)
iii. Barriers to Implementation (What challenges did you face and how
did you address them?)
B. Secondary Outcomes: Think Change in State or Awareness (Knowledge,
Skills Attitudes and/or abilities generated in the individual as a result of being
exposed to the training) -Other terms used include “Intermediate Outcomes” –
Think change in KSA’s (Knowledge, Skills, Attitudes)
a. Learner Evaluation (Did they learn something? KSA’s)
i. Use pre-post testing
1. assess learning both right after the experience and again
after a longer period of time)…make sure you have a
baseline for everything you assess.
2. Or, How much do you know in pre-test and What do you
know after the training
a. Mini survey when come in..what know, years in field
b. Post is action plan (see below)
3. Evaluation of skills related to population of interest
4. Evaluation of Knowledge and Attitudes related to
population of interest
ii. Baseline – make sure you have a baseline for most things
iii. Short-term assessments – right after training
03-27-08
iv. Long-term assessments – after time has elapsed (often 6-12
months)…see if KSA change stayed
1. Do you need more booster sessions?
b. Examples
i. Action plans – I am going to go out and do X,
ii. Curricular Projects – what was done/created…shows the learning
from the training
iii. Clinical Teaching Improvements – plan to change this or that
about how they teach or do research etc.
iv. Benchmarking…getting baseline and figuring out criterion you
want to shoot for…always has a criterion
1. How many patients exposed to this practice – right now
only doing it with 10%...standard in field is 50% so in 6
months want to reach 50% of patients with change in
practice
v. Can use: Quantitative and Qualitative methods
1. Quantitative
a. Likert 1-5 survey on what they learned
2. Qualitative
a. Action plan
3. Focus Groups/Debriefings/Journaling/Interviews
a. What did you learn
b. Tell us what resonated so we can do it again
vi. Competency/Skill Evaluation of Participants –can they do what
you taught them? Doesn’t mean a change in practice
though…that would be tertiary.
1. Direct Observation (checklists)
2. OSCE’s
3. Simulated or Standardized Patients
vii. Case Study Tests
viii. Academic records/tracking
1. Competency
03-27-08
ix. Peer review of products
1. Peers rate each other on presentations, knowledge
x. Components of Personal Action Plan
1. Learner directed
2. Apply what you learned
3. Make a statement of intent
4. Outline objectives
5. Strategies to overcome obstacles
6. How will this improve your program
7. Follow-up
a. Accomplish plan?
i. Add new content
ii. Start using new assessment tool (skills)
iii. Do lecture or preceptorship
b. Obstacles
C. Tertiary Outcomes: Think Did I Accomplish My Goal(s)? How Do I Show
That? (Direct impact – ex. program efforts on patient health status outcomes–
Other terms used include “Summative Outcome Data” often meaning “impact”)
Can also do tertiary outcome for practitioners. Whatever your goal is…did that
happen?
a. Patient -tertiary
i. Change in Health Outcomes
ii. Monitoring of Health Status of Older Adults/Patients
a. Regional/State/National
b. Geographical Area – Duke University has portal on
Geriatrics
2. Longitudinal Trends
a. Change over time – ex. 3 years of data
b. Examples
i. Measures of improvement in patient health
1. Change in practice for practitioner
03-27-08
a. Staff develop check lists on wound healing, decrease
in amount of time wounds heal
ii. Partnerships/collaborations for data collection
1. Clinically relevant data collected at project sites
2. Quality assurance, Informatics
3. Patient Health Indicators (falls, medication errors, etc.)
iii. Chart Reviews
iv. Health Literacy
1. Access/Utilization
2. Patient-provider relationships - improvement
v. Patient Surveys
1. Quality of life, health status
2. Relevant Patient satisfaction reporting tool
3. Phone follow-up, interviews
vi. Case Studies
1. One individual
vii. Anecdotal Reporting
1. Narratives from the field
2. Most of the clinics that we work with report an
improvement – qualitative – not measured
c. Practitioner Examples-tertiary
i. Change in Trained Provider Pool
ii. Identification of Health Professionals in the Region
iii. Comparisons with health Professionals Trained
1. Comparing people who were trained with your program and
those who were not – very hard to do
iv. Longitudinal Trends in Health Care Professions
1. Chart people trained over last 3 years -how it has affected
change in practice
v. Placement Tracking of Trainees
1. Where did they go
2. What additional populations do they now serve?
vi. Change in Service Quality, Access, Availability
03-27-08
1. Adherence to Quality Standards
a. Change in practice
i. Chart review
ii. Policies of institution-compliance with federal
guidelines
vii. Longitudinal Analyses of Service Availability
d. Institutional Change-tertiary
i. If practitioners go back and change practice which changes policy
ii. Program Elements Incorporated into Ongoing Care
Special Thanks to Julianne Manchester, Ph.D for ongoing discussions regarding primary,
secondary and tertiary outcome data.
References
Bachrach, P., McCreath, H., Murray, P. & Patrick, L. (2008, February). NTACC
National Training and Coordination Collaborative for Geriatric Education Centers.
Paper presented at the meeting of the Association of Gerontology in Higher
Education Annual. Baltimore, MD.
Frank, J. C. & Bachrach, P. S. (2008, February). Secondary Outcomes. Paper
presented at the meeting of the Association of Gerontology in Higher Education
Annual. Baltimore, MD.
McCreath, H. & Patrick, L. (2008, February). Primary Outcomes for GECs. Paper
presented at the meeting of the Association of Gerontology in Higher Education
Annual. Baltimore, MD.
Murray, P. (2008, February). Tertiary Outcomes: Models, Examples, and Discussion.
Paper presented at the meeting of the Association of Gerontology in Higher
Education Annual. Baltimore, MD.
Perweiler, E. (2008, February). GEC’s and the Evaluation Mandate. Paper presented
at the meeting of the Association of Gerontology in Higher Education Annual.
Baltimore, MD.
03-27-08
Download