HD FS 503: Conducting Outcomes Evaluations Objectives

advertisement
HD FS 503: Conducting
Outcomes Evaluations
Susan Hegland
April 15,2002
1
Objectives
Participants will
n Identify purposes for evaluation
n Differentiate between evaluation input,
output, quality, and outcomes
n Identify output and outcome measures
to program needs
n Plan appropriate output and outcome
measures
2
Agenda
6:00
6:30
6:15
7:30
8:00
8:30
8:40
8:50
Why Evaluate
Identifying Types of Evaluation
Break
Developing a Logic Model
Linking Local Needs to Evaluation
Outcomes
Planning Your Evaluation
Evaluation
Adjourn
3
1
If you do what you always did…
You will get what you always got
Kenneth W. Jenkins
President, Yonkers NY NAACP
4
Why evaluate?
n
n
Record 2 reasons for conducting an
outcomes evaluation
Share
5
State Language Framework
What’s the problem?
n Who lacks this knowledge, skill, condition?
(Indicator)
What did we do?
n How much did we put in? (Input)
n How much did we do? (Output )
n How well did we do it? (Output Quality,
Efficiency, Satisfaction)
What difference did it make?
n Changes in program participants (Outcomes)
6
2
Key Outcomes Terms
The desired effect
on participants
For which
participants?
In what area?
Increase
Improve
Expand
Decrease
Reduce
Maintain
Parents
Children
Child Care Providers
Teachers
Neighborhoods
Goals
Attitudes
Skills
Knowledge
Condition
Behaviors
7
Exercise 1: Identifying Types of
Evaluation Data
56 phone calls were received by CCRR seeking
infant-toddler care
Is this statement an example of:
n Community Needs Assessment data (an
indicator)?
n Input Data?
n Output Data?
n Quality Data?
n Short Term Outcome Data?
n Long Term Outcome Data?
8
Unpacking the Black Box
What did we do?
What difference
did it make?
Program
Need
Input
Output Quality
The Black Box
Outcomes
9
3
Exercise 2: The Logic Model
State Result
Local Indicator
Local Need
Input
Output (Quantity)
Output (Quality)
Short Term Outcome
Long Term Outcome
10
State Results
A.
B.
C.
D.
E.
Healthy Children (Birth to 4)
Children Ready to Succeed in School
Safe & Supportive Communities
Safe & Nurturing Families
Secure & Nurturing Child Care
Environments
11
Exercise 3: Linking Five Types of
Evaluation to State Results
State
Result
E
Local
Indicator
Need
Input
% 5’s
Skills for $$$ for
lacking
preschool home
motor,
children
visitors
social,
language
skills
Output
Quality
Outcome
# home
visits
made to
#
families
%
parents
reporting
satisfied
% children
at age level
on ASK
12
4
Costs of Evaluation
Low Cost Evaluations
Moderate Cost Evaluations
High Cost Evaluations
n
n
n
13
Low Cost Evaluations
n
n
Utilize only program funds
Goals:
n
n
Improve services
Document services provided
n
Input and Output Quantities
n
Output quality
n
Outcomes
n
n
n
n
n
Program Records
Participant satisfaction with services
Supervisor evaluations of staff performance
Participant self-report on program benefits
Staff ratings of participant progress
CAUTION:
n
n
NOT appropriate for funding decisions
Pressure may affect validity of data collected
14
Moderate Cost Evaluations
Additional components:
n Multiple measures of
n
n
n
n
Program Quality (Output)
Program Outcomes
Some data not collected by program provider
Standardized outcome measures
n
n
Permit comparison with norm -referenced group
Outcome measures may be completed only a
random sample of participants
15
5
High Cost Evaluations
Additional components:
n Participants randomly assigned to
treatment and control (comparison
group)
16
Selecting an outcome measure
n
n
Collect from multiple sources
Match content to goals logically expected
from program activities
n
n
n
n
n
Design your own measure
Use already created instrument
Adapt existing measure
Match reading content to participant reading
skills
Collect pre/post measures
n
Use specific wording to avoid decline in scores!
17
Caution: Don’t Confuse Selfreport Output & Outcome
Measures from Participants
Output Quality (service satisfaction):
n
“How would you rate each of the following services you
received?
n
Services on how to be more patient with my child
n
This service was
n Helpful
n Not helpful
n Does not apply
Outcomes
n
As a result of participating in XYZ, I
n
Read more books to my child
n
n
n
Yes
No
Unsure
18
6
Protect Participant Confidentiality
n
n
Train data collectors to keep all information
about participants confidential
Obtain informed consent from participants
n
n
n
n
n
Purpose of evaluation
Assurance of confidentiality
Freedom to withdraw from evaluation without
jeopardizing services
Replace names with ID numbers for pre-post
measures
Keep codes for ID numbers in separate,
locked file
19
Creating Your Evaluation Plan
n
Source of evaluation information?
n
What measures will be used?
n
Who will collect the evaluation information?
n
When will the information be collected?
n
Who will summarize the information?
n
What will be the annual cost of this
evaluation?
n
n
n
n
n
Program director, staff, program participants
Surveys, interviews, observations, questionnaires
Program administrators, staff, outside consultant
Ongoing, quarterly, annually
Program administrator, outside consultant
20
Exercise 4: Planning a Program
Evaluation
Input
Output
Quality
Short Term Long Term
Outcomes Outcomes
Source
Measures
Collector
When?
Who
summarizes?
Cost?
21
7
Additional Resources
n
Theory-based evaluation:
n
n
n
n
Rossi, Freeman, & Lipsey . Evaluation: A
systematic approach
Patton: Utilization-based Evaluation
Stufflebeam, Madaus, Kellaghan: Evaluation
Models: Viewpoints on Educational and
Human Services Evaluations
The Program Evaluation Standards (Sage)
n
n
n
n
Feasibility
Utility
Propriety
Accuracy
22
Next week:
n
Qualitative II:
n
n
n
Interviewing
Data Analysis
Evaluation II: Applying the FUPA
standards (see Handbook, p. 13)
23
8
Download