Eval 101

advertisement
Evaluation 101
Laura Pejsa
Goff Pejsa & Associates
MESI 2014
Objectives
Gain a greater understanding of evaluation and evaluative thinking
Learn about some practical approaches & get familiar with some tools
to use
Have an opportunity to apply your learning directly to a real world
case
Session Outline
•
•
•
•
•
•
Introductions / Intro to the day
Grounding definitions & terms
Understanding “programs” (purpose & logic)
Evaluative thinking and the evaluation process
Strategies for making evaluation desirable & usable
Debrief, questions, & close
Metaphors: Your Ideas about Evaluation
• Think of one object that represents your ideas and/or
feelings about evaluation
• Prepare to explain your choice
• Share your with the person sitting next to you and
notice common themes
• Prepare to share your common themes with the group.
E-VALU-ation
"Value" is the root word of evaluation
Evaluation involves making value
judgments, according to many in the field
5
Traditional definition: Michael Scriven
(from Michael Scriven, 1967, and the earlier Program Evaluation Standards)
"The systematic
determination of
the merit, worth (or value)
of an object”
6
Important concepts in this definition
• SYSTEMATIC means that evaluators use explicit rules
and procedures to make determinations
• MERIT is the absolute or intrinsic value of an object
• WORTH is the relative or extrinsic value of an object in
a given context
7
An Alternative Definition: Michael Quinn Patton
Systematic collection of information about the activities,
characteristics, and results of programs to (1) to make judgments
about the program, (2) improve or further develop program
effectiveness, (3) inform decisions, and/or (4) increase
understanding.
Done for and with specific intended primary users for specific,
intended uses.
Commonalities among definitions
• Evaluation is a systematic process
• Evaluation involves collecting data
• Evaluation is a process for enhancing knowledge and decision
making
• Evaluation use is implicit or explicit
Russ-Eft & Preskill (2009, p. 4)
9
Discussion: Why Do Evaluation?
• What are the things we might gain from engaging in evaluation/an
evaluative process?
• Why is it in our interest to do it?
• Why is it in the interest of the people we serve to do it?
• What are the benefits?
From the textbooks… evaluation purposes
• Accreditation
• Accountability
• Goal attainment
• Consumer
protection
• Needs assessment
• Object improvement
• Understanding or
support
• Social change
• Decision making
One basic distinction… Internal vs. External
INTERNAL evaluation
Conducted by program employees
Plus side: Knowledge of program
Minus side: Potential bias and influence
12
EXTERNAL evaluation
• Conducted by outsiders,
often for a fee
• Plus side: Less visible bias
• Minus side: Outsiders have to gain entrée; have
less first-hand knowledge of the program
13
Scriven's classic terms
FORMATIVE evaluation
Conducted during the development or delivery of a
program
Feedback for program improvement
14
Scriven's classic terms
SUMMATIVE evaluation
• Typically done at the end of a
project or project period
• Often done for other users or for accountability
purposes
15
A new(er) term
from Patton
DEVELOPMENTAL
evaluation
• Help develop a program or intervention
• Evaluators part of the program design team
• Use systematically collected data
16
What is the evaluation
process?
Every evaluation shares similar procedures
17
Patton’s Basics of Evaluation:
• What?
• So what?
• Now what?
General Phases of evaluation planning
Phase
Phase Name
Question
I
Object description
What are we evaluating?
1.
II
Context analysis
2.
III
Evaluation plan
Why are we doing an
evaluation?
What do we hope to learn?
How will we
conduct the study?
19
What?
• Words?
• Pictures?
The key is understanding…
20
“We
build the road,
and the road builds us.”
-Sri Lankan saying
A word about logic models and theories of
change…
one way to understand a program.
21
Simplest form of a logic model
INPUTS
OUTPUTS
OUTCOMES
Results-oriented planning
22
A bit more detail. . .
INPUTS
Program
investments
What
we
invest
OUTPUTS
Activities
What
we do
Participation
Who
we
reach
OUTCOMES
Short
Medium
Long
-term
What results?
SO WHAT?
What is the VALUE?
23
A simplistic example…
Inputs:
Outputs
OUTCOMES
Short
What does a logic model look like?
26
Regardless of format, what do logic models and
theories of change have in common?
They show activities linked to outcomes
They show relationships/connections that make sense (are logical).
Arrows are used to show the connections (the “if-then”
relationships)
They are (hopefully) understandable
They do not and cannot explain everything about a program!
27
The Case
The Case: Logic and/or Theory
Draw a Picture…
• Inputs (what goes in to the program to make it possible?)
• Outputs (Activities: what do they do? Participation: counts)
• Outcomes (what do they think will happen?)
• Short, medium, and long term
What can we evaluate?
• Context
• Input(s)
• Process(es)
• Product(s)
Daniel Stufflebeam
The basic inquiry tasks (BIT)
1.
2.
3.
4.
5.
6.
7.
Framing questions
Determining an appropriate design
Identifying a sample
Collecting data
Analyzing data and presenting results
Interpreting results
“Reporting”
Back to the Case: What are our questions?
Evaluation Question
#1
#2
#3
Back to the Case: What do we need to
know, and where can we find it?
Evaluation Question
#1
#2
#3
Information Information
Needed
Source
Possible ways to collect data
Quantitative:
Surveys
Participant Assessments
Cost-benefit Analysis
Statistical Analysis of
existing program data
o Some kinds of record
and document review
o
o
o
o
Qualitative:
o
o
o
o
o
Focus Groups
Interviews
Observations
Appreciative inquiry
Some kinds of record
and document review
What are the best methods for your evaluation?
It all goes back to your question(s)…
Some data collection methods are better than others at
answering your questions
• Some tools are more appropriate for the audience you need
to collect information from or report findings to
• Each method of collecting data has its advantages and
disadvantages (e.g., cost, availability of information,
expertise required)
•
Back to the Case: How will we find out?
Evaluation Question
#1
#2
#3
Information Information
Needed
Source
Methods
Reminder: Importance of Context
Desire & Use
• How do we make this process palatable, even desirable?
• What can we do to make information USE more likely?
• Ways of sharing and reporting
Debrief & Questions
• What are the most important take-aways from today’s
session?
• What can you apply in your own work?
• What questions remain for you?
Download