Understanding Outcome Evaluation

advertisement
Partnership
What to Expect From This Series
1. Learn about steps required to implement
outcome evaluation
2. Design a program logic model
3. Identify appropriate
evaluation/measurement tools
4. Develop an implementation plan
Understanding
Outcome Evaluation:
Core Concepts and
Building the Foundations
Session #1/3
Purpose of Today’s Session
• To gain a basic understanding of the
core concepts and basic steps in
outcome measurement in particular,
and program evaluation in general.
• To share strategies for building buy-in
to outcome measurement.
• To begin work on a program logic
model for a program of your choice.
Today’s Agenda
• Outcome evaluation language
• Outcome evaluation steps
• Break
• Outcome evaluation steps
• Lunch
• Developing logic models
Our Approach to
Outcome Measurement
Discussion
Imagine…
Our Approach to
Outcome Measurement
• Evaluation is already going on, informally, in
any healthy organization and can be
harnessed.
• Evaluation is a natural and necessary part of
organizational development, not an “add on.”
• Evaluation strategies that help one
organization may be confusing or even
damaging to others: there is no “cookie
cutter” approach.
Our Approach to
Outcome Measurement
• Developing common language within and
between organizations is key.
• Organizations get better at measuring
outcomes when they have practice, and
when they identify measurement priorities
themselves.
• Evaluation is about organizing, interpreting
and acting on information, not about
gathering mountains of data.
Improved
Outcome
Measurement
Clear
Evidence
of Impact
Useful Insights,
New Questions
Improved
Practice
A Few Key Definitions
Community Based Research is a term that refers to a variety of efforts to
apply research tools and strategies to the task of building communities.
Community based research can take many forms, including needs
assessment, environmental scans, etc. ..
Program evaluation is one form of community based research. It
focuses on the study of community interventions...
Their processes, outcomes, and context.
Outcome evaluation is one type of program evaluation,
in which: 1. outcome objectives for activities are identified,
2. success in reaching objectives is measured, and
3. best practices are identified, and
4. recommendations for improvement result.
Outcome measurement is a process in which:
1. outcome objectives are identified and
2. success in reaching objectives
is measured
The Dilemma of Terminology
When Talking
About…
Evaluators and Funders Use
Terms Like…
The Process of Measuring
and Learning from
Outcomes
• Stakeholder Consultation, Participatory
Approach, Utilization Plan, Lessons Learned,
Implementation, Process, Evaluability
The Intent of an
Intervention
• Outcome Objectives, Implementation
Objectives, Goals, Outcomes, Targets, Intended
Results
The Changes Caused by
an Intervention
• (Short and Long Term) Outcome
Objectives, Goals, Impacts, Results
The Information Used to
Measure Change
• Inputs, Outputs, (Performance) Indicators,
Targets, Benchmarks, Data Sources
Outcome Measurement
The Basic Steps
The Seven Practical Steps
in Outcome Measurement
1. Get Buy-In
2. Clarify Theory
6. Communicate Results
7. Act on Results
Laying the
Foundations
3. Develop Evaluation
Purpose & Evaluation
Questions
Evaluation
Planning
Acting on
Findings
Gathering &
Analyzing
Information
4. Design Methods and
Measurement plans
5. Analysis Plan
Data Doesn’t Change the World…
People Do.
Evaluation Requires an Ongoing Focus on
Relationship Building
The Role of Collaboration and Teamwork
in Community Based Research
Situating Yourself:
Clarifying Values &
Beliefs, Reviewing
Resources
Deciding & Acting
Together Using
Knowledge
Clarifying The
Issues and
Identifying
Research
Questions
Building &
Maintaining
Relationships For
Action
Gathering, Sharing,
and Organizing
Knowledge
Step 1:
Building Stakeholder Involvement
from the Beginning
Fostering Buy-in/Stakeholder
Participation: How?
• Have all those who have a stake in the project
and its evaluation been identified? Is there
support for the idea of evaluation?
• Are there other ways that stakeholders could
be included in the early planning phase and
throughout the evaluation?
• Has the political context of relationships
among stakeholders been considered?
• Have the interests of outside audiences been
identified and considered?
Creative Ways to
Involve Stakeholders
• Helping to build logic models
• “Teaching” the evaluators
• As surveyors or interviewers
• As pilot testers for data collection tools
• As note-takers at focus groups
• As co-presenters of findings
• As participants in a “data analysis forum”
• OTHERS?
Discussion
• The first meeting of your
evaluation committee
• In the context of measuring
outcomes, when do you run into
stakeholder conflict?
BREAK
Step 2:
Clarifying Your
Theory of Change
What is a
Theory of Change?
• A way of clearly explaining why you do
what you do
• An explanation that focuses on exactly
how your work will lead to positive
change.
Discussion
Which one of the following
theory statements
best describes
program impact?
We have implemented a new
type of women’s shelter…
– because a local report identified the need for
more shelter beds locally.
– so that our organization can provide shelter
for 20% more women
– that is at a different location more accessible
to younger women
– so that young women will be more likely to
report abuse and seek support
– in order to identify instances of abusive
behaviour earlier in relationships and increase
the chances of breaking the cycle of abuse
Components of a
Basic Program Logic Model
• Activities explain the concrete things you do
within your program
• Short-Term Outcome Objectives identify the
fairly immediate changes you expect to see
as a direct result of your work
• Long-Term Outcome Objectives identify the
more distant benefits or changes you hope to
contribute to by achieving your short-term
outcomes
• Goals are the longest term, broad vision of
your program
The Really Useful Idea
(articulating your theory of change)
Identify short-term outcome objectives that are
within your control, measurable and
achievable, and use a logic model to explain
how these short-term achievements contribute
to the achievements of long-term outcomes
over time.
Activity
Refer clients
to Job
Leaders
Short-term
Outcome
Improve clients’
knowledge of
job
opportunities
Long-term
Outcome
Improve
clients’
employment
networks
How Do Logic Models
Help You to Measure Outcomes?
• They help “unpack” complex, important
long-term outcomes into short-term
outcomes that can more easily be measured
• They help you sort out exactly how and why
you expect outcomes to occur, so you know
where to look for evidence of success
• They help you distinguish actual
“measurement problems” from the much
more common “theory problems”
Components of a
Basic Program Logic Model
(After Rush & Ogborn (1991) CJPE)
Activity Clusters
– The concrete things you do within your program
– The major categories or clusters which describe
the activities that make up your work
– Clusters should be composed of activities that
are similar at a behavioural level, not activities
with similar purposes (e.g., cooking in the
kitchen and cooking at the barbecue are similar
activities, but cooking dinner and buying the
groceries for dinner are not).
Components of a
Basic Program Logic Model
Outcome Objectives
- Measurable in principle
- Describes a meaningful change
- Need not specify how much change will occur or
what the target level is
- Specifies a particular population and situation
- Clearly explained and theoretically defensible
- Focused on active rather than the neutral or passive
change (i.e., “improved”, or “reduced” vs.
“supported”, “promoted”, or “encouraged”)
Example: Increased feelings of social
support among participating parents
Components of a
Basic Program Logic Model
Short-term Outcome Objectives
- The immediate benefits or changes that the
target groups are anticipated to experience
or display as a result of the program
activities.
- It should be possible to identify a clear,
direct intended causal link between at least
one program activity and each identified
short-term outcome objective.
Components of a
Basic Program Logic Model
Long-Term Outcome Objectives
- The more distant benefits or changes that
the target groups are anticipated to
experience or display as a result of the
initiative.
- Generally, long-term outcome objectives are
the second-order changes that result from
successful achievement of short-term
outcomes over time.
Refer clients
to Job
Leaders
Improve clients’
knowledge of
job
opportunities
Improve
clients’
employment
networks
Example of a well-done logic model for a program with individualized outcomes
Long-term
outcome objectives
Short-term
outcome objectives
Activities
Develop and
support clients’
individual rehab
plans
Clients become aware
of +/or are connected to
community resources
Clients have
improved informal
supports
Clients are able to find
+/or maintain
accommodation of their
choice
Clients have longer
community tenure
Provide clinical
support and
interventions
Clients maintain or
improve their level of
function
Clients have
increased confidence
in ability to live
independently
Client becomes a more
productive, independent
member within their
community
Assess and develop
life skills
Clients gain insight to
healthier
lifestyles/choices
Reduce the # of
hospitalizations or
shorten lengths of stay
Long-term
outcome objectives
Short-term
outcome objectives
Activities
Example of a well-done logic model for a crisis oriented program
Weekly self help
group for women
Women develop
friendships in the
group
Women support each
other outside of the
group
Women feel
supported in the
decisions they
make
Women are able to evaluate
relationship with partner and
make decisions about the
relationship
Partner is involved
as part of family
decision making
Women access
resources
Family members
reconstruct a support
system and maintain
family unity
Women feel less
fragile and are not
dealing with daily
crises
Women articulate
expectations of their
partners
Working on Logic Models
2-4 pm
Exercise 1: Describing your Activities
 Think about what your staff and/or volunteers do on a day
to day basis in as concrete terms as possible.
 Focus on activities that involve contact with clients or the
community. Activities that should be in your model are
those that you expect will impact individuals or
communities in some positive way.
 Staff are most often “doing things”, like “providing”,
“teaching”, “raising awareness”, “creating”, etc. so try to
use these types of active verbs. List as many activities as
you can and try to be as honest as you can about what
you actually do.
 Try and “cluster” your activities if they seem to involve
doing similar things.
Exercise 1: Describing your Activities
 Avoid putting “outcome” language into your activities. For
example, avoid terms like “prevent”, “increase”, or
“improve” because these are changes that are a
consequence of the activities. Identify the things your
program does that leads to prevention, increases, or
improvements.
 Avoid “double barreled” activities (e.g., “provide education
and personal support”). This is especially important if
they might lead to different outcomes. Separate them if
possible.
 While administrative tasks are important and vital (e.g.,
team meetings, hiring functions, etc.), you can usually
omit them in logic models because they do not tend to
directly impact outcomes objectives. You can leave these
out and add them later if it helps to clarify your program.
Exercise 2: Identifying the Outcome
Objectives for your Program
 Develop a list of the outcome objectives your program is
intended to achieve. Include as many outcomes as you
can. You might also think about your mission, your
marketing materials, grant proposals, and your own
reasons for doing your job.
 Focus on how the program makes a difference or
change in your clients or the community. For example, if
an outcome objective of the program is to give people
someone to talk to, ask yourself how this may help an
individual.
 Try to write your outcome objectives in way that refers
to change that can be measured.
 Think about sequence. Logically, some outcome
objectives will necessarily precede others. Try to
arrange them temporally from top to bottom.
Exercise 3: Putting it all Together
•
Using a pencil, begin to draw the connections between your
activities and your short term outcome objectives and between
your STOs and your long-term outcome objectives. Keep in mind
the following:
 All STOs should be linked to at least one activity; all activities
should be linked to at least one STO.
 All LTOs should be linked to at least one STO; all STOs should
be linked to at least one LTO.
 Pick an arrow that you think might have questionable logic or
weak assumptions and make it “dashed”.
 Pick an arrow that you think has strong logic and wellsupported assumptions and put a check mark beside it .
•
Think about logical causes. If a group support intervention is supposed
to lead to greater feelings of social support, think critically about why
and how that happens. Does it make sense? In some cases, this
analysis might lead you to add, delete, or modify activities and outcome
objectives.
Using the
On-Line
Logic Model Design Tool
What to Expect at Our Next Session
• Respond to questions regarding finishing
up logic models
• Identify indicators
• Start thinking about measurement tools
Upcoming Workshops
• Workshop 2 focuses on
– Refining logic models and identifying
measurement implications
• Workshop 3 focuses on
– Translating measurement plans into
action
Thank You!
Step 3:
Developing Your
Evaluation Purpose &
Evaluation Questions
What do you really want to know?
What kinds of questions can
evaluation really answer for you?
Breaking Down the Complexity
What were we trying to change?
(Long-Term Outcome Objectives)
Did our program have
an impact???
What particular contribution were we
going to make to that change?
(Short-Term Outcome Objectives)
How were we planning to make that
contribution? (Activities)
What sorts of things would we see if
the expected change was
happening? (Indicators)
How can we document those
observations in a systematic way?
(Methods)
The Evaluation Purpose Statement
• A purpose statement should specify
exactly what the goals of the research
project are, and what you intend to
investigate in order to achieve these
goals.
A Good Purpose Statement Explains:

What we are evaluating (and what are we
NOT evaluating)

Why we are evaluating

What we will do with the results
Defining Evaluation Questions
• The empirical questions to which your
evaluation will generate answers
• Usually about 5 to 7 in number except in
very complex evaluations
• The things that will keep you focused as
your evaluation project unfolds
Clarifying the Levels of Questions
To determine how
effective our
program is at
helping people find
meaningful jobs
Evaluation
Purpose
Evaluation
Questions
Actual Interview
or Survey Items
How many of our
clients find jobs?
How satisfied are
clients with their
jobs?
For clients: “How
much do you enjoy
your job?”
What parts of our
program seem to
help the most?
For staff: “How
satisfied does your
client seem?”
Ways to Clarify Evaluation Questions
• Focus on the action-oriented and solution
focused (What can you do with the answers?)
• Review what knowledge your group already
has.
• Whose knowledge is valued by decisionmakers? Whose language is being used?
• What are the gaps in knowledge?
• What are the empirical questions within this
issue? What extra value can someone with
research expertise bring to the table?
WRAPPING UP STEPS 1-3
You’ve got stakeholder buy-in,
you’ve articulated your theory of change,
and you’ve figured out your
evaluation purpose
and the evaluation questions
you want to answer…
Are You Really Ready to Do
Outcome Measurement?
• This is the point at which you need
to make a decision about whether
you have stakeholder buy-in.
Outcome measurement will not be
successful without buy-in.
You want to do
more outcome
measurement
5. Agree on
purpose and
questions, set
roles, talk about
use of findings
1. Develop
Logic models
& Evaluability
assessment
Needs
assessment,
program design,
planning
6. Develop
workplan
2. Develop
Purpose and
Questions
3. Identify
Indicators
NO
YES
Is outcome
measurement
the next
step?
Awareness
raising & team
building;
recruitment of
additional people
4. Bring
together
stakeholders
NO
YES
Is there
buy-in?
Step 4:
Developing
Measurement Plans
What are Indicators?
• Bits of information that provide part of the
answer to one of your questions
• Things you can see or touch or hear –
things that are observable in the world
and don’t involve ‘interpretation’
• Are often numbers but can also be (e.g.)
stories, quotations, examples, pictures
Indicators are a Useful Idea
Because...
• They help to break down the complex task
of “program evaluation” or “outcome
measurement” into manageable chunks
• They help others to understand what you
mean in practical terms when you talk
about a particular outcome
• They help you build up a strong
evaluation plan by combining different
kinds of information from several sources
The Notion of
Triangulation
Explaining the Analogy
• When you ask an evaluation question, you’re
like a navigator on a ship asking “where am
I?”
• Progress indicators are like the various
compass bearings a navigator takes. The
more “diverse” the readings, the better!
• If all the bearings suggest you’re in the same
spot, you know you’ve done a good job of
measuring.
Explaining the Analogy
• No single indicator,
regardless of how
accurate, can tell you
where you are!
• Even if the indicators don’t
converge or triangulate,
you know more than you
did and you know
something about how
much “error” is in your
measurement technique.
Discussion
Indicator Exercise
Something to Think About
Over Lunch…
• What makes a good lunch “good?”
• How would you find out whether this
lunch that we are about to have was
successful?
• How would you prove to someone who
wasn’t at our lunch that it was (or was
not) successful?
LUNCH
Choosing
Information Gathering
Methods
Pros and Cons of Popular
Measurement Techniques
• Retrospective reflections or
stories
More Practical,
Less Rigorous
• Self report (interview or
survey)
• Peer/family/worker report
• Direct observation
• File review
• Clinical assessments
Less Practical,
More Rigorous
Step 5:
Analyzing Your Data
Data Analysis
• Analysis is a process of bringing order to
data, organizing what’s there in patterns
and categories
• Interpretation involves attaching meaning
and significance, explaining descriptive
patterns, looking for relationships among
descriptive dimensions.
Suggested Steps in Data Analysis
1. Organize Data
2. Review Original Questions
3. Summarize and Code
Steps in Data Analysis
4. Generate Themes
5. Begin Writing
6. Provide and Receive
Feedback
WRAPPING UP STEPS 1-5
The Evaluation Plan
Evaluation Methods Planning Chart
Evaluation
Questions
Outcome
Objectives
May refer to
Match up with
success in
Column #2 of
meeting
Worksheet #2
objectives (from
your logic
model) or other
questions
stakeholders
feel the
evaluation
should address.
Relevant
Indicators
Where will we
get the
information?
What,
specifically,
will we ask?
What data
Data Analysis
collection tools
will we use?
Things you can
observe that will
help you to
answer this
question.
Who will you
speak to? How
would you
access existing
information?
How will the
questions be
worded?
Given what is
written in
columns to the
left, what
method is most
efficient and
effective?
How are you
going to make
sense of the
data you
collect?
The Seven Practical Steps
in Outcome Measurement
1. Get Buy-In
2. Clarify Theory
6. Communicate Results
7. Act on Results
Laying the
Foundations
3. Develop Evaluation
Purpose & Evaluation
Questions
Evaluation
Planning
Acting on
Findings
Gathering &
Analyzing
Information
4. Design Methods and
Measurement plans
5. Analysis Plan
Steps 6 & 7:
Communicating and
Using Results
“Problems facing the poor and the powerless must
be understood in the hearts and the guts, as well
as in the heads. The people with the problems
must talk to each other as whole persons with
feelings and commitment as well as facts.
As a tool of research, dialogue produces not just
factual knowledge, but also interpersonal and
critical knowledge, which defines humans as
autonomous social beings”
(Park, 1993 as cited in Nelson, Ochocka, Griffin, & Lord,
1998).
Communication:
Questions to Consider
• How much interpretation do you want to
do (how much do others do)?
• Who should act after learning about the
findings?
• What do you want them to do with the
findings?
• How do you package information in order
to facilitate this?
Where to Start
You want to do
more outcome
measurement
5. Agree on
purpose and
questions, set
roles, talk about
use of findings
1. Develop
Logic models
& Evaluability
assessment
Needs
assessment,
program design,
planning
6. Develop
workplan
2. Develop
Purpose and
Questions
YES
3. Identify
Indicators
NO
YES
Is outcome
measurement
the next step?
Awareness
raising & team
building;
recruitment of
additional people
Is there a fit
with
resources?
4. Bring
together
stakeholders
NO
YES
NO
Is there
buy-in?
Revise plans,
prioritize,
discuss with
funders
Discussion:
Navigating First Steps
• From where you sit right now
in your organization, how do
you feel about these first
steps?
• Which step strikes you as
being the most challenging?
Action Checklist
• Have you got key stakeholders involved?
• Have you clarified program logic and
evaluation purpose?
• Have you identified indicators for key
outcome questions?
• Have you maximized the use of the data
you already have or can easily get?
Download