Implementing Outcome Evaluation (Part One)

advertisement
Outcome Evaluation:
Logic Models,
Evaluation Frameworks,
& Measuring Indicators
Session #2/3
What to Expect from this Series

1. Learn about steps required to implement
outcome evaluation
2. Design a program logic model
3. Identify appropriate
evaluation/measurement tools
4. Develop an implementation plan
A note about the
nature of evaluation…
Name
Affiliation
Position
EMAIL??
Name
Affiliation
Position
Email
Purpose of Today’s Session
• To complete individualized workable
program logic model for program of
your choice
• To begin designing individualized
evaluation plans based on your logic
model
• To begin identifying indicators and
potential evaluation measurement tools
Today’s agenda
• Answering questions and refining logic
models
• Break
• Analyzing logic models and identifying
priority evaluation questions
• Lunch
• Explaining indicators and measurement
• Beginning to draft indicators for your
programs
• Trying to wrap up by 3 so that we can
save time for other questions or one-onone discussions at the end of the day.
Logic Models
Responding to the Online Exercise
• Great start!
• Varying stages
• Tips
What did YOU think of the online exercise?
Logic Models
Reactions to your efforts thus far
Challenges and Issues
• Balancing the logic of program
development and promotion with the logic
of the program itself
• Finding the right balance for short-term
and long-term outcomes
• Differentiating program process
objectives from outcome objectives
• Finding outcomes for individualized
programs or programs for people in crisis
Balancing the logic of program
development and promotion with
the logic of the program itself
Sample Activities
Suggested Edits
• Promote the program to
• Divide the logic of
partners
program design
and
• Provide support and
implementation
information about the
from the logic of
Canadian system to clients
the program itself
• Secure safe and accessible
community based locations
• Determine appropriate
community volunteer leader to
oversee each program
location.
Finding the right balance for shortterm and long-term outcomes
• Try to follow the “logical chain” from
top to bottom and look for big logical
leaps. Try to identify the unstated
assumptions that underlay these
leaps.
• Look for outcomes that are largely
within your control.
Finding the right balance for shortterm and long-term outcomes
Home visits
Families have
increased knowledge
of local recreation
opportunities
?
Families experiences
are affirmed and
normalized
?
increased recreation
and leisure
opportunities
decreased feelings of
depression
increased opportunity
for participation in
community
increased acceptance &
understanding of
accessibility needs of
[vulnerable population]
Dealing with programs that have
individualized activities and
outcomes
Sample Activities
Suggested Edits
• Resident care
• break down into
more concrete
• Provide short term
elements
counselling
• Provide settlement • Describe in a more
behavioural way
and adaptation
support to
participants
How do we identify outcomes when
services are highly individualized?
• Look for more immediate short term
outcomes that may be common to more
clients
• Try to categorize the most typical
activities and outcomes into broad
groups …even the most individualized
services tend to have a cluster of
typical activities
• Choose to emphasize those outcomes
that will appeal to key audiences
Example of a well-done logic model for a program with individualized outcomes
Long-term
outcome objectives
Short-term
outcome objectives
Activities
Develop and
support clients’
individual rehab
plans
Clients become aware
of +/or are connected to
community resources
Clients have
improved informal
supports
Provide clinical
support and
interventions
Clients are able to find
+/or maintain
accommodation of their
choice
Clients maintain or
improve their level of
function
Clients have longer
community tenure
Clients have
improved informal
supports
Client becomes a more
productive, independent
member within their
community
Assess and develop
life skills
Clients gain insight to
healthier
lifestyles/choices
Reduce the # of
hospitalizations or
shorten lengths of stay
This example works well because…
• The outcome objectives are diverse, so
they explain the full impact of the program.
• The short-term outcome objectives are
clear, concrete, achievable and
measurable.
• The links are differentiated well (i.e., every
activity doesn’t link to every outcome).
• Certain short-term outcomes emerge as
pivotal to the overall logic.
How do we identify outcomes when
services are crisis response-oriented?
• Think “Harm Reduction” or “Secondary
Prevention”: Identify the ways in which
your intervention reduces negative
consequences of crises, or reduces risk
of further crisis
• Think of the “next steps” your clients can
take once stabilized as outcomes
• Think of very, very short-term outcomes
like, “client feels a sense of dignity”
Long-term
outcome objectives
Short-term
outcome objectives
Activities
Example of a well-done logic model for a crisis oriented program
Weekly self help
group for women
Women develop
friendships in the
group
Women support each
other outside of the
group
Women feel
supported in the
decisions they
make
Women are able to evaluate
relationship with partner and
make decisions about the
relationship
Partner is involved
as part of family
decision making
Women access
resources
Family members
reconstruct a support
system and maintain
family unity
Women feel less
fragile and are not
dealing with daily
crises
Women articulate
expectations of their
partners
This example works well because…
• The short-term outcomes don’t imply
dramatic change in the lives of participants,
which is appropriate for a program dealing
with women in near crisis situations.
• Even the long-term outcomes are fairly
modest and achievable (which is good for
the same reason).
• The model makes a strong case for having
an impact on one aspect of a complex
social issue
Other Questions?
Time to work a bit more
on polishing logic
models
MORNING BREAK
Feeding Back
• What insights are starting to emerge
from your model?
Review of Logic Models:
Prioritizing Objectives
for Measurement
Discussion Exercise
The importance of the arrows in
a logic model
• The arrows in a model are like your
hypotheses. They express your
ideas about the cause-and-effect
linkages within your program.
• Most arrows are based on certain
assumptions .. Some are conscious,
others are probably not so
conscious.
Analyzing your Logic Model
• If you follow each logical pathway
through from activities through shortterm outcomes to long-term outcomes,
does the logic “hang together” or does it
feel like there are unstated assumptions?
• Where might data gathering help you
to check these assumptions? (make
a note)
Analyzing your Logic Model
• Are there any activities that don’t seem to
link to any identified outcomes, or vice
versa?
• Which logical links do you feel sure are
actually occurring in your program?
Which ones are you less sure about?
•
• Make a note about those links that
you feel you could document better.
Analyzing your Logic Model
• Which outcomes seem “pivotal” in your
model? Which ones are absolutely key to
overall success?
• Which ones do outsiders least
understand or appreciate?
• What are the issues or questions that
your key stakeholders are going to see as
most important?
• Make a note
Analyzing your Logic Model
• Where does your logic model “link
up” with the overall priorities of your
board, your clients, your funders?
Translating key
issues or unknowns
or assumptions into
research questions
Clarifying the levels of questions
To determine how
effective our
program is at
helping people find
meaningful jobs
Evaluation
Purpose
Evaluation
Questions
Actual Interview
or Survey Items
How many of our
clients find jobs?
How satisfied are
clients with their
jobs?
For clients: “How
much do you enjoy
your job?”
What parts of our
program seem to
help the most?
For staff: “How
satisfied does your
client seem?”
Definitions
- Outcome Measurement
questions •
the questions we need to ask to assess the
attainment of each outcome objective
•
the most central question is most often a
rewording of the outcome objective
Other evaluation questions may ask about how
the program/activity was implemented,
problems encountered, lessons learned,
changes made in program implementation, or
costs associated with achieving the outcome
objective
•
Types of Evaluation Questions
Economic Analysis Questions
(Are the outcomes worth it?)
Outcome Measurement Questions (Have we
made a measurable difference?)
Satisfaction Questions (Are stakeholders
pleased?)
Process Questions (Was program implemented as
planned? What’s working? What has been learned?)
Needs Assessment Questions (What are local needs? local
strengths? What are some good ideas for trying to help?)
Exercise: Comparing
Questions
How effective is our program in
fostering independence,
increasing consumer control,
and improving nutrition?
Does our program improve
quality of life?
What are our strengths and
weaknesses?
What would our stakeholders
like us to change, and what
should we keep the same?
Is our program community
based?
How successful have we been
in involving participants at all
levels of the project?
A Framework for Outcome Measurement
Evaluation Design Planning Chart
Objective
Research Questions
Indicators
Where will we get
the information?
What data
collection tools
will we use?
Data Analysis
Outcome objective
from logic model
May refer to success in
meeting objectives (from
your logic model) or other
questions stak eholders
feel the evaluation should
address.
Things you can you
observe that will help
you to answer this
question.
Who will you speak
to? How would you
access existing
information?
Given what is written
in columns to the left,
what method or
methods are most
efficient and
effective?
How are you going
to mak e sense of
the data you
collect?
Increased
feelings of social
support among
participating
parents
Q1: Have parents’
perceptions of social
support changed as
a result of
participation in
programming?
Q2: How isolated are
the parents we
serve? Do they
need more social
support?
Drafting Evaluation Questions:
Exercise
• Considering the outcome objectives you
chose from your logic model, and;
• Considering the stakeholders involved,
and;
• Considering how you want to use the
evaluation results …
• What are the core questions that will
guide your evaluation?
Group Discussion
• What are some of the key
questions or issues you have
decided to focus on?
• How did you decide which
objectives to focus on? What were
some of your criteria?
Some critical comments
on logic models
If we have time ….
A logic model…
• can come in many different formats. There is no
“correct” way to format models although some
ways are better than others. As a rule, models are
more useful if they are clear about logical
linkages and proceed temporally from what you
do to what you wish to accomplish.
• is not an outcome but a process. You can expect
to go through many iterations of your model.
• is only useful insofar as it is used.
Different Approaches to Logic
Models
Principles
Process,
implementation,
service objectives
Target
Groups
Additional Info on
process
Inputs
Activities
Outputs/Benchmarks
Short-term
Outcomes
Intermediate
Outcomes
Long-Term
Outcomes
Goals or
Vision
Implementation
Steps
Targets
Indicators
Additional Info
on Measurement
Limitations of logic models
• Logic models do not do a good job of capturing
program context and program process
• They can sometimes take on a life of their own.
Because they purport to describe the program,
people may continue to assume that it is the best
rendering of what actually occurs and what the
objectives are.
• They may contain outcome objectives that are
logical but too difficult to evaluate for a variety of
reasons.
• Raises the question: Should an organization
include outcome objectives in their model if they
fully expect to be unable to evaluate them?
Should PLMs include
outcome objectives that
can’t or won’t be measured?
Activity
Attitude
Change?
Behaviour
Change?
Societal
Change?
As we move down the logical chain of the
program it gets more and more difficult to
evaluate our objectives. This is due to
many factors including:
• time
• resources
• funding and timing of funding
• multiple causal factors outside control
of program
Do we still include our longer-term
objectives?
Group Discussion
• What was the hardest part about
drafting evaluation questions?
• Did thinking about your evaluation
questions lead to any insights
about your objectives?
LUNCH
Indicators
Objectives and evaluation
questions lead to a variety of
indicators
Definitions:
- Indicators • empirical (observable) bits of
information that are used to determine
“how we will know” whether the
outcome objectives are being achieved
• measurable approximations of the
outcomes you are trying to achieve
Example:
Objective: Increased social support among participants
Indicator: client perceptions of social support
Review:
A Good Indicator …
is “empirical” in nature
• is pragmatic
• is as “close” as one can get to
the phenomena of interest
• helps to further
”operationalize” an objective
Review:
A Good Indicator …
• leads to good measurement and
methodology (e.g., measures
degrees of change and not just
presence or absence)
• is not necessarily quantitative
Example: client perceptions of social support
Review:
Sources of Indicators
• Research literature
• Data already gathered for other
purposes
• Observations
• Perceptions of key informants
• Standardized measurement tools
Choosing Indicators:
An Exercise
The Main Message About
Indicators:
Good measurement is often an act of
creative compromise and balance
• no hard and fast rules about how to do
them well
• strategic combination of 2 or 3 imperfect
indicators often works best
Writing out indicators for
your own objectives:
Exercise
• Use the indicator planning sheet
• critically reflect on indicators if you
already have them .. what are you
relying on now?
• refer to handout of good example
indicators
Indicator development exercise
Evaluation
Question
Has the switch to
a de-centralized
model for Ontario
Works intake
made the system
more accessible?
List relevant
indicators from
information you
already gather
% of callers who
hang up while on
hold (old system
vs. new)
% of applicants
screened out
# of complaints
received
Brainstorm some
possible new
quantitative
indicators
Brainstorm some
possible new
qualitative
indicators
(numbers, percentages,
rates of change, etc.)
(stories, photos,
examples, case studies)
# of legal
challenges to
screening
decisions
Caseworkers
opinions about
the quality of
client information
sent to them by
intake
Client feedback
on access
A Framework for
Outcome Measurement
Evaluation Design Planning Chart
Data Analysis
Objective
Research Questions
Indicators
Where will we get What data
collection tools
the information?
will we use?
Outcome objective
from logic model
May refer to success in
meeting objectives (from
your logic model) or other
questions stakeholders
feel the evaluation should
address.
Things you can you
observe that will help
you to answer this
question.
Who will you speak
to? How would you
access existing
information?
Given what is written
in columns to the left,
what method or
methods are most
efficient and
effective?
How are you going
to make sense of
the data you
collect?
% of parents who
feel they have
social support
(e.g., Scores on
the ASSQ)
from parents
Survey (at intake,
3 months and 6
months)
pre-post
statistical tests
Increased
feelings of social
support among
participating
parents
Q1: Have parents’
perceptions of social
support changed as
a result of
participation in
programming?
(We will go through the elements of this table one by one throughout the day)
Group Discussion
• What are some examples of
indicators (or sets of indicators)
you think are good? What makes
them good?
• What are some examples of
indicators (or sets of indicators)
you’re not so sure about?
• What are you relying on now?
You want to do
more outcome
measurement
5. Agree on
purpose and
questions, set
roles, talk about
use of findings
1. Develop
Logic models
& Evaluability
assessment
Needs
assessment,
program design,
planning
2. Develop
Purpose and
Questions
6. Develop
workplan
YES
3. Identify
Indicators
NO
YES
Is outcome
measurement the
next step?
Awareness
raising & team
building;
recruitment of
additional people
ONE MORE TRY: The outcome
measurement decision-making tree
Is there a fit
with
resources?
4. Bring
together
stakeholders
NO
Is there
buy-in?
YES
NO
Revise plans,
prioritize,
discuss with
funders
Design: Balancing
Rigour and Credibility
with Practical
Limitations
Design
• Once you have clear questions
and some good indicators for
each, the next step is to develop
a plan for gathering the
information
• Design involves balancing two
competing demands: practicality
and rigour
Easy ways to “beef up”
methodological rigour
Strategy
Rationale
Incorporate more descriptive,
behavioural questions
Less vulnerable to social
desirability bias, easier to
understand
Incorporate a greater variety of
data types
Allows you to “back up” and
strengthen findings
Narrow your focus with
research questions
Measure a few things well
Measure at regular intervals
Create your own “comparison
groups”
Add follow-up measures to time
of intervention measures
Allows for tracking of a greater
range of outcomes, opens up
possibility of failure
Find a comparison group
Isolate the contribution of your
program
Easy ways to make your design
more practical & efficient
•
Narrow your focus with research questions
•
Compare the “return” on different measurement choices
and
•
Use indicator lists to minimize the length and
intrusiveness of surveys
•
Stop gathering data you don’t use
•
Invest in buy-in, and save time on data collection. Get
input from stakeholders at each stage. Pilot test!!!!
Running Example Revisited
Evaluation Design Planning Chart
Objective
Research Questions
Indicators
Where will we get
the information?
What data
collection tools
will we use?
Data Analysis
Outcome objective
from logic model
May refer to success in
meeting objectives (from
your logic model) or other
questions stak eholders
feel the evaluation should
address.
Things you can you
observe that will help
you to answer this
question.
Who will you speak
to? How would you
access existing
information?
Given what is written
in columns to the left,
what method or
methods are most
efficient and
effective?
How are you going
to mak e sense of
the data you
collect?
Q1: Have parents’
perceptions of social
support changed as
a result of
participation in
programming?
Q2: How isolated are
the parents we
serve? Do they
need more social
support?
% of parents who
feel they have
social support
(e.g., Scores on
the ASSQ)
Parents
Parent survey (at
intake, 3 months
and 6 months)
pre-post
statistical tests



Increased
feelings of social
support among
participating
parents

parent selfreports of level
of isolation
family or
professional
key informant
assessment of
level of
isolation of
parents
Parents


Family
member/profes
sional service
provider

Parent survey
(at intake)
Parent focus
group
Key informant
internviews
Design Tips
Be flexible and responsive:
• design an evaluation that “fits” the needs of the
target populations and other stakeholders;
• gather data relevant to specific questions and
project needs
• be prepared to revise evaluation questions and plans
as project conditions change
• be sensitive to cultural issues in the community
• know what resources are available for evaluation
and request additional resources if necessary
• be realistic about the existing capacity of the project
• allow time to deal with unforeseen problems.
Design Tips
Collect and analyze information from
multiple perspectives
Always return to your evaluation questions
• The more closely you link your
evaluation design to your highest
priority questions, the more likely you
will effectively address your questions.
Basic Data Collection
Methods
•
•
•
•
Interview
Focus Group
Satisfaction Survey
Analysis of client
records/administrative data
Pros and cons of popular
measurement techniques
• Retrospective reflections
or stories
• Self report (interview or
survey)
• Peer/family/worker report
• Direct observation
• File review
• Clinical assessments
Less Rigour
More Rigour
Download