Logic Modeling: A Tool to Guide Program Design & Evaluation

advertisement
Logic Modeling:
A Tool to Support the Development
& Evaluation of State Unit of Aging
Programs & Projects
John A. McLaughlin
MACGROUPX@AOL.COM
My Aim Today
 Orient you to a different way to think
about conceptualizing and telling the
performance story of your State Unit on
Aging (SUA) programs and projects
 Provide a simple tool for creating a
functional picture of how your SUA works
to achieve its aims
 Offer some helpful hints for framing a
useful performance measurement and
evaluation strategy for your SUA.
2
Beliefs
 Social Advocacy
• Client/customer focus
The
right to be part of a well run
program
 Program Staff Advocacy
• Managing for Results
Nobody
gets it right the first time out!
3
Themes You’ll Hear Today
 GOOD MANAGEMENT
• Relevance
• Quality
• Performance
 Connections
 Evidence
4
More Words
 Goals -- Impacts
 Objectives
• Outcome -- changes
Short-term (proximal)
Intermediate (distal)
• Supporting
Resources
Activities
Outputs: productivity and reach
5
PERFORMANCE
MANAGEMENT TOOLS
PERFORMANCE MANAGEMENT
Performance management includes activities to ensure that goals
are consistently being met in an effective and efficient manner.
Performance management tools include logic models, performance
measurement and program evaluation.
Logic Model
Tool/framework
that helps identify
the
program/project
resources,
activities, outputs
customers, and
outcomes.
Performance
Measurement
Program
Evaluation
Helps you
understand what
level of
performance is
achieved by the
program/project.
Helps you
understand and
explain why
you’re seeing the
program/project
results.
6
Logic Models as Recipes
 Recipes have 3
essential components!
 A good cook follows
the recipe – program
staff would do well to
create & follow their
recipe for success!
7
Logic Models as Maps
 If you were going on a trip, what
would be the first question you
need to answer?
 Then, what tool would you need?
8
Recipes & Maps are used
for:
 Planning
 Communicating
 Performance Measurement and
Evaluation
9
The Logic Model
10
Level I Logic Model
RESOURCES / INPUTS
The ingredients you
need to implement
your program!
YOUR PROGRAM
What you do to
achieve your
long-term aims!
RESULTS / IMPACT
Why you are in
Business!
11
Level II Logic Model
Contextual
Influences
Resources
1
Activities
Outputs
Customers
2
3
4
1rst Order
Outcome
2nd Order
Outcome
5
6
Impact
7
Program’s
Sphere of
Influence
HOW
WHY
12
Understanding the Sphere of
Influence
 Ask your team to estimate their level
of confidence that their program will
lead to each outcome in the logic
model.
• The Strategic Impact
• The Intermediate Outcomes
• The Short-term
 Identify Performance Partners!
13
Complex Effects Chain
Partners
Transparency
Shared Common
Outcomes
14
Elements of Logic Models
 Resources / Inputs: Programmatic investments available to






support the program.
Objectives / Activities: Things you do– activities you plan to
conduct in your program.
Outputs: Product or service delivery/implementation targets you
aim to produce.
Customer: User of the products/services. Target audience the
program is designed to reach.
Outcomes: Changes or benefits resulting from activities and
outputs.
Outcome Structure
• Short-term (K, S, A) – Changes in learning, knowledge,
attitude, skills, understanding
• Intermediate (Behavior) – Changes in behavior, practice or
decisions
• Long-term (Condition) – Changes in condition
External Influences: Factors that will influence change in the
affected community.
15
Outputs & Outcomes
An annual conference
disseminates the latest forage
research.
Low-income families are better
able to manage their resources.
Program staff teach financial
management skills to lowincome families.
Community volunteers have
knowledge and skill to work
effectively with at-risk youth.
The camp experience provides
leadership development
opportunities for 4-H youth.
Forage producers in Pasture
County know current research
information and use it to make
informed decisions.
The program trains and
empowers community
volunteers.
Campers, aged 12-15 years of age,
learn new leadership and
communication skills while at
camp.
16
Outputs & Outcomes
OUTPUT
An annual conference
disseminates the latest forage
research.
Program staff teach financial
management skills to lowincome families.
The camp experience provides
leadership development
opportunities for 4-H youth.
The program trains and
empowers community
volunteers.
OUTCOME
Low-income families are better
able to manage their resources.
Community volunteers have
knowledge and skill to work
effectively with at-risk youth.
Forage producers in Pasture
County know current research
information and use it to make
informed decisions.
Campers, aged 12-15 years of age,
learn new leadership and
communication skills while at
camp.
17
Volunteers
 If the program is addressing a situation of
low volunteer involvement in community
affairs and the purpose of the program is
to increase volunteering among
community residents as a part of a larger
community development initiative, then
increased numbers of residents
volunteering in community life would be
an outcome. The outcome is expressed as
a behavioral change.
18
Number or type of participants who
attend; number of clients served.
 If the purpose of the program is to
increase use of a service by an
underserved group, then numbers
using the service would be an
outcome. The outcome is not
numbers attending or served; the
outcome is expressed as use that
indicates behavioral change.
19
Participant Satisfaction.
 For our purposes in education and
outreach programming, client satisfaction
may be necessary but is not sufficient. A
participant may be satisfied with various
aspects of the program (professionalism
of staff, location, facility, timeliness,
responsiveness of service, etc) but this
does not mean that the person learned,
benefited or his/her condition improved.
20
Training, Research,
Producing
 These are Outputs. They may be
essential aspects that are necessary
and make it possible for a group or
community to change. But, they do
not represent benefits or changes in
participants and so are not
outcomes. They lead to, result in
outcomes, but in and of themselves,
they are outputs.
21
Steps in the Logic Model
Process
1. Establish a stakeholder work group and collect
2.
3.
4.
5.
6.
documents.
Define the problem and context for the program
or project.
Define the elements of the program in a table.
Develop a diagram and text describing logical
relationships.
Verify the Logic Model with
INTERNAL / EXTERNAL stakeholders.
Then use the Logic Model to identify and confirm
performance measures, and in planning,
conducting and reporting performance
measurement and evaluation.
22
Step 1: Establish work group &
collect documents & information.
 Convene / consult a work group
• provides different perspectives and knowledge
• attempts agreement on program performance
expectations
 Review sources of program or project
documentation
• Strategic and operational plans
• Budget requests
• Current metrics
• Past evaluations
 Conduct interviews of appropriate staff
23
Step 2: Define problem program
addresses & context.
The Context
Drivers of
Success
Constraints
on Success
The
Program
Factors leading
to the Problem
1
2
3* your niche
The
Problem
Program
Addresses
24
Step 3: Define elements of
program or project in a table.
Outcomes
Resources/
Inputs
Activities
Outputs
Customers
Reached
Short-term
Intermediate
Long-term
(Change in Attitude) (Change in Behavior) (Change in
(Condition)
HOW
External Influences:
WHO
WHAT & WHY
25
Step 4: Develop a diagram & text
describing logical relationships.
Draw arrows to indicate/link causal
relationships between logic model elements.
We use
these
resources
…
1.0
For
these
activities
…
2.0
To
produce
these
outputs
3.0
So that the customers can
change these ways.
4.0
Which
leads to
these
outcomes.
Leading to
these
results!
5.0
6.0
Work from both directions (right-to-left and left-to-right)
26
Two Important Rules
to Follow
For every action identified in the Logic
Model, the must be an output that
connects to an outcome through a
specific customer.
OR
An action must produce an output that
becomes a key input to another activity.
THINK CONNECTIONS!
27
Logic Modeling Exercise
1
Brief application of logic
modeling using a United Way
example
Logic Modeling Exercise
GOAL:
Provide an opportunity for participants to apply the principles and practices of
logic modeling in an interactive setting.
INSTRUCTIONS:
1.
The group will be given a set of index cards that contain words or statements
that answer the list of questions on the next slide.
2.
As a group, review the questions and use the index cards to map out the logic
of our case study program on your flip chart paper.
3.
When the cards are placed/glued on the paper in the correct order, draw lines
connecting the cards to show the logic relationships.
4.
When you have completed your logic model, the cards will be ordered so that
they describe the program logic and its underlying assumptions (boxes and
connecting arrows).
5.
Check your logic using if, then and how, why statements. When you have
completed this exercise be prepared to report out to the larger group.
REMEMBER THE RULES!
29
Questions to Guide
Modeling
 What are the essential resources we need
to implement program?
 What programs / activities do we have to implement with
these people to achieve our results?
 What are the outputs of our programs?
 Who / what do we need to reach to achieve these results?
 What are the short-term and intermediate changes that will
enable us to realize our strategic results?
 What are the strategic results / long-term environmental
outcomes we are aiming for?
 What external influences to the program context do we
have to be aware of?
30
31
Worksheet
Simple Logic Model Diagram
EXTERNAL INFLUENCES
32
33
“Z” Logic
Supplier-Customer Relationship
Resources
Action A
A
Outcomes
Resources
Action B
B
Outcomes
Action C
C
Outcomes
Resources
Strategic
Program
Results!
Unpacking supports more focused Performance
Measurement & thus more useful evaluation, as well as
better understanding & communication about how the
“Program” is supposed to work!
34
“Unpacked Logic Models”
LEVEL I LOGIC MODEL
EPA
Clean
Safe
Swimable
Fishable
Water
Improving
Water
Quality
Training
Program
State
Local
Private
LEVEL II LOGIC MODEL
Outcomes
Resources
EPA
State
Local
Private
Activities
Customers
Materials
Development
Recruitment
Training
Technical
Assistance
Website
Developers/
Builders
Shortterm
Increased
awareness of
harmful
effects
Increased
awareness of
new
technologies
and
incentives
Intermediate
Developers &
builders
acquire new
technologies
& change
practice
Impact
Long-term
Reduction in
NPS
pollutants in
waterways
More Fish
Healthier
Wetlands
Clean
Beaches
Clean/
Safe/
Swimable/
Fishable
Water
36
LEVEL III LOGIC MODEL
Outcomes
Resources
•EPA
•State
•Local
•Private
Activities
Outputs
Materials
Developmen
t
Materials
Ready
Recruitmen
t
Trainees
Ready
Training
Developers/
Builders
Trained
Technical
Assistance
Trainees
Receive TA
Website
Trainees/
Others aware
of/using new
information
Shortterm
 Increased
awareness of
harmful effects
•Increased
awareness of
new
technologies
and incentives
Intermediate
Developers
and builders
acquire new
technologies
and change
practice
Long-term
Reduction in
NPS
pollutants in
waterways
Healthier
Wetlands
Impact
More
Fish
Clean
Beaches
Clean/
Safe/
Swimable/
Fishable
Water
37
38
Testing The Logic of
Strategic Plans
39
Strategic Plan Check:
Goals
 Is the goal statement outcome oriented?
 Does it specify the expected strategic change / impact




for a specific target group (older persons & disabled)?
What evidence is available that this impact / change is
important (relevance)? Are there existing needs data?
What specific roles, if any, do partners (internal &
external) play in the success of this impact?
Are there missing Goals to enable the mission / vision
to be realized?
What concerns you most about this goal, right now?
40
Strategic Plan Check:
Objectives
 Is the objective outcome oriented?
 Does it clearly specify the anticipated change for a
specific target group & why they need to be changed?
 Does the change relate to the goal? Will success with
this objective lead to success with the goal?
(QUALITY)
 What evidence is available that that this change is
important? Are there existing needs data?
 What specific roles, if any, do partners (internal &
external) play in the success of this objective?
 Are there missing Objectives to enable the goal to be
realized?
 What concerns you most about this objective, right
now?
41
Strategic Plan Check:
Strategies
 Is there a reasonable degree of confidence that





strategy will result in achievement of a specific
outcome for a specific group?
What evidence is available that this strategy is the
right strategy – in comparison to others – to achieve
the outcome that is specified?
What specific roles, if any, do partners (internal &
external) play in the success of this strategy?
Considering the strategy you’ve adopted, do you have
sufficient resources on hand or available to actualize
the strategy?
Are there missing strategies to enable the objectives
to be realized?
What concerns you most about this strategy, right
42
now?
Scenario Checking
What if’s!
1. Select several external forces & imagine related changes
which might influence the SUA, e.g., change in regulations,
demographic changes, etc. Scanning the environment for
key characteristics often suggests potential changes that
might effect the alliance, as does sharing the plan with
stakeholders!
2. For each change in a force, discuss 3 different future SUA
scenarios (including best case, worst case, & reasonable
case) which might arise with the SUA as a result of each
change. Reviewing the worst-case scenario often provokes
strong motivation to change the SUA – forming
partnerships, changing strategy.
3. Conduct likelihood / Impact assessment on each external
influence.
43
Scenario Checking
What if’s!
 Select most likely external changes to effect the SUA,
e.g., over the next 3-5 years, identify the most
reasonable strategies the SUA can undertake to
respond to change. Suggest what the SUA might do,
or potential strategies, in each of the 3 scenarios to
respond to each change.
 This process should be repeated for each element of
the Logic Model
• Program structure – Resources, Activities, Outputs
• Outcome structure – Short-term, Intermediate,
Strategic
 REMEMBER – “NOBODY GETS IT RIGHT THE FIRST
TIME OUT!”
44
Logic Modeling Exercise
2
Brief application of logic
modeling focusing on a typical
SUA program
Logic Modeling Exercise
GOAL:
Provide an opportunity for participants to apply the
principles & practices of logic modeling in an
interactive setting.
INSTRUCTIONS:
1. Participants will identify 1 SUA program (e.g.,
community awareness, home delivered or congregate
meals, education) to Logic Model as it operates
currently.
2. Group will construct a Level I & Level II Logic Model.
3. After constructing the Models and checking using if,
then and how, why questions, the participants should
discuss who they might tweak the Model to address
Choice.
4. Participants will be prepared to present their Modules
to the whole group.
46
Benefits of Logic Modeling
 Communicates the performance story of
the program or project.
 Focuses attention on the most important
connections between actions and results.
 Builds a common understanding among
staff and with stakeholders.
 Helps staff “manage for results” and
informs program design.
 Finds “gaps” in the logic of a program and
works to resolve them.
47
Logic Modeling Benefits
Kellogg, 1998
48
The real value - Most of the value in a logic model is
in the process of creating, validating,
and modifying the model … The
clarity in thinking that occurs from
building the model and the depth and
breath of those involved are critical
to the overall success of the process
as well as the program.
Adapted from W.K. Kellogg Foundation Handbook, 1998
49
Social Mechanism
Logic Modeling,
Performance Measurement,
& Evaluation
Orientations for Performance
Measurement & Evaluation
PERFORMANCE MEASUREMENT
 Accountability, description
• What objectives/outcomes have been
accomplished at what levels?
PROGRAM EVALUATION
 Learning, Program Improvement, Defense
• What factors, internally and/or externally
influenced my performance? (Retrospective)
• What effect will this level of performance have
on future performance if I don’t do something?
(Prospective)
• What roles (+/-) did context play in
52
my performance?
Key Questions Grantees Need to
Answer About Their Programs
 What am I doing, with whom, to whom/what? (effort)
 How well am I doing it? (quality)
• Customer Feedback
Performance
• Peer Review for Technical Quality
Measurement
• User Review for Social Validity
 Is anybody (anything) better off? (effect)
• Short-term
• Long-term




What role, if any, did my program play in the results?
What role, if any, did the context play?
Program
Were there any unintended outcomes?
Evaluation
What will happen if I don’t do something?
53
Performance Measurement Hierarchy
Program Logic Elements
7. End results
6. Practice and behavior change
5. Knowledge, attitude, and skill changes
4. Reactions
3. Participation
2. Activities
1. Resources
Matching Levels of
Performance Information
7. Measures of impact on overall
problem, ultimate goals, side effects,
social and economic consequences
6. Measures of adoption of new practices
and behavior over time
5. Measures of individual and group changes
in knowledge, attitude, and skills
4. What participants and clients say about
the program; satisfaction; interest;
strengths; weaknesses
3. The characteristics of program
participants and clients; numbers, nature
of involvement; background
2. Implementation data on what the program
actually offers or does
1. Resources expanded; number and types of
staff involved; time extended
54
Two Questions
 What is the right Outcome?
• Short-term
• Intermediate
• Strategic
 Am I getting at the right Outcome,
the right way?
• Efficiency
• Effectiveness
55
Types of Measures
Category
Definition
Examples
Resources/
Inputs
Resources consumed by the
organization.
Amount of funds, # of FTE,
materials, equipment, supplies
Activities
Work performed that directly
produces core products & services.
# of training classes offered as
designed; Hours of technical
assistance training for staff.
Outputs
Products & services provided as a
direct result of program activities.
#of technical assistance requests
responded to; # of compliance
workbooks developed/delivered.
Customer
Reached
Measure of target population
receiving outputs.
% of target population trained; # of
target population receiving
technical assistance.
Customer
Satisfaction
Measure of satisfaction with
outputs.
% of customers dissatisfied with
training; % of customers “very
satisfied” with assistance
received.
Outcomes
Accomplishment of program goals
and objectives (short-term and
intermediate outcomes, long-term
outcomes--impacts).
%  in citizen understanding of
sound nutrition choices;
# communities adopt researchbased practices; %  in #
community-based choices
56
Work Quality Measures
Category
Definition
Examples
Efficiency
The ratio of the amount of input
to the amount of output. Focus is
on operating efficiency. Relating
output to some specific resource
in terms of cost or time.
Cost per workbook
produced; cost per meal
delivered.
Productivity
Measure of the rate of production Number of meals delivered
per some specific unit of
per volunteer.
resource (e.g., staff or employee).
The focus is labor productivity.
Cost
Measure that relates outcomes to Cost per number of
Effectiveness costs.
persons who are elderly
improving nutrient intake
57
Defending your Impact
Claim
 Did we observe a change in the
anticipated outcome(s) as seen in
your performance measures?
 Can we connect any element of our
program (what we did) to that change
using your performance measures?
 Are there any rival explanations
(usually in the context)?
58
Definitions:
Performance Measurement:
The ongoing monitoring & reporting of
program progress & accomplishments,
using pre-selected performance measures.
Performance measure – a metric used to
gauge program or project performance.
Indicators – measures, usually quantitative,
that provide information on program
performance and evidence of a change in
the “state or condition” in the system.
59
Definitions:
Program Evaluation:
A systematic study that uses
measurement & analysis to answer
specific questions about how well a
program is working to achieve its
outcomes & why.
60
+ DISCREPANCY
STANDARD/ DESIRED
LEVEL
OF
PROGRAM
PERFORMANCE
ACTUAL LEVEL OF
PROGRAM
PERFORMANCE
- DISCREPANCY
61
Example
 Standard: 95% of targeted community-based




treatment facilities will adopt BMPs by June 2006.
Performance: 65% of targeted community-based
treatment facilities adopt BMPs by June 2006.
Managers’ Question: Should we act and if so,
what should we do?
Prospective Evaluation Question:
What impact on predicted longer-term impacts
will this observed level of performance have?
Retrospective Evaluation Question: What
programmatic or contextual factors influenced
the observed level of performance?
62
The Logic Model &
Evaluation
HOW
WHY
Longer term
Resources/
Inputs
Activities
Outputs
Customers
Short term
Intermediate
outcome
outcome
outcome
(STRATEGIC
AIM)
RESULTS FROM
PROGRAM
PROGRAM
EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-)
63
Assessing Strength of
Evaluation Design for Impact
 Is the population representing the counterfactual
equivalent in all pertinent respects to the program
population before that population is exposed to
the intervention – selection bias
 Is the intervention the only force that could cause
systematic differences between the 2 populations
once exposure begins?
 Is the full force of the intervention applied to the
program population, and is none applied to the
counterfactual?
• Implementation evaluation
• Independence
64
In the end Logic Models:
Enable planners to:
 Develop a more convincing, plausible argument
RE how their program is supposed to work to
achieve their outcomes & communicate this to
funding agencies & other stakeholders.
 Focus their PM/PE on the right elements of
performance to enable program improvement &
the estimation of causal relationships between &
among elements.
 Be better positioned to present & defend their
claims about their program performance to
external stakeholders.
65
Download