Focus Your Evaluation What do you want to know?

advertisement
Focus Your
Evaluation
What do you
want to know?
What will you
measure?
This is STEP 2 of the five steps of evaluation.
Use pages 2-10 in the booklet, Planning a Program Evaluation,
for additional information related to this set of slides
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
1
What do you think is involved in focusing an
evaluation? (check all that apply)
1.
2.
3.
4.
5.
6.
7.
getting everyone on the same page
deciding what we will evaluate
stating our evaluation purpose
figuring out who wants to know what
deciding on key questions that we want the
evaluation to answer
identifying indicators
selecting an evaluation design
See the next slide for the answer
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
2
If you thought all of those items were involved in
‘focusing an evaluation’, you are RIGHT!
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
3
Focusing your evaluation
is like focusing a camera
What do you want to know?
– Describe the program – logic model
– What is the purpose of your evaluation?
– Determine use/users – who wants to know
what?
– Determine key questions
(Your research questions/not the questions you put
on a questionnaire)
– Select indicators
– Determine evaluation design
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
4
Describe the program that you will
evaluate
• Do you want to evaluate the whole program or
a particular component of a program?
• Is it a new program or one that is well
established and outcomes are evident?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
5
Remember, “program” is any planned series of
activities intended to promote change/improvements.
“Program” can refer to: small projects; a conference or
an event; a comprehensive, multi-year youth
development effort; community-wide initiatives;
organizational or policy development activities;
partnership building activities; community mobilization
efforts; communication campaigns; etc.
It might be a statewide initiative or a specific/focused
county program.
WHAT “program” or part of a program will you
evaluate?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
6
Describe your program using a logic model
Situation:
INPUT
S Program
investments
OUTPUTS
Activities
Assumptions
Participation
OUTCOMES
Short
Medium
Long-term
External factors
We use the logic model in Extension to describe our programs. A logic
model helps show the connections between inputs, outputs and
outcomes to ensure strong programming with a focus on results. If
you are unfamiliar with logic models or want some refreshing, see the
other resources listed on this web site, or go to
www.uwex.edu/ces/pdande and click on logic model.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
7
Once you’ve determined ‘what’ you will
evaluate, then be clear about your purpose.
What is the purpose of your evaluation?
We are conducting an evaluation of
___________________ (name of program)
because___________________________
in order to __________________________.
See the following slide for some examples
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
8
Purpose of the evaluation. Here are 3
examples. How do they differ?
We are conducting an evaluation of the Service Learning
Program because we want to know to what extent youth who
participate increase their leadership skills and service ethic in
order to report program outcomes to the county board.
OR…
We are conducting an evaluation of the Service Learning
Program because we want to know which activities help
youth the most in gaining leadership skills and a service ethic
in order to improve our program’s effectiveness.
OR…
We are conducting an evaluation of the Service Learning
Program because we want to know what participants see as
its value in order to get more youth and adults to participate.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
9
Why is it important to be clear about your
evaluation purpose?
Go back to the previous slide. Think about
how each example differs (each focuses on
something different) and what it means for the
type of data that you would need to collect.
If you are not clear about your purpose, you
may end up with data that you don’t need or
want!
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
10
Take five…
Write a purpose statement for
an evaluation you are planning.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
11
Based on purpose, we determine use and
users: Who wants to know what?
WHO?
WHAT do
HOW will they
you/they want to use the info?
know?
You – staff
Participants
Funder
Other
stakeholders ??
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
12
Read the following scenario.
Has this ever happened to you?
We are a group of five people who want to evaluate our
after-school program. We know evaluation is important. We
want to develop a questionnaire that we will distribute to the
youth who participate in the program.
So we sit down to write the questionnaire. Soon it becomes
apparent that we each have different ideas about what we
want to find out. Mark, the school administrator, wants to
know if the students who attend do better academically.
Marg, Dick and I, as the program staff, want to find out what
the kids like and dislike so we can plan better for the next
program. Gloria, our funder, is interested in knowing how
many youth attend regularly, which youth they are, and
whether the program serves those in greatest need.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
13
The previous scenario is an example
of what can happen when you decide
to ‘write a questionnaire’ before you
are clear about what the program is
and WHO WANTS TO KNOW WHAT
(determine use and users).
In our scenario, Mark, Marg, Dick,
Gloria and I got really frustrated and we wasted
a lot of time until we realized that we needed to
start with purpose and determine who wanted to
know what BEFORE we could write the
questionnaire.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
14
What do your evaluation stakeholders want
to know?
Try asking them…
“I would like to know _______________
about (name of program).
Or,
“I need to know ___________ because I
need to decide __________.”
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
15
Go back to your logic model –
what do you want to know about this program?
INPUTS
OUTPUTS
Promote community
service
Staff
Plan and organize a
community service
initiative
Grant
Youth
ages
12-16
Provide assistance,
mentoring, best
practices
Partners
Foster positive youth
- adult interactions
Time
Adults
Reflect and evaluate
What do you want to know?
What amount
of $ and time
is invested?
Quality?
Timeliness?
To what extent
are activities
implemented?
What is the quality
and effectiveness
of activities?
Which youth –
adults participate?
Who doesn’t?
What is the scope,
intensity of
participation?
How satisfied are
they?
OUTCOMES
Youth increase
knowledge about
community needs and
issues
Youth increase skills in
planning, decision
making, problem solving
Youth gain confidence in
their leadership skills
Youth put their
knowledge and
skills into action
and meet a real
community need
Youth feel more valued
and involved
Youth engage in
additional
community
activities
Adults increase skills in
working with youth
Adults automatically
think of youth in
community roles
Adult increase their
appreciation for youth
and their role in
communities
Adults expand
opportunities for
youth involvement
To what extent are
short-term outcomes
achieved? For whom?
Why or why not?
What else is
happening?
To what extent are
these outcomes
occurring? For
whom? Why or why
not?
What else is
happening?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
Youth are
connected with
and contribute
to communities
To what
extent are
youth more
connected
with and
contribute to
communities?
16
Evaluation questions
we ask in 4-HYD are about…
Outcomes and impacts
• What difference have we made? In what ways?
To what extent? For whom?
• Have youth and community outcomes improved?
What changes are there in knowledge, skills,
behaviors, practices, conditions?
• What role did we play in the change?
• Did we achieve our goal?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
17
Questions about…
Satisfaction – reactions:
• How useful was the information provided?
• How well the program met needs?
• Reactions to the facility and logistics?
• How convenient was the location and timing?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
18
Questions about…
Program Activities
• What program activities were implemented?
• How? What was the quality of implementation?
• Did it go as planned?
• What can we do better?
• Who came/did not come? Were they satisfied?
• What resources did we mobilize?
• What did Extension contribute/make possible?
• What were the challenges/barriers?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
19
Questions about…
Program Inputs
• What did Extension contribute?
• Who collaborated?
• How much did things cost?
• What resources were used?
• Were inputs provided in a timely and efficient way?
• Who else contributed what?
• Was the level of inputs sufficient?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
20
Questions for…
Program planning
• What are future needs?
• What are people currently doing?
• What do people prefer?
• How do people learn the best?
• What are priorities?
• What are possible, alternative approaches?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
21
Questions about…
Your teaching or facilitation
• How effective was the teaching/facilitating?
• How can you improve?
• What worked well? What didn’t?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
22
Remember, in extension, we don’t consider
“satisfaction” as an outcome. Satisfaction
and a positive reaction may be necessary for
learning and for desired changes to occur, but
it is not sufficient. For example, youth may
really like attending and have a great time at
the program and gush with enthusiasm: “I
really love this” “This is great” or, “I’m going
to get all my friends to get involved”, etc.
But, such satisfaction does not mean that the youth
learned anything, have grown in any way, or can/or will do
anything differently. Having fun and “liking” the program
may be an important and necessary step along the
way…but it is not an outcome.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
23
Questions about
Our participants
• Personal information such as age, gender,
race/ethnicity
• Level of experience, training, expertise
• Employment status, position
• Residence
• Etc.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
24
So many questions and so little time….
• Differentiate between what you NEED to know
and what you’d LIKE to know!
• Prioritize: We can’t and don’t want to collect
information about everything!
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
25
Once you’ve decided on your questions,
then determine what information you
need in order to answer the question.
These are the indicators.
An indicator is the specific information, or
evidence, that represents the phenomenon you
are asking about.
Indicator of fire = smoke
Indicator of academic
achievement = passing grades
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
26
Everyday example of indicator
Meet my cat, Calley.
How do I know when she has been fed?
What are the indicators that Calley has been fed?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
27
I know that she has been fed when she:
•
•
•
•
Lays contentedly on her pillow
Purrs
Doesn’t meow and beg for food
Doesn’t paw my leg and cry
These are indicators that Calley has been fed.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
28
For each evaluation question, determine
the information you need to answer it.
Question
Indicators
To what extent did the
program increase
youth-adult
partnerships?
#,% of Boards with youth participating in meetings
before and after
#,% of Boards with youth on committees before
and after
#,% of Boards with youth in leadership positions
Change in quality of youth-adult interactions
To what extent does
#,% of participants whose grades improve
the mentoring program #,% of participants who have improved
lead to improved
attendance
school performance?
#,% of participants with decreased # of behavioral
reports
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
29
Remember:
• Indicators may be quantitative or
qualitative. “Not everything that counts
can be counted.”
• Several indicators may be necessary
• Indicators must be culturally relevant.
For example, indicators of a youth leadership program
might be very different depending upon age of the
youth; ethnic background; abilities.
• The more specific the indicator is, the
most specific your results will be.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
30
Finally, as part of FOCUS, choose an
evaluation design:
When and how data will be collected
Typical designs in youth program evaluation include:
– Single point in time (e.g., survey, end-of-program
questionnaire)
– Pre-post program or retrospective post-then-pre
(comparison of before to after)
– Multiple points in time (e.g., pre, post and followup
– Comparison group designs (two or more groups)
– Case study design (in-depth analysis of a single
case)
– Mixed method (e.g., the use of a survey,
observation and testimonials)
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
31
Contribution vs. attribution
“Contribution” refers to what Extension contributes
in helping to achieve desired outcomes.
“Attribution” refers to being able to attribute our
work to the desired outcomes, that is drawing
causal links and explanatory conclusions about the
relationships between observed changes and our
Extension intervention.
Seldom can we draw direct cause-effect
relationships; often many forces are involved in
producing outcomes. It may be more realistic to
think about what Extension ‘contributes’ than what
we ‘caused’.
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
32
Reflection time
What is one thing you learned
(or had reinforced) from going through this
presentation that you
hope not to forget?
© 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation
33
Download