(PPTX, Unknown)

advertisement
Wilder
Research
Evaluation 101 Workshop
With Nicole MartinRogers, Ph.D.
Great Lakes Culture Keepers conference
@ the Mille Lacs Indian Museum and Trading Post
April 28, 2015
Agenda
 Introductions and framing questions
 Evaluation overview
 Logic models
(lunch break)
 Evaluation plans
 Collecting data
 Using evaluation results to improve programs
 Evaluation in the real world
Framing questions
Introduce yourself, the organization you are from,
and answer any or all of the following:
 What are your biggest concerns about
evaluation?
 What is your best evaluation experience?
 What is one thing you hope to learn from today’s
session?
Wilder
Research
Evaluation Overview
Why we evaluate
What it is
The evaluation process
Terminology and approaches
Considerations and limitations
Why evaluate?
Why evaluate?
 Ongoing learning and program improvement
 Guide programming decisions
 Assess effectiveness, identify best practices
 Demonstrate program impact
 Meet funder requirements and seek funding
 Generate support for the program
 Recognize a job well done
What is evaluation?
 Systematic process
 Collects information about questions/issues
 Improves knowledge and decision-making
 Asks questions about issues that come from
your everyday practices
Approaches to evaluation
 Developmental evaluation
 Program implementation (process)
 Satisfaction
 Program impact (outcomes)
Approaches to evaluation (cont.)
Initiative is innovating
and in development
• Exploring • Creating
• Emerging
 Implementers are
experimenting with different
approaches and activities.
 There is a degree of
uncertainty about what will
work, where, and with whom.
 New questions, challenges,
opportunities, successes, and
activities continue to emerge.
Try Developmental Evaluation
Initiative is forming
and under refinement
• Improving • Enhancing
• Standardizing
 Core elements of the
initiative are taking shape.
 Implementers are refining
their approach and activities.
 Outcomes are becoming
more predictable.
 The context is increasingly
well known and understood.
Try Formative or Process
Evaluation
Initiative is stabilizing
and well-established
• Established • Mature
• Predictable
 The initiative’s activities are
definable and established,
and do not change
significantly as time passes.
 Implementers have significant
experience with (and an
increasing sense of certainty)
about what works.
 The initiative is ready for a
determination of merit, worth,
value, or significance.
Try Summative or Outcomes
Evaluation
An Indigenous framework for evaluation*
 Engaging the community
 Cultural metaphors
 Ways of knowing
 Core cultural values
 Telling a story (and knowing your audience)
 Responsive information gathering
 Looking to our gifts (“strengths-based”)
 Interpreting and sharing the information
*Adapted from LaFrance & Nichols (2009)
Evaluation process
Engaging stakeholders in evaluation
 Stakeholders are:
– Organizations and individuals who care about the
program and/or the evaluation findings
– In general, anyone who has something to gain or lose
from the program
 Not everyone can be or has to be at the table
– Stakeholders can be engaged in different ways and at
different levels
Wilder
Research
Logic models
Program theory
Terminology
Uses
Cultural metaphors
Activity
Fill in the Blanks
©1995
Program theory
You need to be here.
You are here.
What needs to happen to
get from here to there?
• IF the activity/program is provided THEN what should be
the result (impact) for participants?
• What ACTIVITIES needs to happen, and in what
INTENSITY and DURATION, for participants to
experience the desired OUTCOME?
• What EVIDENCE do you have that this activity/program
will lead to the desired result?
Logic models help you…
 Build consensus and clarity about essential
program activities and outcomes
 Identify opportunities for program improvement
 Be clear about beliefs and assumptions that
underlie program design
 Promote evidence-based thinking
 Avoid scope creep or mission drift
 Evaluate your impact
Elements of a logic model
 Inputs: any resources or materials used by the
program to provide its activities
 Activities: any services or programs provided by
the program
 Outputs: any quantifiable documentation of the
activities of a program
Elements of a logic model (cont.)
 Outcomes: any characteristics of the participants
that, according to the theory and goals of the
services, can be reasonably expected to change
as a result of the participant’s receiving services
 A common progression is:
– Short-term : Changes in knowledge or awareness
– Intermediate outcomes: Behavioral changes
– Long-term outcomes: Sustained behavior change,
individual impacts, community impacts
Outcomes: consider the following
 To what extent are the activities likely to create change
given dosage?
 Which users are likely to be impacted?
 What other factors may influence whether change
occurs?
A logic model should provide a realistic statement of
outcomes that are likely to occur given the inputs and
activities, as well as the circumstances of participants and
the context within which the program occurs.
How to build a logic model
 Program staff: state in plain language how and
why you are doing what you are doing and what
you expect to come from it
– Working sessions to identify key inputs, activities, and
outcomes can be great team-building exercises
 Participants: ask them what they got out of the
program and how it could be improved
 Literature and other programs: the field and prior
research can tell you what outcomes can be
expected from your program (and strengthen
your model)
Wilder
Research
Logic models
A few examples
Wilder
Research
Lunch break!
Enjoy your lunch
See you back here at 1:00!!
Wilder
Research
Logic model activity
Maple Sap Harvest Workshop
Use your logic model to…
 Communicate to key stakeholders:
– Board
– Funders
– Staff and volunteers
– Participants
 Share it on your website, in your annual report, on social
media, in evaluation reports, in funding requests
 Convert your logic model to narrative using evaluation
language (“outcomes”)
 Design your evaluation plan (part 2 of this training)
Wilder
Research
Evaluation plans
Evaluation questions
Methods
Community engagement
Activity
Why you need an evaluation plan
The evaluation plan is where you document
your evaluation questions and your specific
plan for gathering, analyzing, and reporting
the data to answer those questions.
Documentation of these steps is critical for
accountability and to avoid evaluation
scope creep.
Evaluation questions
What is it you want to learn about your
program?
“To what extent does (what we do)
affect/change (a behavior or characteristic) ?”
Process issues – what happened?
 Was the program implemented as intended?
 Were the targeted participants served – number
and type?
 Which aspects of the program worked well?
 Which aspects were problematic? Why?
** Consider setting specific output targets for the activities in
your logic model (some funders may require this)**
Process issues (cont.)
Prioritize your process evaluation questions based
on how much the answer to the question will…
 Influence participant outcomes or satisfaction
 Concern staff members or other key stakeholders
 Help with planning or improvement decisions
 Add contextual understanding of the program
Satisfaction – what do people think?
 Do elements of participant satisfaction make a
difference in positive outcomes?
 Will you be able to do anything with your
satisfaction results? Or is it beyond your
resources or control?
 Are there key stakeholders
whose satisfaction will influence
your program’s sustainability?
Outcomes – what changed?
 Did the program achieve the desired outcomes?
** Use your logic model to determine which outcomes to
measures in which timeframe AND to demonstrate which
outcomes you do not need to measure because the link
has already been demonstrated in the literature **
Outcomes (cont.)
Prioritize your evaluation questions based on
which outcomes will be the most . . .
 Important to participants
 Important to other stakeholders, including funders
 Useful in understanding success and guiding
improvements
 Achievable
 Feasible to measure (last but not least!)
Wilder
Research
Evaluation plan activity
Developing evaluation questions
Measure things right!










Valid (measuring what you want to measure)
Reliable (consistent measurement)
Culturally responsive questions and approach (give gifts)
Ethical and legal
Useful for multiple purposes
Sensitive to change
Easy to use
Accessible
Relevant
Focused
Methods
Qualitative data
Quantitative data
 Numerical information:
 Word information:
– Interviews
– Closed-ended surveys
– Focus groups
– Administrative data
 Analyzed by:
 Analyzed by:
– Calculating/computing
data
– Grouping data by key
themes
.
Data sources
Primary data sources
Secondary data sources
 Information collected
 Data that have already
specific for your evaluation
been collected about the
target population
– Surveys
– Interviews
– Focus groups
– Administrative/program
data
– Existing population-level
data sets
– Previous research and
program evaluations
Data collection
 Surveys
– Paper, Web, Phone
 Interviews
– Informal, Semi-structured, Structured
 Focus groups and talking circles
 Observation
 Interactive observation
 Administrative/program records
Interactive observation example – Wordle
Data collection resources
 Survey Monkey and other online survey tools
– But most of these tools won’t help you design a good
survey; they just make it easier to administer
– Check out: https://www.surveymonkey.com/mp/survey-guidelines/
 Focus groups:
– http://www.eiu.edu/~ihec/Krueger-FocusGroupInterviews.pdf
 Wordle:
– http://www.wordle.net/
Wilder
Research
Using evaluation results to improve
your program
Data-based decision-making
Continuous quality improvement
and a learning culture
Using evaluation for program improvement
 What are the most important findings?
 How should the program be improved, if at all?
 What things about the program worked really well? How
great are the outcomes? How strong is the evidence?
 Do the results lead to additional questions about the
program?
 Who should see the results? In what format?
*Helpful hint: You need to take time to do
these things as part of your evaluation plan
Wilder
Research
Evaluation use activity
Analyzing, interpreting, and using
evaluation data
Wilder
Research
Evaluation in the real world
Considerations for doing evaluation in-house
Working with professional evaluators
Evaluation resources
Considerations
 What skills do you and your staff have?
 What additional skills might you need?
 What is your budget for evaluation?
 Do you and your staff have time to spend on the
evaluation work in addition to your other tasks?
 How can you use volunteers in your evaluation?
 How can you incorporate evaluation data
collection into standard operating procedures?
Working with professional evaluators
 What specific skills or expertise will add value?
– Topical/program expertise?
– Experience with similar grants?
– Cultural knowledge?
 What capacity does the contractor have?
– Do you want an independent contractor or a larger
evaluation firm?
 What is your budget for evaluation?
 Who will manage and guide the work of the
contractor?
Evaluation resources
 www.wilderresearch.org – find evaluation tip
sheets, evaluation reports on a variety of topics,
and more!
 Cite LaFrance and Nichols book here
 Visitor Studies Association (VSA):
http://visitorstudies.org/resources
 Institute of Museum and Library Services (IMLS):
http://www.imls.gov/research/evaluation_resources.aspx
Wilder
Research
Wrap-up
Evaluation & feedback on this session
Next steps
Full circle: follow-up questions
 What do you think of when you think of
evaluation? What has changed after today’s
session?
 What do you hope to get out of today? Is there
anything that was not covered that you still
want to learn about evaluation?
 What went well today?
 Was there anything that could have been
improved?
Next steps
 Which of these tools, if any, do you think you
plan to use in your work?
 What additional support would you need to be
able to use these tools?
 What additional evaluation training or resources
do you plan to seek out in the future?
 Anything else?
Chi’miigwech!
Nicole MartinRogers, Ph.D.
Senior Research Manager
WILDER RESEARCH
651-280-2682
nicole.martinrogers@wilder.org
Follow me on Twitter @NMartinRogers
www.wilderresearch.org
Download