Extension Program Evaluation

advertisement
Extension Program Evaluation
Michigan Planning and Reporting System
(MI PRS)
Winter 2011Training
Part of Planning Process
• Evaluation is an upfront activity in
the design or planning phase of a
program.
• Evaluation is not an after-program
activity.
Why Outcomes?
Today, in a time of continued
reduction in government funding,
Extension professionals are
challenged more than ever before to
document outcomes of programs
and address stakeholder demands
for accountability.
Review of Part of Bennett's Hierarchy
As one moves up the hierarchy, the
evidence of program impact gets
stronger. End Results
Practice Change
KASA
Reactions
People Involvement
Activities
Resources
• Collecting impact data on programs
is costly, time consuming, and
requires skill. (But not impossible!)
• Extension professionals are
expected to evaluate a minimum of
one program a year at impact level.
Example
• A pre/post measure can assess
short term outcomes on
knowledge, attitudes, skills, and
aspirations (motivation to change).
• A plan for participant follow-up is
required to assess behavior or
practice change.
Plan early
• Plan early on what is needed with
cost, time, skills (data collection,
analysis, interpretation), and
resources that are needed to
evaluate an extension program.
• Work with Institute teams/ groups.
• Evaluating programs at the lower
levels (inputs, participation,
collaboration, activities, and
reactions) may require little effort
and are less expensive. This is
process evaluation.
Process Evaluation
• Process evaluation, also called
formative evaluation, helps
program staff to assess ongoing
programs for improvement and
implementation.
• Examples: program fidelity,
reaching target audiences
Outcome Evaluation
• Documenting impact or
community-level outcomes
requires skills relative to
questionnaire development, data
collection and analysis,
interpretation and reporting.
• Summative evaluation, also called
impact or outcomes evaluation, may
require understanding of evaluation
designs, data collection at multiple
points, and sophisticated statistical
analyses such as Analysis of
Covariance and the use of covariates.
A Framework for Linking Costs and Program
Outcomes Using Bennett's Hierarchy
Process (Formative) Evaluation
Cost &
Outcomes Inputs Activities Participation Reactions
Outcome Evaluation
KASA
Practice/
Behavior
Change
SEEC
Short Term
X
X
X
X
XX
XXX
----
Intermediate
X
X
X
----
XX
XXX
XXXX
Long Term
X
X
X
----
XX
XXX
XXXX
X = Low cost, effort, and evidence;
XX = requires questionnaire development, data collection and analysis skills;
XXX = requires understanding of evaluation designs, multiple data collection,
additional analysis, skills, interpretation;
XXXX—all of the above, time, increased costs, potentially resulting in stronger
evidence of program impact.
Professional Development
• Plans for professional development
are captured in MI PRS, consider
building skills in evaluation.
• Develop with Institute work teams
program evaluation plans that fit
with logic models.
To make an Evaluation Plan:
1. Decide if the program is ready for
formative/process or
summative/outcome evaluation.
2. Link program objectives to
evaluation questions that address
community outcomes.
To make an Evaluation Plan, Cont.
3. Identify key indicators for
evaluation (make sure they are
measurable and relevant).
4. Consider evaluation costs (followup techniques and comparison
groups used in summative designs
are more expensive).
5. Develop a cost matrix.
• Tracking program and project
processes and outputs, as well as
outcomes, will require data
collection and analysis systems
outside of MI PRS.
• Link program costs and cost of
evaluation to the outcomes.
Conclusion
In the end, evaluation questions that
address the “so what” issue are
connected to outcomes and costs,
and ultimately justify the value of
Extension programs to public good.
Key Reference
Radhakrishna, R., & Bowne, C. (2010). Viewing
Bennett’s hierarchy from a different lens:
Implications for Extension program
evaluation. Journal of Extension, 48 (6).
Retrieved 1/24/11 at:
http://www.joe.org/joe/2010december/tt1.php
MSUE Resources
• Organizational Development webpage
– Planning, Evaluation, and Reporting section
Evaluation Resources will Grow!
Other Extension materials on
Evaluation….with future MSU specific
resources to be released in 2011
MSU Evaluation Specialist
 Assists with work teams to develop logic
model objectives and evaluation strategies
 Consults on evaluation designs
 Provides guidance to data analysis and
selecting measures
 Develops and delivers educational programs
related to Extension program evaluation
 Facilitates evaluation plan development or
brainstorming for Institute work teams
Organizational Development team member
•
•
•
•
•
•
Dr. Cheryl Peters, Evaluation Specialist
Statewide coverage
cpeters@anr.msu.edu
989-734-2168 (Presque Isle)
989.734.4116 Fax
Campus Office: Room 11, Agriculture Hall.
Campus Phone: 517-432-7605
• Skype: cpeters.msue
MI PRS Resources
Download