Review: Alternative Approaches II

advertisement
Review: Alternative Approaches II

What three approaches did we last cover?

Describe one benefit of each approach

Which approach focuses on the marginalized?

What were the five cautions the authors shared
about the alternative approaches to evaluation?
Guidelines for Planning Evaluations:
Clarifying the Evaluation Request
and Responsibilities
Dr. Suzan Ayers
Western Michigan University
(courtesy of Dr. Mary Schutten)
Individuals who Affect or are Affected
by an Evaluation Study

Sponsor: authorizes the evaluation, provides
resources for its conduct

Client: requests the evaluation

Stakeholders: those who have a stake in the
program or in the evaluation’s results

Audiences: individuals, groups, agencies who
have an interest in the evaluation and receive
its results
Understanding Reasons for
Initiating Evaluation

Understanding the purpose of the evaluation is
an important first step
– Did a problem prompt the evaluation?
– Did some stakeholder demand it?
– Who has the need to know?
– What does s/he want to know? Why?
– How will s/he use the results?

It is not uncommon for the clients to be
uninformed about evaluation procedures and to
have not given deep thought about the
ramifications

Frequently, the purpose is not clear until the
evaluator has carefully read the relevant
materials, observed the evaluation object, and
interviewed stakeholders
Practical Application to YOUR Plan:
Questions to Begin
1) Why is this evaluation being requested? What
questions will it answer?
2) To what use will the evaluation findings be
put? By whom? What others should receive the
information?
3) What is to be evaluated? What does it include?
Exclude? During what time period? In what
settings? Who will participate?
4) What are the essential program activities?
How do they link with the goals and objectives?
What is the program theory?
5) How much time and money are available for
the evaluation? Who can help with it? Is any
information needed immediately?
6) What is the political climate and context
surrounding the evaluation? Will any political
factors and forces interfere in gaining
meaningful and fair information?
Informational Uses of Evaluation

Needs Assessment
– Determine whether sufficient need exists to initiate a
program and describe the target audience
– Assist in program planning by identifying potential program
models

Monitoring/Process Study
– Describe program implementation and whether changes
from the initial model have occurred

Outcomes Study
– Examine whether certain goals are being achieved at desired
levels
– from the initial model have occurred

Cost Effectiveness Study
– Judge overall program value & its relative cost:value ratio
compared to competing programs
Noninformational Uses
Postponement of a decision
 Ducking responsibility [know decision already
but need to make it look good]
 Public Relations [justify the program]
 Fulfilling grant requirements


Covert, nefarious, political uses of information:
– Typically more common in federal/national
evaluations
Conditions under which evaluation
studies are inappropriate

Evaluation would produce trivial information
– Low impact program, one-time effort

Evaluation results will not be used
– Regardless of outcome, political appeal/public support…

Cannot yield useful, valid information
(bad worse than none)
– Well-intentioned efforts, “mission impossible” evals

Evaluation is premature for the stage of the program
– Fitness program evaluation in first 6 weeks will not yield
meaningful information
– Premature summative evals most insidious misuse of
evaluation

Motives of the evaluation are improper
– Ethical considerations, “hatchet jobs”
(propriety: eval respects rights
& dignity of data sources; help organizations address all clients’ needs)
Determining Appropriateness

Use a tool called evaluability assessment
– Clarify the intended program model or theory
– Examine the program implementation to
determine whether it matches the program
model and could achieve the program goals
– Explore different evaluation approaches to
match needs of stakeholders
– Agree on evaluation priorities and intended
uses of the study
Methods

Create working group to clarify program model
or theory, define information needs, evaluation
expectations
– Personal interviews with stakeholders
– Reviews of existing program documentation
– Site visits

Figure 10.1 (p. 186): checklist to determine
when to conduct an evaluation

External
Who will Evaluate?
– impartial, credible, expertise, fresh look
– participants may be more willing to reveal sensitive
information to outsiders
– more comfort presenting unpopular
information/advocating changes, etc.

Internal
– Knowledge of program, history, context, etc.
– familiarity with stakeholders
– Serve as advocates to use findings
– quick start up
– Known quantity

Combination
– Internal collect contextual information
– Internal collect data
– External directs data collection, organizes
report
– Internal is there to advocate and support
after external is gone
Evaluator Qualifications/Skills
Does evaluator have the ability to use
methodologies and techniques needed in the
study?
 ….have the ability to help articulate the
appropriate focus for the study?
 ….have the management skills to carry out the
study?
 …maintain proper ethical standards?
 …communicate results to audiences so that
they will be used?

Download