Directors School 2012 - North Carolina State University

advertisement
Karla A. Henderson, Ph.D.
Professor, North Carolina State University
karla_henderson@ncsu.edu
Framework
 Evaluation Inventory (handout)
 Needs Assessment Regarding Program Evaluation
(Types of Needs Explained)
 Small Group Needs Identification (6 people per group)
Importance (Very, Moderately, Slightly, Not)
5-10 topics to Discuss Together
What is Evaluation?
Evaluation= systematic collection and analysis of data to
address criteria to make judgments about the worth or
improvement of something; making decisions based on
identified criteria and supporting evidence
Assessment= the examination of some type of need that
provides a foundation for further planning
Evaluation is sometimes like solving a mystery
Types of ProgramEvaluation
Macro
Micro
(System
(Activity or
Evaluation)
Event
Evaluation)
Formal (Systematic)
Evaluation
MACRO and MICRO approaches----- Provides rigor
 Systematic gathering (procedures and methods) of
evidence
 Leads to decisions and action
 Criteria
 Evidence
 Judgment
Assessments
 Why assess?
 eventually use the input to design programs and
objectives
 generates new program ideas
 gives constituents a say
 helps you be responsive
 What is a need? A want? An intention?
 Assessments determine all three of these so YOU
can figure out how to promote what you are doing
Types of needs (Caution!)
 Normative needs- what should be available
 Felt needs- what individuals believe they
would like
 Expressed needs- needs fulfilled through
participation
1. Why do Systematic
Evaluation
Potential Purposes:
 Efficiency-How
are we doing?
(Management
/Process
Focused)
Effectiveness-
What difference
do our efforts
make?
(Impact/Outcome
Focused)
Accountability Era—DO YOU AGREE?
 What gets measured gets done
 If you don’t measure results, you
can’t tell success from failure
 If you can’t see success, you can’t
reward it
 If you can’t reward success, you’re
probably rewarding failure
Reinventing Government, Osborne and Gaebler, 1992
University of Wisconsin-Extension, Program Development and Evaluation
Accountability Era
 If you can’t see success, you can’t
learn from it
 If you can’t recognize failure, you
can’t correct it.
 If you can demonstrate results, you
can win public support.
Reinventing Government, Osborne and Gaebler, 1992
University of Wisconsin-Extension, Program Development and Evaluation
Evaluation Process: “Begin
with the end in mind.”
Covey (1990), 7 Habits of Highly Effective People
2. How to Think Like
an Evaluator
A thinking process used by
evaluators:
(from Richard Krueger)
 Reflecting
 Develop a theory of action. A logical sequence that results in change.
 Begin with what is supposed to happen--the results.
 Listening
 Share the theory of action with others.
 Measuring

Determine your measurement strategies--how you're going to look at
the program
 Adding value to the program

What can evaluation do to contribute to the program? How can
evaluation make the program better, more enjoyable, focused on
results, accountable, satisfying to participants and educators, etc?
Ways of Thinking:
 Goal Based Thinkers - "We look for goals"
 Audit Thinkers -"We investigate and find out what's





wrong"
Utilization Focused Thinkers - "We make evaluation
useful"
Empowerment Focused Thinkers - "We empower local
people"
Positivistic Thinkers - "We are scientists"
Number Thinkers - "We count--and we do it well“
Qualitative Thinkers - "We tell stories" (Richard Krueger)
Practical tips for successful
evaluation:
(Richard Krueger)
 Involve others Utilization, impact and believability emerge from









involving colleagues and clientele. If you want the information used
then involve others!
Ask yourself: Do I have a program? and, Is it worthy of evaluation?
Consider your purpose for evaluating---- (see earlier slide)
Consider who wants the evaluation-Who requested it?
Use a variety of evaluation methods when possible.
Keep costs low by: Sampling strategically
Keep interest high by adding payoff to the participant.
Start with goals, but don't be unduly limited by goals
Consider "early evaluation"
Design the evaluation carefully. The evaluation should:



Enhance the program
Yield information beneficial to stakeholders
Conserve resources
3. Differences
Between
Assessment,
Evaluation, and
(Action) Research
Evaluation= systematic collection and analysis of data to
address criteria to make judgments about the worth or
improvement of something; making decisions based on
identified criteria and supporting evidence
Assessment= the examination of some type of need that
provides a foundation for further planning
Action Research-Evaluation leading to decisions/changes
4. Steps Involved in
Evaluation Process
Steps
 Problem, Idea Identified
 Problem Statement/Purpose Determined
 Instrument/Method Chosen
 Data Sources
 Data Collection
 Data Analysis
 Conclusions/Recommendations
Evaluation Process: “Begin
with the end in mind.”
Covey (1990), 7 Habits of Highly Effective People
5. What Should be
Evaluated
Areas to Evaluate
Personnel
Places
Policies
Programs
Participant Outcomes
Potential Purposes:
Efficiency-How
are we doing?
(Mgmt/Process
Focused)
Effectiveness-
What difference
do our efforts
make?
(Impact/Outcome
Focused)
Levels of Evaluation:
 END RESULTS (Impact)
 PRACTICE CHANGE (Outcomes)
 KASA CHANGE (Knowledge, attitudes, skills, and




aspirations)(Outcomes)
REACTIONS (SATISFACTION) (Outputs)
PEOPLE INVOLVEMENT (Outputs)
ACTIVITIES (Outputs)
INPUTS--RESOURCES
What is sampling?
 A population is the theoretically specified aggregation
of study elements.
 A sample represents or is representative of a
population
Types of Sampling
 Probability
 Non-Probability
 Theoretical
Probability
 Probability sampling
Samples are selected in accord with probability theory, typically
involving some random selection mechanism.
Random
Stratified Random
Systematic
Cluster
 Representativeness
Quality of a sample having the same distribution of characteristics
as the population from which it was selected.
Nonprobability
 Technique in which samples are selected in a way that
is not suggested by probability theory.
Purposive
Convenience
Quota
Expert
Snowball
6. Who Should
Conduct Evaluations
WHO DOES THE EVALUATIONS?
 Internal
 You!
 Staff
 Agency Evaluation Personnel
 External
 Consultants
 University Students!
Regardless—YOU have to know your purpose, goals, and
appropriate methods!
7. When Should
Evaluataions Be
Done
Timing
 Assessments (planning) – find out
where to begin based on what you
know
 Formative (process)- concerned with
efficiency and effectiveness
 Summative (product) – overall
performance
Approaches to Needs
Assessments
 Literature/Professional Development
 Advisory Groups
 Structured Interviews (individual and
focus groups)
 Surveys
Formative Evaluation
 Evaluation in Process
 Allows Changes to be Made
Immediately
 Most often Focused on Inputs and
Outputs
Summative Evaluation
At the END of something
“What was?”
Recommendations for the Future
8. What Tools are
Available for
Evaluations
Data Collection Methods
MAJOR ONES:
 Questionnaires/Surveys
 Interviews (Individual and Focus Groups)
(Pros and Cons)
Other Methods:
 Systematic Observation
 Checklists
 Field Observations
 Unobtrusive Measures
 Physical Evidence
 Archives
 Covert Observations
 Visual Analyses
 Experimental Designs
 Case Studies
9. What “Cutting
Edge” Strategies
Exist for Evaluation
To consider (see below for further info)
 Logic Models—”outcome focused”
 Trends Analysis
 Benchmarking
 Proragis (NRPA)
10. CAPRA Standards
related to Evaluation
CAPRA Standards
 10.1 Systematic Evaluation Program
There shall be a systematic evaluation plan to
assess outcomes and the operational
deficiency and effectiveness of the agency.
 10.2 Demonstration Projects and Action Research
There shall be at least one experimental or
demonstration project or involvement in
some aspect of research, as related to any part
of parks and recreation operations, each year.
CAPRA Standards
 10.3 Evaluation Personnel
There shall be personnel either on staff or a
consultant with expertise to direct the
technical evaluation/research process.
 10.4 Employee Education
There shall be an in-service education program
for professional employees to enable them to
carry out quality evaluations.
11. Evaluating Inputs,
Outputs, Outcomes,
and Impacts
Evaluation Approaches—KEY POINTS!
 Multiple LEVELS of evaluations:
 Inputs (costs, personnel, etc.)
 Outputs
 Activities
People involvement
 Reactions
 Outcomes
 KASA-knowledge, attitudes, skills, aspirations
 Behavior CHANGE
 Impacts


Long-term Benefits
12. Using Goals and
Objectives as Basis
for Evaluation
 What are the goals of the
program?
 What do we expect to
happen?
 What do we want
participants to do, gain,
learn?
BEGIN WITH THE END IN MIND!!
Goals and Objectives
Goals

Broad, long-range statements that define the
programs/services that are going to be provided
Objectives

Specific statements (about the attainable parts of the
goal) that are measurable and have some dimension of
time.
Objectives

Specific


Measurable


Must be attainable and reality-based!!!
Relevant


Must be some way to determine whether or not the desired
results have been achieved
Achievable


Must be clear and concrete
Must be useful; must have worth to your organization
Time-limited/Time connected

Must specify a time frame for accomplishment
Adapted from Edginton, Hanson, & Edginton, 1980
13. Using Logic
Models as Basis for
Evaluation
Logic Model (Preview)
Goal
StrategiesActivities
Short-term
Outcomes
Long-term
Outcomes
Data
Sources &
Performance
Measures
Moving Toward Success: Framework for After-School Programs, 2005
A logic model is…
 A HOT TOPIC in (Program) Evaluation
 A depiction of a program showing what the
program will do and what it is to
accomplish.
 A series of “if-then” relationships that, if
implemented as intended, lead to the
desired outcomes
 The core of program planning and
evaluation
University of Wisconsin-Extension, Program Development and
Evaluation
Simplest Form of a Logic Model
INPUTS
OUTPUTS
OUTCOMES
54
Everyday example
H
E
A
D
A
C
H
E
Situation
Get pills
Take pills
Feel better
INPUTS
OUTPUTS
OUTCOMES
55
Where are you going?
How will you get there?
What will show that you’ve
arrived?
“If you don’t know
where you are going,
how are you gonna’
know when you get
there?”
Yogi Berra
56
How will activities lead to desired outcomes?
A series of if-then relationships
Tutoring Program Example
IF
We
invest
time and
money
then
IF
We can
provide
afterschool
programs to
50 children
then
IF
Students
will have
opportuniti
es for
active
recreation
then
IF
They will
learn and
improve
their
recreation
skills
then
IF
They will
be more
active
University of Wisconsin-Extension, Program Development and Evaluation
then
They will
become fit
and
develop
life long
skills
57
Logical chain of connections
showing what A program is to
accomplish
INPUTS
OUTPUTS
Program
investments
Activities
Participation
What we
invest
What
we do
Who we
reach
OUTCOMES
Short
Medium
Longterm
What results
58
OUTPUTS
What we do
ACTIVITIES
•Train, teach
•Deliver services
•Develop products and
resources
•Network with others
•Build partnerships
•Assess
•Facilitate
•Work with the media
•…
Who we reach
PARTICIPATION
•Participants
•Clients
•Customers
•Agencies
•Decision makers
•Policy makers
University of Wisconsin-Extension, Program
Development and Evaluation
Satisfaction
59
OUTCOMES
What results for individuals, families, communities..…
SHORT
MEDIUM
LONG-TERM
Learning
Action
Conditions
Changes in
Changes in
Changes in
• Awareness
• Knowledge
• Attitudes
• Skills
• Opinion
• Aspirations
• Motivation
• Behavioral intent
•Behavior
•Decision-making
•Policies
•Social action
Conditions
Social (well-being)
Health
Economic
Civic
Environmental
CHAIN
OF
OUTCOMES
Limitations
Logic Model…
 Doesn’t address:
Are we doing the right thing?
University of Wisconsin-Extension, Program Development and Evaluation
14. Using
Benchmarking as a
Basis for Evaluation
Benchmarking
 Agencies or Services compared to other agencies or




services to determine a relative value
Set goals against a “standard” of the best
Benchmarks are the statistics; Benchmarking is the
process
Information examples: budgets, salaries, fees charged,
number of participants, number of staff
Decide who you want to compare, design a
questionnaire, analyze, and write report
15. Using NRPA’s
PRORAGIS Tool for
Evaluation
16. Doing
Cost/Benefit and
Economic Impacts
Economic Impact Analyses
Determining the dollar value a
program or service has on the
economy
The net change of money in a
community from the spending of
visitors—only visitors are part of
the analyses
Cost-Benefit Analyses
Focus on Monetary Forms
Sometimes Difficult to Do
Social Benefits may outweigh
Monetary Benefits
17. Using Trends
Analysis for
Evaluation
Trends analysis
Examining the future by looking at
the past
Measuring data over time
Secret is good record keeping
18. Designing Valid,
Reliable, and
Useable
Questionnaires
Kinds of Questions
 Open-ended
 Close-ended
 Fixed alternatives
 Likert Scale
 Semantic differential
 Ranking
 Partially close-ended
Examples
 Open-ended: “What did you like best about day
camp?”
 Close-ended, fixed alternatives: “What did you like
best about day camp this summer?”
 Counselors
 Arts
 Snacks
 Swimming
 Sports
Another Example
 Close-ended, Likert: “What did you think about each
of these parts of day camp?”
Counselors
Arts & Crafts
Snacks
Swimming
Poor Fair Good Great
1
2
3
4
1
2
3
4
1
2
3
4
1
2
3
4
Wording Advise










One idea per question
Clear, brief, and simple
Avoid leading questions
Avoid estimates if possible
Use words familiar to respondent
Avoid fancy words and jargon
Be clear about meanings of words
Avoid negative questions
Do a pilot study
State alternatives precisely-mutually exclusive
responses
 Use stages if needed but give GOOD directions
Format and Layout Design
For Good Response










Give clear directions
Start with something easy & familiar (NOT demos though)
Have white space & easy font to read
Colored paper if easy to read—font size appropriate for audience
Have easy to follow directions for staged questions
Have “professional look”
Front page should include title, date or time or year, perhaps a graphic
Keep the length manageable for the info desired
Anchor all numbered responses—e.g., 5=strongly agree, 4=agree etc.
NO TYPOGRAPHICAL ERRORS
PILOT TESTING
 Is the survey valid?
 How much time does it take to complete?
 Do the respondents feel comfortable
answering the questions?
 Is the wording of the survey clear?
 Are the answer choices compatible with
the respondents’ experience in the
matter?
19. Using Individual
Interviews
Types of Interviewing
 Personal (in-depth)
 Telephone
 Part of Field Observation
 Focus Groups (Group Interviews)
Interviews
 Approaches:
 Structured, close-ended
 Standardized, open-ended
 Interview guide (semi-structured)
 Informal conversational (unstructured)
Asking Open-Ended Questions









NEVER allow a person to say “yes” or “no”
Purpose is to get people to talk
Start with noncontroversial questions
Demographics at end/don’t ask if you know (e.g., gender)
Avoid “WHY”
Use probes and follow-ups (“Tell me more about…?”
Use listening skills
Maintain control of the interview
Share yourself with the interviewee
Structuring Interviews








Purpose of interviews
Individual or group
If group- number of persons involved
Structured or unstructured?
Audio-taping/Note-taking
Facilitator/Recorder Roles
Rapport-building
Timing
Things to remember
 Pay attention to the content of your questions
(keep your focus)
 Give thought to setting up the interview
 Pay attention to how you plan to record the
data
 Take notes-on-notes
 Conduct interviewer training
 Go over possible problem areas
Training for Interviewers
 Discussion of general guidelines and procedures.
 Specify how to handle difficult or confusing situations.
 Conduct demonstration interviews.
 Conduct “real” interviews.
Guidelines for Structured Survey
Interviewing
 Dress in a similar manner to the people who will be




interviewed.
Study and become familiar with the questions.
Follow question wording exactly (if it is quantitative).
Record responses exactly.
Probe for responses when necessary.
Telephone Surveys
Advantages:
 Money and time.
 Control over data collection.
Disadvantages:
 Surveys that are really ad campaigns.
 Answering machines.
20. Conducting
Focus Groups
 Focus Group
A group of people are brought together in a room to
engage in guided discussion of a topic.
Pros
and cons
 Pros





Socially oriented
Flexible
High face validity
Speedy
Low in cost?
 Cons
 Researcher has less control
 Data may be difficult to




analyze
Need good moderator
Must do more than one
group-great differences
between groups
Groups difficult to assemble
Must have conducive
environment
What about group interviews?
 Focus groups encourage ideas by interacting
with other individuals
 Pay attention to the same issues in a personal
interview
 Have a clear purpose for the group meeting
(criteria)
 Have clear structured, open-ended questions
Points for Conducting a
Focus Group
 Be a good moderator
 Set ground rules
 The beginning of the focus group process is crucial
to its success
 Keep the process focused and flowing
 Probe where needed
 Be sure to thank participants
Other Focus Group Tips
 Usually 5-10 individuals
 Key is to get people to talk to each other
 Usually two moderators (one to direct discussion and





another to take care of technical aspects)
Use incentives to get people to particiapte (maybe)
Pilot test just like ALL surveys
USE OPEN-ENDED questions
Be familiar with group process—getting people to talk,not
letting some take over
Ground rules-speaking one at a time, confidentiality,
breaks taken, positive and negative
Summary
 Focus groups encourage ideas by interacting
with other individuals
 Pay attention to the same issues in a personal
interview
 Have a clear purpose for the group meeting
(criteria)
 Have clear structured, open-ended questions
21. Using
Observational Tools
Types of Data
 Qualitative
 Quantitative
Qualitative Observations
 Takes a great deal of field time
 Need in-depth notes on your observations
 Use of key informants is often helpful
 Criteria often change and become re-defined
Tips on (Field) Observations







Only unscientific if inappropriate techniques are used
Highly reliable and valid IF systematically done
Time-consuming
Anecdotal records or critical incidents describe FACTS of situations
Field notes are collected-MANY, MANY, MANY
Researcher is data collection instrument
SEPARATE facts from interpretations
Notetaking










Record as soon as possible
Take notes in detail-running descriptions, conversations, incidents
Make copies of notes ASAP
Indicate whose language-yours, quotes, paraphrases
Rule of thumb-several (5-6 single spaced typed pages for an hour of
observation)
Can use tape-recorder and transcription
Record ALSO what you do not understand
MONITOR yourself as a data collector
Length of time spent depends on research question and theoretical
saturation
“IF it is not written down, it never happened!”
Dimensions of Approaches
 Full participant observer-> Outside observer
 Overt observer-> Covert observer
 Single observation-> Long-term multiple obs.
 Narrow focus-> Broad focus
Quantitative: Duration Sampling Checklist
 Participant: _________
Activity: _____________ Time: _____ to
________

 Behavior
Duration
Total

 On task (appropriate)
2 sec., 3 sec., 19 sec
24 sec
 Approp. Conversation

(w/ peers)
 Walks around room
 Argues w/ leader
 Argues w/ peers
14 sec., 29 sec
112, 143 sec, 157 sec
0 sec
43 sec
43 sec
412 sec
0 sec
43 sec
Frequency Sampling Checklist
 Client: _________
Activity: _____________ Time: _____ to ________

 Behavior
Frequency of Behavior

 Initiates a conversation
 Shifts attention to new project
 Asked question by peer
//
/
////
 Responds appropriately when
asked question by peer
///
Rating Scales
 Behavior
Excellent
Average
Poor*
 How well does the
client respond to questions?
 How appropriate does client
interact with peers?
 *Excellent= top third Average= middle third
Poor= lowest third
22. Using On-line
Tools
(SurveyMonkey etc.)
Question Structures-SurveyMonkey

Question 1: Have you used a swimming pool in the last year?

Yes

No

Question 2: How many times in the last year have you gone to a public swimming pool?

Never

Once

Twice

More than 3 times

Question 3: How many times have you gone to a public swimming pool this past year? (Please
write in an estimate number) __________
On-Line Considerations--Pros
 Same strengths as a paper version
 •Better at addressing sensitive issues
 •Cost efficient
 •Faster delivery
 •Design options
 •Dynamic
 •Ability to track
 •Quick response time
 •Easier to use for skip logic
On-Line Considerations--Cons
 Spam/Privacy concerns
 •Technical issues
 •Submitting multiple submissions
 •No interviewer present to clarify questions or issues
New Technologies and Survey
Research
 CAPI - computer assisted personal interviewing.
 CASI - computer assisted self interviewing.
 CSAQ - computerized self-administered
questionnaires.
 TDE - touchtone data entry.
 VR - voice recognition.
23. “Fun” Strategies
for Doing
Evaluations
Importance-Performance Eval
How important is something?
How well did the organization perform it?
Matrix of:
Keep up the good work
Possible Overkill
Needs Work
Low Priority
Other Ideas:
 Use Grading Scale A, B , C, D, F
 Use Pie Chart and distribute money for preferences
 Use Happy Faces
 Computer Assisted
 Others…
Measuring Physical Activity
SOPLAY or SOPARC
Pedometers or
Accelerometers
24. Selecting
Evaluation
Instruments
Selecting an Evaluation/Measurement Tool







Is it reliable and valid?
Does it measure what you want?
Appropriate for the participants?
Reasonable to administer and in your price range?
Directions clear, concise, unambiguous?
Easy to analyze?
Is it the best way to measure the objectives?
 Is the activity reaction form put together by Cary, NC
the one you should use?
25. Contracting with
Outside Consultants
Outside:
 You have to know what you want done
 You have to be able to evaluate results and make
decisions
 You probably need some amount of financial resources
 Could save you a lot of time if you do it right
26. Developing Your
Own Evaluation Tool
Steps:
 Define problem/level of evaluation
 Determine contents (including criteria) and broad




questions
Identify and categorize respondents
Develop items, structure format
Write directions
Ensure response
Purpose of Data Collection
 What do you want to know?
 Who has the information?
 What is the best approach (based on purpose,
time, money, resources, your expertise) to use?
 How will you use the results?
 Are you interested in outputs and/or outcomes?
Kinds of Information Sought




Behavior info
Knowledge info
Attitudes/beliefs/values info
Demographic info
**pay attention to relationship of respondent to the
question (their past, present, future)
27. Ethical
Considerations
related to Evaluation
Issues
 Ethical
 Political
 Legal
 Moral
Evaluator must be Competent
 Knowledge about area to be researched
 Knows about evaluation design
 Knows wide range of methods
 Knows how to analyze and interpret data and
relate them back to conclusions/recommendations
Developing Competencies – con’t
 Knows how to use the results
 Understands how to handle political, legal, and ethical
concerns encountered
 Must have certain personal qualities: trustworthy,
strives for improvement, responsive to sensitive issues
“Doing the Right Thing”
 Political Issues (science is touched by politics but goes on
anyway; social change is always political; values matter)
 Supports/refutes views and values
 Personal contacts
 Value-laden definitions
 Controversial findings
 Pressures to produce certain findings
 Know the organization’s position, don’t go beyond the data
in conclusions, have clear purpose for research
“Doing the Right Thing”
 Legal Issues
 Not many legal concerns except around illegal behaviors
 Moral






Unintentional mistakes made by bias or mistake
Cultural & procedural biases
Letting bias, prejudice, friendships influence outcomes
Dealing with negative findings
Taking too long to get the results out
Be prepared to recognize the possibility of statistical errors and know how
to explain them
“Doing the Right Thing”
 Voluntary Participation
 No Harm
 Anonymity and Confidentiality
 Issues of Deception
 Analysis and Reporting
Ethics Principles







Be open with people
Don’t promise more than you can do
Protect the rights and privacy of your subjects
Guard against coercion
Get written consent & Board approval
Guard against harm
Let participants know the results
 Anonymity
The researcher cannot identify a given response with
a given respondent.
 Confidentiality
Researcher can identify a given person's responses but
promises not to do so publicly.
Ethical Considerations
 Avoid bias (settings, questions, populations)
 Consider question content
 Protect confidentiality (if appropriate)
 Manage data/records
 Report results of evaluation
 Respect/report different opinions
28. Devising an
Evaluation Plan for
an Agency
Purposeful Planning for Evaluation
 Schedule time on your yearly calendar
 Involve planning Board, committees,
colleagues, staff, volunteers, etc.
 Identify what you hope to achieve
(desired outcomes)
 Identify your goals and objectives
Purposeful Planning for Evaluation
 Identify the methods you will use to
measure whether or not your program
met objectives
 Identify how often evaluation will be
undertaken
 Identify WHO will do the evaluation
and how they will get the appropriate
resources
Formal (Systematic)
Evaluation--Review
MACRO and MICRO approaches----- Provides rigor
 Systematic gathering (procedures and methods) of
evidence
 Leads to decisions and action
Criteria
Evidence
Judgment
29. How to Use
Information from
Evaluation Projects
Using Data-KEY ADMONISHMENTS
 If you collect the data, you need to analyze
the data!
 Record the results of your data collection
 Compare your findings to your original goals
and objectives
 Share with appropriate people
 Incorporate into your planning cycle
Download