What is Evaluation? - Arkansas State University

advertisement
Center for Community
Engagement
Office of Behavioral
Research and Evaluation
Who We Are






Becky
Brittany
Christy
Eleny
Phyllis
Trisha
Brief History
 Where we began (small evaluation office in Education)
 How we evolved to where we are now
So, where/what are we now?
 CCE …
 Is a centralized resource for faculty and students at ASU
 Provides a means to interact and work with university units and
faculty and with the broader community through partnerships
 OBRE…
 Is a centralized resource for education faculty and their partners
across colleges, schools, centers, co-ops, etc.
 Provides consultation for and evaluation of educationally-related
programs
What is our purpose?
To foster






activity at ASU
collaboration at ASU
program development
community partnerships
opportunities to engage in society in meaningful ways
an exchange of resources for mutual benefit
 multi-faceted outreach for the community at large
What We Do
Examples of CCE’s Work
 Training
 PBIS
 Grant Writing
 Grant Writing
 DFC
 Youth Drug Court
 Evaluation &
Research




SOC
Beck PRIDE
ELF
FYE iPad Initiative
 Consultation & Program
Development
 College of Education
 Out of the Dark
 STEM
 Service Learning/Research
 Instrument Development
 ERZ
 Asset Mapping
 © instrument for Wraparound (AR)
 Cultural and Linguistic
Competence
Examples of Past Collaborations
 Recent Past











LIFT
Second Chance
Out of School Network
ACTION for Kids
REACH
BRAD
Jonesboro Auditorium Commission
ERZ
DFWC
CCC
Rural and Delta STEM Centers
How We Operate
 With Whom?
 In What Ways?
 How is the Work Funded?
What Is It That We Do?









We help write grants
We help develop programs
We help evaluate programs
We help develop plans
We develop training
We map assets & analyze needs
We do research
We help make schools safer
We help students achieve
 We don’t do floors or dishes
 We work for coffee, tea, & chocolate
Email:
cce@astate.edu
Website:
cce.astate.edu
2013 Institute for Research
Development
May 16, 2013
The Story Line
 In the grant development system, the
submitters are represented by two
separate, yet equally important groups.
The program directors who implement the
program and the evaluators who assess
the implementation. These are their
stories.

(with apologies to Law and Order)
First Question to Ask Yourself
 Why does your project deserve money?
 Who will benefit from your work? How?
 Will you be a good steward of the money?
 How do we know the money was worth it?
 How will you know if you accomplished
anything?
What You Will Learn Today
 Evaluation Needs
 Why logic models are important
Institute for Research
Development
WHAT IS
EVALUATION &
WHY DO WE NEED
IT?
What is Evaluation?
“Evaluation is a systematic process to
understand what a program does and how well
the program does it.”
http://www.cdc.gov/healthycommunitiesprogram/tools/pdf/eval_planning.pdf
What is Evaluation? (continued)
 Evaluation is…
 Measurement
 Assessment
 Data collection
What is Evaluation? (continued)
 Evaluation is…
 Making sure your activities are moving forward
as they should
 Making sure you will accomplish something
 Letting others know if you succeeded
What is Evaluation? (continued)
Evaluation should…
 be the core of your project
 challenge your project implementation
 assess your program, organization, or plan
 help you decide how to proceed
What is Evaluation? (continued)
 Types of evaluation
 Process—monitoring procedures &
implementation
 Is activity or program progressing
 Schedule of activities
 Attendance at meetings
 Articles published
What is Evaluation? (continued)
 Formative
 Assessing efforts to allow revision
 Strengths
 Weaknesses
 Progress
What is Evaluation? (continued)
 Outcome (Summative)
 Did project/activity work
 Short term
 Accomplish objectives
 Change knowledge/attitudes/behavior
What is Evaluation? (continued)
 Impact
 Long-range results
 Broader target (e.g., community-level, school-level)
Evaluation Model Example
Impact
evaluation
Scope
Outcome
evaluation
Process
evaluation
Impact evaluations are broader
and assess the overall or net
effects—intended or unintended—
of the program as a whole.*
Outcome evaluations investigate
whether the program causes
demonstrable effects on
specifically defined target
outcomes.*
Process evaluations
investigate the process of
delivering the program,
focusing primarily on inputs,
activities, and outputs.*
Performance Measurement
Program Monitoring
Time
* Evaluation definitions excerpt from: Trochim, William M. The Research Methods Knowledge Base, 2nd Edition. Internet
WWW page, at URL: <http://trochim.human.cornell.edu/kb/index.htm> (version current as of Aug. 02, 2000).
26
Steps in Evaluation (Programs)
http://learningstore.uwex.edu/assets/pdfs/G3658-1W.PDF
So Why do We Need Evaluation?
 To show that the money spent was
worthwhile
 To provide evidence that your project works
(e.g., for future funding)
Who Expects You to do Evaluation?
 Everyone
 Government (NSF, NIH, SAMSHA, DoE, HUD,
DoJ,…)
 Foundations
Example--NSF
 Online Evaluation Resource Library (oerl.sri.com)
 Project Outcomes Report for the General Public
(same goals as evaluation)
(www.nsf.gov/pubs/policydocs/porfaqs.jsp)
Example—SAMHSA
 SAMHSA grantees are required to collect
and report … data …under the Government
Performance and Results Act (GPRA) …
 Performance data will be reported to the public,
the Office of Management and Budget (OMB) and Congress as
part of SAMHSA’s budget request.
Example—SAMHSA cont’d
 Grantees must … review … performance data …. The
assessment should…help you determine whether
you are achieving the goals, objectives and
outcomes you intend to achieve and whether
adjustments need to be made to your project.
 You will be required to report on your progress
achieved, barriers encountered, and efforts to
overcome these barriers…
Example—Project Intercept
SO…
WHAT DO PEOPLE
DO FOR
EVALUATION?
Evaluation Examples
 Differs across fields
 Formative types
 Monitor meetings
 Monitor timeline
 CQI
 Needs assessment
 Asset mapping
 Strengths and weaknesses (or SWOT)
Evaluation Examples
 Summative






Behaviors
Attitudes
Knowledge
Costs (cost-benefit)
Policies
Institutional changes
 Ideally want “counts”
Caveat
 Formative & Summative just heuristics
 What is what? Depends on project & funding
agency
 Summative can be formative when used within
project delivery
 Formative can be summative when used as
outcome
 Needs assessment often needed in advance of
project
Example—ELF Program
Formative evaluation--quantitative
 Goal #1: Increase retention of freshman STEM majors, especially those
students from underrepresented groups.
 Use demographic and other information (e.g., financial eligibility) to count the
number of freshman from underrepresented groups from the Ark-LSAMP
program.
 Count the number of freshmen that start in the ELF Program who progress to
the sophomore year.
 Goal #2: Provide faculty and peer mentoring to students in the ELF
Program
 Count the number of upper-division students in the ELF Program who become
team leaders for research projects.
 Count the number of faculty mentors for each student in the ELF Program.
Example—ELF Program cont’d
Formative Evaluation cont’d :
Additional assessments for student and program
portfolios
 Monitor student grades (as a measure of achievement).
 Monitor progress toward graduation (as a measure of
achievement).
 Measure student interest, involvement, and investment
using surveys and focus groups each spring to assess
student opinions, attitudes, and suggestions for program
improvement.
Example—ELF Program cont’d
 Summative Evaluation
 Less emphasis
 Sum of formative
 e.g., graduation rate by year summed across years
Example—Project HOPE
Asset Mapping Summary as Summative Product
HOW DO YOU
DETERMINE WHAT
YOUR EVALUATION
SHOULD BE?
Virtually All Projects Must Have…
 Goals (abstract, what do you want to accomplish?)
 Objectives (concrete, measureable components of goals)
 Outcomes (optional, measured result)
 Objectives or Outcomes are what you evaluate
 Goals are determined by inferences
Example
 Goal: Improve math achievement
 Objective: Increase understanding of algebra
 Outcome: Student performance on end-of-course exams
will increase by 10%
OR
 Goal: Improve math achievement
 Objective: Increase student performance on end-of-course
exams
Evaluation Data
Also need to know…
 What kind of data will you collect?
 How will they be analyzed?
 To whom will they be reported?
Data Collection, Analysis, & Reporting
Types of Data







Record Review Data
Surveys
Interviews
Focus Groups
Measurements
Tests
Behaviors
Data Collection, Analysis, & Reporting (cont.)
Say you will measure only things you can measure
Examples of saving the world and failing in your outcomes
 Increase self esteem
 Too hard to define (baseline vs. barometric?)
 Too hard to measure (Rosenberg’s classic)
 Too hard to change
 Improve family functioning
 What does it mean?
 Increase collaboration—Huh?
 Increase achievement scores
 Designed to be stable
 General tests have too many different items, your program needs to target all
those items
 Norm- vs. criterion-referenced
 Increase student involvement
Data Collection, Analysis, & Reporting (cont.)
How will you analyze the data?



Pre-Post?
More frequent intervals?
Fewer intervals?
What assumptions will be made?
Data Collection, Analysis, & Reporting (cont.)
To whom will you report the data?
 Funding Agency (that’s a given)
 Stakeholders? Community?
 Participants?
 Possible future funders?
Assume you need to report at many levels.
EVALUATION
VERSUS
RESEARCH
Evaluation versus Research
 Evaluation differs from ‘research’ minimally
 Most often for programs
 Often not generalizable knowledge (but often is!)
 Both focus on assessment & measurement
Evaluation versus Research
RESEARCH
QUESTION
HYPOTHESIS
REPLICATE & EXTEND
RESEARCH
REPORT THE
RESULTS
PREDICTIONS
PROCESS
RESEARCH
METHOD
ANALYZE RESULTS
& DRAW
CONCLUSIONS
EXPERIMENT
Evaluation versus Research
PROGRAM
GOAL
OBJECTIVES
MODIFY…REPLICATE…
& EXTEND
EXPECTED
EVALUATION
OUTCOMES
PROCESS
REPORT THE
RESULTS
PROGRAM
IMPLEMENTATION
ANALYZE RESULTS
& DRAW
CONCLUSIONS
EVALUATION
Elements of an Evaluation Design
from Agency for Toxic Substances & Disease Registry, CDC)
1.
2.
3.
4.
5.
6.
7.
8.
Statement of objectives
Definition of data to be collected
Methodology
Instrumentation
Data Collection
Data Processing
Data Analysis
Report
LOGIC MODELS:
ROADMAP FOR PROGRAMS
Logic Model Example: Water quality
University of Wisconsin Extension
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Logic Models
(Benefits from OJJDP:http://www.ojjdp.gov/grantees/pm/logic_models_understanding.html)
 Why Create a Logic Model?
 Identify program goals, objectives, activities, and
desired results
 Clarify assumptions and relationships between
program and expected outcomes
 Communicate key elements of program
 Help specify what to measure in evaluation
 Guide assessment of underlying project
assumptions and promotes self-correction
Logic Models
(OJJDP Info: http://www.ojjdp.gov/grantees/pm/Logic_models_safeplay1.pdf)
Every day logic model :
Family Vacation
Family Members
Drive to state park
Budget
Set up camp
Car
Camping
Equipment
INPUTS
Cook, play, talk,
laugh, hike
OUTPUTS
University of Wisconsin Extension
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Family members
learn about each
other; family
bonds; family has
a good time
OUTCOMES
Logic Models: Kellogg Foundation Guide
http://www.compact.org/wp-content/uploads/2010/03/LogicModelGuidepdf1.pdf
Logical chain of connections showing
what the program is to accomplish
INPUTS
OUTPUTS
Program
investments
Activities
What we
invest
What
we do
Participation
Who we
reach
OUTCOMES
Short
Medium
What results
University of Wisconsin Extension
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Longterm
How will activities lead to desired outcomes?
A series of if-then relationships
Tutoring Program Example
IF
We
invest
time and
money
then
IF
We can
provide
tutoring 3
hrs/week
for 1 school
year to 50
children
then
Students
struggling
academically
can be
tutored
IF
then
IF
They will
learn and
improve
their skills
then
IF
They will
get better
grades
University of Wisconsin Extension
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
then
They will
move to
next
grade
level on
time
Generic Logic Model
(OJJDP: http://www.ojjdp.gov/grantees/pm/generic_logic_model.pdf)
Logic Model Used in Evaluation of the Children At Risk Program
From the Urban Institute via USDOJ: http://www.ojp.usdoj.gov/BJA/evaluation/guide/documents/evaluation_strategies.html#p6
Parent Education Program – Logic model
SITUATION: During a county needs assessment, majority of parents reported
that they were having difficulty parenting and felt stressed as a result
INPUTS
Staff
Money
Partners
Research
OUTPUTS
Assess
parent ed
programs
Designdeliver
evidencebased
program of
8 sessions
Facilitate
support
groups
OUTCOMES
Parents
increase
knowledge of
child dev
Parents
of 3-10
year
olds
attend
Parents better
understanding
their own
parenting style
Parents gain
skills in new
ways to parent
Parents
identify
appropriate
actions to
take
Reduced
stress
Improved
child-parent
relations
Parents use
effective
parenting
practices
Parents gain
confidence in
their abilities
University of Wisconsin Extension
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
www.ojjdp.gov/dmc/tools/dmcpresentations1_17_06.ppt
66
BACK TO EVALUATION
Planning your Evaluation
 Know your solicitation
 Know your Goals & Objectives
 Answer 2 questions
1. How will you know your project is on the right
track?
2. How will you know your project was effective?
 Don’t forget dissemination plan.
General Heuristics for Success
 Pay attention to solicitation
 Keep multi-methods in mind
 Most organizations like reports that provide…
 3 bullet points, a graph, & a quote
 Provide
 some quantitative & some qualitative stuff
 some objective & some subjective data
General Heuristics for Success
 Operationally define concepts in advance
 Don’t try to change the world, maybe just some
measureable behaviors
 Pick measures/outcomes that literature shows can be
changed
 Promise basics, add value
 Objective better than subjective
 e.g., caregivers think family improved, children still have
behavioral problems & going to jail—latter matter
 Include evaluator from beginning.
Fin, Finito, Ende
 Questions?
Thank You!
David A. Saarnio, Ph.D.
Christy J. Brinkley, Ed.S.
Center for Community Engagement
Arkansas State University
Phone: 870-680-8420
Fax: 870-972-3071
dsaarnio@astate.edu
cbrinkley@astate.edu
cce@astate.edu
Email:
cce@astate.edu
Website:
cce.astate.edu
Download