PPT - NTTAC!

advertisement
Evaluating Your Program:
Developing Successful and Sustainable
Programs
OJJDP New Grantee Orientation
April 6–7, 2010
Washington, DC
CSR, Incorporated
Outline
•
•
•
•
•
•
What is Program Evaluation?
The Logic Model
Evaluation Questions
Types of Program Evaluation
Data Collection and Analysis
Evidence-Based Programs
Program Evaluation
• Objective measurement and systematic
analysis to determine the manner and extent to
which a program achieves its intended goal.
Definitions adapted from the OMB Circular no. A-11
Part 6, Section 200
http://www.ojjdp.ncjrs.gov/grantees/pm/glossary.html
What Is Program Evaluation?
• There are many ways to define program evaluation
• Evaluation can be defined as the systematic process of
collecting and analyzing data in order to:
– Determine whether program objectives have been achieved;
– Make a decision about the program;
– Improve the delivery of services.
Boulmetis, J., & Dutwin, P. (2002). The ABC’s of Evaluation.
San Francisco: Jossey Bass.
Why Evaluate Your Program?
• Satisfy funding agency’s grant requirements
• Document program accomplishments
• Justify program
• Improve program activities and services
Why Evaluate a Program?
• Evaluation results provide useful feedback to:
– Sponsors and Funders
– Program Staff
– Clients
– Donors
– The Larger Community
First Steps in Program Evaluation
• Construct a logic model
• Identify performance measures/outputs
• Identify outcomes
Logic Model: Definition
• “Tool used to visually describe the
linkages between program goals,
activities, and expected outcomes.”
Definition from: http://www.jrsainfo.org/jjec/resources/definitions.html
Logic Model
Conditions
(Needs or Problems)
What community need
will the project fill?
(e.g., youth need for
positive social outlets)
Activities
What services will
the project offer?
(e.g., tutoring,
after school sports,
mentoring)
Outputs
How will you measure
project services?
(e.g., number of
youth served)
Outcomes
How will you know that
the community need has
decreased? (e.g., youth
have improved attitudes,
increased self-esteem;
youth delinquency or
substance use rates have
decreased; youth school
attendance improved)
Developing a Framework
Conditions
(Needs or Problems)
Activities
Outputs
Outcomes
What community need
will the project fill?
(e.g., youth need for
positive social outlets)
What services will
the project offer?
(e.g., tutoring,
after school sports,
mentoring)
How will you measure
project services?
(e.g., number of
youth served)
How will you know that
the community need has
decreased? (e.g., youth
have improved attitudes,
increased self-esteem;
youth delinquency or
substance use rates have
decreased; youth school
attendance improved)
High rates of truancy
among elementary
students (K to 5th
grade)
After-school tutoring
1. Hours of tutoring
2. Number of youth
tutored
3. Number of support
sessions provided
1. Improved attitudes
toward school
2. Improved self-esteem
3. Improved school
attendance
Services of a family
support coordinator
Developing a Framework
ADVANCED MODEL
ASSUMPTIONS: Principles, evidence-based approaches, and theories that guide the program
Situation
Analysis
Problem
Identification
Priority
Setting
Mission
Goals
Advance
Model
INPUT
ACTIVITIES
Resources and
Contributions.
Tasks conducted
by grantee’s staff,
subcontractor, or
volunteers.
Activities are
directly linked to
outputs.
willOUTPUTS
go here
Products and
services
delivered.
OUTCOME
Changes in individuals, agencies, systems,
and communities. Outcomes may be intended
or unintended.
Initial
Learning
Awareness
Knowledge
Attitude
Skills
Opinions
Aspirations
Motivations
Intermediate
Action
Behavior
Practice
Policies
Social Action
DecisionMaking
ENVIRONMENT: External and contextual factors that influence the program
Sources: GAO-02-923 –Strategies for Assessing How information Dissemination
Contributes to Agency Goals. GAO-03-9 –Efforts to strengthen the link between
resources and results at the administration of children and families. GAO/GGD-00-10
Managing for Results: Strengthening Regulatory Agencies' Performance Management.
Ellen Taylor-Powell, 2000. “A logic model: A program performance framework.“
University of Wisconsin-Cooperative Extension Program Evaluation Conference.
IMPACTS
Long-Term
Conditions
Social
Economic
Civic
Environmental
Why Develop a Program Logic Model?
•
Clearly identifies program elements
— goals, inputs, activities, outputs, and outcomes
•
•
•
Shows how program elements fit together
Illustrates the logic of how your program
works
Helps specify what to measure in an
evaluation
Why Develop a Program Logic Model?
• Builds a common and shared understanding
of the program’s goals, structure, need for
resources, and expected results among
program stakeholders;
• Provides a valuable and concise tool when
presenting on the program to other groups;
• Provides a record of your program.
How to Construct a Program Logic Model
There are several ways to do so:
• Convene a meeting of your program’s stakeholders
– Develop the model as a group.
(Building the model helps build consensus and
support for your program.)
• Program staff develop the logic model
– Meet separately with various stakeholder groups to
elicit their perceptions and expectations.
Different stakeholder groups may hold different views
about the program’s activities and expected results.
Where to Begin in Constructing a Logic Model
•
A program logic model is a type of flowchart
– One may begin constructing the model at either end of
the chart.
•
Try to construct a program logic model as early as possible
during program planning.
•
A program may be up and running before a logic model is
developed.
•
A logic model is a living document
– Revisions may be necessary as the program changes.
Next Steps in Program Evaluation
Once you have the logic model:
• Decide on your evaluation questions
(They are related to the goals of your program)
• How effective is your program?
Moving From Goals to Evaluation Questions
Project:
After-school
recreation
program
designed to
promote selfesteem and
reduce
delinquency
Program
Goals:
Increase
youth
protective
factor of selfesteem.
Lower arrest
rates among
program
youth.
Evaluatio
n Goal:
Determine
whether youth
served
increase their
self-esteem
and decrease
delinquency
Evaluation
Questions:
Do youth who
complete the
program show
increased selfesteem and
decreased
likelihood of
delinquency?
Types of Program Evaluation
• Process Evaluation
–
–
–
–
How does the program work?
What services are provided?
Is the program implemented as planned?
What are the outputs?
What works or does not work in the program?
Types of Program Evaluation
• Outcome Evaluation
– Assess program effectiveness
– What are the results?
• Short term
• Long term
Next Steps in Program Evaluation
Choose the type of evaluation to conduct:
• Decide on the data to be collected to monitor
performance measures/outputs and outcomes
• Create instruments needed to collect data
• Decide how to analyze and use data
Data Collection
• Need up-front planning
• Need a sense of what you are trying to accomplish
• What data will you collect and why?
• What data sources are available and which
will you use?
Types of Data Collection Tools
Good for “What?” and
“How many” questions
Activity Logs/
Skill Sheets
Written documentation of participant's
attendance, achievement, or acquisition of
Document/Records
review
Review of written documents such as
performance ratings, program logs, tally
sheets, and other existing indicators
Good for “What?” and
“How many?” questions
Focus groups
Moderated discussions on a particular topic
or issue
Good for “What?”
“How?” and “Why?”
questions
Interviews
Data collection through oral conversations
Good for “What?” and
“Why?” questions
Observation
Watching people engaged in activities and
recording what occurs
Good for “How?”
“What?” and “How
many?” questions
Questionnaires
Written responses to clearly defined
questions
Good for “What?” and
“How many?” questions
skills
www.evaluationtools.org/tools_main.asp
Create Instruments
• Construct your own instruments
– Attendance sheets
– Activity logs
– Surveys for clients
• Use existing instruments
– Templates available online
– Modify to suit your particular program
Ways to Collect Data
•
Paper
̶ Internal forms/checklists
•
•
Electronic
̶ Excel, Access, SPSS
Online
̶ Web-based data collection tools
̶ OJJDP’s Data Collection & Technical Assistance
Tool (DCTAT)
Analyze and Use Data
• Analyze Data
– How will the data be analyzed?
– Who will be responsible for data analysis?
• Use Data
– How will the data be used?
– Will the data be made available in reports?
– Who will receive the reports?
Evidence-Based Programs
• A program is evidence-based if:
– evaluation research shows that it produces the
expected positive results;
– the results can be attributed to the program itself, not
to other factors or events;
– the evaluation is peer-reviewed by experts in the
field;
– the program is “endorsed” by a federal agency or
respected research organization and included in its
list of effective programs.
From: Evidence-based Programs: An Overview
http://www.uwex.edu/ces/flp/families/whatworks_06.pdf
The Importance of Evidence-Based Practices
• Government-wide move toward accountability
• Government Performance and Results Act
– Shift from accountability for process to accountability
for results
– Programs must show effectiveness to justify funding
• President’s Management Agenda
• Evidence-based practices are cost effective
– It is important to provide programs that work
Program Evaluation: Success and Sustainability
• Use data to create interest and support for
program continuation
– From funders
– From the community
• Program evaluation helps identify the most
successful components of your program
– Maintain effective staff, procedures, or activities
Conclusion
• Program evaluation is an ongoing process
• Start with a logic model to identify the main
elements of the program
• Choose the type of evaluation needed
• Plan data collection and analysis activities in
advance
• Use program evaluation results to justify and
market the program
Resources
Boulmetis, J., & Dutwin, P. (2002). The ABC’s of Evaluation.
San Francisco: Jossey Bass.
The Program Manager’s Guide to Evaluation
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
http://www.acf.hhs.gov/programs/opre/other_resrch/pm_guide_eval
Planning a Program Evaluation
University of Wisconsin Cooperative Extension
http://learningstore.uwex.edu/assets/pdfs/G3658-1.pdf
The Global Social Change Research Project
http://gsociology.icaap.org/methods/evaluationbeginnersguide.pdf
Contact Information
Monica Robbers, Ph.D.
Patricia San Antonio, Ph.D.
CSR, Incorporated
703.312.5220
mrobbers@csrincorporated.com
psanantonio@csrincorporated.com
Download