A Grant Evaluation Toolbox for Institutional

advertisement
Daylene Meuschke
College of the Canyons
RP Conference
April 11, 2014
Bri Hays
San Diego Mesa College
Explore evaluation tools and resources
Discuss challenges and strategies for overcoming
them in grant development and evaluation
Compare formative and summative evaluation
approaches
Illustrate a program theory using a logic model
Summarize evaluation goals and activities using an
evaluation plan
Grant development phase?
After the grant is awarded?
Just before the annual performance report is due?
3 Major Categories of Evaluation:
Needs Evaluation/Needs Assessment
Process Evaluation
Outcomes Evaluation
Logic Models
Framing the evaluation: What is the context?
Evaluation Plan Templates
Communication/Dissemination Plan Templates
Clearly identify the specific aims of the program or
project
Outline of the program/project theory, rationale, or
model
Visual of how the program/project will work
Inputs
Activities
Outputs
Outcomes
Impact
Inputs
Activities
What do you need
to achieve your
intended
What are you
outcomes/impact? going to do?
What services,
events, etc. will
you provide?
Outputs
What are your
operational
targets?
Outcomes
What changes/
benefits may be
expected for
participants and
other
stakeholders?
Impact
What large-scale,
overarching
change or benefit
will the project
produce?
Visual outline can serve as basis for program
planning and development
Communicate to internal and external stakeholders
how your program/project works
Clearly identify links between program activities and
outcomes, both short and long term
Set the stage for the evaluation plan
And…are a
component of an increasing
number of federal grant applications
Inputs
Activities
Outputs
Outcomes
Impact
.5 FTE STEM
Learning
Communities
Coordinator
• Outreach for
faculty
• Faculty trainings/
workshops
• 10 faculty will participate in
learning communities each
semester
• 5 to 7 Basic Skills and STEM
learning communities will
be offered each semester
• Increased faculty collaboration
• Increased student engagement
• Improved success in STEM
gateway courses
• Improved student persistence
in STEM programs
.5 FTE STEM Cohort
Counselor
• High school
outreach and
recruitment for
STEM cohort
• Family orientation
• Educational
planning
• 10 Workshops at 5 feeder
high schools
• 2 Family orientations each
fall
• 75%+ of cohort students
have education plans in first
semester
• 5% increase in number of STEM
majors over baseline (Latino
students and overall)
• Increased student/family
awareness of college resources
• 60%+ of cohort students with
ed plan follow plan for first two
semesters
5 Peer Mentors for
STEM Cohort
Students
• Weekly followups with cohort
students
• College Success
Workshops
• 75%+ of cohort students will
log at least 1 hour with peer
mentor per week
• 80% of cohort students will
participate in 1+ workshops
• Improved social support
• Increased student engagement
•
• Faculty trainings
and workshops
• SI Leader training
course
• SI sessions for
gateway STEM
courses
• 4 Faculty members will
partner with an SI leader
• 40%+ of students in
selected STEM courses will
participate in 1+ SI sessions
• Increased academic
engagement in STEM courses
• Increased success rates in
gateway STEM courses (overall
and for Latino students)
• Increased persistence in STEM
programs
• Increase in fallto-fall
persistence of
first-year STEM
students
• Improved
completion of
math Basic Skills
sequence
(overall and for
Latino students)
• Increase in
number of STEM
program
graduates
(overall and for
Latino students)
• Increase in
number of STEM
program
transfers (overall
and for Latino
students)
•
Supplemental
Instruction
Faculty Training
(UMKC)
4 STEM SI
Leaders
Formative Evaluation
How is it working?
Is it working as intended?
Is it reaching the intended population?
What is working well?
What can be improved?
Continuous quality improvement!
Analyze
Results
Reflect and
Refine
Assess
(Collect Data)
Implement
Changes
Summative Evaluation
What worked?
How do we know that it worked?
What didn’t work?
What are the lessons learned from the program or
project?
How does this fit into the larger body of knowledge
about this intervention/program/project?
Program/Project Objectives
What are the specific aims of the program or project?
Key Stakeholders
Evaluation Goals
What information does the grant team need?
Needs assessment?
Process evaluation?
Outcome evaluation?
Evaluation Plan Outline/Timeline
What are the major steps or tasks in the evaluation?
When do the internal and external stakeholders need the
information?
Examine the grant’s Request for Proposal (RFP) to
determine the evaluation requirements.
Each grant will have different requirements for
reporting and evaluation.
Develop a comprehensive evaluation plan, including:
Responsible parties
Data collection
Timelines
Dissemination
Upward Bound Grant Evaluation
Step One:
Determine
Requirements
• Determined the requirements for the evaluation as set forth by the
Department of Education.
Step Two:
Build Evaluation
Model
• Built these specifications into an evaluation template.
STEP THREE:
Data Collection
• Developed a spreadsheet to track student data
to their post-secondary completion.
• Identified who was responsible for grant
objectives.
• Coordinated with the Upward Bound
director and staff to ensure proper data
collection.
• Created a shared network drive to
enable shared access to the data
collection spreadsheet
How will you share the results of the evaluation?
What types of information needs to be reported, and
to whom?
When does information need to be shared with
internal and external stakeholders?
What levels of detail are required for each report?
How will the information be used?
San Joaquin Delta College FIPSE Report on Learning
Communities (see Appendices)
A brief is written to
evaluate the grant.
The brief is often used as a
stand-alone document.
A project director may also
use the brief as support to
their funding source.
Sections of the brief can be
used for reference in an Ad
Hoc report.
Recommendations are often
included in the brief based
on the evaluation.
Consider the capacity of your office and the
availability of data when developing the evaluation
plan
Is an external evaluator required?
If so, what are their roles and responsibilities?
What are the research office’s responsibilities?
Is a data sharing agreement required?
What are the external (i.e., funding agency) reporting
deadlines?
How much lead time does the project director need
to review annual reports?
American Evaluation Association
Kellogg Foundation Evaluation Handbook
Kellogg Foundation Logic Model Development Guide
University of Wisconsin-Extension Program
Development and Evaluation Web Page
Centers for Disease Control and Prevention Program
Performance and Evaluation Office Web Page
Western Michigan University Evaluation Center
Daylene Meuschke
Director, Institutional
Research
College of the Canyons
Daylene.Meuschke@canyons
.edu
Bri Hays
Campus Based Researcher
San Diego Mesa College
bhays@sdccd.edu
Download