Program Evaluation - Office of the Dean of Students

advertisement
Program Evaluation
Just enough knowledge to be dangerous
Belinda De La Rosa and Megan Mustafoff
Off. of the Dean of Students and Counseling Ctr.
What to expect
•
•
•
•
•
•
Definitions
Know your Audience & Population
Goals
Methodologies
Report Writing
If time—walk you through an evaluation
thought process
What is program evaluation?
• Program evaluation is the systematic assessment of the processes
and/or outcomes of a program with the intent of furthering its
development and improvement. As such, it is a collaborative process
in which evaluators work closely with program staff to craft and
implement an evaluation design that is responsive to the needs of
the program. For example, during program implementation,
evaluators can provide formative evaluation findings so that
program staff can make immediate, data-based decisions about
program implementation and delivery. In addition, evaluators can,
towards the end of a program or upon its completion, provide
cumulative and summative evaluation findings, often required by
funding agencies and used to make decisions about program
continuation or expansion.
Source: http://www.washington.edu/oea/services/research/program_eval/faq.html
How is evaluation different
than research?
• Evaluators use many of the same qualitative and
quantitative methodologies used by researchers in other
fields. Indeed, program evaluations are as rigorous and
systematic in collecting data as traditional social
research. That being said, the primary purpose of
evaluation is to provide timely and constructive
information for decision-making about particular
programs, not to advance more wide-ranging knowledge
or theory. Accordingly, evaluation is typically more
client-focused than traditional research, in that
evaluators work closely with program staff to create and
carry-out an evaluation plan that attend to the particular
needs of their program.
Source: http://www.washington.edu/oea/services/research/program_eval/faq.html
How is evaluation different
than assessment?
• The primary difference between evaluation and
assessment lies in the focus of examination. Whereas
evaluation serves to facilitate a program's development,
implementation, and improvement by examining its
processes and/or outcomes; the purpose of an
assessment is to determine individuals or group's
performances by measuring their skill level on a variable
of interest (e.g., reading comprehension, math or social
skills, to mention just a few). In line with this
distinction—and quite common in evaluating educational
programs where the intended outcome is often some
specified level of academic achievement—assessment
data may be used in determining program impact and
success.
Source: http://www.washington.edu/oea/services/research/program_eval/faq.html
Other Types of Evaluations
•
•
•
•
•
Needs Assessments
Return on Investment (ROI)
Environmental Scan
Learning Outcomes
Can you think of others?
Know the Program
• Important to work closely with program
staff
• Educate yourself as much as possible
(content analysis, document analysis,
procedural audits, interview all
stakeholders, peer reviewed journal
articles)
Goals of Program Evaluation
• Formative
• Process
• Summative
Know Your Audience
• Audience
Who are you doing this evaluation for?
Are they stakeholders?
Do they benefit directly from the program?
Are there political considerations (can you mediate
them)?
– Internal or external evaluators
– Formal or informal investigation
– When is Institutional Review Board paperwork
required (security)?
–
–
–
–
Know Your Target Population
• Get to know as much as you can about
your target population
• Work with program staff closely
• Observe, research, investigate, talk
• Ask questions
• Once you know as much as you can decide
on a sampling technique based on the
goals of the evaluation
• Incentives
Office of the Dean of Students
Assessment Inquiry
• Do you routinely evaluate your unit?
– If so, on what cycle?
• Type of Evaluation*
• If you evaluate routinely how far back do you have data?
• Do you generate any routine reports?
– If so, on what cycle, to whom, and how is it disseminated?
• Are other types of data collected or are one-time studies
available?
• Does your unit have an individual who works primarily
with data? If so, please provide their name and contact
information.
Theory and planning
• Know or establish program goals
– Logic model (inputs, outputs, 1-3-5 goals)
– Learning outcomes (specific and measurable)
• Measure program goals, gaps and need
– What information do you need to know
– Is this information already collected somewhere
– How will you collect the information you don’t already
have—spend more time using it than collecting
– What will you do with the information collected
Methods & Tools
• Quantitative
– Frequencies (head counts)
– Surveys (phone, mail, online, in person)
• Qualitative
– Focus groups
– Individual interviews (phone, in person)
– Case Studies
• Mixed methods research
– Uses both qualitative and quantitative methods
Project designs
•
•
•
•
•
Pre-test/post-test
Case-control
Cost-benefit analysis
Procedural audit
Quasi experimental
Quantitative methods
• Surveys
–
–
–
–
Likert scales (strongly agree—strongly disagree)
Dichotomous response (yes or no)
Continuous response (income)
Categorical response (partner status)
• Frequencies
–
–
–
–
–
Survey data
Head counts
Time count
Space size and quantity (2 20x20 rooms)
Can you think of other frequencies relevant to your
project?
Qualitative methods
•
•
•
•
Case studies
One on one interviews
Focus groups
Content analysis (websites, policies,
internal documents)
Sampling
• Probability sample • Methods to recruit
– Random sample
– Random stratified
sample
• Non probability
sample
– Convenience sample
– Snowball sample
– Quota sample
– Judgment/expert
samples
– Flyers
– List serves
– Newspapers
– Classrooms
– Can you think of
others?
Bias and Power
•
•
•
•
Non response bias
Coverage bias
Selection bias
Sample size
Ethical Considerations
• IRB must be consulted for exemption and reviews
• Special protected populations
– Children, elderly, minorities, students
• Fact check with your participants and be sensitive to
positional power.
• Secure data and consent forms physically and
electronically.
• Destroy data according to institutional or IRB
recommendations
• Follow FERPA, HIPAA or other federal, state or local
laws regarding information gathering, storage, and
sharing for your population
http://irb.illinois.edu/
Recruiting
• Incentives paid in advance/completion
– Individual incentive
• Money, coupons, gift cards, swag
– Lottery incentive
• Support of community/preliminary notification
• Get the support of respected individuals in that
population to encourage participation
• Emotional appeal what will this do for them or people
they care about.
• Follow up requests and deadline extensions
• Brevity of instrument
Analysis of qualitative data
• Coding
– Data is reviewed and segmented. Each segment is assigned a
code or theme. Reports are prepared by summarizing the
frequency and distribution of the codes or themes.
• Summative evaluation
– Review of the objectives of the evaluation and then the use of
content analysis, utilizing coded notes, to pull recurring
regulator concepts judged on being either internally
homogeneous or externally heterogeneous. The process is then
repeated by two other independent researchers in order to
determine and verify reliability of the themes pulled. Inter-coder
reliability increases the reliability of this analytical method.
Analysis of Quantitative data
•
•
•
•
•
Frequencies
Contingency tables (categorical variables)
Odds ratios
Regression
Power Analysis
Report writing outline
•
•
•
•
Executive summary and abstract
Literature review
History of the program
Evaluation
– Methodology
– Results
– Discussion
• limitations
• Outcomes
• recommendations
Report writing
• Present data in ways administrators can utilize it to
transform operations, re-envision goals, expand services
or collapse services.
• Always include an executive summary and never just
write an executive summary.
• For frequently written reports that are just summaries of
data be sure to document methods and procedures for
institutional memory. Don’t forget to record outcomes
and changes due to reports.
• Use visual aids such as charts, graphs, flow charts, etc. as
necessary and to a level that will be usable by your
audience
Remember that Data Flows
Planning & Budgeting
•Mission, Vision, Goals developed
•Unit goals aligned with Division
•Programs based on assessable goals, with performance
indicators
•Planning to available financial resources
•Funding based on priorities and results
Assessable
Outcomes
Improvement
Instrumentation
Implementation
Application
of Findings
•Judging The Data Against Desired Results
•Comparing Data To Peers
•Reporting Data to internal constituents
•Demonstrating accountability to external
stakeholders
•Proposing improvement initiatives based on
assessment findings
•Determine and effect change based on the data
•Separate out data-reporting from planning for
improvement
•Improving assessment methods
(Everyone in Division implements goals)
Tracking
Data Collection
Analysis
Evaluation
•Needs/interest assessments
•Web-based surveying
•Assessment of learning outcomes
•Surveys: NSSE, Student Satisfaction, IPEDS, CORE, etc.
•Administrative program reviews
•Benchmarking studies
•HR Data
•Program, service & facilities evaluations
•Contributions to institutional accreditations
•Student Record Data
Whitney, Karen. “Fear not the data…use the data to tell your story.” University of Illinois Student Affairs. Champaign. 25 Jan. 2010.
For More Information
See these Websites
• http://gsociology.icaap.org/methods/evaluationbeginnersguide.pdf
• http://managementhelp.org/evaluatn/fnl_eval.htm
• http://www.tgci.com/magazine/A%20Basic%20Guide%20to%20Progra
m%20Evaluation.pdf
• http://extension.psu.edu/evaluation/
• http://www.programevaluation.org/
• http://www.uwex.edu/ces/pdande/evaluation/
• http://www.portal.mohe.gov.my/portal/page/portal/ExtPortal/MOHE_
MAIN_PAGE/Tender_Contract/BUDGET/files/PLANNING_PROGRAM
_EVALUATION.pdf
• http://ocw.jhsph.edu/courses/fundamentalsprogramevaluation/lectureN
otes.cfm
• http://www.musc.edu/vawprevention/research/programeval.shtml
• http://teacherpathfinder.org/School/Assess/assess.html
• Sudman, S & Bradburb, N (1986). Asking Questions: A Practical Guide to
questionnaire Design. San Francisco: Jossey-Bass Publishers.
Download