Anchoring Essentials: PowerPoint 3 Overview

advertisement
How Do You Know When
Your Programs Really Work?
Evaluation Essentials for Program Managers
Session 3: DATA ANALYSIS
Anita M. Baker, Ed.D.
Evaluation Services
Hartford Foundation for Public Giving,
Nonprofit Support Program: BEC
Bruner Foundation
These materials are for the benefit of any 501c3
organization. They MAY be used in whole or in part
provided that credit is given to the Bruner Foundation.
They may NOT be sold or redistributed in whole or part
for a profit.
Copyright © by the Bruner Foundation 2012
* Please see supplementary materials for a sample agenda, activities and
handouts
Bruner Foundation
Rochester, New York
How to Use the Bruner Foundation Evaluation Essentials for Program Managers Powerpoint
Slides
The Evaluation Essentials for Program Managers slides were developed as part of a Bruner Foundation
special project, by evaluation trainer Anita Baker – Evaluation Services, and jointly sponsored by the
Hartford Foundation for Public Giving. They were tested initially with a single organization in Rochester, NY
(Lifespan) as part of the Evaluation Support Project 2010. The materials were revised and re-tested with
three nonprofit organizations as part of the Anchoring Evaluation project in 2011-12. The slides, intended for
use in organizations that have already participated in comprehensive evaluation training, include key basic
information about evaluation planning, data collection and analysis in three separate presentations.
Organization officials or evaluation professionals working with nonprofit organization managers are
encouraged to review the slides, modify order and add/remove content according to training needs. (Please
note that the final session includes general information about analysis planning as well as analysis of both
quantitative and qualitative data, and presentation of findings. Specific strategies related to data collection,
i.e., analysis of survey data or interview data, and information about development of tables and graphs are
included in the supplementary powerpoint presentation.
Additional Materials
To supplement these slides there are sample agendas, supporting materials for activities, and other
handouts. There are “placeholder” slides with just a picture of the target with an arrow in the bullseye that
signify places where activities can be undertaken. Be sure to move or eliminate these depending on the
planned agenda.Other more detailed versions of the Evaluation Essentials materials area also available in
Participatory Evaluation Essentials: An Updated Guide for Nonprofit Organizations and Their Evaluation
Partners and the accompanying 6-session slide presentation. These materials are also available on the
Bruner Foundation and Evaluation Services websites free of charge.
Whether you are an organization leader or an evaluation professional working to assist nonprofit organization
staff, we hope that the materials provided here will support your efforts.
When you have finished using the Evaluation Essentials for Program Managers series have
trainees take our survey. https://www.surveymonkey.com/s/EvalAnchoringSurvey
Bruner Foundation
Rochester, New York
2
What is Evaluation Anyway?
Program Evaluation

Thoughtful, systematic
collection and analysis of
information about
activities, characteristics,
and outcomes of
programs, for use by
specific people, to reduce
uncertainties, inform
decisions.
i Review
Participatory Evaluation

Trained evaluation
personnel and practicebased decision-makers
coming together to learn
about , design, conduct
and use results of program
evaluation.
How are evaluation data collected?

Interviews

Surveys

Observations

Record Reviews
ii Review

All have limitations and
benefits

Require preparation on the
front end:
 Instrument Development
and testing
 Administration plan
development
 Analysis plan development
Evaluation Data Collection Options
Qualitative Data
Quantitative Data
Record Review
Interviews
Conducting guided
conversations with key people
knowledgeable about a subject
Focus Groups
Facilitating a discussion about a
particular issue/question among
people who share common
characteristics
Surveys
Administering a
structured series of
questions with discrete
choices
Observations
Documenting visible
manifestations of behavior or
characteristics of settings
iii Review
Collecting and organizing data
about a program or event and
its participants from outside
sources
External Record Review
Utilizing quantitative data that
can be obtained from existing
sources
Surveys:

Series of items with pre-determined response choices

Can be completed by administrator or respondents

Can be conducted




“paper/pencil”
phone, internet (e-survey)
using alternative strategies
USE SURVEYS TO:
Study attitudes and perceptions
Collect self-reported assessment
of changes in response to program
Collect program assessments
Collect some behavioral reports
Test knowledge
Determine changes over time.
Instruments are called – surveys, “evaluations,”
questionnaires
GRAND
CLAIMS
iv Review
PRE
POST
Interviews:

One-sided conversation with questions mostly predetermined, but open-ended.

Respondent answers in own terms.

Can be conducted




in person
on phone
one-on-one, or groups
USE INTERVIEWS TO:
Study attitudes and perceptions
Collect self-reported assessment
of changes in response to program
Collect program assessments
Document program
implementation
Determine changes over time.
Instruments are called – protocols, schedules or guides
v Review
Observations:

Observations are conducted to view and hear actual
program activities.

Users of reports will know what and how events occur.

Can be focused on




programs overall
participants
pre-selected features
USE OBSERVATIONS TO:
Document program
implementation
Witness levels of skill/ability,
program practices, behaviors
Determine changes over time.
Instruments are called – protocols, guides, checklists
vi Review
Record Reviews:


Accessing existing internal information, or information
collected for other purposes.
Can be focused on




USE REC REVIEW TO:
Collect some behavioral reports
Conduct tests, collect test results
Verify self-reported data
Determine changes over time
own records
records of other orgs
adding questions to existing docs
Instruments are called – protocols
vii Review
What happens after data are collected?
1.
Data are analyzed, results are summarized.
2.
Findings must be converted into a format
that can be shared with others.
3.
Action steps should be developed from
findings.
“Now that we know _____ we will do _____.”
viii Review
Important Data-Related Terms

Data can exist in a variety of forms



Records: Numbers or text on pieces of paper
Digital/computer: Bits and bytes stored electronically
Memory: Perceptions, observations or facts stored in a person’s mind

Qualitative, Quantitative

Primary v. Secondary Data

Variables (Items)

Unit of Analysis

Duplicated v. Unduplicated

Unit Record (Client-level) v. Aggregated
1
Plan your Analysis in Advance!
• What procedures will be conducted with each set of
data and who will do them?
• How will data be coded and recoded?
• How will data be disaggregated (i.e. “broken out for
example by participant characteristics, or time)?
• How will missing data be handled?
• What analytical strategies or calculations will be
performed (e.g., frequencies, cross-tabs)?
• How will comparisons will be made?
• Whether/which statistical testing is needed?
2
Analysis Plan Specifics,
You Must Decide . . .
 What procedures will be conducted with each
set of data and who will do them.
How
data will be grouped or partitioned.
What
How
types of codes will be applied to the data.
comparisons will be made.
• Data to other project data (within group)
• Data to expectations
• Data to data from other sources (across groups)
There is no single process!
3
Analyzing (Quantitative) Data:
A Few Important Terms*
• Case: individual record (e.g., 1 participant, 1 day, 1 activity)
• Demographics: descriptive characteristics (e.g., gender)
• Disaggregate: to separate or group information (e.g., to look
at data for males separately from females) – conducting
crosstabs is a strategy for disaggregating data.
• Duplicated/Unduplicated (e.g., counting # of individuals at
events – dup; or counting number of events for each individual )
• Partition(v): another term that means disaggregate.
• Unit of Analysis: the major entity of the analysis – i.e., the
what or the whom is being studied (e.g., participants, groups,
activities)
• Unit Record (i.e., client level) v. Aggregate (i.e., group level)
• Variable: something that changes (e.g., number of hours of
attendance)
*common usage
4
Quantitative Data Analysis: Basic Steps
1.
Organize and arrange data (number cases as
needed).
2.
Scan data visually.
Code data per analysis plan.
3.
4.
5.
6.
7.
8.
9.
5
Enter and verify data.
Determine basic descriptive statistics.
Recode data as needed (including missing data).
Develop created variables.
Re-calculate basic descriptive statistics.
Conduct other analyses per plan
Quantitative Data Analysis Strategies
Important Things to Look at or Summarize


Frequencies: How often a response or status occurs.

Total and Valid Percentages: Frequency/total *100

Measures of Central Tendency: Mean, Median, (Modes)

Distribution: Minimum, Maximum, Groups (*iles)
Cross-Tabulations: Relationship between two or more variables
(also called contingency analyses, can include significance tests
such as chi-square analyses)
Useful, 2nd Level Procedures
Means testing (ANOVA, t-Tests)
Correlations
Regression Analyses
6
Analyzing Quantitative Data
Important Things to Look at or Summarize
What to Do
What That Means
Calculate Frequencies
Count how many there are of something.
Count how often something (e.g., a
response) occurs.
Calculate Total and/or
Valid Percentages
Frequency/total *100
Example Questions You
Could Answer
How many participants were in
each group?
What were the demographics of
participants?
How many answered “Yes” to
Question 2?
What proportion of participants
met intensity targets?
What proportion of all those who
answered question 2, said “Yes.”
7
Analyzing Quantitative Data
Important Things to Look at or Summarize
Example Questions You
Could Answer
What to Do
What That Means
Determine
Central Tendencies
Calculate the average (mean), or
identify the median (middle) or mode
(most common value).
What is the average number of
hours participants attend?
Avg. =
What is the most common
numbers of days attended in a
week? (mode)
Sum of Values
Total Number of Values
Total # of hours
Total # of people with hours
8
Analyzing Quantitative Data
Important Things to Look at or Summarize
What to do
What That Means
Determine Distributions Determine the minimum value, the
maximum, and/or how the data are
grouped
(e.g, high, medium, or low values,
quartiles, percentiles, etc.).
Cross-Tabulations
(pivot tables are crosstabs)
9
Relationship between 2 or more
variables (also called contingency
analyses, can include significance tests
such as chi-square analyses)
Example Questions You
Could Answer
What was the least amount of
attendance for the group? What
was the most?
How many participants fall into
low, medium, and high intensity
groups?
Are there relationships between
participant characteristics and
outcome changes?
Coding and Data Entry
1. Create codebook(s) as needed (identify codes
and affix them to instrument copies).
2. Create electronic database when possible (use
Excel, SPSS, SAS).
3. ID/create unique identifiers for cases and affix
or enter as needed.
4. Enter or extract data as needed (do not recode
as data are entered).
5. Make (electronic or paper) copies of your data.
10
Analysis of Qualitative Data
Analytical Strategies Similar
For Qualitative and Quantitative Data

Consider how you plan to use findings, -- who is the
audience? what format works best?

Plan your analysis in advance.
•
•
•
•

How does the data fit within overall evaluation plan, other data?
How will findings fit in the overall report plan?
How will you code, display and draw conclusions about data?
How will you validate/verify and adjust your findings?
Be careful interpreting data!
11
Steps to Take When Analyzing
Qualitative Data
2.
Segment or partition data (i.e., divide it into meaningful
analytical units)
Reduce data
 Code data
Process is Iterative
 Compare data
3.
Organize, summarize and display data
4.
Draw conclusions, verify/validate results
5.
Revise summaries and displays accordingly
1.
12
Coding Qualitative Data
1.
A priori or deductive codes: predetermined
categories based on accepted theory or
program knowledge
2.
Inductive: based on raw data (not
predetermined)
3.
Hierarchical: larger categories with
subcategories in each

You can combine inductive and deductive within a
hierarchical coding scheme
13
Coding Strategies and Reminders
1.
Keep a master list of codes


Distinguish a priori and inductive codes
Re-apply codes to all segments
2.
Use multiple codes, but keep coding schemes as simple
as possible
3.
Test out sample entries to identify potential problems
before finalizing code selections
4.
Check for inter/intra coder reliability (consistency)



14
Coding is not exact (expect differences)
Co-occurring codes (more than one applies)
Face-sheet codes (descriptors)
Enumeration

A strategy for organizing, summarizing,
and displaying qualitative data
 Quantify frequency of codes,* or types
 Use counts to define results (e.g., most responses
were positive; all responses fell into 4 categories – the
category most exemplified was __________).
* e.g., none, some, a lot, as a
percentage
15
Negative Findings







16
Explain the results and what they mean,
and why they occurred if possible
Clarify how negative
Don’t blame it on bad evaluation
Clarify next course of action
Clarify what did work and for whom
Avoid milquetoast approach
Don’t be reluctant to report if possible
Inaccurate Findings



17
Determine cause
Disseminate errata if necessary or
recall report
Communicate with stakeholders why
results will not be usable
Inconclusive Findings



18
Present in an unbiased fashion
Indicate conclusions can not be
drawn
Develop a plan to correct evaluation
or program problems if necessary
Positive Findings







19
Explain the results and what they mean, and why
they occurred if possible
Clarify how positive, who it worked for and how
Don’t distrust positive results (but be careful to
avoid biased designs)
Report positive results and celebrate
accomplishments
Clarify next course of action
Resist making assumptions about the next
iteration
Design careful follow-up
Evaluation Reporting: Initial Steps
1. Clearly identify your audience.
Staff?
Funders?
Board?
Participants? Multiple
2. Determine what Presentation Strategies work best.
PowerPoint
Newsletter
Fact sheet
Oral presentation
Visual displays
Video
Storytelling
Press releases
Report
full report, executive summary,
stakeholder-specific report?
20
Introduction
Components of A Strong Program
Evaluation Report


Methods 
Findings


Conclusions



21
Description of the subject program.
Clear statement about the evaluation
questions and the purpose of the evaluation.
Description of actual data collection methods
Summary of key findings (including tables, graphs,
vignettes, quotes, etc.)
Discussion or explanation of the meaning and
importance of key findings
Suggested Action Steps
Next Steps (for the program and the evaluation)
Issues for Further Consideration (loose ends)
Think About Communication Strategies
Are there natural opportunities for sharing
(preliminary) findings with stakeholders?
• At a special convening
• At regular or pre-planned meetings
• During regular work interactions (e.g., clinical
supervision, staff meetings, board meetings)
• Via informal discussions
22
Additional Reporting Tips
 Convert findings to shareable form(s).
 Think about internal and external reporting.
 Plan for multiple reports.
 Before you start writing, be sure to develop an outline
and pass it by some stakeholders.
 If you’re commissioning an evaluation report, ask to see
a report outline in advance.
 Review the evaluation reports of others carefully for the
important components and meaningfulness.
23
Before You Present Your Findings,
Answer These Questions

Do your findings accurately reflect the data you
collected?

How might your interpretation be inaccurate?

Are there any unintended consequences that might
result from sharing these findings?

Are there any missing voices you overlooked?
24
Sharing Findings: ‘ate’ Steps
1. Deliberate – Spend time with people close to the evaluation work and
confirm the findings. You must convince yourself (ves) first.
2. Anticipate – Determine how you want to use the findings and what
value might be derived from the findings for the program/process.
3. Investigate – Once you have findings, test them with key
stakeholders. They will shed light on perceived value of the findings.
4. Calibrate – Develop a result sharing mechanism that can convey the
message you want to convey to your chosen audience.
5. Illuminate – Remove any unnecessary details and highlight the ‘key
findings’.
6. Substantiate – Take a step away from the work and come back to it
later with fresh eyes. Ask yourself, “Do the findings still resonate?”
7. Annotate – Proofread the final draft. Misteaks can distract from
results.
8. Communicate – Share the results!
25
Download