Standards-Based Education, NCLB, and School Counseling

advertisement
School Counselors and
Program Evaluation
Washington School Counselors Association
John Carey
National Center for School Counseling
Outcome Research
UMass Amherst
www.cscor.org
Data-Driven School Counseling
Programs

Implement comprehensive programs
based on national design and local need

Use data to determine directions
(data driven decision making, needs assessment)

Measure results
(program evaluation)

Share successes
What Data Do We Use?
ASCA National Model asserts there are three
broad categories of data sources:
1.
Student Achievement Data
2.
Achievement-Related Data
3.
Standards and Competency Data
Student Achievement Data
1. Norm-Referenced Standardized Tests
– Scores referenced to national average
– PSAT, SAT, ACT, Iowa, Metropolitan
– Predictive Validity
2. Criterion-Referenced Standardized Tests
–
–
–
–
Scores referenced to performance standards
State achievement tests (AIMS)
Content related to state curriculum frameworks
Content Validity
Student Achievement Data
3. Performance tests or changes in
achievement levels (advancement in Math or
English, for example)
4. Portfolios
5. Course grades and GPA
6. Completion of college prep
requirements
7. Drop-out rate
Achievement-Related Data
1.
2.
3.
4.
5.
6.
7.
8.
Attendance rates
Behavioral problems
Student attitudes
Discipline referrals
Suspension rates
Drug, Tobacco, and Alcohol use patterns
Parent involvement
Extracurricular activities
Standards and Competency
Related Data
1.
2.
3.
4.
College Placements
Financial Aid Offers
Vocational Placements
Percentage of students who:
Have 4- or 6-year plans
Participate in job shadowing
Have completed career interest inventories
ASCA National Standards
Data Sources vs. Data Types
The ASCA National Model identifies three
data types for use in program evaluation:
– Process Data – What was done for whom?
– Perception Data – Attitudes, opinions, beliefs
- generally self-report data
– Results Data – Objective and measurable
student outcomes such as academic
achievement, attendance, and disciplinary
interventions
Program Evaluation: Process Data

Process Data: What was done for whom?
– Who received services?
 Ninth graders? Students at risk of failing math?
– What did they receive?
 Curriculum intervention? Small-group
intervention?
– When did they receive it?
 All year? Twice? For 30 minutes?
– Where and How was it provided?
 In the classroom? After school?
Program Evaluation: Process Data

Process data alone does not tell us
whether or not the student is different (in
behavior, attitude or knowledge) as a
result of this activity.

Coupled with results data, process data
can help identify what factors may have
led to success in an intervention.
Program Evaluation: Perception Data

Perception data measures how students are
different as a result of an intervention
– Did students gain competencies?
 Every 10th grade student completed a career interest
inventory.
 85% of 10th graders identified the 4 steps in the
career decision making process.
– Did they gain knowledge?
 87% of 9th graders demonstrated knowledge of
graduation requirements.
– Were there changes in their attitudes or beliefs?
 86% believe that pursuing a non-traditional by
gender career is acceptable.
Program Evaluation: Perception Data
Differences in student knowledge, competency
and attitudes are measured through:
 Pre-post tests
–What do students know/believe before
and after the intervention?
 Completion of an activity
–Completion of a 4-year plan
 Surveys
–What do students say they believe or
know?
Program Evaluation: Results Data

Results data is the proof that the
intervention has or has not influenced
behavior.
– An intervention may occur (process data),
students may know the information
(perception data), but the final question is
whether or not the students are able to utilize
the knowledge, attitudes and skills to affect
behavior (results data).
Program Evaluation: Results Data

Results data can be complex because
many factors impact behavior change.
– An increase in enrollment at a vocational high
school may be due to an intervention
implemented for MS students. Conversely,
finding no changes in results data does not
mean that an intervention has necessarily
been unsuccessful.
Clarifying Terms

Research

Action Research

Program Evaluation
Program Evaluation Process

Program Evaluation allows practitioners to
evaluate programs and interventions in
their specific contexts.

Practices can change immediately and in
an ongoing manner as data are collected
and analyzed.
Program Evaluation Process
1. Identify the construct(s)
2. Review what is known
3. Develop specific hypotheses or questions you
would like to answer and plan the program
evaluation accordingly
4. Gather the data
5. Analyze the data
6. Interpret results and disseminate and use findings
7. Evaluate the process
Conducting Program Evaluation:
Identify the Construct(s)
1. Review the mission and goals of your
program to identify the constructs you
would like to look at. Ask yourself:
– How are people different as a result of the
school counseling program?
– What is a question I want to answer?
– What do I wish was different?
Conducting Program Evaluation:
Identify the Construct(s)
Constructs often have “sub-constructs”
Define the construct/question in clear,
specific language. The more specific you
are, the easier it will be to measure later.
You can use the ASCA National Model to
help you define your construct
Conducting Program Evaluation:
Identify the Construct(s)
The ASCA National Model domains
1.
Personal/Social Development Domain
2.
Academic Domain
3.
Career Domain
Conducting Program Evaluation:
What is Already Known?
2. Review what is known about your
questions.
– Has anyone in your organization asked this
question before?
– Who might have information?
– What is the relevant research in professional
journals?
– What does an Internet search find on this
topic?
Conducting Program Evaluation:
Develop Hypotheses
3. Develop hypotheses or questions you
would like to answer and plan the
research process accordingly.
– Ask yourself what you think the answer(s) to
your question(s) will be.
– Identifying the hypothesis helps you identify
your biases, which may impact your process.
– What is the opposite of your hypothesis (the
“null hypothesis”)? What would the data look
like if you were wrong?
Considerations

What are your biases/Mental Models? How are
they impacting the questions you’re asking and
the places you’re looking?

Evaluating findings from research literature and
internet searches
– What is the source? How reliable is it?
– What are the strengths/weaknesses of the research
design, sampling, effect size, measures used,
treatment fidelity, researcher bias, instrument
reliability and validity?
Considerations

Data:
– How accurate is the data you’ve chosen to
use?
– What’s missing?

Instruments:
– Reliability and validity
– Just because it exists doesn’t mean it’s well
done
Considerations

Sampling and Research Design:
– Size of sample
– Comparability of sample and control
– Matching vs. random assignment
– Assuring fidelity of treatment
 Doing same thing across different groups?
Considerations

Ethical Considerations:
– Consent
– Human Subjects Review
– Denying access to interventions remediation?

Data Analysis
– Do you have the capacity to analyze the data?
Conducting Program Evaluation:
Develop Hypotheses
EXAMPLE:
Project Explorers – after school program at
vocational high school for MS students
Hypotheses/Questions:
Does participation in in Project Explorers lead to:
 Increased enrollment?
 Increased awareness of the offerings of the VHS?
 Career exploration?
 Improved career decision making abilities?
Activity #1
With your SC program in mind, think about these
questions to help you identify constructs:
– How are people different after participating in my
program?
– What is a question I want to answer?
– What do I wish was different?

Use these questions to develop a specific,
measurable question you would like to answer,
and place the answer on the planning tool
Conducting Program Evaluation:
Gather Data
4. Gather the data.
– What information do you need in order to
answer your question?
– Use multiple sources of data, or multiple
outcome measures, wherever possible
(triangulation).
– Decide whether you need to consider student
achievement data, psychosocial data, career
data, school data, process data, perception
data, and/or results data.
Conducting Program Evaluation:
Triangulation
Process
Data
Results Data
THE
QUESTION
Perception
Data
Conducting Program Evaluation:
Triangulation
Process
Data
INCREASED
ENROLLMENT
THE
QUESTION
Perception
Data
Conducting Program Evaluation:
Triangulation
PARTICIPATION –
# OF SESSIONS
# OF STUDENTS
INCREASED
ENROLLMENT
THE
QUESTION
Perception
Data
Conducting Program Evaluation:
Triangulation
PARTICIPATION –
# OF SESSIONS
# OF STUDENTS
INCREASED
ENROLLMENT
THE
QUESTION
SURVEY DATA
Project Explorers Example

Results Data
– Did the total number of students enrolling in
vocational programs increase when compared
to the last three years?

Perception Data
– Survey using combination of validated items
from pre-existing survey (MCGES), and “hand
written” items
Project Explorers Example
(Perception Data)
Circle the correct response for questions 1 - 3.
1) Which school (s) prepares its graduates for skilled
employment, such as a mechanic?
VHS
My local high school
Both schools
2) Which school(s) prepares its graduates for 2-year
colleges, such as Middlesex Community College?
VHS
My local high school
Both schools
3) Which school(s) prepares its graduates for 4-year
colleges, such as UMASS?
VHS
My local high school
Both schools
4) List as many of the shops at VHS that you are aware of:
Project Explorers Example
(Perception Data)
5) Check all that apply.
I have used the following resources to learn about careers:
___Internet
___ Expert or person working in the field
___Counselor/Teacher
___Other (please specify)_____________
___Family member
___ I have not researched any career
6) Circle the answer that indicates the level of your confidence for
each item.
(copyright laws prohibit duplication of these items – Missouri Comprehensive
Guidance Evaluation Survey (MCGES), Lapan)
7) Please check the 3 areas of employment that you are most interested
in at this time.
____
____
____
____
Agriculture
Arts & Communication
Construction
Manufacturing
____
____
____
____
Education
Business and Computer Services
Health and Hospitality Services
Transportation
Project Explorers Example
(Process Data)

Process Data documents “what was done
for who”
– Number of students participating (overall)
– Attendance at each shop
Activity #2

Identify the PROCESS, PERCEPTION, and
RESULTS data needed to answer your
question / hypotheses

Use the planning tool!
Conducting Program Evaluation:
Gather Data

Where is the data?
– Does it already exist?
 School records, intake forms, test results
– Will you generate your own data?
 Surveys
 Interviews
 Observations

Multiple data sources help you more accurately get
at the complexity of a situation, whereas one
measure or data source will give you a snapshot
view.
Conducting Program Evaluation:
Gather Data

Select and/or develop the instruments you
will use to gather the data. Possibilities
include: (more later)
– Surveys
– Tests (of achievement, aptitude, attitude, etc.)
– Behavioral checklists or observations
– School records
– Performance assessments
– Interviews
Conducting Program Evaluation:
Gather Data

Identify and follow ethical and legal
standards:
– No participant should be exposed to physical
or psychological harm.
– Permission to use confidential data must be
obtained.
– Participation in a study is always voluntary.
– Participants may withdraw from the study at
any time.
– Participants’ privacy rights must be respected.
Conducting Program Evaluation:
Gather Data

Identify the group/sample to be studied:
– Ideally either the entire sample is involved in the
study (the class, grade, or school) or the group
studied is a random sample of the population.
– Stratified sampling uses a smaller sample which has
the same proportions as the larger sample.
– Systematic random sampling is when every x number
of students is chosen from the whole population
(every 4th student on the attendance list, for
example).
Conducting Program Evaluation:
Gather Data

Once data sources and measures are identified,
ethical standards are considered, and the
sample is identified, data can be gathered!

Ways to collect data include:
– Questionnaires or surveys
– School records
– Interviews
– Observation
Collecting Data: Demographics

Regardless of what other types of data
you collect and analyze, demographic data
is a critical first step

Collect the types of demographic data that
are important for making sense of your
results, and for describing who
contributed to the data you collected
Collecting Data: Demographics

Questions to consider:
– What information will I need to adequately
describe my sample?
 Think of the information you will need to provide to
let unfamiliar people know who your program served
– What information will I need to analyze the data
the way I want to?
 Think of the various ways you can describe your
results . For example, the impact on different ethnic
groups, special education vs. regular education, etc
Demographic Variables








Ethnicity
Gender
Class (Parent Educational Level)
Language Level (Limited English Proficient)
Low Income (Free or Reduced School Lunch)
Acculturation, Migration (Mobility)
Special Needs
School Performance (GPA, Achievement
Quartile)
Demographic Variables (school)
Student Participation in School Programs
 Student Participation in Extracurricular
Activities
 Grade Level
 Age
 Number of Years in Current School
 Parent Participation Level

Identifying Effective Surveys

Key questions to consider:
1. How does the survey relate specifically to
my program’s goals?
2. Is the survey appropriate for my age group
of students?
3. How reliable and valid is the survey?
Creating Your Own Surveys
1. Define Constructs
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and Instructions
Developing Your Own Surveys
Writing Items
1. Define Constructs
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
Sometimes pre-existing surveys and measures are
not able to adequately capture the attainment of
the goals of your program, or the question you
would like to answer
Open-ended vs. closed questions:
Open-ended questions can provide rich data, but
are hard to summarize
What do you think about Project Explore?
Why is career planning important?
Closed questions are most common in surveys
because the results are easy to summarize
Developing Your Own Surveys
Writing Items
Characteristics of Good Likert Items
1. Define Constructs
2. Select/Write Items

The Likert technique presents a set of attitude
statements. Subjects are asked to express
agreement or disagreement on a five-point
scale. Each degree of agreement is given a
numerical value from one to five. Thus a total
numerical value can be calculated from all the
responses or for subsets of responses.

1 = Strongly Disagree, 2 = Disagree,
3 = Neither Disagree nor Agree, 4 = Agree,
5 = Strongly Agree
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
Developing Your Own Surveys
Writing Items
1. Define Constructs
2. Select/Write Items
Characteristics of Good Likert Items

Simple words—readability.

Use precise words—avoid
ambiguity.

Reverse stated items—response
set.
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
Developing Your Own Surveys
Writing Items
1. Define Constructs

ISOMORPHISM – Each statement
should clearly map onto one construct
definition

SINGULARITY – Each statement should
contain one idea – avoid “double
barreled items”
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
“I work hard in school because I
have high expectations for myself”
Developing Your Own Surveys
Writing Items
1. Define Constructs

2. Select/Write Items
SOCIAL DESIRABILITY MANAGEMENT –
Avoid questions that may have more
socially appropriate responses
“Teachers are good people to go to for
help”
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions

KNOWLEDGE LIABILITY – Each
statement should be answerable by the
potential respondents
“My career decision making abilities have
increased because of Project Explore”
Demographic Items
1. Define Constructs

“Closed” format whenever possible

Meaningful Categories

Within knowledge of respondents
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
Developing Your Own Surveys
Writing Instructions
1. Define Constructs
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions

Should be clearly defined, easy to
understand, and as brief as possible
– Include the following:
 Purpose of survey
 How to answer items
 Whether responses are anonymous or
not
 Honest answers are requested
 Answer all items
 Survey is not a test
Developing Your Own Surveys
Testing the Survey
1. Define Constructs
2. Select/Write Items
3. Write Instructions
Why test the survey?
1. Identify possible problems
2. Evaluate wording of items
4. Test Survey
5. Edit Items and
Instructions
3. Ensure clarity
4. Assess amount of time required to
complete survey
Developing Your Own Surveys
Testing the Survey
1. Define Constructs
2. Select/Write Items

Pre-testing Items
 (AKA “pilot testing”) - not the same as
pre/post testing)
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
– Survey is administered to small
number of “readily available” people
– Respondents answer items as though
part of the study
– Feedback is given about the survey
and items themselves
Developing Your Own Surveys
Revising the Survey
1. Define Constructs

Make changes to survey items
based on “pilot test” information
and participant feedback

Administer surveys to participants
2. Select/Write Items
3. Write Instructions
4. Test Survey
5. Edit Items and
Instructions
Activity #3

Identify the source of each type of data
– Process data
– Perception data
– Results data

What demographic data do you need to collect?
– Is the impact of your program different for different
groups?
– Will the results of your evaluation vary based on
demographic characteristics?
Conducting Program Evaluation:
Analyze the Data

5. Analyze the data:
– Before data can be analyzed it may need to
be edited, coded and organized for analysis.
– Data is often input into a software program
such as Excel or SPSS.
– EZAnalyze is an inexpensive (free for you)
data analysis program designed specifically
for educators – for more information, visit
www.ezanalyze.com
Conducting Program Evaluation:
Analyze the Data

Descriptive statistics describe the data and
can provide information about how a
group has changed over time:
– Measures of central tendency
Mean, median, mode
– Measures of variability
Variance, standard variation, range
– Measures of relative standing
Percentile rank
Conducting Program Evaluation:
Analyze the Data

Inferential statistics provide additional information:
– Looking carefully at how a group changes over time
 Use t-tests or Chi-Square
– Looking at differences between control and intervention
groups
 Use t-tests or Chi-Square
– Looking at differences among more than 2 groups
 Use Analysis of Variance (ANOVA)

Finding a member of the evaluation team already
comfortable with data input and analysis can make
this part much less intimidating!
Statistics 101
Using data requires an understanding of some basic
statistics – nothing too fancy is needed!
Having a handle on some common terms will allow
you to make sense of all the numbers and increase
your ability to use data
Statistics 101 – Common Terms

N - Number of participants

Mean – the “average” score – all scores
are added up and divided by the N

Standard Deviation – how far, on
average, a single score deviates from the
mean score
Statistics 101 – Common Terms
A frequency Histogram can be used to show how people scored on
a variable – this is useful for demonstrating how several of these
concepts work
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
130
145
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
Mean = 100, SD = 15
130
145
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
Mean = 100, SD = 30
130
145
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
Mean = 100, SD = 10
130
145
Statistics 101 – Common Terms

Median – the “middle” number. Obtained
by putting all the observed values on a
line and finding the one that lands in the
middle. Useful for describing “skewed”
distributions

Mode – the most frequent number
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
130
Mean = 100, Median = 100, Mode = 100
145
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
55
70
85
100
115
Score on IQ test
130
Mean = 100, Median = 90, Mode = 80
145
Statistics 101 – Common Terms
Number of people
60
50
40
30
20
10
0
15
30
45
60
Household Income
75
90
105
(in thousands)
Mean = 62, Median = 47, Mode = 40
Statistics 101 – Common Terms

Z-Score – a “standardized score”. The
person’s mean score divided by the
standard deviation

Percentile Rank – tells you the relative
position of a person’s score, compared to
other people’s scores
From http://www.webenet.com/bellcurve2.gif
Statistics 101 – Common
Applications

Categorical Variable – a variable that divides data into
groups; has little or no numeric meaning

Dependent Variable – a variable that contains
information you are interested in that has numeric value

Disaggregation – sorting a dependent variable by a
categorical variable (or variables)

Correlation – a number between -1 and +1 used to
describe the relationship between two variables
Statistics 101 – Common
Applications
Disaggregation Graph - Disaggregating Mean # of
Days absent by Ethnicity, then by Gender.
25
20
20
Days abs
25
15
10
15
10
Ot
he
r
White
Asian/PI
n
Ind
i
/P
I
Latino
er
ica
As
ian
African
Amer
Ethnicity
Am
W
h it
Am
an
0
e
0
La
tin
o
5
er
5
Af
r ic
an
Days absent
Disaggregation Graph - Disaggregating
Mean # of Days abs by Ethnicity
Ethnicity
Male
Female
American
Indian
Other
Statistics 101 – Common
Applications

T-Tests
– one sample – compare a group to a known value
 For example, comparing the IQ of convicted felons to the known average of
100)
– paired samples – compare one group at two points in time
 For example, comparing pretest and posttest scores
– independent samples – compare two groups to each other

ANOVA - compare two or more groups, OR, compare
at two or more points in time (repeated measures)
Statistics 101 – Common
Applications
Paired Samples T-Test of Pretest and Posttest
80
70
Mean Score
60
50
40
30
20
10
0
Pretest
Posttest
Paired Variables
Statistics 101 – Common
Applications
Independent Samples T-Test
12
Days abs
10
8
6
4
2
0
Male
Female
Groups
Statistics 101 – Common
Applications
Independent Samples T-Test
Days abs
11
10
9
8
Male
Female
Groups
Statistics 101 – Common
Applications
Male
Female
Number of people
60
50
40
30
20
10
0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Days Absent
Non-significant t-test
Statistics 101 – Common
Applications
Male
Female
Number of people
60
50
40
30
20
10
0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Days Absent
Significant t-test
Statistics 101 – Common
Applications
Male
Female
Number of people
60
50
40
30
20
10
0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
Days Absent
Non-significant t-test (SD’s increased)
Statistics 101 – Common
Applications
Days abs
ANOVA - Days abs Disaggregated by Ethnicity
20
18
16
14
12
10
8
6
4
2
0
African
Amer
Latino
White
Asian/PI
Ethnicity
American
Indian
Other
Statistics 101 – Common
Applications
Scatterplot of Days absent and Grade total
r = -.92
120
Grade total
100
80
60
40
20
0
0
5
10
15
20
Days abs
25
30
35
40
Statistics 101 – Common
Applications
Scatterplot of Days absent and Grade total
r = -.92
120
Grade total
100
80
60
40
20
0
0
5
10
15
20
Days abs
25
30
35
40
Statistics 101 – Common
Applications
Scatterplot of Days abs and IQ
r = .12
70
60
Pretest
50
40
30
20
10
0
0
5
10
15
20
Days abs
25
30
35
40
Analyzing Data
Creating the Data Template

For the data to work with EZAnalyze, it
needs to be structured a certain way
– The first row MUST contain variable labels
– The remaining rows MUST contain the data
from the surveys, one row for each person
responding
– A few rules can be applied to help you
properly structure your data template
Analyzing Data
Creating the Data Template

Each survey should have a method for
coding that will allow you to match what is
entered into the Excel file with the paperpencil survey
– Number surveys
– Use coding system
– If identifying information is available, take
steps to ensure confidentiality of respondents
is maintained
Analyzing Data
Creating the Data Template

Each POSSIBLE RESPONSE should be
given its own column in the data template
– For each question that allows only one
response, each question will become one
column
– For each question that has multiple
responses, each possible response will require
a column
Analyzing Data
Creating the Data Template
Sample of what data will look like when only
one response is possible for each question
Analyzing Data
Creating the Data Template
Question: Who do you trust in school?
trust1 = counselor
trust2 = teacher
trust3 = principal
trust4 = cafeteria
trust5 = custodian
Analyzing Data
Creating the Data Template

If you have PRETEST and POSTTEST data
for a group of students, you will want to
have both the pretest and the posttest for
each student in the same row
Analyzing Data
Checking for Accuracy

Once all of your data are entered, you need
to check to make sure the data were entered
accurately
– If you have a lot of data, you can select a
sample (10%) to spot check for accuracy
– You can use EZAnalyze’s Descriptive Statistics
function to get the range of scores contained in
your dataset, then use the SORT function of
Excel to find problem data
– This is where having the ID number on BOTH
the survey and in the Excel file comes in handy!
Analyzing Data
This is initially difficult to get a handle on if
you have not done this sort of thing
before
Using the EZAnalyze manual and tutorials
combined with your own data will make
the process more concrete
Activity #4

How will you analyze the data to answer your
questions?
– Will statistical significance be determined?
– Will disaggregating the results be useful? If so, how?
– Remember to use data analysis procedures that will
answer your hypothesis.

Key terms and statistics (usually true)
– Relationship = correlation
– Increase = improvement from pre to posttest, a
paired samples t-test
– Differences among groups (based on demographic
characteristics or treatment/control groups) =
independent samples t-test, ANOVA, or Chi Square
Conducting Program Evaluation:
Interpret Results
6. Interpret your results, and disseminate
and use findings to inform practice.
– What do the results of your analyses mean?
– What did you find out?
– Were your hypotheses correct?
Conducting Program Evaluation:
Interpret Results
The goal of program evaluation is to use
the results to inform practice. Some
options include:
– Make recommendations that will resolve a
problem.
– Make plans and decisions about interventions
based on the findings.
– Make program plans based on the findings.
– Develop action plans based on the findings.
Conducting Program Evaluation

Report conclusions (the accountability part)
– Decide on audience(s)
– Structure report/presentation so that the most
relevant information is presented to audience
– Don’t exclude important information
– Present all relevant results even if they don’t support
your hypotheses
– Relate findings to purposes of evaluation, hypotheses
and previous research
– Make recommendations and decisions based on
conclusions
Thank You!
Center for School Counseling
Outcome Research
www.cscor.org
Visit the “Resources” section of the website for
copies of the materials
Download