Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon

advertisement
School-wide Positive Behavior Support
Evaluation Template
October, 2005
Rob Horner, George Sugai, and Teri Lewis-Palmer
University of Oregon
Purpose
This document is prepared for individuals who are implementing School-wide Positive
Behavior Support (PBS) in Districts, Regions or States. The purpose of the document is
to provide a formal structure for evaluating if School-wide PBS implementation efforts
are (a) occurring as planned, (b) resulting in change in schools, and (c) producing
improvement in student outcomes.
The organization of this template provides (a) an overview of the context within which
School-wide PBS is being used, (b) a common set of evaluation questions, (c) evaluation
instruments/procedures that address each questions, and (d) samples of evaluation data
summaries that can be provided and used to build formal evaluation reports.
Context
School-wide positive behavior support (SW-PBS) includes a range of systemic and
individualized strategies for achieving social and learning outcomes while preventing or
reducing problem behavior for all students. School-wide PBS includes universal
prevention of problem behavior through the active teaching and rewarding of appropriate
social skills, consistent consequences for problem behavior, and on-going collection and
use of data for decision-making. In addition, School-wide PBS includes an array of more
intensive supports for those students with more severe behavior support needs. The goals
within School-wide PBS are to prevent the development of problem behavior, to reduce
on-going patterns of problem behavior, and to improve the academic performance of
students through development of a positive, predictable and safe school culture.
School-wide PBS is being implemented today in over 4300 schools throughout the
United States. Each of these schools has investing in training on school-wide PBS
strategies, has a team that is coordinating implementation, and is actively monitoring the
impact of implementation on student outcomes.
As more schools, districts, states and regions adopt School-wide PBS there will be an
increasing need to formally evaluate if these training and technical assistance efforts (a)
result in change in the way schools address social behavior in schools, and (b) result in
change in the behavioral and academic outcomes for students.
Need for Evaluation
School-wide PBS will continue to be adopted across the U.S. only if careful, on-going
evaluation of the process and outcomes remains a central theme. Evaluation outcomes
will both document the impact of School-wide PBS, and guide improvement in the
strategies and implementation procedures. Evaluation may occur at different scales (one
school, versus a district, versus a state or region), and different levels of precision (Local
self-assessment, versus state outcome assessment, versus national research-quality
analysis). The major goal of evaluation is always to provide accurate, timely, valid and
reliable information that is useful for decision-making. The stakeholders and decisions
being made will always shape the evaluation. We recognize that the type, amount and
level of information gathered for an evaluation will vary. It is very likely that no two
evaluation efforts will be exactly the same. At the same, there will be value in
identifying common evaluation questions, data sources, and reporting formats that may
be useful across evaluation efforts. This evaluation template is intended to benefit those
building evaluation plans to assess school-wide PBS. Our hope is that the measures and
procedures defined in the template will make it easier for others to design evaluation
plans, and that over time a collective evaluation database may emerge that will benefit all
those attempting to improve the social culture of schools.
In building an evaluation plan we recommend (a) beginning with the decisions that will
be made by stakeholders, (b) organizing the core evaluation questions that will guide
decision-making, (c) defining valid, reliable and efficient measures that address the
evaluation questions, and (d) presenting information in an iterative, timely and
consumable format.
Evaluation Decisions
Our experience suggests that most efforts to implement School-wide PBS begin with a
“special” investment by the state, region or federal government in a demonstration effort
designed to assess (a) if School-wide PBS can be implemented effectively in the local
area, (b) if School-wide PBS results in valued outcomes for children, families and
schools, and (c) if School-wide PBS is an approach that can be implemented in a costeffective manner on a large scale.
The decisions that guide a formal evaluation will focus simultaneously on issues of (a)
accountability and oversight (e.g., did the project conduct the activities proposed?), (b)
the impact of the project (e.g. was there change in school practices, student behavior,
academic outcomes), and (c) implications for further investment needed to take the effort
to a practical scale.
An on-going challenge for any evaluation of School-wide PBS is that the individual
behavior of children and adults functions as the target of intervention efforts, but the
“whole school” is the unit of most evaluation analyses. In essence the goal of Schoolwide PBS is to create a “whole school” context in which individuals (both faculty and
students) are more successful. Most evaluations will reflect this attention to individual
behavior, with summaries that reflect the global impact on the whole school.
As evaluation plans are formed there are some common traps that are worth avoiding.
1. Evaluation plans are strongest when focused on real outcomes (change in school
practices and student behavior)
a. It is possible for evaluation reports to focus only on counts of training
events and participant satisfaction. These are necessary, but insufficient
pieces of information.
2. Evaluation plans should examine student outcomes only when School-wide PBS
practices have been implemented.
a. It is important to know first if training and technical assistance resulted in
change in the behavior support practices used in schools
b. An important “next” question is if those schools that implemented to
criterion saw change in student outcomes. If schools did not implement
School-wide PBS practices, we do not expect to see changes in student
outcomes.
3. Evaluation plans often focus only on initial training of demonstration sites, and
ignore the capacity development needed for large-scale implementation.
a. School-wide PBS efforts focus simultaneously on establishing
demonstrations of effectiveness (individual schools), and the capacity to
expand to a socially important scale. There often is the assumption that
initiatives start with a small demonstration, and only after the
demonstration is documented as viable and effective does work begin on
large-scale capacity building. Within School-wide PBS there is an
immediate emphasis on building the (a) coaching network, (b) local
trainers, and (c) formal evaluation structure that will be keys to taking
School-wide PBS to scale. Establishing a Leadership Team with broad
vision and mandate is part of the first step toward implementation of
School-wide PBS.
Evaluation Questions
In different contexts different evaluation questions will be appropriate. In general,
however, School-wide PBS will be implemented within the context of an initiative to
change school discipline across a portion of schools in a geographic area (district, state,
region). Efforts to provide evaluation of the School-wide PBS implementation effort
often will address the following evaluation questions:
1. Who is receiving training and support?
a. What schools are receiving implementation support?
b. What proportion of schools in the target area is implementing school-wide
PBS?
2. What training and technical assistance has been delivered as part of the
implementation process?
a. What training events have been conducted?
b. Who participated in the training events?
c. What was the perceived value of the training events by participants?
3. Has the training and TA resulted in change in the behavior support practices used
in schools?
a. Are the faculty in participating schools implementing universal schoolwide PBS?
b. Are the faculty in participating schools implementing targeted and
intensive individual positive behavior support?
4. If schools are using SW-PBS is there an impact on student behavior?
a. Has there been a change in reported student problem behavior?
1. Office discipline referrals
2. Suspensions
3. Expulsions
4. Referrals to special education
b. Has there been change in student attendance?
c. Has there been change in student academic performance?
d. Has there been a change in perceived risk factors and protective factors
that affect mental health outcomes?
5. Has the Training and Technical Assistance resulted in improved capacity for the
state/district/region to sustain SW-PBS, and extend implementation to scale?
a. To what extent has the implementation effort resulted in improved
capacity of the area to train others in school-wide PBS
b. To what extent has the implementation effort resulted in improved
capacity to coach teams in school-wide PBS procedures?
c. To what extent do local teams have evaluation systems in place that will
allow them to monitor and improve school-wide PBS?
d. To what extent does the state or district Leadership Team have an
evaluation system in place to guide broad scale implementation efforts?
6. Are faculty, staff, students, families, and community stakeholders satisfied?
a. Are faculty satisfied that implementation of school-wide PBS is worth the
time and effort?
b. Are students satisfied that implementation of school-wide PBS is in their
best interest?
7. Policy impact
a. Have changes in student behavior resulted in savings in student and
administrator time allocated to problem behavior?
b. Have policies and resource allocation within the area (district, school,
state) changed?
8. Implications
a. Given evaluation information, what recommendations exist for (1)
expanding implementation, (2) allocating resources, (3) modifying the
initiative or evaluation process?
Evaluation Measures/Instruments
Evaluation plans often incorporate an array of measures to address the core evaluation
questions. Some measures are purely descriptive, or uniquely tied to the local context.
Other measures may be more research-based, standardized, and experimentally rigorous.
Issues of cost, time, and stakeholder needs will affect which measures are adopted. To
accommodate variability in evaluation needs, a comprehensive model should offer
multiple measures (some more rigorous, and some more efficient) for key questions. The
list of measures provided below is not offered with the assumption that all measures
would be used in every evaluation plan, but that each plan may find a sampling of these
measures to be useful. We also recognize and encourage the use of additional, locally
relevant measures.
Evaluation
Questions/Focus
Who is receiving
training and
technical support?
Measures
(ResearchValidated
Measures in Bold)
School Profile
Typical Data
Collection Cycle
Metric and use of
data
Completed when a
school begins
implementation.
Name, address,
contact person,
enrollment, grades,
student ethnicity
distribution.
Updated annually
What training and
TA has been
delivered?
List of training
events, persons
participating, and
training content.
Was the training
and TA identified
as useful by
participants?
Training Evaluation
Form
Has the training
and TA resulted in
change in the
Team
Implementation
Checklist (TIC)
Collected as part of
each major Schoolwide PBS Team
Training Event
Documents the
teams and
individuals present,
the content of
training, and the
participant
perception of
workshop
usefulness
The TIC is collected
at the first training
event, and at least
The TIC provides a
% implementation
of Universal Level
quarterly thereafter
until 80% criterion
met.
SW-PBS practices.
Plus a sub-scale
score for each of the
SET subscales.
EBS SelfAssessment Survey
The EBS SelfAssessment Survey
completed during
initial training, and
annually thereafter
The EBS Survey
produces % of staff
indicating if Schoolwide, Specific
Setting, Classroom
and Individual
Student systems are
in place, and
important for
improvement.
Systems-wide
Evaluation Tool
(SET)
The SET is
completed annually
as an external, direct
observation measure
of SW-PBS practice
implementation.
The SET produces a
total % score and %
for seven subscales
related to Universal
level SW-PBS
practices.
Individual-Student
Systems Evaluation
Tool (I-SSET)
The I-SSET is
administered with
the SET annually by
an external
evaluator.
The I-SSET
produces three
scores: The % to
which “foundation”
practices are in
place for individual
student support; the
% to which
“Targeted” practices
are in place; and the
% to which
“Individualized,
Function-based
Support” practices
are in place.
behavior support
practices in
schools? (Change
in adult behavior)
School-wide
Benchmarks of
Quality (Florida)
The BoQ is
The BoQ produces a
completed by school summary score for
teams, and assesses
the same features as
the SET, but based
on team perception
implementation, and
sub-scale scores for
SET factors.
SWIS data are
collected and
summarized
continuously.
SWIS data indicate
the frequency and
proportion of office
discipline referrals,
suspensions and
expulsions.
The SSS typically is
administered
annually by an
external observer at
the same time as
SET evaluations.
The SSS produces a
perceived Risk
Factor score and a
perceived Protective
Factors score. The
SSS is one index of
the overall “safety”
of the school.
The SCS is a direct
survey of students
and/or adults that is
collected annually,
or on a formal
research/evaluation
schedule.
The SCS produces a
standardized score
indexing the
perceived quality of
the social culture of
the school.
State Academic
Annual assessment
Achievement Scores of literacy, math,
(Unique to each
etc. scores.
state)
The typical outcome
is the proportion of
students within
identified grades
(e.g., 3, 5, 8, 10)
who meet state
standards.
SW-PBS Registry
Provides listing of
*Leadership Team
School-wide
If SW-PBS is
Information System
implemented at
(SWIS)
criterion, is there
improvement in
the social and
academic outcomes
for students?
School Safety
Survey (SSS)
Yale School
Climate Survey
(SCS)
Have the training
and TA efforts
resulted in
improved local
capacity to
implement SWPBS?
Completed when
initiative begins,
and maintained as
new people are
identified.
*Coordinators
*Local Trainers
*Coaching Network
Leadership Team
Self-Assessment
Survey
Are Faculty, Staff,
Students, Families,
Community
Stakeholders
Satisfied?
Do improvements
sustain over time?
Completed by the
Leadership Team at
least annually.
Provides a summary
score and sub-scale
scores.
*Schools
Implementing SWPBS
Faculty Impact
Assessment
Completed 3-6
weeks after teaching
school-wide
expectations
Provides a Likertlike rating of
perceived usefulness
of SW-PBS
practices
Student Impact
Assessment
Completed 3-6
weeks after teaching
school-wide
expectations
Provides an index of
whether students
learned school-wide
expectations, and if
they find the SWPBS process useful.
TIC, SET, BoQ, ISSET
Annual assessment
of the proportion of
schools that meet
criterion from one
year to the next.
Provides summary
of extent to which
schools that reach
criterion and
produce valued
gains, sustain those
achievements.
Collected annually
Document if savings
accrue as a function
of investing in SWPBS.
Leadership Team
Self-Assessment
SWIS, SSS,
Standardized Test
Policy and Future
Implementation
Evaluation
Concerns.
*Evaluation Team
Cost Analysis
(Unique to each
initiative)
Evaluation Report Outline
This section provides an overview of how a School-wide PBS Evaluation Report may be
organized, and the evaluation data summaries that fit for each section.
Purpose of Evaluation
The purpose section should indicate the intended audience (e.g., stakeholders) and
core decisions that are intended to be influenced by the evaluation report.
Description of School-wide PBS and History of Implementation
A section is recommended that provides the reader with a short overview of the
School-wide PBS. List the major features of the approach, and offers a short history of
School-wide PBS implementation in the District, Region, or State. This section may also
be the place to indicate the funding sources that support implementation efforts.
Evaluation Questions
A short section is recommended that operationally defines a list of specific
evaluation questions.
Evaluation Measures and Activities
An evaluation report typically covers a specific time period (e.g. six months, one
year, a 3-year project). Indicate the timeframe of the overall project, and the specific
timeframe covered in the report. Within this timeframe provide a table of the Schoolwide PBS implementation activities that were proposed, and those carried out.
List the specific measures used to collect data, and provide copies of the measures
in appendices. Where appropriate consider including published articles defining the
psychometric properties of research-quality instruments.
Evaluation Results
List the evaluation questions under consideration, the data source addressing each
question, and a summary of the current data. Take care to both present the data in an
objective and complete manner, and summarize the results to directly answer each
evaluation question.
Examples of possible evaluation questions and data sources are provided below
from Evaluation Reports prepared in Illinois (Dr. Lucille Eber), New Mexico (Cathy
Jones, and Carlos Romero), Iowa (Drs. Marion Panyon and Carl Smith) and Maryland
(Susan Barrett, Jerry Bloom and Milt McKenna).
Evaluation Question: Who is adopting School-wide PBS?
Data Source: School Profile; Registry of Teams; State List of Schools
Data Display: From Eber et al., Illinois Positive Behavior Interventions and Support
Progress Report 03-04.
Figure 1: Number of Schools Adopting PBIS by academic year.
450
400
350
300
250
200
150
100
50
0
444
394
303
184
120
23
Year 1
9/98
Year 2
9/00
Year 3
9/01
Year 4
6/02
Year 5
6/03
Year 6
6/04
Table 1: Percentage of schools in Illinois adopting PBIS by region by implementation
year.
Percent of Total IL. Schools Implementing PBIS Regionally as of June 2004
Region
Chicago
North
Central
South
Total
602
1874
1049
666
4149
% of total in PBIS 9/99
0% (0)
0.7% (14)
0.6% (6)
0.5% (3)
0.6% (23)
% of total in PBIS 9/00
0.4% (2)
1.7% (32)
4.2% (44)
6.3% (42)
3.0% (120)
% of schools in PBIS 9/01
1.0% (5)
3.6% (68)
7.1% (74)
7.1% (47)
4.7% (194)
% of schools in PBIS 9/02
2.7% (15)
7.1% (133)
9.4% (99)
8.4% (56)
7.3% (303)
% of schools in PBIS 6/03
3.8% (21)
10.7% (201)
10.9% (114)
8.7% (58)
9.5% (394)
% of schools in PBIS 6/04
4.3% (24)
12.1% (227)
12.6% (132)
9.2% (61)
10.7%
(444)
Total # of Schools
Evaluation Question: Are schools implementing School-wide PBS?
Data Source: Team Implementation Checklist (Target Criterion = 80%)
Data Display: Iowa Elementary School Team Implementation Checklist Data
Iowa Elementary Schools
Team Checklists 02-04, % Items Fully & Partially Implemented
Percent (%) Implemented
100
80
60
40
20
Adams
1ES-D
Douds ES *
2
Iowa Valley ES* Jackson
3
4
ES-D
MLK Monroe Park
5
6
7Ave.
ES-D
ES-D
Schools ES-D
% Imp.
Prescott ES*
8
Stockport ES-P Stowe
9
10
ES-D
% Partially Imp.
Data Source: Team Implementation Checklist (Target Criterion 80% Total)
Data Display: Individual School Report from Iowa 03-04 (Panyon & Smith, 2004)
Data Source: EBS Survey Data (Target Criterion 50% “In Place”)
Data Display: Illinois 02-03 District Evaluation (Eber et al., 2003)
Aug. '03
Nov. '03
Feb. '03
Sep. '02
Nov. '02
Mar. '03
Apr. '03
May '03
Sep. '03
Nov. '03
Mar. '04
Sep. '02
Nov. '02
Mar. '03
Apr. '03
May '03
Sep. '03
Nov. '03
Mar. '04
Nov. '03
Feb. '04
Nov. '03
Feb. '04
Sep. '03
Nov. '03
Mar. '04
Aug. '03
Sep. '03
Nov. '03
Feb. '04
Sep. '02
Oct. '02
Feb. '03
Apr. '03
Sep. '03
Nov. '03
Feb. '04
Sep. '02
Nov. '02
Mar. '03
Apr. '03
May '03
Sep. '03
Nov. '03
Mar. '04
Aug. '03
Nov. '03
Feb. '04
0
2002-03 EBS Surveys, Systems "In Place" Scores
100
Percent (%) In Place
80
60
40
20
Bottenfield ES Dr. Howard ES Kenwood ES
1
2
3
Robeson ES South Side ES Washington
ES
4
5
In Place
Partial
6
Westview ES Columbia Ctr.
7
8
Franklin MS
9
Not in Place
Data Source: EBS Survey
Data Display: Illinois 20-03 Individual Schools Evaluation (Eber et. al., 2003)
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
SW
NC
CR
IS
0
Central HS
10
Data Source: Systems-wide Evaluation Tool (SET) (Target Criteria 80% Total plus 80%
Teaching)
Data Display: Illinois District-Level Assessment 03-04 (Eber er al., 2004)
SET Teaching & Mean Scores 03-04
100
Percent (%)
80
60
40
20
Mean
9
10
Vernon L. Barkstall ES
8
Stratton ES
6
7
Schools
South Side ES
5
Robeson ES
Franklin MS
Edison MS
4
Kenwood ES
3
Garden Hills ES
2
Dr. Howard ES
1
Columbia Ctr.
Carrie Busey ES
0
11
Teaching
Data Source: Systems-wide Evaluation Tool (SET)
Data Display: Illinois Individual Schools Assessment 03-04 (Eber et al., 2004)
ol
ct
Su
pp
or
t
M
ea
n
Sc
or
e
en
t
tri
D
is
g
rin
M
on
ito
Sy
st
ns
at
io
M
an
ag
em
em
m
Sy
st
e
Vi
.T
pe
ct
R
ew
ef
in
Ex
.D
pe
ct
Ex
ar
ds
au
gh
t
100
90
80
70
60
50
40
30
20
10
0
ed
% of Implementation
Middle School B
2001-02 SET Scores
Fall 2001
Spring 2002
Summary Evaluation Questions from SET Scores:
1. What proportion of schools receiving training and TA have implemented SWPBS to criterion?
Illinois Schools with SET Scores
Met SET Criteria
Total Schools with SET scores
300
Frequency
250
200
150
100
50
0
01-02
02-03
03-04
2. What proportion of schools meeting SW-PBS criteria sustain across time?
a. Of 52 schools meeting the SET criteria in Illinois during 02-03, 85%
sustained or improved performance in SET in 03-04.
Evaluation Question: Is Implementation of SW-PBS Improving Student Outcomes?
Data Source: SWIS
Data Display: New Mexico SWIS summary compared to nation average for schools
initiating SW-PBS implementation 03-04 (Jones, Romero, Howarth, 2004)
ODR/
100 students/ Year
ODR /
New Mexico
100 students/ School ODR/100/day
Day
Elementary
N = 508
Mean = 76
Median = 54.5
Mean = .43
Mean = .46
N = 6 schools
Middle
(Jr. High)
N = 153
Mean = 199
Median = 132
Mean = .95
Mean = 1.32
N = 5 schools
High School
N = 29
Mean = 171
Median = 151.9
Mean = .99
Mean = .82
N = 4 schools
Data Source: SWIS
Data Display: Illinois Individual School Comparison of SET and ODR rates for one
school across three years.
SET Total Score and ODR/100 Students/Year:
One Chicago School
SET Total: ODR per 100
140
120
100
80
SET
60
ODR
40
20
0
01-02
02-03
03-04
Data Source: SWIS
Data Display: Hawaii and Illinois SWIS summaries for Schools meeting and not meeting
SET criteria 03-04 (From Sugai et al. 2004)
Mean ODRs per 100 students per school day
Illinois and Hawaii Elementary Schools 2003-04 (No Minors)
Mean ODR/100/Day
1
0.8
0.6
0.4
0.2
0
N = 87
N = 53
Met SET 80/80
Did Not Meet SET
Data Sources: SWIS
Data Display: Comparison of Triangle Summary from SWIS for Schools meeting and
not meeting SET criteria in Central Illinois, 03-04 (Eber et al., 2004).
Central Illinois Elem, Middle Schools
Triangle Summary 03-04
Mean Proportion of
Students
1
0.8
0.6
6+ ODR
2-5 ODR
0.4
0-1 ODR
0.2
0
Met SET (N = 23)
Not Met SET (N =12)
Data Sources: SWIS and Team Checklist
Data Display: Elementary Schools In New Mexico at Different Stages of SW-PBS
Implementation, 03-04 (Jones, Romero, & Howarth, 2004).
New Mexico 03-04 ODR and TIC r = -.698
TIC %; ODR/100/day
TIC %
ODR/100/Day
1.6
1.4
1.2
1
0.8
0.6
0.4
0.2
0
1
2
3
4
5
6
7
8
9
10
Schools
Data Sources: SWIS, and SET
Data Display: National Database on Out of School Suspension Rates per 100 Students
with an ODR, for Schools that do and do not meet the SET Criteria, and for students with
and without IEPs.
Elementary Schools 03-04: Rate of OSS per 100
(145 schools; 106,822 students)
Mean Rate of OSS per 100
Students
Not SW-PBS (N = 56 schools )
SW-PBS (N = 89 schools)
80
70
60
50
40
30
20
10
0
Without IEP
With IEP
All
Evaluation Question: Is Implementation of SW-PBS associated with improved
mental health outcomes (e.g. reduced risk factors and increased protective factors)?
Data Source: School Safety Survey
Data Display: Illinois summary of SSS scores for schools that met and did not meet SET
Criteria.
Mean Protective Factor Score
SSS Mean Protective Factor Score:
Illinois Schools 03-04 t = 7.21; df = 172; p < .0001
1
0.8
0.6
0.4
0.2
0
Met SET
Did Not Meet SET
SSS Mean Risk Factor Score:
Mean SSS Risk Factor Score
Illinois Schools 03-04 t = -5.48; df = 134; p < .0001
1
0.8
0.6
0.4
0.2
0
Met SET
Did Not Meet SET
Evaluation Question: Is there improvement in Student Academic Outcomes when
SW-PBS is implemented to criterion?
Data Sources: SET and Standardized Achievement Scores
Data Display: Illinois mean proportion 3rd Graders achieving state ISAT reading
standards for 8 schools meeting and 23 schools not meeting SET Criteria 02-03 (Eber et
al., 2004).
Proportion of Students Meeting
Reading Standards
Proportion of 3rd Graders who meet or exceed state
reading standards (ISAT) in Illinois schools 02-03
t = 9.20; df = 27 p < .0001
1
0.8
0.6
0.4
0.2
0
Not Meeting SET
Meeting SET
Data Sources: SET and Standardized Achievement Scores
Data Display: Change in percentage of students meeting 3rd grade reading standards for
Elementary schools in one Oregon school district for schools that had met or not met SET
criteria for four consecutive years.
Change in Percentage of
Students Meeting Standards
Elementary Schools With School-wide PBS
20
15
10
5
0
-5
1
2
3
4
5
6
7
8
9
10
11
12
13
Schools
Change in Percentage of
Students Meeting Standards
Elementary Schools Without School-wide PBS
6
4
2
0
-2
-4
-6
1
2
3
4
Schools
5
6
School-wide PBS Training Evaluation:
Date of Training___________________________
Disagree
Agree
The training event was efficiently organized
1
2
3
4
5
6
The presenter(s) was knowledgeable
1
2
3
4
5
6
The presenter(s) was organized and effective
1
2
3
4
5
6
The training materials were well organized
1
2
3
4
5
6
The training materials were useful
1
2
3
4
5
6
The physical accommodations for the training
were acceptable
1
2
3
4
5
6
The most helpful/useful features of the training were:
Features that would have improved the training were:
Other comments:
School-wide PBS
Faculty Evaluation Survey
The purpose of this survey is to gather information about the effectiveness of the schoolwide positive behavior support training held within the last month, and to guide team
efforts to organize future training in school-wide discipline.
Date:_____________________________
Disagree
Agree
The training activities focused on
important social messages for students.
1
2
3
4
5
6
The training activities were well
organized.
1
2
3
4
5
6
There was adequate time to plan
the training activities.
1
2
3
4
5
6
The students responded well to the
training activities.
1
2
3
4
5
6
The school-wide training helped with
classroom behavior of students.
1
2
3
4
5
6
We should do the school-wide training
again next year.
1
2
3
4
5
6
The most valuable aspects of the training were:
Suggestions for improving training in the future:
School-wide PBS
Student Evaluation Survey
The purpose of this survey is to learn what you found useful about the training you
received on school-wide expectations, and how the training can be improved for the
future.
Date:___________________________
Your Grade:______________
Please list the behavioral expectations for your school:
How well do you understand what is expected of you at school?
Not Clear
1
Very Clear
2
3
4
5
6
How well do you believe other students follow the behavioral expectations?
Not at all
1
2
3
4
5
Almost Always
6
Do you believe it is worthwhile to provide orientation to the behavioral expectations for
students coming to our school?
Not important
1
2
3
4
What did you find most valuable about the training?
What would you recommend to improve the training?
5
Very important
6
Registry
Leadership Team, Trainers, Coaches
Leadership Team
Name
Contact Information
Implementation Coordinator(s)
Name
Contact Information
Implementation Trainers
Name
Contact Information
Coaches
Name
Contact Information
Evaluation Team
Name
Contact Information
School Implementation Teams
Names
Roles
Contact Information
Download