2010-2011 Executive Summary - Pennsylvania Positive Behavior

2010-2011 SWPBIS
Executive Summary
Timothy J. Runge, Co-Principal Investigator
Mark J. Staszkiewicz, Co-Principal Investigator
Kevin H. O'Donnell, Research Assistant
Indiana University of Pennsylvania
January 2012
Acknowledgements
The following agencies and organizations are acknowledged for their collaboration with
the authors on this project: Pennsylvania Department of Education (PDE); Pennsylvania Bureau
of Special Education (BSE); Pennsylvania Training and Technical Assistance Network
(PaTTAN); Pennsylvania Positive Behavior Support Network (PAPBS Network); Office of
Special Education Programs (OSEP) Technical Assistance Center on Positive Behavioral
Interventions and Supports (PBIS); Educational and Community Supports at the University of
Oregon; the Educational and School Psychology Department at Indiana University of
Pennsylvania (IUP); the IUP Research Institute, and the IUP School of Graduate Studies and
Research.
Specific recognition is extended to Dr. James Palmiero, Director of PaTTAN Pittsburgh;
Ronald Sudano, Statewide SWPBIS Coordinator; Dr. Tina Lawson, Eastern Regional SWPBIS
Coordinator; Lisa Brunschwyler, PaTTAN Consultant; and Teresa Stoudt, Central Regional
SWPBIS Coordinator. We acknowledge past and present IUP Educational and School
Psychology Research Assistants Rebecca Tagg, Aleksey Aleskeev, Melissa Gilroy, Cong Xu,
Kevin O'Donnell, and Stephen McFall. Thanks are also offered to Celeste Dickey and the staff
of the University of Oregon's Educational and Community Supports.
Most importantly, our deepest admiration is bestowed on PAPBS Network schools,
supportive local communities, and collaborating mental health agencies that work tirelessly to
develop, implement, and improve the SWPBIS framework in their educational buildings.
Data analysis and summation of results which formed the basis of this Executive
Summary were supported in part by a contract from the PaTTAN / Intermediate Unit 1, PDE,
and BSE. Opinions expressed within are solely those of the authors and do not necessarily
reflect the position of the funding agencies or the Pennsylvania Department of Education, and
such endorsements should not be inferred.
1
2010-2011 PAPBS Network SWPBIS Executive Summary
Preface
The purpose of this Executive Summary is to present outcome data related to the
implementation of School-Wide Positive Behavioral Interventions and Supports (SWPBIS) in
schools that are members of the Pennsylvania Positive Behavior Support Network (PAPBS
Network). The general framework for the 2010-2011 evaluation of SWPBIS in PAPBS Network
schools is based on recommendations of leaders in the field of large-scale SWPBIS evaluations
(i.e., Algozinne et al., 2010).
Methodology
Data for this program evaluation come from a combination of direct and indirect sources.
When publicly available, data were collected independently by the researchers. Confidential
data (e.g., referrals to special education, office discipline referrals) were voluntarily submitted by
participating PAPBS Network schools. Consistent with the IUP Institutional Review Board
approval (Log No. 08-251) approving this research, schools were assured that these sensitive
data would be aggregated and publicly reported in a manner that insured their anonymity. In no
situation will a school be identified individually or as a group with specific outcome results.
Analyses of outcomes were performed using inferential statistics when complete
longitudinal data were available or in cases where cross sectional analyses were appropriate.
The interpretation of statistically significant findings is that there is a high probability that the
changes over time are real changes, not a statistical artifact due to random sampling error.
Subsequently, the conclusion is that SWPBIS is related to the observed changes over time.
Caveats
Readers should consider results and interpretations contained within this Executive
Summary with some caution due to the following limitations.

SWPBIS was implemented in a small number of PAPBS Network Schools beginning in
fall 2007. A considerable number of schools were trained in 2009-2010 with
implementation occurring thereafter. As such, data are aggregated into two distinct
cohorts given the different contexts in which SWPBIS was implemented across cohorts
(R. Horner, personal communication, December 9, 2011). Throughout this report, the 33
schools that began implementation in fall 2007 are referred to as "cohort 1" and the
schools that began implementing as early as 2009-2010 are referred to as "cohort 2."

Given that significant changes in outcomes related to school reform efforts such as
SWPBIS are not expected for a few years (Curtis, Castillo, & Cohen, 2008; McGlinchey
& Goodman, 2008), present results should reveal changes in outcomes within cohort 1
schools but not cohort 2 schools.

Absence of complete longitudinal data for certain outcome variables limited the extent to
which pre- and post-SWPBIS implementation changes could be analyzed. A longitudinal
2
2010-2011 PAPBS Network SWPBIS Executive Summary
methodology is preferred as it provides more confidence that the observed changes in the
data are associated with SWPBIS implementation and therefore generalizable to other
schools. Cross sectional approaches were used when robust longitudinal data were
unavailable from schools. Within cross sectional designs, the number of schools for
which data are reported will change from one year to the next as a consequence of which
and how many schools submitted data. Cross sectional analyses require a much more
cautious and conservative interpretation of SWPBIS effects on outcome variables.

An inherent bias may be present in these results as a product of a school's willingness to
voluntarily share its data with the researchers. Readers should, therefore, be mindful that
most schools complying with data submission requests were schools that were
successfully implementing SWPBIS.

The method by which the implementation status of SWPBIS was measured presents a
threat to the validity of findings. From fall 2007 to fall 2008, fidelity of SWPBIS
implementation was documented via the Team Implementation Checklist (TIC; Sugai,
Horner, & Lewis-Palmer, 2002, 2009). Many schools began using the more
psychometrically sound Benchmarks of Quality (BoQ; Kincaid, Childs, & George, 2005)
in spring 2009. In spring and fall 2010 many schools were independently audited using
the Schoolwide Evaluation Tool (SET; Sugai, Lewis-Palmer, Todd, & Horner, 2005).
When multiple sources of fidelity data were available for a school during the same
general period of time, a greater reliance was placed on SET and BoQ data over TIC data.

This evaluation utilized an ex post facto design in which schools were not randomly
assigned to treatment and control groups. It is possible that PAPBS Network schools
implementing SWPBIS are different from the typical, non-SWPBIS school given any
number of reasons. In the absence of a true experimental design, with random
assignment of SWPBIS and non-SWPBIS schools, cause and effect relationships cannot
be concluded.

Release of this Executive Summary is at the discretion of PDE, BSE, and PaTTAN.
Analyses and interpretations contained within are the expressed opinion of the authors
and do not necessarily reflect the views of the sponsoring agencies. Additionally, the
authors are not responsible for any misrepresentations of these results.
3
2010-2011 PAPBS Network SWPBIS Executive Summary
Introduction
School-Wide Positive Behavioral Interventions and Supports (SWPBIS) is a three-tiered
system that "establish[es] the social culture and individualized behavior supports needed for a
school to be a safe and effective learning environment for all students" (Sugai & Horner, 2009, p.
309). All students and staff in a building are exposed to the Tier 1, or universal school-wide,
practices which are intended to prevent problematic and disruptive behavior from occurring.
These school-wide practices include careful consideration of all school environments to increase
adult supervision and minimize inappropriate behavior; systematic and explicit instruction of
behavioral rules and expectations in all school settings; reinforcement of desirable behavior
through a token economy system and educative verbal recognition; and frequent review of
multiple data to evaluate efficacy of these school-wide practices. Considerable empirical
evidence (Bradshaw, Mitchell, & Leaf, 2010; Luiselli, Putnam, Handler, & Feinberg, 2005;
Muscott, Mann, & LeBrun, 2008; Spaulding et al., 2010; Sugai & Horner, 1999) documents that
for elementary school populations, over 80% of the student population responds well to this
universal, school-wide level of prevention. That is, these students receive one or no disciplinary
referrals for inappropriate behavior in an entire school year. The percentages of students for
which this is the case in middle and high schools drops to approximately 73% and 67%,
respectively. Despite this relative decline in the upper grades, school-wide prevention efforts
work for large majorities of students.
Approximately 15-30% of all students (depending on elementary or secondary grade
status) do not respond favorably to school-wide universal prevention efforts as evidenced by
cumulative office discipline referrals of two to five per academic year (Spaulding et al., 2010).
These students require supplemental behavioral intervention and supports in addition to the
school-wide prevention techniques (Crone, Horner, & Hawken, 2004; Lewis & Sugai, 1999;
March & Horner, 2002). Tier 2, or strategic, interventions typically include small group
counseling or therapy, implementation of commercially-available standard protocol
interventions, or interventions tailored from brief functional behavioral assessments (Walker et
al., 1996). The goal of these strategic interventions is to provide students with academic,
behavior, social, and emotional skills to minimize the barriers they face, thus augmenting the
effectiveness of tier 1 school-wide techniques.
Even with high fidelity tier 1 school-wide behavioral supports and strategic tier 2
interventions fully in place, a small percentage of students still fail to respond appropriately.
That is, these students exhibit chronic externalizing behaviors in schools and typically receive six
or more office discipline referrals in an academic year. For these students, highly individualized
and intensive supports are needed in conjunction with the tier 1 and 2 supports. This tertiary
level of intervention is student-centered and family-oriented in that supports are implemented not
only for the student, but also for the family given that there are often significant needs that
extend across all the student's ecologies. Positive behavior support plans (PBSP) and intensive
wrap-around services are typically implemented across multiple life domains (Eber, Sugai,
Smith, & Scott, 2002). Research suggests that 3-8% of students require this level of support,
with higher percentages occurring in secondary grades (Spaulding et al., 2010).
4
2010-2011 PAPBS Network SWPBIS Executive Summary
Purpose of PA SWPBIS Evaluation
PDE, BSE, and PaTTAN selected an initial cohort of 33 schools to implement a SWPBIS
framework beginning in fall 2007. Participating schools received training, onsite and on-going
technical assistance, and other resources from PaTTAN in exchange for their long-term
commitment to this project and willingness to submit data on key outcome variables. A second
group of schools was trained by PaTTAN, Intermediate Units (IU), and PAPBS Network
Facilitators beginning in fall 2009. Some of these second cohort schools began implementing in
2009-2010, although most did not commence implementation until 2010-2011.
No Child Left Behind (NCLB; 2002) and Individuals with Disabilities Education Act
(IDEA; 2004) firmly place the onus on educational systems to document the effects of practices
implemented in public schools - and SWPBIS receives no special exemption from this
requirement. Annual reports of the PAPBS Network SWPBIS efforts were synthesized for two
consecutive years (Runge & Staszkiewicz, 2009, 2010). An executive summary report (Runge &
Staszkiewicz, 2011) was released to the public in January 2011 via the PAPBS Network website
www.papbs.org. The present Executive Summary highlights results from the 2006-2011
evaluation report (Runge, Staszkiewicz, & O'Donnell, 2011).
Framework of PAPBS Network SWPBIS Evaluation
The general framework offered by technical assistants at the OSEP PBIS Network
(Algozzine et al., 2010) served as the organizational structure for the present Executive
Summary. This structure is based on five broad domains: Context; Input; Fidelity; Impact; and
Replication, Sustainability, and Improvement. The data reported within are very similar to data
presented in previous reports; however, the organization of this program evaluation is different.
Each domain is summarized below and has a corresponding section within this Executive
Summary:
1. Context - explicitly stated goals of SWPBIS implementation; documentation of what
training and support were provided for implementation; documentation of who provided
this training and support to schools; which school staff attended the team training; and
which schools received support to implement
2. Input - documentation of professional development content and activities; participants'
satisfaction with team training and on-going support; and the depth and breadth of
technical support provided to participating schools
3. Fidelity - the quality with which SWPBIS framework was implemented as prescribed
4. Impact - effect of SWPBIS on outcomes including office discipline referrals (ODRs),
out-of-school suspensions, in-school suspensions, instructional time regained, school
safety, school organizational health, staff retention rates, and academic achievement
5. Replication, Sustainability, and Improvement - the sustainability of implementation in
schools; the capacity to replicate, or scale-up, SWPBIS in other schools and districts; and
documentation of economic, political, and legislative efforts to establish SWPBIS as a
fundamentally core mechanism by which schools operate
5
2010-2011 PAPBS Network SWPBIS Executive Summary
Context of PA SWPBIS
Stated Goals
Implementation of SWPBIS in Pennsylvania is under the leadership of a diverse set of
stakeholders including PDE, BSE, PaTTAN, Pennsylvania Governor's Commission on Children
and Families, private providers, Pennsylvania Departments of Health and Public Welfare,
advocacy groups, and higher education. As indicated on the PAPBS Network (n.d.) website
(http://www.papbs.org), the goals are:











Develop and implement a school-wide cross-system approach for supporting the academic
and emotional well-being of all students using research-based positive behavioral supports
and strategies of varying intensity: 1) universal or preventative strategies for the benefit of all
students; 2) secondary strategies for those who will achieve with enhanced supports; and 3)
tertiary or intensive services for those who will achieve with intensive and coordinated
supports.
Achieve sustainability by seeking funding and legislative support for demonstration models,
providing training and technical assistance, and encouraging the facilitation of collaborative
partnerships among schools, families, youth and agencies.
Foster a consistent application of best practice standards among schools, families and
agencies.
Promote shared values that are consistently demonstrated through practice and partnerships of
schools, agencies and families.
Develop and embed opportunities for collaboration between systems partners and families.
Establish a dialogue that will inform ongoing training needs.
Reduce fragmentation of training resources.
Conduct cross-systems professional development to ensure a common language, knowledge
base, and understanding of supports and services available to children, youth and families.
Develop a cross-systems/integrated planning process for individual child/family needs.
Develop a cross-systems progress monitoring/data collection system to ensure accountability
to the academic achievement and well-being of all children, youth and families.
Ensure that youth and families will have opportunities for meaningful participation in all
PAPBS Network activities, including the development, provision and monitoring of services,
policies and procedures.
Documentation of Training
Training to implement SWPBIS began in June 2007 with 28 schools. Six more schools
from the eastern region received the same initial training in January 2008 and, for evaluative
purposes, are included in the original cohort of SWPBIS schools. One school team removed
itself from the project during the first year due to a realignment of district priorities, and this
school is not included in any of the programmatic analyses. Focus of training and onsite
technical assistance during that time period was on developing the infrastructure to implement
and sustain universal, tier 1 SWPBIS. Training for the subsequent 2008-2009 year focused on
tier 2 supports. From that point onward, training has revisited issues related to tiers 1 and 2 with
new training on intensive, tier 3 supports.
6
2010-2011 PAPBS Network SWPBIS Executive Summary
Beginning in 2009-2010, a larger second cohort of schools was trained on universal, tier
1 SWPBIS using the same model as with cohort 1. Consistent with the training for cohort 1
schools, these trainings were regionalized at the PaTTAN and IU offices.
Who Provided Training and Technical Support
Coordination of Pennsylvania's SWPBIS effort has been under the direct leadership of
the PAPBS Network in consultation with Dr. Lucille Eber, Marla Dewhirst, and Steve Romano
from the Illinois Positive Behavior Intervention and Support Network. PaTTAN and IU
consultants have provided the bulk of training and onsite technical assistance to implement
SWPBIS since Marla Dewhirst and Steve Romano provided initial training in 2007-2008. A
growing number of certified SWPBIS Facilitators from PAPBS Network affiliates has provided
training and technical assistance to newer schools.
Schools Receiving Training and Technical Support
A breakdown of participating SWPBIS schools, Local Education Agencies (LEAs), and
IUs is summarized in Table 1 by cohort and region of the Commonwealth. Totals indicated for
the combined cohorts are not arithmetic sums of the respective cohorts because some LEAs and
IUs have multiple schools in a cohort and/or schools in both cohorts. Visual display of schools
receiving training is also presented in Figure 1.
Table 1
Participating Buildings / LEAs / IUs by Cohort and Region
West
Central
Cohort 1
Schools
12
4
LEAs
7
4
Collaborating IUs
4
4
Cohort 2
Schools
29
81
LEAs
14
31
Collaborating IUs
4
8
Combined Cohorts
Schools
41
85
LEAs
19
34
Collaborating IUs
4
9
East
Total
17
12
7
33
23
15
56
23
8
166
68
20
73
32
9
199
85
22
Interpretation of these data needs to be considered in the context of the varying internal
capacity of the three PaTTAN offices and 29 IUs to adequately support SWPBIS training and
implementation efforts. These data indicate that the number of schools trained in SWPBIS has
greatly expanded since the original 33 schools trained in summer 2007. As noted in the data
above, nearly 49% of cohort 2 schools came from the central region (N = 81). Approximately
34% of cohort 2 (N = 56) schools were from the eastern region. The western region was
represented by 29 schools in cohort 2, accounting for 17.5% of all cohort 2 schools. The
combined cohorts indicate the largest number of schools trained in SWPBIS is located in the
7
2010-2011 PAPBS Network SWPBIS Executive Summary
central region (N = 85), with the eastern region adding 73 schools out of the total 199 schools
trained in SWPBIS. Forty-one of all the schools trained in SWPBIS are located in the western
region. Readers of this report are reminded that these data indicate the number of schools
participating in the SWPBIS training and are not necessarily indicative of implementation status
subsequent to that training.
Figure 1
Map of PAPBS Network Schools
Cohort 1 Schools
Cohort 2 Schools
The number of IUs collaborating with SWPBIS schools in the combined cohorts indicates
that expansion has occurred in the central and eastern regions with stable participation in the
western region. Closer inspection of these data indicates that every IU in the eastern region is
collaborating with at least one school trained in SWPBIS. All but two IUs in the central region
are likewise collaborating with a school trained in SWPBIS. Regarding the western region, IU
collaboration has remained unchanged across the two cohorts.
School grade constitution across the separate and combined cohorts is presented in Table
2. The total number of schools indicated in this table exceeds the 199 SWPBIS-trained schools
given that many schools span multiple grade ranges. Arbitrary categorization of preschool,
elementary, middle, and high school was made by these authors purely for reporting purposes.
The trend in schools trained in SWPBIS from cohort 1 continued with the second cohort. The
majority of trained schools educate students in the elementary grades (K-5). Middle schools
accounted for the largest minority of schools trained in SWPBIS, followed by high schools.
These data and trends are consistent with other large-scale implementation efforts in SWPBIS
(e.g., Bradshaw et al., 2010; Eber et al., 2010).
The relatively recent downward extension of SWPBIS into preschool settings, termed
Program-Wide Positive Behavior Support (PWPBS), is exciting in light of the unique setting,
8
2010-2011 PAPBS Network SWPBIS Executive Summary
services delivered, and resources available within early childhood programs (Fox & Hemmeter,
2009). Although this program evaluation report will not specifically focus on effects of PWPBS
given a number of methodological and practical reasons, it is hoped that elements of this report
will provide compelling, albeit indirect, support for this effort.
Table 2
Number of Participating Buildings by Grade Level
Preschool
Elementary
(K-5)
Cohort 1
2
23
Cohort 2
13
114
Combined Cohorts
15
137
Middle
(6-8)
9
88
97
High School
(9-12)
5
28
33
Interested readers are directed to www.papbs.org for a complete list of PAPBS Network
schools. The total number of students attending these schools is approximately 118,000
representing approximately 6% of the 1.78 million students educated in Pennsylvania's public
schools (PDE, n.d.).
Schools were strongly encouraged to collaborate with a community mental health agency
when designing and implementing the SWPBIS framework. These agencies are listed in Table
3, although additional agencies most likely have been inadvertently omitted.
Staff that Attended Training
Data pertaining to the professional roles of SWPBIS training session attendees were not
available for detailed review. A fairly certain assumption is that the majority of SWPBIS
training attendees were general and special education teachers and building-level administrators
(e.g., principals, assistant principals). Related service providers, including school counselors,
school psychologists, school social workers, behavioral specialists, Title I staff, nurses, home
and school visitors, and central administrators, also attended many of the sessions. Non-school
district attendees included PaTTAN and IU consultants and administrators, members of the
PAPBS Network State Leadership Team (SLT), child and family advocates, higher education
faculty, and service providers from collaborating mental health agencies.
9
2010-2011 PAPBS Network SWPBIS Executive Summary
Table 3
Listing of Collaborating Mental Health Agencies
5Star
Achievement Center
Aldersgate Youth Services Bureau
Allegheny Children's Initiative
Alternative Community Resources Program, Inc.
Behavioral Specialists, Inc.
Bradley Center
CCRES, Inc.
Centre County Can Help Agency
Chester County ARC
Child and Family Focus
Child Behavioral Health
Child Guidance
Children's Aid Society
COMHAR, Inc.
Community Care Behavioral Health
Community Counseling
Comprehensive Counseling Services
Creative Health
D. T. Watson
Delaware Valley Children's Center
Devereux Center for Effective Schools
Family-Based Services
Family Links
Fellowship Health Resources
Genelle Sweetser, LCSW
10
Holcomb Behavioral Health
Lycoming Therapeutic
Mercer County Behavioral Health
Mercy Behavioral Health
MGC, Inc.
Milestones
Mon Yough Community Services
New Life Counseling
Northwestern Human Services
PA Counseling Service
Paoletta Mental Health
Pendell Mental Health
Penn Psychiatric
Ponessa
Pressley Ridge
Resolve Behavioral Health
Sharon Regional Health System
Southwest Behavioral Health
St. Anthony's Point
Staunton Clinic
Team Care Behavioral Health
Value Behavioral Health
Vocational and Psychological Services
Watson
Wesley Spectrum Services
Youth Advocate Programs, Inc.
2010-2011 PAPBS Network SWPBIS Executive Summary
Input of PA SWPBIS
The focus of this domain is on the content of professional training provided to schools
and level and type of support provided to SWPBIS schools (Algozzine et al., 2010). A related
evaluative component in this domain, perceived value and satisfaction of that training by
participants, was not able to be reviewed given lack of available data.
Content of the Professional Training and Technical Support
Schools participating in the PAPBS Network received their training and onsite technical
assistance from at least one SWPBIS Facilitator. SWPBIS Facilitators agree to follow the same
general training and technical support framework endorsed by the PAPBS Network SLT which,
notably, was developed in consultation with PBIS Technical Assistance Consultants and is
available for review in the PBIS Professional Development Blueprint (Lewis, Barrett, Sugai, &
Horner, 2010). Delivery specifications are at the discretion of the SWPBIS Facilitator based on
training needs, location of training, size of the group, etc. A synopsis of training content is
provided in Table 4.
Level of Support Provided to Schools
Schools trained on SWPBIS typically attended large-group workshops on content
summarized in Table 4. Follow-up, targeted technical assistance was then provided by the
assigned SWPBIS Facilitator directly to schools on a regular basis. The focus of this support ran
the gamut from guiding the writing of behavioral lesson plans to facilitating efficient team
meetings using the Team Initiative Problem-Solving (TIPS) model (Newton, Todd, Algozzine,
Horner, & Algozzine, 2009). The frequency of this technical assistance typically occurred once
a month during the initial year or two of SWPBIS framework development and implementation.
Onsite technical assistance was scaled back as schools improved and sustained high fidelity
implementation. By the time schools were implementing for two or three years, onsite technical
assistance by the SWPBIS Facilitator occurred very irregularly, perhaps only two or three times
a school year.
PaTTAN often compensated teams for travel expenses incurred for attending the largegroup workshops. A small number of schools were awarded School-Based Behavioral Health
Performance Grants to, among many things, support implementation and maintenance of a
SWPBIS framework. These competitive grants were available each academic year from 20072008 to present.
11
2010-2011 PAPBS Network SWPBIS Executive Summary
Table 4
Sample Content of Training for SWPBIS Implementation
Year
Day 1
Day 2
I
 Overview of SWPBIS  Academic and
Behavior Connection
 Data Review
(Pre Lesson Planning
 Expectations
Implementation)  Behavioral Matrix
 Acknowledgement
Systems
 Action Planning and
Next Steps
I
(Initial
Implementation)
II
(PostImplementation)
12
Day 3
 Success and
Challenges
 Data-Based DecisionMaking
 Classroom- vs. OfficeManaged Behavior
 Procedures for
Dealing with Problem
Behavior
 Defining Majors and
Minors
 Planning for Kick-Off
 Classroom
Management
 Expectations
 Routines
 Active Engagement
 Acknowledgement
Systems
 Behavior Reduction
Strategies
 Behavior Reduction
Strategies
 De-escalation
Techniques
 Bully Prevention
 Integration with
SWPBIS
 Building Tier 2/3
Systems
 Universal Screening
 CICO / BEP
 Initial Line of Inquiry
 Simple FBA
 BEP / CICO
 Working Smarter Not
Harder
 PAPBS Data
Evaluation Plan
 Team Initiated
Problem Solving
2010-2011 PAPBS Network SWPBIS Executive Summary
Fidelity of PA SWPBIS
Of particular interest is, upon receipt of initial SWPBIS training, how many schools
implement with fidelity and how long it takes schools to reach full implementation status.
Results reported in the previous executive summary documented full implementation in cohort 1
schools occurred in one to three years, although the majority of schools took at least two years
(Runge & Staszkiewicz, 2011).
Fidelity refers to the relative match between how schools implement SWPBIS and the
original design presented in large-scale trainings. As Algozzine et al. (2010) stated, fidelity is
"the extent to which professional development has resulted in change in the practices used in
participating schools" (p. 12) relative to core features of SWPBIS.
Universal, tier 1 SWPBIS fidelity was assessed using three different measures: SET,
BoQ, and TIC. Distinction between partially and fully implementing schools was important
given previous program evaluation results indicating differential effects on SWPBIS as a
function of the degree to which the framework was implemented as designed (Childs, Kincaid, &
George, 2010; Runge & Staszkiewicz, 2010, 2011).
Cross sectional fidelity data for cohorts 1 and 2 are presented in Figures 2 and 3,
respectively. Data from cohort 1 suggest that high fidelity implementation gradually increased
for the first two school years after initial training, with a peak of 22 schools (66.7%) fully
implementing SWPBIS by spring 2009. At that same time, an additional 24% (N=8) schools
were designated as partially implementing. A downward trend is observed relative to the
number of cohort 1 schools fully implementing SWPBIS since spring 2009. At the same time,
an upward trend was observed relative to the number of schools for which fidelity data were
unavailable. It is not entirely clear how to interpret missing data as such an occurrence could
reflect one of two events: (1) the school is not implementing SWPBIS and thus did not complete
fidelity checks; or (2) the school is implementing at some level but failed to submit their fidelity
data for analytic purposes. In either case, valid interpretation of the level of implementation
cannot be made when data are missing. A consistent downward trend across five years is
observed relative to the number of schools designated as partially implementing, with one school
designated as partially implementing four years after the initial training. Data from cohort 2
schools is similar to that of cohort 1: an increasing number of schools achieve full SWPBIS
implementation status each year subsequent to initial training.
13
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 2
Cross Sectional Analysis of Cohort 1 Implementation Fidelity
35
Before
After
Training
Number of Schools
30
25
Unknown
Not
20
Partial
Full
15
10
5
0
Spring 2007
Fall 2007
Spring 2008
Spring 2009
Spring 2010
Spring 2011
N = 33
Time
Figure 3
Cross Sectional Analysis of Cohort 2 Implementation Fidelity
180
165
Number of Schools
160
140
131
120
107
100
Unknown + Not
Partial
80
Full
60
40
29
20
0
26
33
6
1
0
Spring 2009
14
Spring 2010
Time
2010-2011 PAPBS Network SWPBIS Executive Summary
Spring 2011
N = 166
Longitudinal analysis of fidelity for both cohorts is presented in Figures 4 and 5. Note
that only schools that were not implementing before the initial training and for which complete
longitudinal data were available were included in this analysis. Data from cohort 1 schools
indicates that the majority of schools achieved full implementation status within one academic
year of initial training, and all 10 schools achieved full implementation status within four years
of initial training. Only a small number (N = 3) of schools were able to achieve full
implementation in just a few months post initial training.
Figure 4
Longitudinal Analysis of Cohort 1 Implementation Fidelity
8
7
After Training
Before
Training
Number of Schools
6
5
Not
4
Partial
Full
3
2
1
0
Spring 2007
Fall 2007
Spring 2008 Spring 2009
Time
Spring 2010
Spring 2011
N = 10
Complete longitudinal data from spring 2010 to spring 2011 were available for 23
schools in cohort 2. Seventeen (73.9%) of these schools were partially implementing SWPBIS
in spring 2010 with an additional five (21.7%) established at full implementation. One school
was not implementing in spring 2010. Just one year later, a majority of these schools were fully
implementing (N = 13; 56.5%). The remaining 10 (43.5%) were partially implementing.
In summary across cohorts, it appears that high fidelity SWPBIS can be implemented
relatively soon after initial training although it takes most schools two or three years to achieve
such designation. These summative conclusions, however, are made with some caution as the
absence of data from large numbers of schools, especially from cohort 2, might temper the above
conclusions. For example, it is equally plausible to conclude that a large number of schools do
15
2010-2011 PAPBS Network SWPBIS Executive Summary
not implement after initial training. This alternative conclusion certainly suggests that training
does not necessarily result in high-, or even low-fidelity, implementation. What matters more is
the onsite technical assistance and regular coaching, something that is embedded within the
PAPBS Network plan for implementing and sustaining SWPBIS on a large scale.
Figure 5
Longitudinal Analysis of Cohort 2 Implementation Fidelity
18
16
Number of Schools
14
12
10
Not
Partial
8
Full
6
4
2
0
Spring 2010
16
Time
2010-2011 PAPBS Network SWPBIS Executive Summary
Spring 2011
N = 23
Impact of PA SWPBIS
The PAPBS Network SLT and PaTTAN identified the specific direct and indirect
variables thought to be affected by SWPBIS. These results are reviewed within this section and
represent a variety of behavioral and academic outcomes. Many of the data results reported
below are from cohort 1 schools only because all but a few cohort 2 schools submitted these data
at the time of this report. Moreover, data from cohort 2 schools represented one year of baseline.
Staff Perceptions of Status of Behavioral Support
Schools were asked to complete the Effective Behavior Support: Self-Assessment Survey
(EBS: SAS; Sugai, Horner, & Todd, 2003) each year as a means of evaluating staff perceptions
about the current status and need for improving behavioral support at the School-Wide, NonClassroom, Classroom, and Individual student level. The only data available for this report were
those related to the School-Wide dimension of the EBS: SAS.
Staff perceptions of status of behavioral support at the school-wide level. Staff were
asked to select one of three levels of implementation they believed their school had achieved at
that particular point in time. The three choices were full (fully implemented), partial (partially
implemented), or none (not implemented). It is important to note that these data represent staff
perceptions and are not necessarily reflective of actual level of implementation of SWPBIS. As
such, the EBS: SAS is not considered a tool to evaluate SWPBIS implementation.
Eight cohort 1 schools provided EBS: SAS and fidelity data for four consecutive years,
and this allowed for a longitudinal assessment of staff perceptions as a function of
implementation fidelity. In this analysis, all eight schools were not implementing SWPBIS in
2006-2007 and fully implementing three years after initial training. Figure 6 contains the results
of this longitudinal comparison. These data confirm that staff perceptions of SWPBIS
implementation status were directly related to the actual fidelity with which the framework was
implemented. A repeated measures analysis of variance for this four-year period confirms that
this is a statistically significant change, F(1, 7) = 5.29, p < .007. Follow-up pairwise
comparisons revealed that the following paired comparisons were significantly different: 1 Year
with 2 Years; and Pre-Implementation with 3 Years. In other words, the changes from preimplementation to the first year of implementation did not demonstrate a significant difference
but after the first year of implementation, the staff was much more likely to indicate that
universal, School-Wide support systems were fully implemented. Moreover, after three years of
full implementation, a significant proportion of staff viewed SWPBIS as fully established as
compared to perceptions at pre-implementation. Results from cross sectional analyses from
larger numbers of schools, although not reported in this executive summary, were comparable to
these longitudinal analytic results. This finding confirms the hypothesis that faculty perceptions
of full SWPBIS implementation are consistent with more objective evaluations of
implementation via SET, BoQ, or TIC. As schools fully implement SWPBIS, their staff is aware
of their success in implementing this framework across all settings in a school building.
17
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 6
Longitudinal Comparison of Self-Report Pre- and Post-Implementation Level for "School Wide"
– Cohort 1
Percentage of Respondents
100%
80%
63.58%
59.54%
27.82%
31.14%
8.60%
9.33%
Pre-Implementation
1 Year
60%
71.47%
76.70%
24.37%
4.16%
19.41%
3.89%
2 Years
3 Years
40%
20%
0%
School Wide Not Implemented
School Wide Partially Implemented
N=8
School Wide Fully Implemented
Staff perceptions of need for improvement in support at the school-wide level. In
addition to being asked for perceptions of the extent of behavioral support implementation,
participating staff were also asked the degree of importance that school-wide systems of support
were in need of improvement at their respective school. Once again, these data were collected
each spring. A longitudinal comparison was performed on the eight cohort 1 schools for which
complete EBS: SAS and fidelity data were available in each of the four years. Figure 7 contains
the summary for these schools. As evident in the graph, the percentages representing the "low
importance" response rose from 50.64% to 64.07% across four years.
Figure 7
Longitudinal Comparison of Self-Report Pre- and Post-Implementation for Importance of
Improving "School Wide" – Cohort 1
Percentage of Respondents
100%
13.68%
16.80%
80%
11.48%
31.20%
35.69%
7.44%
28.49%
38.79%
60%
40%
20%
50.64%
44.41%
Pre-Implementation
1 Year
57.33%
64.07%
2 Years
3 Years
0%
Low Priority
18
Medium Priority
2010-2011 PAPBS Network SWPBIS Executive Summary
High Priority
N=8
The repeated measures analysis of variance confirms that for these eight cohort 1 schools
the changes over time are most likely not due to chance, F(1, 7) = 5.421, p = .006. Follow-up
paired comparisons revealed these significant paired differences: Pre-Implementation and 3
Years; Year 1 and Year 2; Year 1 and Year 3. These data suggest that as schools sustain full
implementation of SWPBIS, as documented by objective measures, staff perceived the need for
professional development dedicated specifically to universal, tier 1 SWPBIS as less important. It
is believed that, as schools sustain full SWPBIS implementation, staff perceptions regarding
areas in need of professional development switch to nonclassroom settings and individual
students. This supposition, however, is an educated speculation in the absence of any data to
support or refute this claim.
Differences in staff perceptions on status and need for improvement in partially
implementing schools compared to fully implementing schools. Staff perceptions of the
integrity with which the SWPBIS framework was implemented and priorities for improvement
were found to correlate with the actual level of implementation confirmed via objective means.
Schools objectively designated as fully implementing SWPBIS had staff that reported
significantly higher confidence in the intergrity with which SWPBIS was implemented compared
to the staff in schools objectively classified as partially implementing SWPBIS. That is, staff in
schools that, according to objective assessments of fidelity, were partially implementing
SWPBIS recognized and reported a significantly lower level of implementation integrity and
indicated a significantly higher degree of importance placed on improving SWPBIS fidelity. As
expected, once a school reached full implementation status, as documented via SET or BoQ,
staff overwhelmingly recognized this accomplishment and reported a shift in professional
development needs away from the universal SWPBIS framework toward other levels of behavior
support within a compehensive PBIS model.
Staff Perceptions of School Safety
Existing research suggests that SWPBIS schools have staff that report fewer risk factors
within the school building and the surrounding community. Such risk factors include drug and
gang activity, vandalism, truancy, community poverty and crime, and instances of child abuse.
Concurrently, staff at schools implementing SWPBIS perceive increased protective factors
within their school building and the surrounding community over time. Examples of protective
factors include opportunities for students to engage in extracurricular activities, parental
involvement, school-community collaboration, acceptance of diversity, and high expectations for
student learning and productivity (e.g., Eber, Lewandowski, Hyde, & Bohanon, 2008).
Personnel at participating schools were asked to voluntarily and anonymously complete
the School Safety Survey (SSS; Sprague, Colvin, & Irvin, 2002) each year after receiving the
initial training. Larger amounts of data were available using a cross sectional approach in cohort
1 schools; however, only results from longitudinal analyses are reported within this executive
summary given that statistical analyses can be performed on these more complete data sets.
Cross sectional descriptions, nonetheless, were found to parallel those from longitudinal
analyses.
19
2010-2011 PAPBS Network SWPBIS Executive Summary
Complete longitudinal SSS and fidelity data for 10 cohort 1 schools were available for
analysis across a four-year span. In this analysis, all schools were designated as not
implementing in spring 2007 and fully implementing three years after initial training. The mean
percentage of Risk Factors and Protective Factors for the 10 schools that submitted complete
data are presented in Figures 8 and 9, respectively.
The mean Risk Factor scores for the 10 schools were 41.04%, 41.68%, 39.13%, and
39.03% across the four years. This trend is in the expected direction, with the average Risk
Factor score dropping by 2.01% from pre-implementation to year three of implementation.
Results of one-way repeated measures of analysis of variance were not statistically significant
indicating that, although the trend is in the desired direction, changes are not significant.
Mean Protective Factors for the 10 schools rose from 78.02% to 79.30% across the four
years. As with Risk Factors, the trend in Protective Factors was not significant at the .05 level,
but was in the expected direction. Follow-up paired sample t-tests revealed some significant
improvements in staff perceptions of Protective Factors in Years 2 and 3 compared to the first
year of full implementation. These results suggest that as a school fully implements SWPBIS, its
staff perceive more protective factors to exist in the school and local community.
A final note about the data is related to the ratio of Protective Factors to Risk Factors
(Figure 10). It can be argued that as this Protective to Risk Ratio gets larger, the school becomes
safer. As expected given the independent trends in Risk and Protective Factors, the ratio of
Protective Factors to Risk Factors increases over time. That is, as SWPBIS was implemented for
longer periods of time, the proportion of Protective Factors increased in comparison to Risk
Factors.
Figure 8
Longitudinal Comparison of Average Percentage of Risk Factors – Cohort 1
Average Percentage of Risk Factors
50%
41.04%
41.68%
Pre-Implementation
1 Year
40%
39.13%
39.03%
2 Years
3 Years
30%
20%
10%
0%
Time
20
2010-2011 PAPBS Network SWPBIS Executive Summary
N = 10
Figure 9
Longitudinal Comparison of Average Percentage of Protective Factors – Cohort 1
Average Percentage of Protective Factors
100%
80%
78.02%
76.54%
78.59%
79.30%
Pre-Implementation
1 Year
2 Years
3 Years
60%
40%
20%
0%
Time
N = 10
Figure 10
Ratio of Average Protective to Risk Factors in Implementing Schools – Cohort 1
2.10
Average Protective to Risk Ratio
2.01
2.03
1.90
1.90
1.83
1.70
1.50
Pre-Implementation
1 Year
2 Years
Time
21
2010-2011 PAPBS Network SWPBIS Executive Summary
3 Years
N = 10
Staff perceptions of school safety as a function of SWPBIS fidelity. Additional
analyses were conducted to evaluate the differences between schools that partially and fully
implement SWPBIS with regard to the Risk and Protective Factor scores. The t-test for
independent samples revealed statistically significant differences in Risk and Protective Factors
by SWPBIS fidelity after one and two years of implementation. In other words, for two
consecutive years after the initial training, the schools identified as fully implementing SWPBIS
reported significantly more Protective Factors and fewer Risk Factors than the schools identified
as partially implementing SWPBIS. It thus appears that the level of Protective Factors increases
and Risk Factors decreases as a function of the fidelity of SWPBIS implementation.
Student and Staff Attendance
Cross sectional and longitudinal analyses of average daily student and staff attendance
rates in cohort 1 schools were not significant. While changes over time may not be significant,
the non-significant findings may be a function of ceiling effects at baseline when schools were
already observing rather high rates of student and staff attendance.
Office Discipline Referrals
Cross sectional review. A cross sectional view of ODRs for cohort 1 schools as a
function of SWPBIS fidelity is found in Figure 11. To control for school size and the length of
the school year, raw data obtained from schools were converted to a common metric: ODRs per
100 students per school day.
Figure 11
Cross Sectional Average ODRs/100 Students/School Day - Cohort 1
1.2
1.060
Not Implementing
ODRs / 100 Students / School Day
1.0
Partial Implementing
0.8
0.756
0.752
Full Implementation
0.670
0.543
0.6
0.404
0.4
0.244
0.2
0.0
(N = 8)
(N = 2)
Before
Training
(N = 5)
(N = 10)
1 Year
(N = 4)
(N = 15)
2 Years
(N = 2)
(N = 11)
3 Years
Years After Initial Training
22
0.386
0.301
2010-2011 PAPBS Network SWPBIS Executive Summary
(N = 8)
4 Years
While less than half of the schools provided data in each year, it is clear that fully
implementing schools average fewer ODRs per 100 students per school day than do partially
implementing schools. These results suggest that schools that fully implement SWPBIS see
greater reductions in ODR rates compared to schools that partially implement SWPBIS
compared to baseline rates.
Longitudinal analyses. Complete longitudinal pre- and post-implementation fidelity and
ODR data were available for four schools in cohort 1 from 2006-2007 through 2009-2010. This
analysis is perceived to be the purest indication of the effects of SWPBIS on ODR rates given
that both fidelity and ODR data were available across multiple years. ODRs expressed as the
average number of ODRs per 100 students per 180 school days are presented in Figure 12.
Repeated measures analysis of variance (ANOVA) on these data revealed no significant
difference from baseline through 3 years after initial training. The small number of schools for
which complete longitudinal data were available greatly reduced the power with which
significant differences could be detected, however. So although results cannot be generalized
beyond these four schools, it is important to note that for these four schools, implementation of
SWPBIS had a real and substantial affect on ODRs with dramatic decreases observed in the
initial year of implementation and sustained decreases across multiple years.
Figure 12
ODR Rates from Pre- to Post-Implementation - Longitudinal Analysis Cohort 1
Number of ODRs per 100 Students per
Academic Year
300
254.0
250
200
150
100
63.6
59.9
50
18.3
0
Baseline
1 Yr
2 Yr
Years of Full Implementation
3 Yr
N=4
The data in Figure 12 clearly indicate a practical decrease in the initial year of full
implementation with an overall downward trend across time. These four schools saw a reduction
from 254.0 ODRs per 100 students per school year during baseline to a rate of 18.3 ODRs per
100 students per school year after three years of implementation.
ODR rates as a function of improved fidelity. Complete longitudinal data from six
elementary schools in cohort 2 were available for analysis regarding how ODR rates change as a
23
2010-2011 PAPBS Network SWPBIS Executive Summary
function of improved SWPBIS integrity. These results, presented in Figure 13, demonstrate that
as schools improve SWPBIS fidelity, their ODR rates actually significantly increase in the first
year of full implementation. It is hypothesized that this significant increase in ODR rates as
schools fully implement SWPBIS may be related to a heightened awareness and consistency
among faculty regarding what behaviors should be subject to administrator management. One of
the key features of universal SWPBIS is a process by which school staff agrees to consistently
apply classroom removal consequences for the same behaviors across settings. This process, in
turn, may explain the significant increase in ODR rates after one year of full implementation in
elementary schools.
Figure 13
ODR Rates as Elementary Schools Improve Fidelity - Cohort 2
ODRs / 100 Students / School Day
0.8
0.7095474
0.6
0.4
0.34975517
0.2
0
Partial (2009-2010)
Full (2010-2011)
Level of Implementation
N=6
Recoupment of time. An analysis of the time regained as a result of reductions in ODRs
suggested by Scott and Barrett (2004) was conducted using the complete longitudinal data from
the four schools in cohort 1. Conversion of minutes / hours to days was estimated based on an 8hour workday for administrators and teachers and a 6-hour school day for students. Assuming
these conservative estimates of time and using the average ODR rate per 100 students per
academic year, the following estimated savings are offered in Table 5.
These data suggest rather robust savings in administrators', teachers', and students' time
otherwise lost to disruptive behavior. On average, administrators regained 4.3 work days per
100 students in an academic year. This regained time allows principals to provide more
instructional supervision and accomplish other administrative duties. Likewise, teachers
regained that same amount of time, thus providing them with more time for instruction and
instructional planning. The savings for students is 11.6 instructional days recouped for every
100 students in a building.
24
2010-2011 PAPBS Network SWPBIS Executive Summary
Table 5
Estimated Time Saved for Administrators, Teachers, and Students When Fully Implementing
SWPBIS
Average
Yearly
Baseline
1 Year
2 Years
3 Years
Savings
Average ODR/100 Students/Year
254
64
60
18
(Savings from Baseline)
(190)
(194)
(236)
(207)
Administrator Minutes
2,540
640
600
180
(Savings from Baseline)
(1,900)
(1,940)
(2,360)
(2,067)
Hours
42.3
10.7
10
3
(Savings from Baseline)
(31.6)
(32)
(39)
(34.2)
Days
5.3
1.4
1.3
0.4
(Savings from Baseline)
(3.9)
(4)
(4.9)
(4.3)
Teacher Minutes
2,540
640
600
180
(Savings from Baseline)
(1,900)
(1,940)
(2,360)
(2,067)
Hours
42.3
10.7
10
3
(Savings from Baseline)
(31.6)
(32)
(39)
(34.2)
Days
5.3
1.4
1.3
0.4
(Savings from Baseline)
(3.9)
(4)
(4.9)
(4.3)
Student Minutes
5,080
1,280
1,200
360
(Savings from Baseline)
(3,800)
(3,880)
(4,720)
(12,400)
Hours
84.7
21.3
20
6
(Savings from Baseline)
(63.4)
(64.7)
(78.7)
(68.9)
Days
14.2
3.6
3.3
1
(Savings from Baseline)
(10.6)
(10.9)
(13.2)
(11.6)
Note. N = 4; Results are computed using the average ODR / 100 Students / Academic Year;
Estimated time for Administrators = 10 minutes per ODR; Estimated time for Teachers = 10
minutes per ODR; Estimated time for Students = 20 minutes per ODR; 8-hour workday assumed
for Administrators and Teachers; 6-hour school day assumed for Students.
ODR Triangle Data. An analysis of ODR Triangle Data is frequently cited in the
SWPBIS literature. Using the student population of a building, the ODR Triangle Data represent
the percentage of the total student body who receive 6+ ODRs, 2-5 ODRs, and 0-1 ODR in a
school year. Results from national studies (e.g., Kaufman et al., 2010; Spaulding et al., 2010)
indicate significant differences in ODR Triangle Data related to elementary versus secondary
schools. Comparison of cross sectional mean ODR Triangle Data for cohort 1 as a function of
grade level (i.e., elementary v. secondary) revealed statistically significant differences.
Therefore subsequent analyses were disaggregated by grade level. Additionally, note that preimplementation ODR data were not available for any school, thus comparisons as a function of
implementation integrity are not made.
Elementary schools. Longitudinal analyses of ODR Triangle Data in cohort 1
elementary schools resembled results from cross sectional analyses. Complete longitudinal data
for fully implementing schools were available from six elementary schools. Average percentage
25
2010-2011 PAPBS Network SWPBIS Executive Summary
of students earning zero or one ODR, two to five ODRs, and six or more ODRs are displayed in
Figure 14. These data indicate elementary schools that consistently implement SWPBIS over
multiple years observe approximately 95% of all students earning zero or no ODRs in a year
with 3-4% of students receiving two to five ODRs in a year. Students with the most frequent
disruptive behavior in the school setting account for just over 1% of the entire student
population.
Figure 14
Longitudinal ODR Triangle Data for Cohort 1 Elementary Schools
Percentage of All Students
100%
1.35%
3.46%
1.35%
3.85%
1.22%
3.66%
95%
6+ ODR
2-5 ODR
95.19%
94.80%
95.12%
1 Yr
2 Yrs
3 Yrs
0-1 ODR
90%
N=6
Years of Full SWPBIS Implementation
Secondary schools. A longitudinal approach to analyzing ODR Triangle data in
secondary schools could not be accomplished given limited data and the need to maintain school
anonymity. Therefore, a cross sectional approach was employed. ODR Triangle Data for cohort
1 secondary schools is presented in Figure 15.
Within the cross sectional analyses of cohort 1 secondary schools, an average of 82% of
all secondary students received zero or one ODR in a year. On average, approximately 11% and
6%, of secondary students, respectively, received two to five and six or more ODRs, in an
academic year. These data are slightly better than those reported by Spaulding et al. (2010)
using a much larger national dataset. Overall, PAPBS Network schools' data suggest that
secondary schools implementing SWPBIS will observe large majorities of students exhibiting
few, if any, disruptive behaviors. Moreover, only a small percentage of secondary students
chronically display disruptive behavior.
Summary of ODR data. When SWPBIS effects on ODRs are compared across grade
levels, evidence suggests that improvements are more pronounced in the earlier grades. Most
notably, ODR Triangle Data were significantly more appealing at the elementary level compared
26
2010-2011 PAPBS Network SWPBIS Executive Summary
to the secondary level. Statistically more elementary students received zero or no ODRs
compared to their older peers. Likewise, elementary schools implementing SWPBIS
experienced far fewer percentages of students demonstrating occasional or chronic disruptive
behavior.
Figure 15
Cross Sectional Triangle Data for Cohort 1 Secondary Schools
Percentage of All Students
100%
80%
8.68%
11.51%
5.85%
12.72%
5.58%
9.68%
60%
6+ ODR
40%
79.82%
81.43%
84.74%
2-5 ODR
0-1 ODR
20%
0%
(N=4)
1 Yr
(N=3)
2 Yrs
(N=2)
3 Yrs
Years of Full SWPBIS Implementation
After multi-year full implementation of SWPBIS, daily ODR rates decrease substantially.
Additionally, large majorities of elementary students receive one or no ODRs in an academic
year. Very small percentages, approximately 6% of all students, receive more than one ODR
while an elementary school sustains full SWPBIS implementation. The net result is that
considerable administrative and instructional time is recouped when SWPBIS is implemented.
Moreover, students spend more time in the instructional setting, thus increasing opportunities for
learning academic content.
Results at the secondary level were comparable, although these conclusions are not as
strong given smaller sample sizes. Sustained SWPBIS implementation appears to result in
approximately 82% of all students receiving one or no ODR, with 11% and 7% of the remaining
student body receiving 2-5 ODRs and 6+ ODRs, respectively, in an academic year.
Out of School Suspensions
Implementation of high fidelity SWPBIS has been shown to decrease the prevalence of
exclusionary disciplines such as out-of-school suspensions (OSS; Bradshaw et al., 2010; Luiselli
et al., 2005; Muscott et al., 2008). Comparisons of OSS rates disaggregated by elementary and
secondary schools were conducted given work by Spaulding et al. (2010) who noted significantly
higher rates of OSS in secondary schools compared to elementary schools. Notably, three years
after initial implementation, secondary schools in the PAPBS Network used OSS at a statistically
27
2010-2011 PAPBS Network SWPBIS Executive Summary
higher rate (35.79 OSS Days per 100 students) than in elementary schools (4.51 OSS Days per
100 students.
Elementary schools. Longitudinal OSS rate data from three cohort 1 elementary schools
were available for four consecutive years. Schools included in these analyses were not
implementing SWPBIS at baseline (2006-2007) and maintained full implementation status for
three consecutive years following initial training. Data are presented in Figure 16 with a
superimposed linear trendline.
Figure 16
Longitudinal OSS Rates in Three Elementary Schools - Cohort 1
4
3.93
3.87
Days of OSS Served Per 100 Studenrts
3.38
3
2
1.35
1
0
Baseline
1 Year
2 Years
Time Since Initial Training
3 years
N=3
Results from repeated measures ANOVA were not statistically significant, thus limiting
the generalizability of these results to other schools. For these three schools, however, a
downward trend is observed from pre-implementation to three years of full implementation. At
pre-implementation, an average of 3.87 days of OSS were served per 100 students in a year; but
at the third year of full implementation, rates were reduced to 1.35 days of OSS served per 100
students. For these schools, this results in increased instructional time for the students and
decreased time dealing with major behavioral violations in the classroom and administrative time
processing an out-of-school suspension.
Secondary schools. Complete longitudinal OSS data for cohort 1 secondary schools
across multiple years were available for two schools. Given that data submission was predicated
on the researchers maintaining the anonymity of schools, longitudinal analyses were not
conducted.
28
2010-2011 PAPBS Network SWPBIS Executive Summary
Referrals to / Eligibility for Special Education
It is hypothesized that implementation of a SWPBIS framework will affect the number of
students referred for a special education eligibility evaluation and the number of students
declared eligible for special education and related services. To date, no empirical evidence has
demonstrated such a relationship; however, from a theoretical perspective, it is believed that as
an entire school staff is better able to accommodate students with special education needs in
general education settings, fewer students will require restrictive special educational services.
Differences in requests for evaluations to determine special education eligibility between
partially- and fully-implementing SWPBIS schools within each year were not significant,
therefore data were aggregated across levels of implementation integrity. Three years after
initial training, a pattern of referrals for special education evaluation and identification for
special education services is not apparent.
Complete longitudinal data for referral and identification rates for special education were
available from two elementary schools. These data lend themselves to detection of trends across
time. The two schools included in these analyses were not implementing SWPBIS at baseline
(2006-2007) and maintained full implementation status for three consecutive years following
initial training. Data are presented in Figure 17.
Figure 17
Longitudinal Rate of Referrals to Special Education and Students Newly Identified for Special
Education Services in Two Elementary Schools
7.6
8
Referrals
Rate Per 100 Students
6.09
6
4
Identified
2.98
2.81
2.64
2.18
2
2.07 1.91
0
Baseline
1 Year
2 Years
Years Since Initial Training
3 Years
N=2
Statistically significant changes over time were not observed, although such a finding
would be extremely difficult to detect given the small sample size. For these two schools,
however, a decreasing trend for both referrals and newly identified students is observed after an
initial increase in the first year after initial training. Similar findings were noted when
incomplete four-year longitudinal data were analyzed from seven cohort 1 schools. It is
hypothesized that the initial increase observed in Year 1 was due, in part, to a heightened
29
2010-2011 PAPBS Network SWPBIS Executive Summary
awareness among all staff about the behavioral needs of all students. Once school staff was
comfortable with its own skills and the capacity of the school to address the vast majority of
students' needs, a declining rate of referrals for special education eligibility and students newly
identified for special education and related services have been observed in the subsequent two
years. Generalization of these results to other schools, however, cannot be made.
Special Education Placements in the Least Restrictive Environment
Educating students with disabilities in the least restrictive environment (LRE) is a
fundamental right of all students and an important goal that schools attempt to achieve.
Educating a student in his or her LRE is a determination that is made by each Individualized
Education Program (IEP) team and is a function of the child's needs and the capacity for the
regular education environment to meet those needs effectively and efficiently. A school that,
prior to SWPBIS implementation, did not have staff that could prevent or intervene effectively
with students who presented with significant behavioral challenges would traditionally
implement an IEP that placed that child in a very restrictive setting. Once staff systematically
adopted SWPBIS principles, that same student may be included with non-disabled peers more
frequently because the staff have the skills and competencies necessary to effectively educate
that student in a less restrictive setting. Therefore, it is hypothesized that SWPBIS contributes to
improved LRE indicators for that building. This theoretical implication of SWPBIS on LRE has
yet to be empirically tested in the scholarly literature.
Longitudinal analyses, the ideal method to evaluate efficacy of SWPBIS, could not be
performed given incomplete data. Cross sectional comparisons, therefore, were employed. In
these comparisons, the percentage of students with disabilities in three broad categories of LRE
was averaged across all schools for which data were available in each year. The average
percentage of all students with IEPs educated in the least restrictive placement (≥80% of the
school day included with non-disabled peers), moderately restrictive placement (40-79% of the
school day), and most restrictive placement (<40% of the school day) are presented in Figure 18.
Cross sectional LRE data were available for 10 to 12 schools across the baseline and
three years after initial training, although it is important to note that different schools reported
data for any given year. Statistically significant differences were observed between non- and
fully-implementing schools during baseline for two of the three LRE categories: ≥80% of day
included with non-disabled peers and 40-79% of the day included with non-disabled peers.
Similarly significant differences were noted for 1 Year post-initial training on the least restrictive
setting, ≥80% of day included with non-disabled peers. Although a statistically significant
difference was likewise noted for this same reporting category 2 Years post-initial training, these
interpretations are cautiously offered given that there was only one partially implementing school
for which LRE data were available at that time.
30
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 18
Cross Sectional Comparisons by LRE Reporting Category and SWPBIS Fidelity - Cohort 1
Percentage of Students
100%
75%
7.2%
3.5%
16.7%
6.3%
4.7%
5.3%
31.3%
36.0%
30.9%
47.4%
50.0%
50%
LRE Category
100.0%
96.0%
<40%
25%
0%
62.5%
45.3%
59.2%
64.5%
33.3%
(N = 9)
( N = 2)
Not
Full
Baseline
(N = 5)
Partial
40-79%
≥80%
(N = 5)
Full
(N = 1)
Partial
1 Year
(N = 9)
Full
2 Years
(N = 12)
Full
3 Years
Years Since Initial Training
Out-of-School Placements
As SWPBIS becomes more institutionalized within a school and its staff, it is believed
that fewer students with extreme behavioral challenges would require specialized interventions
typically offered in an out-of-school placement. For example, a student with significant
behavioral excesses would, in a traditional school setting, experience repeated classroom
removals, followed by numerous out-of-school suspensions, until a decision was made to place
the student in a specialized school setting such as a partial hospitalization program, residential
treatment facility, or on home-bound instruction. From a theoretical perspective, schools that
implement effective SWPBIS interventions with strong tier 2 and 3 supports should observe
fewer students requiring out-of-school placements.
Two-year longitudinal out-of-school placement data were available from a subset of fully
implementing schools from both cohorts which allows for examination of trends over time.
Average rates of out-of-school placements per 100 students are presented in Figure 19. The
percentage of all students placed outside of their neighborhood school that were identified under
the special education category of emotional disturbance is displayed in Figure 20.
Some conclusions are offered, although interpretation of these trends is cautiously made
given the limited number of years for which data were available. Schools that fully implement
SWPBIS for two years, on average, place just over one student per 100 total students in an outof-school educational program. The observed increase from 2008-2009 to 2009-2010 was not
statistically significant; therefore, this trend cannot be generalized to other schools. With regard
to the data in Figure 20, an increasing proportion of students with emotional disturbance are
placed in out of school educational programs. Again, this change from over two years was not
statistically significant; however, for these schools, it appears that SWPBIS is having the effect
31
2010-2011 PAPBS Network SWPBIS Executive Summary
of increasing the ratio of students with emotional disturbance being placed outside their
neighborhood school compared to all other students placed in these restrictive settings. It is
hypothesized that SWPBIS schools are better able to address the behavioral and academic needs
of students with many other exceptionalities, but still rely on external placements for students
with the most challenging behavioral needs.
Figure 19
Longitudinal Comparison of Out-of-School Placement Rates in Fully Implementing SWPBIS
Schools - Combined Cohorts
Out-of-School Placements
Per 100 Students
1.2
1.0515
1.1061
1.0
0.8
0.6
0.4
0.2
0.0
2008-2009
2009-2010
N=7
Figure 20
Longitudinal Comparison of the Percentage of All Students Placed Outside of Their
Neighborhood School That Are ED - Combined Cohorts
Percent of All Students
Placed That Are ED
50%
45.51%
40%
30%
22.12%
20%
10%
0%
2008-2009
32
2010-2011 PAPBS Network SWPBIS Executive Summary
2009-2010
N=4
Secondary Level of Support in SWPBIS – Check-In Check-Out
There exist countless types of tier 2, or selected, interventions a school implementing
universal SWPBIS can adopt. One standard protocol secondary intervention is Check-In, CheckOut (CICO) also known as the Behavior Education Program (see Crone et al., 2004 for a
review). Numerous research studies have documented the efficacy of CICO to positively and
proactively address the behavior of students who are typically at-risk for academic and/or
behavioral challenges (Hawken, Adolphson, MacLeod, & Schuman, 2009).
Effect of CICO on Student Behavior. Students supported in the CICO standard
protocol intervention earn points for positive behaviors displayed within pre-specified time
periods throughout the day. Although there is some discretion afforded to the individual teams
implementing CICO, the typical daily targets are set at 80% of the total available points (Crone
et al., 2004). Individual student's data are then tracked across days and weeks to determine
whether the CICO is having the desired effect of maintaining appropriate levels of prosocial
behavior. Data summarized in Table 6 represent the percentage of students enrolled in CICO
who met their average daily goal of earning at least 80% of the available points.
Table 6
Number of Students Involved in and Effectiveness of CICO Across Time
2008-2009
2009-2010
Level School
N
#
% Met
N # Met % Met
Met
Goal
Goal
Goal
Goal
Elem. A
7
7
100%
13
12
97.3%
B
4
2
50.0%
9
9
100%
C
23
22
95.7%
D
46
39
84.8%
E
F
G
H
TOTAL
11
9
81.8%
91
82
90.1%
N
2010-2011
# Met % Met
Goal
Goal
20
3
26
68
12
5
14
2
150
8
2
12
36
11
4
14
1
88
40.0%
66.7%
46.2%
52.9%
91.7%
80.0%
100%
50.0%
58.7%
10
16
26
5
11
16
50.0%
68.9%
61.5%
Sec.
I
J
TOTAL
6
6
6
6
100%
100%
13
13
7
7
53.8%
53.8%
TOTAL
17
15
88.2%
104
89
85.6%
176 104
59.1%
Note. Level = grade range for the particular school; Elem. = elementary (K-5); Sec. = Secondary
(6-12); N = number of students enrolled in CICO; # Met Goal = number of students who met the
goal of ≥80% points over a pre-specified period of time; % Met Goal = percentage of students in
CICO who met the pre-specified goal of ≥80% points.
33
2010-2011 PAPBS Network SWPBIS Executive Summary
Data from the Table 6 indicate that a majority of students enrolled in CICO are in the
elementary grades. These data also suggest that CICO is effectively addressing the behavioral
needs of a majority of students enrolled in the intervention (59.1% of all students achieving the
minimum average daily criterion in 2010-2011). From the perspective of targeted interventions,
CICO, implemented concurrently with high fidelity SWPBIS, is meeting the needs of a majority
of these students.
Academic Achievement
To investigate the impact of PA SWPBIS on the academic performance of students, data
from the 33 schools in cohort 1 and the 166 schools in cohort 2 were kept separate. This was
done for several reasons. First, the year of implementation for the schools across cohorts varied.
For example, schools in cohort 2 were in their first year of implementation at the same time that
schools in cohort 1 would be in their second or third year of implementation. This meant that
academic data (e.g., PSSA results) in any given year of implementation were collected using
different tests and norms. Secondly, some schools within the same district were in different
cohorts, which meant that very likely there were interactions among these schools that would
violate any assumptions of independence. Finally, the number of years for which comparisons
could be made varied based on the cohort in which a school was a member. Schools in cohort 1
had baseline plus three years of implementation but schools in cohort 2 had only baseline plus
two years of implementation, with PSSA scores available only for the first year of
implementation by the time this evaluation was conducted. Consequently, this evaluation
included PSSA achievement data only for cohort 1.
During 2007, the baseline year for cohort 1 schools, two schools were already
implementing SWPBIS and are excluded from the baseline data. However, since the analyses
for subsequent years were based on a cross sectional approach, these two schools were included
in the analyses for 2008 through 2010. Therefore, the 26 schools used in 2007 are the schools
that had not yet implemented SWPBIS. In subsequent years, 2008 through 2010, the number of
schools in each category reflects the number of the original 28 schools that meet the
implementation criteria. It is important to note, also, that the number of schools identified
reflects the number for whom implementation data were available.
PSSA Reading performance for cohort 1. In comparing cohort 1 schools with the
performance of all Pennsylvania schools, the following comparisons were made for each year,
2007 through 2010:
a. Percentage of students scoring "Below Basic or Basic" for partially implementing schools
compared to all Pennsylvania schools (Figure 21);
b. Percentage of students scoring "Proficient or Advanced" for partially implementing
schools compared to all Pennsylvania schools (Figure 22);
c. Percentage of students scoring "Below Basic or Basic" for fully implementing schools
compared to all Pennsylvania schools (Figure 23);
d. Percentage of students scoring "Proficient or Advanced" for fully implementing schools
compared to all Pennsylvania schools (Figure 24)
34
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 21
Percentage of Students Scoring Below Basic or Basic on PSSA Reading for Partially
Implementing Schools vs. Pennsylvania State Average
50
45
Percentage of Students
40
35
30
25
20
15
10
5
0
State Averages
Partially Implementing Schools
2007
2008
2009
2010
31
30
29
29
33.27
33.79
42.71
34.25
Figure 22
Percentage of Students Scoring Proficient or Advanced on PSSA Reading for Partially
Implementing Schools vs. Pennsylvania State Average
80
Percentage of Students
70
60
50
40
30
20
State Averages
Partially Implementing Schools
35
2007
2008
2009
2010
69
70
71
72
66.75
66.21
57.29
66.75
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 23
Percentage of Students Scoring Below Basic or Basic on PSSA Reading for Fully Implementing
Schools vs. Pennsylvania State Average
50
45
Percentage of Students
40
35
30
25
20
15
10
5
0
State Averages
Fully Implementing Schools
2007
2008
2009
2010
31
30
29
29
33.27
31.05
25.39
24.18
Figure 24
Percentage of Students Scoring Proficient or Advanced on PSSA Reading for Fully Implementing
Schools vs. Pennsylvania State Average
80
Percentage of Students
70
60
50
40
30
20
State Averages
Fully Implementing Schools
36
2007
2008
2009
2010
69
70
71
72
66.75
68.95
74.6
75.83
2010-2011 PAPBS Network SWPBIS Executive Summary
One-sample t-tests were used to determine the significance of the differences between
partially implementing schools and Pennsylvania state averages with regard to reading
performance. The following results were found:

Prior to implementation in 2006-2007, the percentage of students in all cohort 1 schools
scoring Below Basic or Basic on the Reading PSSA (M = 33.27%, SD = 17.80) was
significantly higher than the State average, t (25) = -878.4, p < .001.

After one year of implementation in 2007-2008, the percentage of students in partially
implementing school scoring Below Basic or Basic on the Reading PSSA (M = 33.79%,
SD = 22.03) was significantly higher than the State average, t (13) = -503.9, p < .001.

After two years of implementation in 2008-2009, the percentage of students in partially
implementing schools scoring Below Basic or Basic on the Reading PSSA (M = 42.71%,
SD = 26.01) was significantly higher than the State average, t (6) = -290.7, p < .001.

After three years of implementation in 2009-2010, the percentage of students in partially
implementing schools scoring Below Basic or Basic on the Reading PSSA (M = 34.25%,
SD = 27.93) was significantly higher than the State average, t (1) = -145.0, p < .004.

Prior to implementation in 2006-2007, the percentage of students in all cohort 1 schools
scoring Proficient or Advanced on the Reading PSSA (M = 66.75%, SD = 17.82) was
significantly lower than the State average, t (25) = -1955.9, p < .001.

After one year of implementation in 2007-2008, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Reading PSSA (M =
66.21%, SD = 22.03) was significantly lower than the State average, t (13) = -1177.9, p <
.001.

After two years of implementation in 2008-2009, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Reading PSSA (M =
57.29%, SD = 25.97) was significantly lower than the State average, t (6) = -717.4, p <.
001.

After three years of implementation in 2009-2010, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Reading PSSA (M =
66.75%, SD = 27.93) was significantly lower than the State average, t (1) = -361.2, p <
.002.
In each year, the partially implementing SWPBIS schools in cohort 1 performed less
favorably than the State average.
One-sample t-tests were also used to determine the significance of the differences between
fully implementing schools and Pennsylvania State averages with regard to reading performance.
The following results were found:
37
2010-2011 PAPBS Network SWPBIS Executive Summary

Prior to implementation in 2006-2007, the percentage of students in all cohort 1 schools
scoring Below Basic or Basic on the Reading PSSA (M = 33.27%, SD = 17.80) was
significantly higher than the State average, t (25) = -878.4, p < .001.

After one year of implementation in 2007-2008, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Reading PSSA (M = 31.05%,
SD = 10.05) was significantly higher than the State average, t (12) = -1065.6, p < .001.

After two years of implementation in 2008-2009, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Reading PSSA (M = 25.39%,
SD = 9.9) was significantly lower than the State average, t (18) = -1261.3, p < .001.

After three years of implementation in 2009-2010, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Reading PSSA (M = 24.18%,
SD = 10.08) was significantly lower than the State average, t (16) = -1176.6, p < .001.

Prior to implementation in 2006-2007, the percentage of students in all cohort 1 schools
scoring Proficient or Advanced on the Reading PSSA (M = 66.75%, SD = 17.82) was
significantly lower than the State average, t (25) = -1955.9, p < .001.

After one year of implementation in 2007-2008, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Reading PSSA (M =
68.95%, SD = 10.04) was significantly lower than the State average, t (12) = -2490.1, p <
.001.

After two years of implementation in 2008-2009, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Reading PSSA (M = 74.6%,
SD = 9.9 was significantly higher than the State average, t (18) = -3086, p < .001.

After three years of implementation in 2009-2010, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Reading PSSA (M = 75.8%,
SD = 10.1) was significantly higher than the State average, t (16) = -2908, p < .001.
Prior to implementation and even after the first year of full implementation, schools that
were fully implementing SWPBIS were not performing as well as the State average with regard
to PSSA Reading scores. However, in the second and third year of full implementation, the
participating schools significantly outperformed the State averages by having a significantly
fewer percentage of students at Below Basic or Basic and a significantly larger percentage of
students at the Proficient or Advanced levels on the PSSA Reading.
PSSA Math performance for cohort 1. In comparing cohort 1 schools with the
performance of all Pennsylvania schools on the PSSA Math test, the following comparisons were
made for each year, 2007 through 2010:
a. Percentage of students scoring Below Basic or Basic for partially implementing schools
compared to all Pennsylvania schools (Figure 25);
38
2010-2011 PAPBS Network SWPBIS Executive Summary
b. Percentage of students scoring Proficient or Advanced for partially implementing schools
compared to all Pennsylvania schools (Figure 26);
c. Percentage of students scoring Below Basic or Basic for fully implementing schools
compared to all Pennsylvania schools (Figure 27);
d. Percentage of students scoring Proficient or Advanced for fully implementing schools
compared to all Pennsylvania schools (Figure 28).
Figure 25
Percentage of Students Scoring Below Basic or Basic on PSSA Math for Partially Implementing
Schools vs. Pennsylvania State Average
50
45
Percentage of Students
40
35
30
25
20
15
10
5
0
State Averages
Partially Implementing Schools
39
2007
2008
2009
2010
32
28
27
25
30.44
30.68
42.56
27.45
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 26
Percentage of Students Scoring Proficient or Advanced on PSSA Math for Partially
Implementing Schools vs. Pennsylvania State Average
80
70
Percentage of Students
60
50
40
30
20
2007
2008
2009
2010
67
71
73
76
69.55
69.32
57.4
72.55
State Averages
Partially Implementing Schools
Figure 27
Percentage of Students Scoring Below Basic or Basic on PSSA Math for Fully Implementing
Schools vs. Pennsylvania State Average
50
45
40
Percentage of Students
35
30
25
20
15
10
5
0
State Averages
Fully Implementing Schools
40
2007
2008
2009
2010
32
28
27
25
30.44
24.63
20.25
15.79
2010-2011 PAPBS Network SWPBIS Executive Summary
Figure 28
Percentage of Students Scoring Proficient or Advanced on PSSA Math for Fully Implementing
Schools vs. Pennsylvania State Average
90
Percentage of Students
80
70
60
50
40
30
20
State Averages
Fully Implementing Schools
2007
2008
2009
2010
67
71
73
76
69.55
75.37
79.73
84.23
The following results were found:

Prior to implementation in 2006-2007, the percentage of all students in cohort 1 schools
that scored Below Basic or Basic on the Math PSSA (M = 30.44%, SD = 17.51) was
significantly lower than the State average, t (25) = -9228, p < .001.

After one year of implementation in 2007-2008, the percentage of students in partially
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 30.68%, SD
= 20.77) was significantly higher than the State average, t (13) = -499, p < .001.

After two years of implementation in 2008-2009, the percentage of students in partially
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 42.56%, SD
= 22.45) was significantly higher than the State average, t (6) = -313.2, p < .001.

After three years of implementation in 2009-2010, the percentage of students in partially
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 27.45%, SD
= 2.90) was significantly higher than the State average, t (1) = -1206.1, p < .001.

Prior to implementation in 2006-2007, the percentage of all students in cohort 1 schools
that scored Proficient or Advanced on the Math PSSA (M = 69.55%, SD = 17.52) was
significantly higher than the State average, t (25) = -1929.6, p < .001.
41
2010-2011 PAPBS Network SWPBIS Executive Summary

After one year of implementation in 2007-2008, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 69.32%,
SD = 20.8) was significantly lower than the State average, t (13) = -1266.5, p < .001.

After two years of implementation in 2008-2009, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 57.4%,
SD = 22.46) was significantly lower than the State average, t (6) = -853.3, p < .001.

After three years of implementation in 2009-2010, the percentage of students in partially
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 72.55%,
SD = 2.90) was significantly lower than the State average, t (1) = -3671.9, p < .002.
It is very interesting to note that with respect to performance on the Math PSSA, prior to
implementation, the participating schools performed more favorably than the State averages.
However, once SWPBIS was implemented, schools that only partially implemented SWPBIS
performed significantly less favorably than the State averages.
One-sample t-tests were also used to determine the significance of the differences between
fully implementing schools and Pennsylvania State averages with regard to math performance.
The following results were found:

Prior to implementation in 2006-2007, the percentage of students in all cohort 1 schools
that scored Below Basic or Basic on the Math PSSA (M = 30.44%, SD = 17.51) was
significantly lower than the State average, t (25) = -9228, p < .001.

After one year of implementation in 2007-2008, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 24.63%, SD
= 9.28) was significantly lower than the State average, t (12) = -1077.8, p < .001.

After two years of implementation in 2008-2009, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 20.25%, SD
= 10.24) was significantly lower than the State average, t (18) = -1140.8, p < .001.

After three years of implementation in 2009-2010, the percentage of students in fully
implementing schools scoring Below Basic or Basic on the Math PSSA (M = 15.79%, SD
= 9.29) was significantly lower than the State average, t (16) = -1102.0, p < .001.

Prior to implementation in 2006-2007, the percentage of all students in cohort 1 schools
that scored Proficient or Advanced on the Math PSSA (M = 69.55%, SD = 17.52) was
significantly higher than the State average, t (25) = -1929.6, p < .001.

After one year of implementation in 2007-2008, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 75.37%,
SD = 9.34) was significantly higher than the State average, t (12) = -2713, p < .001.
42
2010-2011 PAPBS Network SWPBIS Executive Summary

After two years of implementation in 2008-09, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 79.73%,
SD = 10.24) was significantly higher than the State average, t (18) = -3074.4, p < .001.

After three years of implementation in 2009-10, the percentage of students in fully
implementing schools scoring Proficient or Advanced on the Math PSSA (M = 84.23%,
SD = 9.28) was significantly higher than the State average, t (16) = -3340.3, p < .001.
While schools that only partially implemented SWPBIS tended to perform less favorably
than the State averages on the Math PSSA, the completely opposite finding was discovered for
the schools that had fully implemented SWPBIS. In each year, the fully implementing schools
outperformed the State average results and for each year of implementation, the gap between the
State averages and the fully participating schools grew larger.
Comparison of PSSA Reading in partial and full implementing cohort 1 schools.
Independent sample t-tests were used to test the significance of the difference between partially
implementing and fully implementing schools for each of the three years of actual
implementation. Comparisons were made for the percentage of students scoring "Below Basic
or Basic and also for the percentage of students scoring Proficient or Advanced. The results of
these tests for PSSA Reading appear in Table 7.
Table 7
Comparison of PSSA Reading Scores by Schools Identified as Partial and Full Implementing
SWPBIS for 2008 - 2010
Variable
N
M
SD
t
Percentage of Students Below Basic or Basic 2008
.41
Partial
14
33.79%
22.03
Full
13
31.05%
10.05
Percentage of Students Proficient or Advanced 2008
-.41
Partial
14
66.21%
22.03
Full
13
68.95%
10.04
Percentage of Students Below Basic or Basic 2009
2.51*
Partial
7
42.71%
26.01
Full
19
25.39%
9.93
Percentage of Students Proficient or Advanced 2009
-2.51*
Partial
7
57.29%
25.97
Full
19
74.60%
9.92
Percentage of Students Below Basic or Basic 2010
1.13
Partial
2
34.25%
27.93
Full
17
24.18%
10.08
Percentage of Students Proficient or Advanced 2010
-1.13
Partial
2
66.75%
27.93
Full
17
75.83%
10.10
Note. N = number of schools designated as either partial or full implementers
* p < .05
43
2010-2011 PAPBS Network SWPBIS Executive Summary
For each of the three years of implementation, the fully implementing schools had a
lower percentage of students scoring Below Basic or Basic and a larger percentage of students
scoring Proficient or Advanced on the PSSA Reading test than did the partially implementing
schools. In the third year of implementation, the average percentage of students in these two
categories for the partially implementing schools was almost identical to the percentage in the
first year of implementation. However, the percentages for fully implementing schools followed
a trend in which the percentages were more favorable each year - with a lower percentage of
students in the Below Basic or Basic category and more students in the Proficient or Advanced
category.
In only the second year of implementation (2008-09), however, was there a significant
difference between the partially and fully implementing schools. In the second year of
implementation, with respect to the percentage of students scoring Below Basic or Basic, the 19
fully implementing schools (M = 25.39%, SD = 9.93) had a signicantly lower percentage (t [24]
= 2.51, p < .05) than the seven partially implementing schools (M = 42.71%, SD = 26.01).
Similarly with respect to the percentage of students scoring Proficient or Advanced, the 19 fully
implementing schools (M = 74.6%, SD = 9.92) had a significantly larger percentage (t [24] =
2.51, p < .05) than the seven partially implementing schools (M = 57.29%, SD = 25.97).
With regard to the percentage of students in the second year of implementation, it is
interesting to note how much more homogeneous the schools in the fully implementing group
were compared to the partially implementing group with their standard deviations of
approximately 10 and 26, respectively.
Comparison of PSSA Math in partial and full implementing cohort 1 schools.
Independent sample t-tests were used to test the significance of the difference between partially
implementing and fully implementing schools for each of the three years of actual
implementation. Comparisons were made for the percentage of students scoring Below Basic or
Basic and also for the percentage of students scoring Proficient or Advanced. The results of
these tests for PSSA Math appear in Table 8.
For each of the three years of implementation, the fully implementing schools had a
lower percentage of students scoring Below Basic or Basic and a larger percentage of students
scoring Proficient or Advanced on the PSSA Math test than did the partially implementing
schools
Once again, only in the second year of implementation (2008-09), however, was there a
significant difference between the partially and fully implementing schools. In the second year
of implementation, with respect to the percentage of students scoring Below Basic or Basic, the
19 fully implementing schools (M = 20.25%, SD = 10.24) had a signicantly lower percentage (t
[24] = 3.53, p < .05) than the seven partially implementing schools (M = 42.56%, SD = 22.45).
Similarly with respect to the percentage of students scoring Proficient or Advanced, the 19 fully
implementing schools (M = 79.73%, SD = 10.24) had a significantly larger percentage (t [24] = 3.53, p < .05) than the seven partially implementing schools (M = 57.40%, SD = 22.46).
44
2010-2011 PAPBS Network SWPBIS Executive Summary
As was found with PSSA Reading scores with regard to the percentage of students in the
second year of implementation, fully implementing schools were more homogeneous than the
partially implementing group with their standard deviations of approximately 10 and 22,
respectively.
Table 8
Comparison of PSSA Math Scores by Schools Identified as Partial and Full Implementing
SWPBIS for 2008 - 2010
Variable
N
M
SD
t
Percentage of Students Below Basic or Basic 2008
.96
Partial
14
30.68%
20.77
Full
13
24.63%
9.28
Percentage of Students Proficient or Advanced 2008
-.96
Partial
14
69.32%
20.77
Full
13
75.37%
9.34
Percentage of Students Below Basic or Basic 2009
3.53*
Partial
7
42.56%
22.45
Full
19
20.25%
10.24
Percentage of Students Proficient or Advanced 2009
-3.53*
Partial
7
57.40%
22.46
Full
19
79.73%
10.24
Percentage of Students Below Basic or Basic 2010
1.75
Partial
2
27.45%
28.99
Full
17
15.79%
9.29
Percentage of Students Proficient or Advanced 2010
-1.73
Partial
2
72.55%
28.99
Full
17
84.23%
9.28
Note. N = number of schools designated as either partial or full implementers
*p < .05
Caveat regarding comparison of PSSA performance. It is difficult to determine why
there were significant differences between fully and partially implementing schools in only the
second year of implementation. It is entirely possible that it takes more than one year for the
impact of SWPBIS to become evident, which would explain why the trend is in the desirable
direction in the first year, but not statistically significant. In the third year, it is most likely that
the lack of significance is a product of the lack of complete data; implementation data on only 19
of the original 26 schools were available and only two schools were identified as partial
implementers.
In the analyses for PSSA Reading and Math, several words of caution are needed. First,
it is important to note that while the schools are the same across all three years, the students do
differ. That is, the students tested in the pre-implementation year are not the same as the
students tested two years later in spring 2009. Secondly, these comparisons were performed on
percentage data rather than on the raw frequency data. Had the raw data (i.e., number of students
in each category) been available, the actual number of students performing at each level for each
school and year would have provided a much more stable and accurate picture. Nonetheless,
45
2010-2011 PAPBS Network SWPBIS Executive Summary
these results are very encouraging and certainly suggest that future analyses be performed to
confirm these findings. Finally, since the percentage of students who score Below Basic or
Basic and the percentage of students who score Proficient or Advanced are related (i.e.,
percentages sum to 100%) changes in one variable obviously have an impact on the other.
46
2010-2011 PAPBS Network SWPBIS Executive Summary
Replication, Sustainability, and Improvement of PA SWPBIS
The final salient domain offered by Algozzine et al. (2010) for large-scale SWPBIS
evaluation is how well the framework can be replicated in other settings while still observing
similar effects (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; McIntosh, Horner, & Sugai,
2009). The first of three central evaluation questions, within this domain, is how quickly schools
achieve full implementation status. The second program evaluation question for this domain is
the extent to which SWPBIS is replicated in other schools over time. Finally, sustained
implementation once a school achieves full implementation status is important to understand in
the context of maintenance of school reform efforts.
Improvement of Fidelity
McKevitt and Braaksma (2008) noted that the length of time for a school to achieve full
implementation status is directly related to the training schedule for teams to develop the
SWPBIS infrastructure and materials. Sugai and Horner (2005) suggested that full
implementation cannot be accomplished until core school teams have completed the three days
of pre-implementation training previously summarized in Table 4. Descriptive data regarding
the number of cohort 1 schools that achieved full implementation status as a function of time
after completing the three days of pre-implementation training is presented in Figure 29. Two of
the cohort 1 schools were removed from the analysis because they were fully implementing prior
to the pre-implementation training. Only data from cohort 1 were summarized because all
schools in this cohort received training at the same time thus facilitating accurate designation of
full implementation status as a function of time. Additionally, many cohort 2 schools have yet to
implement fully because they completed training just a few months ago.
Fifteen of the 31 cohort 1 schools achieved full implementation in less than one year after
completing the three-day pre-implementation training. An additional eight schools achieved full
implementation within two years, accounting for 74.2% of all possible schools. Only one school
required more than three years to achieve full implementation status. Importantly, five of the 31
(16.1%) schools failed to achieve full implementation status ever. These data indicate that it is
very plausible for a dedicated, committed core team and school staff to achieve full
implementation status in less than one year of completing the three-day pre-implementation
training. Still, a small percentage of schools experience a two- to three-year process to reach full
implementation. Lastly, despite high quality pre-implementation training and onsite technical
assistance, a small number of schools never fully achieve high fidelity SWPBIS implementation.
Replication
Replication of any school reform effort deemed effective and valuable is critical to
ensuring that more individuals come into contact with the positive outcomes associated with that
reform. Within the context of SWPBIS, replication refers to the number of schools that achieve
full implementation status across time. Importantly, replication also refers to witnessing the
same positive outcomes in latter adopting schools as those observed in the original schools.
47
2010-2011 PAPBS Network SWPBIS Executive Summary
Judgment of this latter aspect, replication of effects in later-adopting schools, was reviewed in
the previous domain, Impact of PA SWPBIS.
Figure 29
Time to Reach Full Implementation After Completing Pre-Implementation Training - Cohort 1
16
15
14
Number of Schools
12
10
8
8
6
5
4
2
2
1
0
< 1 year
1 - 2 years
2 - 3 years
> 3 years
Time to Full Implementation
Never
N = 31
Cross sectional analysis of fidelity data from schools in both cohorts were combined to
describe the expansion of SWPBIS in Pennsylvania since its early adoption in summer 2007.
Schools for which no fidelity data were available were excluded from this analysis. Visual
display of these data is presented in Figure 30. Two dramatic increases occur, once at spring
2008 and again at spring 2011. These two increases correspond to approximately one full year
after completion of pre-implementation training for each cohort. By spring 2011, 45 out of the
199 trained schools had achieved full implementation status.
Sustainability
A final evaluation question relevant to this domain is the capacity for schools and
districts to sustain SWPBIS implementation over multiple years, once full implementation status
is achieved. SWPBIS, like any other educational reform effort, faces many challenges to longterm sustainability. McIntosh, Filter, Bennett, Ryan, and Sugai (2010) suggested multiple
influences may inhibit sustained SWPBIS including lack of contextual fit to the school setting,
competing initiatives, loss of funding, and failure to report efficacy data back to stakeholders and
administrators who make decisions about allocating resources for SWPBIS.
Analyses of sustainability one to four years after initially implementing with integrity
were calculated across schools for which fidelity data were available, omitting schools with
missing data. These longitudinal analyses, in effect, were comparable to the previous analyses
with the exception that schools with missing data were omitted from the analyses. These data
48
2010-2011 PAPBS Network SWPBIS Executive Summary
are presented in Figure 31. These data suggest that a majority of schools are able to sustain
implementation three to four years after initially achieving full implementation. Again, these
interpretations are made with caution given incomplete data from a number of other schools.
Figure 30
Cross Sectional Analysis of Full Implementation Status for Combined Cohorts
Number of Schools Achieving
Full Implementation Status
50
45
40
30
20
23
22
Spring 2009
Time
Spring 2010
16
10
2
0
Spring 2007
Spring 2008
Spring 2011
Figure 31
Longitudinal Analysis of the Percentage of Schools Sustaining Implementation Once Achieved
Percentage of Schools
100%
5.3%
7.1%
12.5%
94.7%
92.9%
87.5%
80%
60%
40%
100.0%
Sustained
20%
0%
1
(N = 19)
49
Regressed
2
3
(N = 8)
(N = 14)
Academic Years Post Initial Full Implementation
2010-2011 PAPBS Network SWPBIS Executive Summary
4
(N = 2)
Summary
A total of 199 schools across Pennsylvania have been trained to implement School-Wide
Positive Behavioral Interventions and Supports (SWPBIS). Thirty-three schools designated as
cohort 1 were trained in summer 2007. The remaining 166 schools, categorized into cohort 2,
received training beginning in 2009, although many did not complete initial training until the
2010-2011 academic year. Technical support and onsite consultation have been offered to these
schools by a number of educational agencies, namely the three PaTTAN offices, various IUs,
and members of the PAPBS Network SLT. Most schools implementing SWPBIS have also
received invaluable assistance and expertise from local community mental health agencies, under
a collaborative arrangement with individual schools and districts.
The expressed purpose of the PAPBS Network is to establish cross-systems of care for all
students and families using a three-tiered public health model delivered in public school settings.
Schools affiliated with the PAPBS Network received training and technical assistance from a
certified SWPBIS Facilitator using the same set of training materials endorsed by the OSEP
PBIS Network. Onsite technical assistance was provided on a regular basis during
infrastructure-building activities and initial implementation with a fading of such support over
time contingent on improved and sustained implementation.
The predominant number of schools trained in SWPBIS across both cohorts was
elementary schools, accounting for approximately 69% of all schools involved in the Network.
Most recently, preschool settings have initiated program-wide PBIS which is an encouraging
trend given the importance of prevention and early intervention services. Schools involved in
the initiative span rural, suburban, and urban settings. Approximately 118,000 students are
educated in PAPBS Network schools representing 6% of all students in Pennsylvania.
The number of schools achieving full SWPBIS implementation status has increased with
a total of 45 schools achieving full implementation status in spring 2011. Data from cohort 1
schools confirm that a majority of schools achieve full implementation status within two years.
Just a small percentage of cohort 1 schools were unable to achieve full implementation status
after four years. It thus appears that a school reform effort such as SWPBIS can be fully
implemented in an academic year by some schools, but it is much more likely that schools will
require two years to fully establish a strong SWPBIS framework.
As schools fully implemented SWPBIS and staffs' perceptions reflected that
improvement, a shift in priorities for behavior support moved toward, most likely, improving
behavior in non-classroom settings and for individual students. The latter relates directly to tier
2 and 3 levels of support in a PBIS framework.
Staff perceptions of school risk factors slightly decreased simultaneously with a slight
increased perception of protective factors over a four-year period for all schools; however,
neither trend was statistically significant. Significantly different ratings of risk and protective
factors were observed between staff from partial and fully implementing SWPBIS schools,
50
2010-2011 PAPBS Network SWPBIS Executive Summary
however, suggesting fewer risk factors and more protective factors are perceived when a school
fully implements SWPBIS.
Analysis of ODR rates across multiple years provides strong evidence to conclude that
SWPBIS results in a reduction of ODRs fairly quickly, typically in the first year of full
implementation. The net effect of this reduction of ODRs, therefore, is a substantial recoupment
of time for administrators, teachers, and students. Using complete longitudinal data from four
cohort 1 schools, conservative estimates of regained time per 100 students in a building
amounted to an average yearly savings of 4.3 work days for administrators and teachers and 11.6
days of instruction for students when SWPBIS is fully implemented. As one can see, a
considerable economic and academic effect is observed when SWPBIS is implemented with
integrity.
ODR Triangle Data were available for a handful of cohort 1 and 2 schools, and these data
shed light on the proportion of all students who exhibit behavior disruptive to the learning
environment. Statistically significant differences in the proportion of students receiving an ODR
in a year occurred between elementary and secondary schools. Longitudinal analyses of
elementary ODR Triangle Data suggest that approximately 94-95% of all elementary students
receive one or no ODRs in a year, 4% of all elementary students receive two to five ODRs in a
year, and 1% of all elementary school students receive six or more ODRs in a year. Cross
sectional comparisons of secondary school ODR Triangle Data indicate approximately 82% of
all secondary students receive one or no ODRs in a year, 11% of all secondary students receive
two to five ODRs in a year, and almost 7% of all secondary students receive six or more ODRs
in a year.
Secondary schools implementing SWPBIS used OSS as a disciplinary consequence for
an ODR significantly more often than elementary schools. Within elementary school settings, a
decreasing frequency of using OSS was observed over multiple years, although this change over
time was not statistically significant. Detailed analysis of secondary school OSS trends and
levels across time could not be conducted given small samples and the desire to maintain school
anonymity.
Longitudinal analysis of referrals for special education evaluation and identification of
new students for special education and related services indicated an increase for both during the
initial year of SWPBIS implementation. A downward trend was then observed in the subsequent
two years. These conclusions should not, however, be generalized to other schools given that
data from only two schools were available.
Encouraging results were obtained when investigating the effects on SWPBIS on indices
of LRE. Most noteworthy was the statistically significant findings between percentage of
students with IEPs educated in the least restrictive educational setting. Data from 10 schools
indicated that fully implementing schools placed significantly higher proportions of students
with disabilities in the LRE compared to schools that did not implement or only partially
implemented SWPBIS. Longitudinal trends where observed in the desired direction, although
changes over time were not statistically significant. Small sample sizes, again, compromised
statistical analyses. These findings suggest that SWPBIS may facilitate a learning environment
51
2010-2011 PAPBS Network SWPBIS Executive Summary
in which more students with disabilities are educated in the LRE compared to non- or partially
implementing schools.
Related to LRE is the rate of out-of-school educational placements for all students and
those identified for special education services under the emotional disturbance exceptionality.
Longitudinal comparisons with just a few schools were comparable to cross sectional
comparisons. Percentages of all students placed in out-of-school educational facilities averaged
approximately 1% across two years. The proportion of all students placed outside their
neighborhood school that were identified as emotionally disturbed increased from 22% to 45%
during this same time period. These trends over time are not statistically significant, however.
Across a three-year span, eight elementary schools and two secondary schools
implemented CICO. The relative success of the CICO intervention at curbing inappropriate
behavior and reinforcing prosocial behavior was 82% and 90% for elementary schools in the first
two years, respectively. Success rates dropped in year 3 to 58.7%. Similar trends were observed
in the secondary schools that implemented CICO. Eighty-two percent of secondary students
enrolled in CICO at year 1 were successful with an 86% success rate in year 2. Success was
much lower in year 3 as evidenced by 59% of all enrolled secondary students achieving
behavioral goals. It is important to remember, however, that the data indicate that CICO is
effective for a large majority of students in both elementary and secondary settings.
Academic achievement data were compared among cohort 1 schools as a function of
implementation fidelity across four years. Additionally, cohort 1 schools' PSSA performance
levels were compared to State averages. Prior to implementing SWPBIS, cohort 1 schools had
higher percentages of students performing in the Below Basic or Basic categories and lower
percentages of students performing in the Advanced or Proficient categories on PSSA Reading
compared to State averages. Across time, schools that partially implemented SWPBIS continued
to perform significantly less favorably than State averages on PSSA Reading. For schools that
fully implemented SWPBIS, however, gains in reading performance were significantly better
than State averages. Over time, significantly lower percentages of students in fully
implementing SWPBIS schools performed at Below Basic or Basic and significantly higher
percentages of students performed at Proficient or Advanced on PSSA Reading. It thus appears
that high fidelity SWPBIS is at least one component of school reform that creates a learning
environment fostering stronger-than-average acquisition of reading skills.
Cohort 1 schools were stronger than State averages on PSSA Math prior to SWPBIS
implementation. Most other trends on PSSA Math were comparable to results from PSSA
Reading. Schools that only partially implemented SWPBIS in the subsequent three years
performed significantly worse than State averages on PSSA Math. This finding, similar to that
found on PSSA Reading, suggests that partial implementation of SWPBIS is not associated with
positive gains in the acquisition of mathematics skills. Similar to PSSA Reading, though, was
the trend over time indicating high fidelity SWPBIS implementation is associated with
significantly lower percentages of students performing at Below Basic or Basic and significantly
higher percentages of students performing at Advanced or Proficient on PSSA Math compared to
State averages. These results provide rather compelling evidence that high fidelity SWPBIS
52
2010-2011 PAPBS Network SWPBIS Executive Summary
implementation is at least one factor associated with significant gains in mathematics
performance on state accountability measures.
Comparison of the differences between partially and fully implementing SWPBIS on
PSSA Reading and Math indicated that, overall, more desirable results are observed when
schools implement SWPBIS with integrity. As with most empirically supported interventions, it
is best to adhere to the prescribed treatment, in this case the SWPBIS framework. The result of
such compliance to the prescribed framework will most likely produce far more beneficial
academic outcomes for students. High fidelity SWPBIS is closely related to significantly
stronger gains in reading and mathematics than State averages.
As noted earlier, nearly three-quarters of all cohort 1 schools improved from not
implementing SWPBIS to achieving full implementation status within two years. A few more
schools continued to partially implement SWPBIS four years after initial training, and five have
ceased to implement at this point. When fidelity data from both cohorts are combined, not
surprisingly, a marked increase in the number of fully implementing schools corresponds directly
to the expansion of training efforts and increase in the number of certified SWPBIS Facilitators.
Data regarding sustained implementation of high fidelity SWPBIS were incomplete at three and
four years post initial training; however, considerable evidence exists to suggest that once a
school achieves full implementation of SWPBIS, it can sustain implementation for multiple
years. In fact, two schools in cohort 1 have sustained full implementation for four consecutive
years.
53
2010-2011 PAPBS Network SWPBIS Executive Summary
Implications for Further Investment
A number of encouraging trends and positive outcomes were observed in the data for
this third annual evaluation of the PAPBS Network's SWPBIS framework. Conclusions about
some important variables, however, could not be made given the limitations of the available data.
Incomplete data from several participating schools remains the primary limitation of the annual
program evaluation, preventing these researchers from drawing firmer conclusions. Present
efforts by the PAPBS Network SLT have focused on increasing schools' compliance with data
requests, developing a more efficient means to track schools involved in the Network, and
streamlining communication between the PAPBS Network SLT, Network schools, SWPBIS
Facilitators, and these researchers.
It is clear that SWPBIS is rapidly expanding across the Commonwealth. As new schools
are added to the PAPBS Network, it will be imperative for PDE, PaTTAN, and the PAPBS
Network SLT to emphasize to schools that collecting baseline and annual data per the PAPBS
Network guidelines is best practice for making decisions about improvement and sustainability at
the local level. In turn, these same data should be shared at the state-wide level to bolster
aggregated state-wide analyses.
These researchers can always include data that schools submit from previous years that
were omitted from data analyses in this report. It is known, for example, that some schools
completed annual fidelity checks of SWPBIS implementation; however, those data were never
uploaded to the secure website from which these researchers extract data. Schools should audit
what data they have at the local level and make sure all data are submitted to the appropriate
website. These researchers, in collaboration with PAPBS Network SLT, will work in the 20112012 school year to more efficiently identify missing data at the school-level so that some of
those key data points may be located and included in future program evaluations.
Results contained in this program evaluation provide compelling evidence that high
fidelity SWPBIS is associated with strong academic performance in reading and mathematics.
By its own admission, however, PDE recommends that PSSA data not be the sole indicator of
students' academic development. The inclusion of Pennsylvania Value Added Assessment
System (PVAAS) data will aid in more precise analyses of the effects of SWPBIS on academic
development. PVAAS data were collected for this program evaluation report but were not
analyzed given that only one year's worth of data was available. Future program evaluations will
include analyses of PVAAS and PSSA data to provide a clearer picture of SWPBIS and
academic performance.
54
2010-2011 PAPBS Network SWPBIS Executive Summary
References
Algozzine, B., Horner, R. H., Sugai, G., Barrett, S., Dickey, S. R., Eber, L., … Tobin, T. (2010).
Evaluation blueprint for school-wide positive behavior support. Eugene, OR: National
Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved
from www.pbis.org
Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of schoolwide
positive behavioral interventions and supports on student outcomes: Results from a
randomized controlled effectiveness trial in elementary schools. Journal of Positive
Behavior Interventions, 12, 133-149. doi: 10.1177/1098300709334798
Childs, K. E., Kincaid, D., & George, H. P. (2010). A model for statewide evaluation of a
universal positive behavior support initiative. Journal of Positive Behavior Interventions,
12, 198-210. doi: 10.1177/1098300709340699
Crone, D. A., Horner, R. H., & Hawken, L. S. (2004). Responding to problem behavior in
schools: The behavior education program. New York: Guilford Press.
Curtis, M. J., Castillo, J. M., & Cohen, R. M. (2008). Best practices in system-level change. In
A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 887-901).
Bethesda, MD: National Association of School Psychologists.
Eber, L., Lewandowski, H., Hyde, K., & Bohanon, H. (2008). Illinois positive behavior
interventions & supports network. Springfield, IL: Illinois State Department of
Education. Retrieved September 12, 2008, from http://www.pbisillinois.org
Eber, L., Sugai, G., Smith, C. R., & Scott, T. M. (2002). Wraparound and positive behavioral
interventions and supports in the schools. Journal of Emotional and Behavioral
Disorders, 10, 171-180.
Eber, L., Phillips, D., Hyde, K., Breen, K., Rose, J., Lewandowski, H., Upreti, G. (2010). Illinois
positive behavior interventions & supports (PBIS) network: FY10 annual progress
report. Springfield, IL: Illinois PBIS Network. Retrieved July 1, 2011 from
http://www.pbisillinois.org.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005).
Implementation research: A synthesis of the literature. Tampa, FL: Florida Mental Health
Institute, The National Implementation Research Network (FMHI Publication #231).
Fox, L., & Hemmeter, M. L. (2009). A programwide model for supporting social emotional
development and addressing challenging behavior in early childhood settings. In W.
Sailor, G. Sugai, G. Dunlap, and R. Horner (Eds.). Handbook of positive behavior
support (pp. 177-202). New York: Springer Science + Media.
55
2010-2011 PAPBS Network SWPBIS Executive Summary
Hawken, L. S., Adolphson, S. L., MacLeod, K. S., & Schumann, J. (2009). Secondary-tier
interventions and supports. In W. Sailor, G. Dunlap, G. Sugai, & R. Horner (Eds.).
Handbook of positive behavior support (pp. 395-420). New York: Springer.
Individuals with Disabilities Education Act, 20 U. S. C. § 1400 (2004).
Kaufman, J. S., Jaser, S. S., Vaughan, E. L., Reynolds, J. S., Di Donato, J., Bernard, S. N., &
Hernandez-Brereton, M. (2010). Patterns in office referral data by grade, race/ethnicity,
and gender. Journal of Positive Behavior Interventions, 12, 44-54. doi:
10.1177/1098300708329710
Kincaid, D., Childs, K., & George, H. (2005). School-wide benchmarks of quality. Unpublished
instrument, University of South Florida.
Lewis, T. J., Barrett, S., Sugai, G., & Horner, R. H. (2010). Blueprint for school-wide positive
behavior support training and professional development. Eugene, OR: National
Technical Assistance Center on Positive Behavior Interventions and Support. Retrieved
August 9, 2011 from www.pbis.org.
Lewis, T. J., & Sugai, G. (1999). Effective behavior support: A systems approach to proactive
schoolwide management. Effective School Practices, 17, 47-53.
Luiselli, J. K., Putnam, R. F., Handler, M. W., & Feinberg, A. B. (2005). Whole-school positive
behaviour support: Effects on student discipline problems and academic performance.
Educational Psychology, 25, 183-198. doi: 10.1080/0144341042000301265
March, R. E., & Horner, R. H. (2002). Feasibility and contributions of functional behavioral
assessment in schools. Journal of Emotional and Behavioral Disorders, 10, 158-170.
McGlinchey, M. T., & Goodman, S. (2008). Best practices in implementing school reform. In
A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 983-994).
Bethesda, MD: National Association of School Psychologists.
McIntosh, K., Filter, K. J., Bennett, J. L., Ryan, C., & Sugai, G. (2010). Principles of sustainable
prevention: Designing scale-up of school-wide positive behavior support to promote
durable systems. Psychology in the Schools, 47, 5-21. doi: 10.1002/pits.20448
McIntosh, K., Horner, R. H., & Sugai, G. (2009). Sustainability of systems-level evidence-based
practices in schools: Current knowledge and future directions. In W. Sailor, G. Dunlap,
G. Sugai, & R. Horner (Eds.). Handbook of positive behavior support (pp. 327-352). New
York: Springer.
McKevitt, B. C., & Braaksma, A. D. (2008). Best practices in developing a positive behavior
support system at the school level. In A. Thomas & J. Grimes (Eds.), Best practices in
school psychology V (pp. 735-747). Bethesda, MD: National Association of School
Psychologists.
56
2010-2011 PAPBS Network SWPBIS Executive Summary
Muscott, H. S., Mann. E. L., & LeBrun (2008). Positive behavioral interventions and support in
New Hampshire: Effects of large-scale implementation of schoolwide positive behavior
support on student discipline and academic achievement. Journal of Positive Behavior
Interventions, 10, 190-205. doi: 10.1177/1098300708316258.
Newton, J. S., Todd, A. W., Algozzine, K., Horner, R. H., & Algozzine, B. (2009). The team
initiated problem solving (TIPS) training manual. Unpublished manual. Eugene, OR:
Educational and Community Supports.
No Child Left Behind Act of 2001, 20 U.S.C. § 6301 et seq. (2008)
Pennsylvania Department of Education, (n.d.). Public school enrollment reports. Retrieved on
July 28, 2011 from http://www.portal.state.pa.us/portal/server.pt/community/enrollment/
7407/public_school_enrollment_reports/620541
Runge, T. J., & Staszkiewicz, M. J. (2009). Programmatic evaluation of Pennsylvania's schoolwide positive behavior support project: A summary of implementation fidelity and impact
on behavioral and academic outcomes on cohort 1 schools in school years 2007-2009.
Unpublished technical report, Department of Educational and School Psychology,
Indiana University of Pennsylvania, Indiana, PA.
Runge, T. J., & Staszkiewicz, M. J. (2010). Pennsylvania school-wide positive behavior
support: 2nd annual summary of implementation fidelity and impact on behavioral and
academic outcomes on cohort 1 schools in school years 2006-2010. Unpublished
technical report, Department of Educational and School Psychology, Indiana University
of Pennsylvania, Indiana, PA.
Runge, T. J., & Staszkiewicz, M. J. (2011). Pennsylvania school-wide positive behavior support
initiative: 2009-2010 executive summary. Indiana, PA: Indiana University of
Pennsylvania.
Runge, T. J., Staszkiewicz, M. J., & O'Donnell, K. H. (2011). Pennsylvania school-wide positive
behavioral interventions & supports: 3rd annual summary of implementation fidelity and
impact on PA PBS network schools in years 2006-2010. Unpublished technical report,
Department of Educational and School Psychology, Indiana University of Pennsylvania,
Indiana, PA.
Scott, T. M., & Barrett, S. B. (2004). Using staff and student time engaged in disciplinary
procedures to evaluate the impact of school-wide pbs. Journal of Positive Behavior
Interventions, 6, 21-27. doi: 10.1177/10983007040060010401
Spaulding, S. A., Irvin, L. K., Horner, R. H., May, S. L., Emeldi, M., Tobin, T. J., & Sugai, G.
(2010). Schoolwide social-behavioral climate, student problem behavior, and related
administrative decisions: Empirical patterns from 1,510 schools nationwide. Journal of
Positive Behavior Intervention, 12, 69-85. doi: 10.1177/1098300708329011
57
2010-2011 PAPBS Network SWPBIS Executive Summary
Sprague, J., Colvin, G., & Irvin, L. (2002). The school safety survey version 2.0. Eugene, OR:
The Institute on Violence and Destructive Behavior.
Sugai, G., & Horner, R. H. (1999). Discipline and behavioral support: Preferred processes
and practices. Effective School Practices, 17, 10-22.
Sugai, G., & Horner, R. H. (2005). School-wide positive behavior supports: Achieving and
sustaining effective learning environments for all students. In W. H. Heward (Ed.),
Focus on behavior analysis in education: Achievements, challenges, and opportunities
(pp. 90-102). Upper Saddle River, NJ: Pearson Prentice Hall.
Sugai, G., & Horner, R. H. (2009). Schoolwide positive behavior support. In W. Sailor, G. Sugai,
G. Dunlap, and R. Horner (Eds.). Handbook of positive behavior support (pp. 307-326).
New York: Springer Science + Media.
Sugai, G., Horner, R. H., & Lewis-Palmer, T. (2002). Effective behavior support: Team
implementation checklist version 2.2. Eugene, OR: Educational and Community
Supports.
Sugai, G., Horner, R. H., & Lewis-Palmer, T. (2009). Effective behavior support: Team
implementation checklist version 3.0. Eugene, OR: Educational and Community
Supports.
Sugai, G., Horner, R. H., & Todd, A. (2003). Effective behavior support: Self-assessment survey
version 2.0. Eugene, OR: Educational and Community Supports.
Sugai, G., Lewis-Palmer, T., Todd, A. W., & Horner, R. H. (2005). School-wide evaluation tool
version 2.1. Eugene, OR: Educational and Community Supports.
Walker, H. M., Horner, R. H., Sugai, G., Bullis, M., Sprague, J. R., Bricker, D., & Kaufman, M.
(1996). Integrated approaches to preventing antisocial behavior patterns among school
age children and youth. Journal of Emotional and Behavioral Disorders, 4, 193-256. doi:
10.1177/106342669600400401
58
2010-2011 PAPBS Network SWPBIS Executive Summary