UNIVERSITY OF CALIFORNIA Santa Barbara Improving

UNIVERSITY OF CALIFORNIA
Santa Barbara
Improving Instructional Assistant Effectiveness in Inclusive Settings
A dissertation submitted in partial satisfaction of the
requirements of the degree of Doctor of Philosophy
in Education
by
Kimberly Beth Weiner
Committee in Charge:
Professor Michael Gerber, Chair
Professor George Singer
Professor Mian Wang
December 2010
The dissertation of Kimberly Beth Weiner is approved.
___________________________________________
Mian Wang, Ph.D.
____________________________________________
George Singer, Ph.D.
____________________________________________
Michael Gerber, Ph.D., Committee Chair
December 2010
Improving Instructional Assistant Effectiveness in Inclusive Settings
Copyright © 2010
by
Kimberly Beth Weiner
iii
DEDICATION
This dissertation is dedicated to my family who has shown nothing but
unwavering support, enthusiasm and understanding throughout my entire journey. I
am well aware that I am quite possibly the luckiest daughter in the world to have
parents that have continually inspired me to dream big no matter what. Mum, Dad,
your support, in all of its forms, has made this dream possible. Thank you for
listening to me, loving me and pushing me to be everything that you already knew I
was.
Thank you Grammie and Papa for always thinking everything I do is so
amazing. You make me proud of who I am and what I have accomplished.
To my second family, the Patalingjugs, thank you for continually telling me
how proud of me you were. For those words of encouragement I am forever grateful.
Thank you to my girls Catie, Crystal and Brandy for laughing with me, crying
with me and reminding me to keep my head up even when life seemed impossible.
Your friendships served as a constant reminder that I was not alone. Without you I
would not have stayed grounded or sane through these last six years.
Finally, my deepest and most heartfelt thanks to my greatest champion and
defender, Jeremy, who made the choice every day to live with me through this
process. You accepted me for my craziness, nerdiness, crankiness, and
forgetfulness—I love you.
iv
ACKNOWLEDGEMENTS
First and foremost I want to profess my deepest gratitude to my mentor and
advisor Michael Gerber. What started with a simple question evolved into a graduate
experience that was both profound and enlightening during every step of the journey.
Throughout my graduate career you listened to all of my questions and pushed me to
find the answers. Your unfailingly acknowledgement of my constant fretting over
every decision was met with a simple phrase by mother Gerber that I will take with
me throughout my entire life, “Don’t try to make the right decision. Make a decision
and make it the right one.” Thank you for giving me the eyes to see what the right
decisions were and having the strength and confidence to make those decisions. Your
guidance has taught me to test my own limits and challenge myself in ways that I
would have never dreamed of before.
Thank you to my committee members, Dr. George Singer and Dr. Mian Wang
for your knowledge, advice and thoughtfulness (and for wading through the murky
waters of a Masters Thesis that didn’t go exactly as planned). I can only hope that this
dissertation serves as evidence that your guidance and support have encouraged me to
be a meticulous and thoughtful researcher.
I would not and could not have completed this dissertation without the insight,
support and encouragement of Amber Moran. Thank you does not begin to
acknowledge the time and energy you invested to support the development and
implementation of this dissertation. Thank you for listening, reading, editing and
being there every time I needed you.
v
My thanks to my “older” mentors who showed me the path and pointed me
towards the light. Thank you Russ Lang for putting your hand out and saving me in
the second before the dissertation was about to go under. You are as inspiring as you
are intelligent, I will forever hold you in the highest regard. Thank you April Regester
for your immediate guidance and advice from the very first day of my doctoral
program. I frequently referred to you as my “second advisor” and will always be
grateful to you for being an amazing example of how to succeed in the program.
Thank you to all of the administrators, schools, principals, teachers and
instructional assistants that made this dissertation possible. Your dedication to
improving instructional assistant effectiveness and your willingness to allow my
fellow researchers and I into your classrooms were fundamental to the success of this
dissertation. Finally, a special thank you to the Philip and Aida Siff Foundation for
supporting my research. Your assistance has been greatly appreciated.
vi
VITA OF KIMBERLY BETH WEINER
December 2010
EDUCATION
Behavior Analyst Certification Board
Board Certified Behavior Analyst – Doctor (BCBA-D), 2010
University of California, Santa Barbara, Santa Barbara, Ca
Ph.D. in Education, Fall 2010
Emphasis: Special Education, Disabilities and Risk Studies
University of California, Santa Barbara, Santa Barbara, Ca
M.A. in Education, February 2008
Emphasis: Special Education, Disabilities and Risk Studies
Thesis Advisor: Prof. Michael Gerber, Ph.D. Department Chair, Fmr.
Emphasis Leader
University of California, Santa Barbara, Santa Barbara, Ca
California Teaching Certification: Education Specialist Credential,
June 2007
Emphasis: Moderate to Severe
California State University, Sacramento, Sacramento, Ca
B.A. General Psychology, 2003
Graduate of University Honors Program, 2003
CURRENT RESEARCH
University of California, Santa Barbara, Santa Barbara, Ca
Dissertation Research, Department of Education
Advisor: Prof. Michael Gerber
Fall 2009-Fall 2010
•
Develop specific training model comprised of research-based
components of effective professional development for
instructional assistants working in inclusion in three local
districts.
University of California, Santa Barbara, Santa Barbara, Ca
Research Apprentice/Project Assistant, Department of Education
Advisor: Prof. Michael Gerber
Fall 2007-Fall 2010
vii
•
•
•
Assist in the specification development and field testing of
specific tutoring method designed to teach at risk English
Language Learners listening comprehension skills.
Provide specialized instruction in the Core Intervention Model
(Gerber, 2004) and behavior management strategies to
effectively teach struggling students in a small group setting.
Co-develop hypotheses, statistical models, and data analyses to
examine the relationship between specialized academic
instruction, Behavior Management in a small group setting and
student academic outcomes.
PREVIOUS RESEARCH EXPERIENCE
Research Assistant, Tangible Symbols Systems
Oregon Health & Science University, Portland, Or
Supervisor: Philip Schweigert, M. Ed.
November 2005-January 2007
•
Assisted in the development and dissemination of publicity for
Tangible Symbol Systems training opportunity in Santa Barbara
community and surrounding areas.
•
Attended Tangible Symbol Systems workshop in Anchorage Alaska.
•
Engaged in continual communication with researchers (in Oregon),
web designers and graphic artists to ensure success of training in Santa
Barbara.
•
Recruited participants to attend training while coordinating location,
set up and technical equipment.
•
Assigned participants to groups and monitored participants throughout
training and provided follow-up with offsite participants.
•
Assisted in data entry and analysis.
•
Conducted review of relevant literature and wrote introduction section
of paper for submission.
California State University, Sacramento, Sacramento, Ca
Research Assistant, Partner Violence by Females
Supervisors: Professor Lee Berrigan, Ph.D., & Dr. Edwards
Fall 2003-Winter 2003
•
Investigated partner violence committed by females against male
partners.
•
Analyzed data collected during structured interviews, which
included such tests as The Conflicts Tactics Scale.
Applied Behavior Consultants, Inc.
Research Assistant
viii
Supervisors: Dr. Joe Morrow, Owner & Dr. Williamson, Executive
Director
Fall 2002-Spring 2002
•
Investigated the time interval between parents noticing atypical
behaviors in their children to when the child was professionally
diagnosed with autism.
•
Reviewed pertinent research concerning the onset of autistic like
behaviors, statistically analyzed data, made representative
graphs, wrote abstract, relayed significant information to other
researchers and interpreted results of significant correlations.
TEACHING EXPERIENCE
University of California, Santa Barbara, Santa Barbara, Ca
Teaching Assistant, Practicum in Individual Differences
Instructor: Dr. Michael Gerber, Department Chair, Fmr. Emphasis Leader,
Advisor
Fall 2007-Fall 2010
•
Co-developed course, classroom activities and assignments designed to
examine the development and use of a specific tutoring method to
teach at risk English Language Learners listening comprehension.
•
Teach specific tutoring method and behavior management evidencebased practices implemented in individual and small group settings.
•
Supervise undergraduate tutors working with small groups of students
at risk for academic and behavioral difficulties in general education
classrooms.
University of California, Santa Barbara, Santa Barbara, Ca
Teaching Assistant, Reading and Language Arts
Instructor: Dr. Ann Lippincott, Associate Director, Teacher Education
Program
Fall 2007
•
Assisted in development of course, classroom activities and
assignments for general and special education pre-service teachers.
•
Provided continual feedback to students regarding lesson plan
development
•
Facilitated general and special education collaboration.
University of California, Santa Barbara, Santa Barbara, Ca
Student Teacher, Teacher Education Program
Supervisors: April Registrar, M.A. and Dr. Joanne Singer
Summer 2006-Summer 2007
•
Served as Teacher Assistant for SH and Mild/Moderate classes.
ix
•
•
•
•
Developed and trained continual progress monitoring for academic and
behavioral interventions.
Trained staff to provide effective academic instruction, maintenance
and generalization of instruction.
Participated in grade level curriculum and team meetings, parent
conferences, IEP meetings, speech and related service meetings and
field trips.
Trained staff to facilitate inclusion in kindergarten classroom.
INTERNSHIPS
Santa Barbara County Education Office, Santa Barbara, Ca
Intern
Supervisor: Dr. Florene Bednersh, Assistant Superintendent of Special
Education
Spring 2008-Summer 2008
•
Observe assistant superintendent’s duties, responsibilities, management
and leadership styles in coordinating special education services for
students with disabilities for twenty-three school districts in the
county.
•
Participate in an observe Joint Powers Agency meetings, Support
Team meetings, Coordinating Committee meetings, Superintendents
meetings, Teacher trainings and evaluations, Cabinet meetings,
Psychologist interviews, Special Education Administrators of County
Offices Meetings (SEACO), California Beginning Teacher Support
and Assessment (BITSA) meetings, Nurse Day conference, and Job
Alike meetings
Harding Elementary School, Santa Barbara, Ca
Intern
Supervisor: Dr. Sally Kingston, Principal
Fall 2007-Winter 2008
•
Observed principal’s duties, responsibilities, and leadership skills
throughout school day to gain understanding of the principal’s
role in an elementary school setting.
•
Participated in and observed Student Study Team meetings, IEP
meetings, professional development for teachers, student
discipline, collaboration between service providers and teachers,
reading intervention groups, yard supervision, teacher
supervision, new employee training, school and district
collaboration, teacher discipline, and grade level meetings.
TRAININGS PROVIDED
x
Santa Barbara County Special Education Local Planning Agency
(SELPA), Santa Barbara, Ca
Autism/Behavior Specialist,
Fall 2008-Summer 2010
•
Provide trainings for instructional assistants, teachers, related
therapists and professionals SELPA-wide and for individual
districts. Trainings have been provided on the following researchbased topics: Autism, Positive Behavior Supports, DTT,
incidental teaching, social skills, prompting, differential
reinforcement, data collection, academic supports, adaption’s and
modifications for students with disabilities, reinforcement
procedures, classroom management and communication systems.
University of California, Santa Barbara, Santa Barbara, Ca
Behavior Management Training
Fall 2007, Winter 2008, Spring 2008
•
Provided Behavior Management Training for undergraduate
tutors working in small group setting with students at risk for
reading failure. Training focused on setting up individual and
group contingencies for students in order to maximize instruction
time in small groups. Training targeted several research-based
strategies and techniques useful in reducing problematic student
behavior interfering with learning.
Harding Elementary School, Santa Barbara, Ca
Behavior Management Training for Reading Intervention
Specialists
Fall 2007
•
Provided Behavior Management Training for Reading
Intervention Specialists working with a variety of students in
individual and small group settings. Training focused on
individual and group contingencies for students in order to
maximize instruction time in small groups. Specialists learned
how to incorporate Behavior Management strategies and
techniques into instruction to maximize learning time.
OTHER EXPERIENCE
Santa Barbara County Special Education Local Planning Agency
(SELPA), Santa Barbara, Ca
Autism/Behavior Specialist
Supervisor: Dr. Jarice Butterfield, SELPA Director
2008-2010
xi
•
•
•
•
Provide behavioral training and programming based on the principles
and procedures of Applied Behavior Analysis for 23 school districts
and the Santa Barbara County of Education Office.
Provide district and SELPA-wide trainings for instructional assistants,
teachers, related professionals and parents.
Develop behavior support plans and conduct and/or supervise
Functional Analysis Assessments.
Oversee the development and implementation of behavioral programs
while coordinating with school personnel and related service providers.
Regester Supported Living, Santa Barbara, Ca
In-Home Support Service Provider
Supervisor and owner: April Regester
2005-2007
•
Provide in-home and community assistance to 31 year old male
diagnosed with Autism.
•
Promote independence and self-determination in everyday activities.
•
Actively involved in person-centered planning to create individualized,
natural and create supports based on individual’s requests, preferences
and needs.
Applied Behavior Consultants, Inc. Sacramento, Ca
Transition Teacher
Supervisor: Megan Manternach
2004-2005
•
Taught students diagnosed with Autism and related disorders and
behavioral difficulties.
•
Trained in Skinner method of ABA therapy.
•
Assessed student skill level and identified goals necessary for transition
to least restrictive environment.
•
Interviewed, hired, trained and managed 6 behavioral technicians.
•
Communicated regularly with Behavior Analyst, district
representatives, receiving teacher and parents concerning
proposed placements.
•
Taught implementation of current behavior plans and helped to
make necessary revisions and modifications to fit new
environment and ensure success.
Applied Behavior Consultants, Inc. Sacramento, Ca
Senior Behavior Consultant
Supervisor: Megan Manternach/Dr. Joseph Morrow
7/2004-10/2004
•
Supervised the in home and community programs of students with
Autism.
xii
•
•
•
•
Ensured program goals, behavior interventions and lesson plans
were individualized to child’s needs, correctly implemented and
behaviorally sound.
Monitored behavior intervention plans for effectiveness through
continual data collection and review.
Provide continual training and feedback to Behavior Consultants
while coordinating necessary changes based on data analysis and
student progress.
Troubleshoot in conjunction with Behavior Consultant to remedy
lack of progress and ascending trends in academic and behavioral
areas.
Applied Behavior Consultants, Inc. Sacramento, Ca
Teacher
Supervisor: Dr. Joseph Morrow
2003-2004
•
Taught students with Autism in a private school setting.
•
Trained in Skinner method of ABA therapy.
•
Interviewed, hired, trained and managed 6 behavioral technicians.
•
Wrote and trained implementation of first, second and third
degree behavior plans along with emergency intervention
procedures.
•
Coordinate with related service providers to ensure consistency
across environments
•
Trained new employees to be competent in ABA through
individual and continual education.
Applied Behavior Consultants, Inc. Sacramento, Ca
Behavior Technician
1999-2003
•
Worked with students diagnosed with Autism and behavioral
difficulties in private school setting.
•
Implemented Skinner method of ABA therapy with students of
differing abilities and ages.
•
Implemented PECS with students while working on oral motor
activities to promote verbal communication.
VOLENTEER WORK
Santa Barbara, Ca
Behavior Consultant
2006-2007
•
Provided behavioral consulting at Harding Elementary School and local
Head Start Programs. Consulted and assessed the needs of individuals
xiii
displaying inappropriate behaviors. Provided information on strategies
and techniques to reduce inappropriate student behavior interfering
with learning.
CONFERENCE PARTICIPATION
Bridging the Gap between Research and Practice: A Conference
for Teachers of Students with Learning Disabilities
Presenter
October 2008
Provide workshop session on previously researched small group
tutoring method and individual and small group Behavior
Management strategies for teachers, clinicians, and administrators
working with students who have Learning Disabilities.
AWARDS
•
•
•
•
•
•
•
Thomas Haring Fellowship Award, University of California,
Santa Barbara, 2010
Philip & Aida Siff Education Foundation Fellowship, Dean’s
Scholar recipient, 2007-2008
Teacher Education Program University Block Grant, University
of California, Santa Barbara, 2006-2007
Gevirtz Graduate School of Education Department of
Education Block Grant, University of California, Santa
Barbara, 2006-2007
Gevirtz Graduate School of Education General Fee Fellowship,
University of California, Santa Barbara, 2006-2007
Gevirtz Graduate School of Education Department of
Education Block Grant, University of California, Santa
Barbara, 2005-2006
Gevirtz Graduate School of Education Department of
Education Emergency Block Grant, University of California,
Santa Barbara, 2005-2006
AFFILIATIONS
Council of Exceptional Children
o Council of Children with Behavioral Disorders
o Teacher Education Division
•
National Autism Society, Santa Barbara Chapter (Student
Affiliate)
•
xiv
ABSTRACT
Improving Instructional Assistant Effectiveness in Inclusive Settings
by
Kimberly Beth Weiner
As of 2007, 718,119 instructional assistants were employed in the United
States (National Center for Education Statistics, 2009b). Of those instructional
assistants, 373,466 were classified as full-time special education instructional
assistants (Data Accountability Center, 2009a). As the employment of instructional
assistants continues to grow, particularly in special education, so does the inclusion of
students with disabilities into general education classrooms (U.S. Department of
Education, Office of Special Education and Rehabilitative Services, & Office of
Special Education Programs, 2005). Although much of the United States continues to
see increases in both the employment of instructional assistants and the inclusion of
students with disabilities, existing training protocols do not adequately prepare
instructional assistants to support these students (Causton-Theoharis & Malmgren,
2005; Giangreco, Broer, & Edelman, 2002; Petscher & Bailey, 2006; Pickett, Likins,
& Wallace, 2003; Schepis, Reid, Ownbey, & Parsons, 2001). Effective and efficient
instructional assistant training is critically needed to reduce the detrimental effects of
inexperienced and untrained instructional assistants on disabled students in general
education classrooms.
xv
TABLE OF CONTENTS
Chapter 1:Introduction……………………...…………………………………..……..1
Evolution of the Instructional Assistant……………………………..…..…….3
Instructional Assistant Statistics…………………………………………....…5
Inclusion.……………………..………………………………………………..7
Legislation and Instructional Assistants……………..……………………....10
Instructional Assistant Training and Outcome…..…………...…......……….13
Chapter II: Literature Review……..…………………………………………………20
Instructional Assistant Professional Development…………..…...............….25
Duration of Professional Development…………..……………………….….32
Instructional Assistant Instructional Support Behaviors……………...….......34
Student Academic Engagement…………………..…………………….……60
Training Recommendations………………………..……...…………………68
Major Research Questions and Related Hypotheses………...………………68
Chapter III: Method….................................................................................................70
Districts……………………………..……………………..…………..……..70
Instructional Assistant Employment, Skills and Requirements…..………….74
Pre-Service Training and Orientation Procedures………………..………….76
Participants……………………………...……………………………………79
Experimental Design…………………………...…………………...….…….83
Measuring Instruments……………………………………………………….84
Procedures……………………………...………………………...…………..95
xvi
Chapter IV: Results…………………………………………...……….....…………112
Chapter V: Discussion………………………………………….…...…...…………152
References……………………………….………………………………………….200
Appendices…………..……………………………………………………….……..227
xvii
LIST OF FIGURES
Figure 1. Pretest and Posttest Total Correct Vignettes by Group…………..………131
Figure 2. Pretest to Posttest Total Observable Behaviors by Group……………..…135
xviii
LIST OF TABLES
Table 1. Vignette Type, Answer and Description………………………….......…….87
Table 2. Instructional Assistant Demographics…………..……………………...…113
Table 3. Student Participant Demographics……………..………………...........…..116
Table 4. Correlations of Total Observable Behaviors at Pretest for
All Participants………………………………..……………………………..……...118
Table 5. Correlations of Total Observable Behaviors for Treatment Participants
at Posttest………………………..…………………………………………………120
Table 6. Correlations of Total Observable Behaviors for Control Participants
at Posttest……………………..………………………………………….…………122
Table 7. Correlations of Pretest Cognitive and Behavioral Measures for
All Participants……………………..…………………………………….…………125
Table 8. Correlations between Posttest Vignettes and Observed Behaviors for
Treatment Participants………………..…………………………………….………126
Table 9. Correlations between Posttest Vignettes and Observed Behaviors for
Control Participants………………………..…………………………….…………127
Table 10. Descriptive Statistics for Pretest and Posttest Correct Vignettes
by Group……………..…………………………………………………………..…129
Table 11. Repeated Measures Analysis of Variance of Total Cognitive
Measure………………..……………………………………………………………129
Table 12. Descriptive Statistics for Pretest and Posttest Total Observable Behaviors
by Group…………………………..…………………………………………..……134
xix
Table 13. Repeated Measures Analysis of Variance for Main Effects and Interaction
Effects of Time and Group………………………………...……………...……..…134
Table 14. Treatment Group Differences from Posttest to Maintenance for
Total Behaviors Observed……………………………………………………..……137
Table 15. Means, Medians, Gain Scores and Effect Sizes for Observable
Effective Behaviors……………………………………………………..………..…140
Table 16. Means, Medians, Reduction Scores and Effect Sizes for Ineffective
Observable Behaviors………………………………………………………...…….143
Table 17. Summary of Standard Multiple Regression Analysis for Variables
Predicting Posttest Total Behavioral Measure……………………………………...146
Table 18. Summary of Standard Multiple Regression Analysis for Variables
Predicting Posttest Total Cognitive Measure…………………………………….…147
Table 19. Summary of Standard Multiple Regression Analysis for Variables
Predicting Student Academic Gain………………………………………....………148
xx
LIST OF APPENDICES
Appendix A. Instructional Assistant Survey……………………..…………………227
Appendix B. Instructional Assistant Vignettes.……………..……………….……..230
Appendix C. Direct Observation Instrument…………..…………………………...242
Appendix D. Final Interview for Instructional Assistants…………..……………...243
Appendix E. Instructional Assistant Social Validity Questionnaire..……………....244
Appendix F. Observation Questionnaire..………………………………..…………246
Appendix G. 1st Coaching Form……………………..…………………..…………247
Appendix H. 2nd Coaching Form……………………………………..……….……249
xxi
CHAPTER 1
INTRODUCTION
Currently, the terms paraprofessional, instructional aide, instructional
assistant, paraeducator and teacher’s aide are used interchangeably to describe
personnel who 1) assist with the delivery of instruction and other direct services to
individuals in need and 2) work under the direct supervision of a licensed or certified
professional who is responsible for developing, implementing and assessing student
educational programs (Downing, Ryndak, & Clark, 2000; Katsiyannis, Hodge, &
Lanford, 2000; Pickett & Gerlach, 2003). Over the past two decades, instructional
assistants have become an essential component of the educational landscape. This is
most evident in the general education classroom, where instructional assistant support
is often the only way to facilitate the successful inclusion of a student with disabilities
(Giangreco & Broer, 2007; Giangreco, Edelman, & Broer, 2001).
This movement toward inclusive education has drastically altered the role of
the instructional assistant, from clerical assistant to direct instructor and moderator of
student behavior to help facilitate the inclusion of students with disabilities in the
general education classroom (Giangreco & Broer, 2005a; Giangreco, Broer, &
Edelman, 2001b; Jones & Bender, 1993; Pickett & Gerlach, 2003; Riggs & Mueller,
2001). In response to this expansion of the instructional assistant’s roles and
responsibilities, recent literature underscores the importance of effective and efficient
training for this class of personnel (Causton-Theoharis & Malmgren, 2005;
Giangreco, Broer et al., 2001b; Giangreco, Smith, & Pinckney, 2006; Pickett et al.,
1
2003; Riggs & Mueller, 2001; Wadsworth & Knight, 1996; Young, Simpson, Myles,
& Kamps, 1997). Unfortunately, both pre-service and in-service trainings tend to be
scarce, often allowing the least qualified personnel to provide the majority of
instruction to students who demonstrate the greatest learning challenges (Brown,
Farrington, Knight, Ross, & Ziegler, 1999; Giangreco, 2009; Giangreco & Broer,
2005b; Giangreco, Broer, & Edelman, 1999; Giangreco, Broer et al., 2001b;
Giangreco et al., 2006).
In the late 1990s, researchers began to document the inadvertent detrimental
effects of untrained instructional assistants who provide one-on-one support for
students with disabilities in a general education classroom. Well-documented
negative effects include unnecessary dependence on support personnel, separation
from and interference with peer interactions, stigmatization, restricted access to
proficient instruction from a qualified teacher and interference with teacher
interactions (Giangreco et al., 2002; Giangreco, Edelman, Broer, & Doyle, 2001;
Giangreco, Yan, McKenzie, Cameron, & Fialka, 2005; Malmgren & CaustonTheoharis, 2006; Marks, Schrader, & Levine, 1999). These findings cast doubt on
instructional assistants’ ability to facilitate student independence in a general
education classroom.
Equally concerning is the lack of research on the relationship between
instructional assistant use and student outcome, despite abundant literature on
instructional assistants’ roles, responsibilities, and attitudes (Boomer, 1994; Doyle,
1995; French, 1999a, , 1999b; Giangreco, Broer, & Edelman, 2001a; Giangreco,
2
Edelman, Luiselli, & MacFarland, 1997; Marks et al., 1999; McKenzie & Lewis,
2008; Pickett & Gerlach, 2003; Pickett et al., 2003). This gap in the literature is
alarming, considering that instructional assistants have reported spending a majority
of the student’s day providing direct instruction to the student and making
instructional decisions without the oversight of the general education teacher
(Downing et al., 2000; Giangreco & Broer, 2005b; Riggs & Mueller, 2001).
Evolution of the Instructional Assistant
Although there continues to be little consensus regarding the exact role of the
instructional assistant (Giangreco, Broer et al., 2001b; Jones & Bender, 1993; Pickett
et al., 2003; Riggs & Mueller, 2001), several factors have clearly contributed to the
reliance on instructional assistants to provide both behavioral support and direct
instruction to students with disabilities in general education classrooms since the
1950s. The literature suggests that these factors include changes in federal
legislation, spurred by the lack of qualified teachers during WWII; the inclusion of
students with disabilities into less restrictive, general education settings; and the
demand for individualized instruction and compensatory education for students from
impoverished backgrounds. Other contributors include the growing number of
Limited English Proficient Learners, a shortage of highly qualified special education
teachers, increased special education teacher caseloads, and the cost-effectiveness of
utilizing instructional assistants (Giangreco, Edelman, Broer et al., 2001; Giangreco
et al., 2006; Katsiyannis et al., 2000; Pickett et al., 2003).
3
During the 1950s, WWII created a deficit of qualified teachers followed by an
increase in pressure from parents and advocates to create community-based services
for individuals with disabilities. School boards and administrators began employing
instructional assistants to assist teachers in clerical and administrative tasks (French
& Pickett, 1997; Jones & Bender, 1993). This continued until the sixties and
seventies, when the role of the instructional assistant transformed due to several
changes in legislation such as the Federal programs Title I and Head Start.
Title I and Head Start sought to employ and train instructional assistants to
provide support to students who came from economically or educationally
disadvantaged backgrounds (French & Pickett, 1997). Instructional assistants from
similar linguistic backgrounds served as bilingual teaching assistants and liaisons
between home and school (Pickett, Likins & Wallace, 2003). In 1975, the passage of
Public Law 94-142, Education for all Handicapped Children, brought about several
changes in how schools served students with disabilities. In order to provide a free,
appropriate public education in the least restrictive environment possible the law
ensured that students who were in need of special education services had an
Individualized Education Program, detailing personalized services required to ensure
the most effective education (Pickett & Gerlach, 2003). In order to provide
individualized instruction and ample support to these students, general and special
education teachers required additional classroom support (French & Pickett, 1997;
Pickett et al., 2003); therefore, the employment and utilization of instructional
assistants gained significant momentum (Pickett & Gerlach, 2003).
4
Although instructional assistants continued to assist with housekeeping tasks,
the role and responsibilities of special education instructional assistants have
expanded over the last thirty years to include instructional and direct services
(Giangreco et al., 2001; Jones & Bender, 1993; Pickett et al., 2003). This was meant
to increase student independence, assist with the transition from school to work, and
facilitate the inclusion of students with disabilities into their general education
classrooms (French, 1998). Currently, some of the primary roles of instructional
assistants include: a) providing direct instruction in academics, b) managing student
behaviors, c) assisting with personal care, d) facilitating social interactions between
peers, e) collecting student data and f) supporting students in general education
classrooms (French, 1999b).
Instructional Assistant Statistics
Although there is evidence to substantiate the claim that employment of
instructional assistants is rising, past and current statistics must be interpreted with
caution (Pickett et al., 2003). One of the biggest hindrances to understanding
instructional assistant employment rates throughout the years is the way that data is
collected by both federal and state agencies. Lack of a systematic and widely adopted
approach to collecting instructional assistant data at the federal and state level,
coupled with the lag time in producing relevant data, creates a high degree of
variability, depending on the data source (Pickett & Gerlach, 2003; Pickett et al.,
2003). It is also sometimes difficult to determine what type of program an
instructional assistant works in, such as general education, special education, Title I,
5
English Language Learner or early childhood and transition services (Pickett et al.,
2003). Rather than these particulars, the data typically provides only the yearly
employment of full-time instructional assistants in the United States and, in some
cases, by state. For this reason, the data should be interpreted with caution due to the
variation of definitions and diverse interpretations which can lead to under or overreporting (Giangreco, Hurley, & Suter, 2009).
Databases such as the National Center for Education Statistics (NCES) (2009)
provide statistical information regarding the yearly employment of instructional
assistants in the U.S. and each state’s employment, starting in the early 1990s. From
1969 to 1970, 57,418 instructional assistants were employed in the US. By 1980, only
ten years later, that number had risen to 325,755 due to critical changes in legislation.
In 1990, the employment of instructional assistants continued to increase to 395,959.
In 1995, only five years later, employment increased by almost 100,000 to 494,289.
By the year 2000, the number had grown to 641,392 instructional assistants employed
in the U.S., with 63,852 of that number employed in California (NCES, 2009). As of
2007, 718,119 instructional assistants were employed throughout the U.S., with
373,466 of this number serving as full-time special education instructional assistants
in the United States and 65,846 in California alone (Data Accountability Center,
2009a; National Center for Education Statistics, 2009a). Although recent years have
yet to be reported, it can be assumed that instructional assistant employment will
continue to rise.
6
With the increase of instructional assistant employment came an increase in
the number of instructional assistants supporting students with disabilities in the
general education setting. The percentage of students educated in general education
classrooms for a majority of the day (80 percent or more) rose from 45.3 percent in
1995 to 57.21 percent in 2007 (Data Accountability Center, 2009b; U.S. Department
of Education, 2006). Specifically in California, as of 2007 52.35 percent of students
with disabilities were included in their general education classrooms at least 80
percent of the day (Data Accountability Center, 2009b). Although the number of fulltime instructional assistants in the U.S. in 2007 has yet to be reported, there are
statistics indicating that nearly 56.84 percent of students with disabilities spend 80
percent or more of their day in the general education classroom (Data Accountability
Center, 2009b).
Although there are discrepancies in the literature regarding the actual growth
rate of instructional assistants since the 1950s, it is clear that there has been and
continues to be a significant increase in the employment of instructional assistants in
the U.S. due to the increase in student enrollment. The U.S. Bureau of Labor
Statistics (2010) foresees a 10 percent increase in the employment of instructional
assistants from 2008-2018, particularly due to the number of special education and
Limited English Proficient students requiring specialized services and supports.
Inclusion
This upward trend in instructional assistant employment can be partially
explained by the increased inclusion of students with disabilities in general education
7
classrooms. As of 1997, the Individuals with Disabilities Education Act (IDEA)
sought to improve the IDEA by including amendments ensuring that students with
disabilities had access to a free and public education (Yell, 2006). Specifically, the
reauthorization ensured that students with disabilities would be included to the
maximum extent in general education settings with typically functioning peers. It
mandated that removal from the regular educational environment “occur only when
the nature or severity of the disability of a child is such that education in a regular
class with the use of supplemental aids and services cannot be achieved satisfactorily”
(Section 1412 (a) (5)).
Statistics from the Data Accountability Center (2009) indicate that as of 2008,
671,095 students in California were identified as disabled and being served under
IDEA. Of those students, 435,326 were included in a general education classroom for
at least 40 percent of the school day. In the effort to provide an appropriate education
for students with disabilities in this setting, much responsibility has fallen on the
instructional assistant (Downing et al., 2000; Giangreco et al., 1997; Hughes & ValleRiestra, 2008; Marks et al., 1999). One potentially problematic aspect of this shift is
that the general education setting is different than the special education setting, both
in structure and staffing. For example, instructional assistants who work in the
general education classroom are not under the direct supervision of a Special
Education teacher, as they typically would be in a special education classroom
(Downing et al., 2000; Giangreco, Backus, CichoskiKelly, Sherman, & Mavropoulos,
2003).
8
Research on the supervision and management of instructional assistants
indicates that in most cases the special education teacher supervises the instructional
assistant. Although this remains the most common current practice, the literature
suggests that these special education teachers are not adequately prepared to take on a
supervisory role, due to a lack of pre-service training, professional development
opportunities and necessary ongoing support (French, 1998; French, 2001; Pickett &
Gerlach, 2003). Regardless of their readiness to supervise, teachers also contend with
other hindrances, such as the undefined, evolving role of the instructional assistant
and insufficient time to effectively train instructional assistants (Wallace, Shin,
Bartholomay, & Stahl, 2001).
This lack of supervision for instructional assistants is even more apparent in
the inclusive classroom. Special education teachers can rarely accompany an
instructional assistant into the general education setting to provide ongoing
supervision, as they might in the self-contained special education classroom.
Although the general education teacher is the classroom leader, it cannot be assumed
that general education teachers can provide adequate guidance to the instructional
assistant; they may lack the specific training and knowledge required to help an
instructional assistant carry out a student’s individual educational program (Downing
et al., 2000; Giangreco et al., 1997). Therefore, general education teachers have come
to rely on or require instructional assistants to accompany students with disabilities in
the general education classroom (Causton-Theoharis & Malmgren, 2005; Egilson &
Traustadottir, 2009; Giangreco & Broer, 2007; Giangreco, Broer et al., 2001b).
9
It remains unclear how instructional assistants are to be trained to effectively
support a student with disabilities in the general education classroom. As a result of
this uncertainty, instructional assistants assume the responsibility of making
important decisions about a student’s educational program (Downing et al., 2000).
The lack of well-designed, effective and efficient pre-service and in-service trainings
(Downing et al., 2000; French, 2001; Giangreco et al., 2002; Giangreco et al., 2006;
Passaro, Pickett, Latham, & Hongbo, 1994; Wadsworth & Knight, 1996) is an issue
cited by numerous authors, who continually reiterate the concern that the least
qualified personnel are assigned to provide the majority of instruction and assistance
to students with the most challenging educational needs (Brown et al., 1999;
Giangreco & Broer, 2005b; Giangreco et al., 1999; Giangreco, Broer et al., 2001b;
Giangreco et al., 2006).
Legislation and Instructional Assistants
In response to the drastically increased reliance on instructional assistants to
provide support and direct instruction for students with disabilities, federal laws have
attempted to incorporate safeguards to ensure that instructional assistants are provided
with adequate training. In 1997, in addition to ensuring that students with disabilities
were educated in the least restrictive environment, the reauthorization of the
Individuals with Disabilities Education Act (IDEA) included provisions to ensure that
instructional assistants receive appropriate training in instruction and related
educational services. The 1997 IDEA mandates instructional assistants “who are
appropriately trained and supervised in accordance with state law, regulations, or
10
written policies to assist in the provision of special education and related services for
children and youth with disabilities” (Pickett & Gerlach, 2003, p. 49). In order for
instructional assistants to carry out such duties, training is not just essential but
mandatory.
In an attempt to ensure that all instructional assistants are proficient in
providing instructional and behavioral support for students with disabilities, the 1997
IDEA reauthorization also included a provision requiring instructional assistants who
support children with disabilities to be provided with in-service and pre-service
preparation. Each state must develop a plan including policies and procedures to
ensure that instructional assistants are sufficiently trained and that the training is
consistent with state-approved or state-recognized certification or licensure
(Katsiyannis et al., 2000). Unfortunately, the law is unclear regarding how to
“appropriately train and supervise” instructional assistants, leaving the specifics of
these procedures up to state and local education agencies (Giangreco et al., 2003).
As of 2009, the state of California does not have a credential or licensing
program for instructional assistants. This results in a system with no current standards
for instructional assistants’ and teachers’ roles, or mastery of necessary skill sets and
protocols for instructional assistants at pre-service and in-service (Pickett et al.,
2003).
In addition to IDEA, The No Child Left Behind Act of 2001 (NCLB) sought
to address the employment standards and supervision of instructional assistants. More
specifically, amendments were made regarding the employment of and preparation of
11
instructional assistants, detailing specific tasks that an instructional assistant can
perform and how instructional assistants should be supervised (Pickett & Gerlach,
2003; Pickett et al., 2003). NCLB set out the minimal qualifications a instructional
assistant must meet in order to be employed under this title: instructional assistants
employed after January 2001 must either 1) complete two or more years of study at a
institution of higher education, 2) obtain an associate or higher degree or 3) meet
rigorous standard of quality demonstrated through formal state or local assessment of
knowledge in reading, writing, and mathematics [20 U.S.C. 119 (1)(c)(1)]. Currently
employed instructional assistants must also meet these requirements within four years
of the law’s enactment.
The duties an instructional assistant can perform are clearly defined in the
2004 NCLB. They include the following tasks: 1) providing one-on-one instruction
and assisting with behavior management 2) serving as a translator and 3) performing
assistive roles in various environments such as the computer lab, library and media
center. Although instructional assistants are permitted to provide academic instruction
and behavioral support under the direct supervision of a teacher, it is the legal
responsibility of the teacher to plan, ensure proper implementation, and evaluate a
student’s educational program (Pickett et al., 2003). In order to ensure that
instructional assistants are delivering appropriate instruction, NCLB requires that they
provide instructional services to students with disabilities under the direct supervision
of a teacher. This requires the supervising teacher to use the results of formal and
informal assessments to determine student needs. The supervising teacher must also
12
develop lesson plans and corresponding activities aligned with the curriculum, make
the necessary modifications and adaptations to the curriculum, deliver or oversee
instruction, and ultimately evaluate student performance (Etscheidt, 2005; Pickett &
Gerlach, 2003). Although instructional assistants can assist with these duties, at no
time can they be delegated to an instructional assistant (Pickett & Gerlach, 2003).
In 2004, the Individuals with Disabilities Act (IDEA) was again reauthorized
to address the training and supervision of instructional assistants. The reauthorization
proved to be similar to the 1997 reauthorization, with provisions to ensure
instructional assistants were appropriately prepared to assist with the delivery of early
intervention, special education and related services, while also ensuring instructional
assistants met the state requirements of employment in terms of certification and
licensure (Etscheidt, 2005). This was intended to ensure that instructional assistants
could help provide special education services to children with disabilities (Deardorff,
Glasenapp, Schalock, & Udell, 2007). In order to ensure that personnel are able to
effectively serve students with disabilities, states are required to develop instructional
assistant standards, training protocols and documentation requirements. Although this
law was enacted in 2004, states continue to struggle to find ways to provide relevant
and cost-effective training for instructional assistants, in their ever-expanding duties
(Deardorff et al., 2007).
Instructional Assistant Training and Student Outcome
Despite these laws, instructional assistant pre-service and in-service training
has been either non-existent or minimal, with dismal effects (Giangreco, Edelman,
13
Broer et al., 2001). This suggests instructional assistants are neither trained nor
prepared to support students with disabilities (Deardorff et al., 2007; Giangreco et al.,
1999; Petscher & Bailey, 2006; Pickett & Gerlach, 2003; Pickett et al., 2003). As
stated by Riggs and Mueller (2001), lack of appropriate training is cause for great
concern, especially in light of their research, which indicates that instructional
assistants spent more than 50 percent of their time delivering direct instruction to
students with disabilities. Giangreco and Broer (2005) found that close to 70 percent
of special education instructional assistants in 12 inclusive schools in Vermont made
instructional and curricular decisions without the oversight of the general or special
education teacher. Research by Downing, Ryndak and Clark (2000) also indicates
that instructional assistants make a majority of decisions regarding a student’s
educational program without the supervision of a qualified teacher. Marks, Schrader
and Levine (1999) also point out that instructional assistants have reported to be in
charge of developing and adapting the curriculum and assuming the role of the
primary educator in order to provide one-on-one instruction. In this study, one
instructional assistant reported that it was their responsibility to make certain the
student kept up academically, an extremely difficult task for an instructional assistant
who has little or no training.
According to Jones and Bender (1993), it is common for instructional
assistants to start working in the classroom without proper training or preparation to
support students with disabilities. As many as 70-90 percent of instructional assistants
begin working without any initial training (Council for Exceptional Children, 2001).
14
Instructional assistants acknowledge they are typically “thrown into” their jobs
without an understanding of their student’s disability, how the disability has impacted
the student’s ability in the classroom and what the student’s IEP goals were
(Giangreco, Edelman, & Broer, 2001).
When pre-service training is offered, instructional assistants are sometimes
required to attend a teachers’ in-service, which typically does not align with the
training needs of the instructional assistant (Giangreco et al., 2002; Pickett et al.,
2003; Riggs & Mueller, 2001). Yet it is well-documented by many researchers that
pre-service training of instructional assistants, especially those working in inclusive
settings, is essential (Blalock, 1991; French, 1998; Marks et al., 1999; Pearman,
Huang, & Mellblom, 1997; Pickett et al., 2003; Riggs & Mueller, 2001).
Research indicates that most instructional assistants learn on the job and from
teachers and other instructional assistants (Deardorff et al., 2007; Giangreco et al.,
2002; Giangreco et al., 1997; Katsiyannis et al., 2000; Riggs & Mueller, 2001).
Giangreco and Broer (2005) surveyed instructional assistants and found that special
education instructional assistants typically received minimal training and supervision
by the special educator; on average, less than 2 percent of the special educator’s time
was given to critical training and supervision practices. What has been considered “on
the job training” consists of instructional assistants talking with other instructional
assistants or shadowing another instructional assistant (Giangreco et al., 1997).
Unfortunately, this type of training results in untrained personnel providing support
and guidance for new instructional assistants (Riggs & Mueller, 2001).
15
While some instructional assistants found this training valuable, there is still a
need for more explicit and systematic instructional assistant training to competently
support students with disabilities in inclusive settings (Pickett et al., 2003; Riggs &
Mueller, 2001). Downing, Ryndak and Clark (2000) found that, for instructional
assistants, teaching themselves about the student, watching others and remembering
their own school experiences as an alternative to on-the-job training was
unsuccessful.
In some cases, in-service trainings increase the knowledge and skill sets of
currently employed instructional assistants. Pickett, Likins and Wallace (2003) affirm
that, like pre-service trainings, in-service trainings are highly variable, lack
appropriate focus, fail to target the specific needs of instructional assistants and are
rarely competency-driven. These training opportunities also fail to provide adequate
explicit instruction in the areas in which many instructional assistants work, such as
reading, language arts, math, communication, social behavior and living skills
(Giangreco et al., 2002).
Acknowledged by authors Malmgren and Causton-Theoharis (2006), the need
for continual in-service training is particularly important when the academic or
behavioral support of the instructional assistant begins to interfere with the goals of
the student. This sentiment is echoed by other researchers who indicate a need for inservice trainings that allow for ongoing monitoring and on-the-job feedback
(Downing et al., 2000; Pickett & Gerlach, 2003).
16
Historically, a common type of in-service occurs during one-day workshops.
Even though a decade of research demonstrates the ineffectiveness of this type of
training, its continual use can be attributed to its cost and time-efficient design
(Darling-Hammond, Wei, Andree, Richardson, & Orphanos, 2009; Lerman,
Vorndran, Addison, & Kuhn, 2004). Unfortunately, the disadvantages of these “oneshot” workshops significantly outweigh the benefits. One-day workshops are
fragmented, limited in time, and disconnected from classroom practices (Garet,
Porter, Desimone, Birman, & Yoon, 2001; Yoon, Duncan, Lee, Scarloss, & Shapley,
2007).
As made evident in the literature, instructional assistants require training that
is both efficient and effective in order to perform the various tasks required to support
a student with disabilities in a general education classroom. Though they may appear
efficient at first glance, one-day workshops rarely meet the specific training needs of
the instructional assistant. For this reason, specific and explicit training on basic
instructional assistant skills must be devised to decrease the likelihood of inadvertent
detrimental effects on students with disabilities. Research has yet to establish training
that is cost-effective, time-efficient and allows for transferability and retention in the
general education classroom.
An equally important, unanswered question is whether changes in
instructional assistant behavior (as a result of training) will positively affect student
outcome. Current literature indicates that there is little known about the short and
long-term effects of instructional assistant training on student performance, although
17
an increased reliance on special education instructional assistants has generally been
viewed as a logical, beneficial and economical resource (Myles & Simpson, 1989;
Young et al., 1997). With the growing requests made by parents to have an
instructional assistant accompany their child to general education (Giangreco et al.,
2005), school districts can no longer afford to agree to this level of support without
understanding both the benefits and drawbacks. This is made particularly evident by
researchers French and Cabel (1993), Kaplan (1987) and Young et. al (1997) who,
after careful observation, acknowledge that instructional assistants are typically the
main contact for students with disabilities, taking charge of the student’s instruction
and learning in an inclusive setting.
In more recent literature, a small number of researchers are beginning to cite
the critical need for research on the impact of instructional assistants on students with
disabilities, particularly in the general education class. Young, Simpson, Myles and
Kamps, (1997), for example, strongly advocate the need for additional research on
how to effectively train and guide instructional assistants to facilitate the successful
inclusion of students with disabilities into general education settings. This line of
research is particularly important; earlier literature suggests that both the overreliance on instructional assistants and their lack of training may hinder student
growth and development (Boomer, 1994; Hall, McClannahan, & Krantz, 1995). Tews
and Lupart (2008) also acknowledge a growing need for research examining
instructional assistants’ ability to produce a positive effect on student outcome,
particularly as instructional assistants assume greater responsibilities. These findings
18
warrant additional research on the effects of instructional assistant training on student
outcome.
19
CHAPTER II
LITERATURE REVIEW
For many years researchers have attempted to clarify what constitutes
effective professional development for teachers and how to transfer and sustain these
practices in the classroom setting (Quick, Holtzman, & Chaney, 2009).
Understanding the components of professional development is of particular
importance, as high-quality professional development has been linked to increased
student and teacher performance (Darling-Hammond et al., 2009; Yoon et al., 2007).
Although several studies have been conducted over the past decade to establish what
effective professional development looks like, a small number of recently published
studies have provided the field of education with a research–based, conceptual model
of effective professional development.
In 2001, a large-scale study was conducted by Garet, Porter, Desimone,
Birman and Yoon with the goal of increasing both teacher effectiveness and student
learning by ensuring teachers were exposed to effective, efficient and continual
professional development. The authors acknowledged that although the literature
recognizes common components of effective professional development, evidence of
the relationship between these components and student and teacher outcomes is rarely
provided. In order to determine which components led to positive outcomes, the
authors used data from a national evaluation of the Eisenhower Professional
Development Program, a model grounded in an extensive body of literature and
20
acknowledged as a reliable source regarding professional development (Quick et al.,
2009).
After an extensive review of the literature, the authors developed measures
based on the literature’s “best practices” regarding professional development in both
Mathematics and Language Arts. What Works Clearinghouse identified only nine
studies of sufficient quality, and the authors sought to identify common elements of
all nine studies. The common elements of all nine studies were broken into what the
authors defined as structural and core features.
Structural features were defined as form, duration and collective participation.
Form relates to the type of activity, represented by newer forms of professional
development activities such as study groups or networking, or traditional forms such
as workshops or conferences. Workshops of varying lengths are one of the most
utilized and traditional forms. Workshops have been defined as a training given by a
trained expert in a specific area of study outside of the classroom during scheduled
sessions (Garet et al., 2001). Although this is a common form, this type of
professional development has been criticized for its ineffectiveness, lack of focus,
limited time, insufficient content and lack of follow up. Unfortunately, research
shows that workshops rarely result in significant changes in the classroom (LoucksHorsley, Love, Stiles, Mundry, & Hewson, 2003). Reform types of professional
development, such as mentoring or coaching, have gained popularity as a method of
assisting with the transferability of skills into the classroom environment (DarlingHammond et al., 2009; Garet et al., 2001). As stated by the authors Garet, Porter,
21
Desimone, Birman and Yoon (2001), these types of in-vivo activities provide a muchneeded link between classroom instruction and real-life practices in order to increase
the likelihood of sustainability.
Interestingly, although workshops have been widely recognized as
insufficient, each of the nine studies that demonstrated student improvement
following professional development involved workshops that focused on the use of
research-based practices, active learning and teacher adaptation to fit the needs of
their classroom (Garet et al., 2001; Guskey & Yoon, 2009). Although the use of the
word workshop has been replaced with in-service in numerous studies, the
components are similar, consisting of a training given by an expert in a specific area
of study outside of the classroom after school (Bolton & Mayer, 2008; CaustonTheoharis & Malmgren, 2005; Maag & Larson, 2004; Matheson & Shriver, 2005;
Petscher & Bailey, 2006). This illustrates that workshops can be an effective form if
the curriculum is comprehensive, systematic, contains variety, and reflects current
research-based practices (George & Kincaid, 2008). Therefore, it appears important
to differentiate between workshops of high quality and workshops of poor design.
For professional development to be effective, it must not only include the
essential training components but also be efficient (Parsons & Reid, 1996, , 1999).
Duration, the second structural feature, is recognized as an essential component to the
overall success of professional development. As identified in the literature, the
duration or “dosage” of professional development is linked to its effectiveness. As
expected, 30-100 hours of professional development has a significantly greater
22
impact than 10-14 hours (Garet et al., 2008), although other reports suggest that 14
hours of professional development can also positively impact student achievement
(Yoon et al., 2007). Researchers emphasize that increased duration allows for detailed
instruction, in-depth discussion, and opportunities to practice and receive feedback
(Darling-Hammond et al., 2009; Garet et al., 2001).
The final structural feature, collective participation, suggests that professional
development activities should consider grouping professionals who work together in
similar settings. This facilitates collaboration and discussion, which can help to
sustain newly learned practices over time (Darling-Hammond et al., 2009; Garet et
al., 2001; Yoon et al., 2007). Quick, Holtzman and Chaney (2009) tested this model
in the San Diego school districts to improve professional development. Teachers from
the participating schools indicated that opportunity to collaborate was one of the most
important components of professional development. It allowed them to share
information regarding the usability and effectiveness of specific practices, plan
instruction and review student work, and discuss practices and adapt them for
students with varying needs in a small group format.
Although the structural features of professional development provide a basis
to organize, plan and implement effective professional development, the core features
indicate what should be taught and how it should be taught to ensure an increase in
teacher skills, knowledge and retention. Core features of high-quality professional
development are a focus on content, active learning, and coherence (Garet et al.,
2001).
23
The authors state that the content of professional development should focus on
explicitly teaching professionals what students should learn and ways in which
students can learn it. Depending on the targeted outcome, activities can vary in focus
from enhancing teacher knowledge in a particular subject to improving general
teaching practices such as behavior management or lesson planning (Garet et al.,
2001). Darling-Hammond, Wei, Andress, Richardson and Orphanos (2009) confirm
that professional development is most successful when it is concrete, focused and can
be related to classroom practices in a meaningful way.
As indicated in the literature, active learning activities are essential to the
overall success of professional development. Active learning has been described as
teachers’ active involvement throughout the training. It includes watching experts
model practices, discussing and practicing newly learned skills, and receiving
feedback from experts (Garet et al., 2001; Quick et al., 2009). Quick, Holtzman and
Chaney (2009) found that teachers considered this progression of training a necessary
component of professional development. As part of this continuum, teachers also
agreed that immediate feedback and reflection were crucial to the understanding and
application of newly learned skills. Darling-Hammond et al. (2009) substantiate
teachers’ assertions that newly learned practices have a greater probability of
transferring into the classroom if the procedures are modeled during the training.
Similarly, teachers indicate that professional development is most effective when
there are opportunities for active participation; in order to solidify their understanding
and application of newly learned practices.
24
The final core feature of professional development is the connection of
training objectives to larger goals (Garet et al., 2001). Professional development has
often been criticized for its lack of coherence with teacher learning and other facets of
the profession, such as grade level standards, curriculum, monitoring and evaluation
(Quick et al., 2009). For this reason, Garet et al. (2001) contends that coherent
programs of professional development have a greater probability of influencing
classroom practices if teachers understand the connection between professional
development practices and school and state requirements (Darling-Hammond et al.,
2009).
The findings from Garet et al. (2001) reveal that intensive, ongoing and
connected professional development is more likely to have an effect than short
workshops that lack coherence and do not allow for practice, feedback and reflection
(Darling-Hammond et al., 2009). Recent studies continue to reinforce the findings of
this large-scale study in hopes of affirming what constitutes high-quality professional
development and “best practices” to influence classroom practices and improve
student learning (Quick et al., 2009). In order to understand how these components
influence instructional assistant professional development, a synthesis of the current
literature follows.
Instructional Assistant Professional Development
Instructional assistants, similar to teachers, are lifelong learners who continue
to need training and continual professional development that is intensive, ongoing and
connected to practice. By providing professional development in this way,
25
instructional assistants can learn, practice and integrate new skills into everyday
practice while staying abreast of the most recent, research-based, effective practices
for students with disabilities (Lang & Fox, 2003). The following literature review will
provide a thorough understanding of how the previously identified components of
effective professional development are carried out for instructional assistants and
teachers.
Researchers have acknowledged that didactic instruction alone does not lead
to measurable retention or application of skills in the classroom (Harchik, Sherman,
Hopkins, Strouse, & Sheldon, 1989; Leach & Conto, 1999; Lerman et al., 2004).
Instead, numerous professional development models use a combination of didactic
instruction (Bolton & Mayer, 2008; Lerman et al., 2004), reading materials (Hiralall
& Martens, 1998; Maag & Larson, 2004), written and verbal instructions (Hall et al.,
1995; Sarokoff & Sturmey, 2004; Schepis et al., 2001), modeling (Causton-Theoharis
& Malmgren, 2005; Petscher & Bailey, 2006), role-playing (Parsons & Reid, 1999;
Schepis et al., 2001), practice (Harchik et al., 1989), rehearsal (Sarokoff & Sturmey,
2004), coaching (Schepis et al., 2001) and feedback (Arco, 2008; Deardorff et al.,
2007; Leblanc, Ricciardi, & Luiselli, 2005).
Didactic instruction, when part of a multi-component training package, can
provide school personnel with a solid theoretical foundation and understanding of the
procedure and its demonstrated effectiveness (Pickett & Gerlach, 2003). During the
initial training, didactic instruction typically includes definition and detailed
descriptions of the targeted techniques, their significance and relevance in the
26
classroom, and an overview of the current evidence of their effectiveness (Bingham,
Spooner, & Browder, 2007; Bolton & Mayer, 2008; Lane, Fletcher, Carter, Dejud, &
DeLorenzo, 2007; Sarokoff & Sturmey, 2004; Wood, Luiselli, & Harchik, 2007).
For example, Bolton and Mayer (2008) provided 30 minutes of didactic
instruction as part of a multi-component training package. The initial component of
the training package included a general introduction to Applied Behavior Analysis, a
definition of discrete trial training, and a synopsis of the current evidence supporting
the use of such practices in treating children with autism. Similarly, Sarokoff and
Sturmey (2004) included didactic instruction in a training package on discrete trial
teaching, providing participants a copy of the procedures followed by a review of
each component.
Research shows that breaking up specific teaching practices into manageable
components and providing instruction on each component ensures adequate
understanding of techniques and increases the likelihood of retention (Northup et al.,
1994). In some cases, researchers have broken up trainings into three or five steps.
They provide an instructional sequence for each strategy, model specific examples of
the strategy, and in some cases present a written description of the strategy, outlining
research that supports the strategy’s effectiveness in the classroom (Hiralall &
Martens, 1998; Nelson, 1996). Although didactic instruction has proven to be
relatively ineffective in isolation, this type of instruction can compliment a more
comprehensive training program (Bingham et al., 2007; Bolton & Mayer, 2008; Lane
et al., 2007; Sarokoff & Sturmey, 2004; Wood et al., 2007).
27
Prior to practicing newly-learned strategies, some trainers model or
demonstrate the procedures for the participants to illustrate correct implementation
(Codding, Feinberg, Dunn, & Pace, 2005; Lang & Fox, 2003; Petscher & Bailey,
2006; Sarokoff & Sturmey, 2004; Wood et al., 2007). Immediately following
modeling, it is common for school personnel to actively participate in the training
through rehearsal and practice of newly learned strategies with the supervision of
expert trainers. Although the method most often used to reinforce newly learned skills
is role-playing (Hiralall & Martens, 1998; Moore et al., 2002; Wallace et al., 2004),in
some instances trainings use videotapes to model correct implementation prior to
guided practice (Erbas, Tekin-Iftar, & Yucesoy, 2006; Wallace, Doney, MintzResudek, & Tarbox, 2004).
Parsons and Reid (1999) used role-playing during a classroom-based training
to teach instructional assistants a variety of skills designed to create an inclusive
setting for students with disabilities, including prompting, reinforcement, error
correction and task analysis. Likewise, Schepis, Reid, Ownbey and Parsons (2001)
also used role-playing to teach support staff to embed instruction into regularly
occurring routines in an inclusive preschool. These studies, as well as others
(Bingham et al., 2007; Engelman, Altus, Mosier, & Mathews, 2003; Harchik et al.,
1989; Lerman et al., 2004), employed role playing in a multi-component intervention
to ensure proper implementation and the transferability of newly-learned practices
into the classroom.
28
Feedback in professional development increases the likelihood that desired
changes will transfer from training into the classroom (Leach & Conto, 1999). It is
common for trainers to provide immediate supervisory feedback to school personnel
regarding performance during training (Erbas et al., 2006; Hiralall & Martens, 1998;
Maag & Larson, 2004; Moore et al., 2002; Wallace et al., 2004). In the literature,
supervisory feedback is described as verbal, written, or graphic, and is typically
delivered with social consequences, such as approval, praise, correction, clarification
and/or additional direction (Arco, 1991, , 2008; Leblanc et al., 2005). Arco (2008)
finds that this feedback is intended to immediately alter the behavior of staff, and is
effective when provided during role-playing and practice sessions. However,
although feedback during role-playing can be effective, researchers have also
acknowledged the importance of utilizing in-class performance feedback to ensure the
demonstration of targeted behaviors in a natural classroom setting (Arco, 1991, ,
2008; Arco & Millett, 1996; Codding et al., 2005; Fleming & Sulzer-Azaroff, 1989;
Leach & Conto, 1999; Leblanc et al., 2005).
Performance feedback, also known as process feedback, is defined as
quantitative or qualitative information about an individual’s past performance used to
adjust or maintain specific behaviors. Performance feedback differs from outcome
feedback, which is defined as information given regarding the effects on student
behavior rather than instructor performance (Prue & Fairbank, 1981), Performance
feedback is a reasonably non-intrusive training component used to shape newly
learned behaviors or increase existing behaviors (Leach & Conto, 1999). In
29
residential and general education settings, performance feedback about teacher
behaviors has been used to maximize student engagement (Leach & Conto, 1999),
improve the implementation of behavior support plans (Codding et al., 2005),
increase client interactions (Arco, 1991), maintain instructional behaviors following
on-the-job training (Arco & Millett, 1996) and improve discrete trial instruction by
instructional assistants (Leblanc et al., 2005).
Leach and Conto (1999) compared the effects of a short workshop with and
without performance feedback to determine its effects on teachers’ instructional and
managerial behaviors. Immediately following a half-day in-service training
comprised of research-based methods to ensure training effectiveness, teachers were
either provided with outcome feedback, performance feedback or a combination of
the two. Results indicate an immediate increase in the use of target behaviors when
performance feedback was introduced, as well as continual maintenance of the
targeted skills after termination of feedback.
Extending this research, Leblanc, Ricciardi and Luiselli (2005) implemented
an abbreviated performance feedback intervention with the aim of improving
instructional assistant delivery of discrete trial instruction. Results indicate that the
training strategy incorporated effortlessly into the classroom and had a lasting impact
on the behavior of the support personnel. Codding, Feinberg, Dunn and Pace (2005)
found similar results when they provided performance feedback immediately after the
observation and every other week. Constructive and corrective feedback followed
verbal praise for correct implementation. After receiving performance feedback, all
30
participating teachers demonstrated accurate implementation for 15 weeks during the
follow-up phase, and up to 15 weeks after the withdrawal of performance feedback.
These studies show that performance feedback during professional development
increases the likelihood that newly learned behaviors will be maintained in the
classroom environment even after performance feedback is terminated.
Of the previous studies reviewed in this section, few used follow-up training,
which can help preserve the integrity of the practices after the initial training
(Guskey, 1986). Although the literature does not indicate why this is, it might be that
follow-up training requires paying instructional assistants to stay longer, a legitimate
concern considering the limited budget of schools. Rather than follow-up, most of the
studies used on-the-job training, also known as coaching, and performance feedback
to reinforce the accurate implementation of learned skills (Schepis et al., 2001).
Because performance feedback has gained popularity and proven to positively benefit
the behaviors of instructional assistants, follow-up may have been replaced by
performance feedback.
Didactic instruction, modeling, role-playing, practice, rehearsal, feedback and
coaching have been identified in the literature as the effective components of
professional development. As demonstrated by several of the studies, professional
development can be delivered efficiently and effectively with limited time and
resources. This is an important finding considering the dual pressures of budget
restrictions and increasing pressure on many school districts to provide professional
development for instructional assistants.
31
Duration of Professional Development
Although duration is an important factor in professional development, school
districts have limited ability to provide extensive professional development to
instructional assistants, due to budget and time constraints. Fortunately, the reviewed
components of high quality professional development have been successfully
implemented in short duration trainings for a variety of teachers and support staff.
Schepis, Reid, Ownbey and Parsons (2001) used a classroom-based, multi-component
training, followed by on-the-job training, to instruct instructional assistants to embed
instruction within classroom activities. Specifically, participants were shown how to
prompt, correct and reinforce child behavior during a one-on-one, 60-90 minute
training session. The training session included didactic and written instruction,
examples of teaching opportunities in the context of natural classroom routines, roleplaying, and feedback. Following the classroom instruction, participants returned to
the classroom to demonstrate their newly learned skills. Each participant received onthe-job training in 2-4, 20-minute sessions. Each session included 5-15 minutes of
feedback from a supervisor. At baseline, participants’ percentage of teaching
opportunities with correct teaching ranged from 7-30 percent. Following the training,
all participants increased their use of correct teaching procedures to 75-100 percent.
This study demonstrates that short duration trainings can be effective when a targeted
skill is explicitly taught, then followed by on-the-job training.
Similar to Schepis, Reid, Ownbey and Parsons’ 2001 research, Parsons, Reid
and Green (1996) trained support staff in a residential setting to work with people
32
with severe disabilities during a one-day training. More specifically, participants
learned to implement a task analysis with particular emphasis on correct order,
correct prompt, correct reinforcement and correct error correction. Although the
authors do not specify the exact duration of the one-day training, the intervention
included classroom-based training and on-the-job monitoring and feedback similar to
the previous study. After the training, participants demonstrated an increase in task
analysis teaching behaviors from baseline to intervention. All participants maintained
high rates of correct teaching behaviors five weeks after the initial training.
Bolton and Mayer (2008) used a brief, multi-component training package to
increase instructional assistants’ correct implementation of discrete trail teaching
skills in the classroom environment. Three participants received a 3-hour training in
discrete trail implementation; the training incorporated didactic instruction, modeling,
general case instruction, practice and performance feedback. At baseline, the three
participants implemented discrete trail teaching behaviors at low rates, between 50-63
percent. Following training and performance feedback, participants’ correct
implementation increased to 98-100 percent in the classroom. During follow-up, the
three participants maintained high rates of correct implementation (91-100 percent)
for up to 5 months.
Although an effective duration for professional development is considered to
be at least 14 hours (Darling-Hammond et al., 2009), these studies illustrate that brief,
multi-component training programs followed by performance feedback are an
efficient and effective method of training instructional assistants and support staff in a
33
variety of skills. As indicated by Bolton and Mayer (2008), efficient and effective
staff training programs are necessary to increase the likelihood that students will
receive competent instruction by trained personnel.
Although the literature demonstrates that instructional assistants can learn a
multitude of skills during short duration trainings, research has yet to identify a
training package that encompasses the basic skills instructional assistants should
possess to effectively support a student with disabilities in a general education
classroom. Following a thorough search of the literature, five basic skills (from this
point on referred to as instructional assistant support behaviors) were determined to
be necessary to increase the effectiveness of instructional assistants working in a
general education classroom.
Instructional Assistant Instructional Support Behaviors
Prompting procedures. For a student to emit an independent response to
naturally occurring stimuli, the stimulus must acquire control over the student’s
behavior. In order for this to happen, a teacher or support person must devise a way to
get the student to respond adaptively in the presence of naturally occurring stimuli,
and then reinforce the occurrence of the behavior (Wolery & Gast, 1984). One-way to
do this is through the use of prompts and prompting procedures. Prompts are
additional stimuli used to increase the probability that a student will eventually emit a
desired response in the presence of a discriminative stimulus (Alberto & Troutman,
2006; Cooper, Heron, & Heward, 2007; West & Billingsley, 2005). Teachers often
use prompts in the form of verbal directives, gestures, modeling or physical guidance
34
to assist students with disabilities during instructional tasks (West & Billingsley,
2005). These prompts, provided immediately after a response, are typically known as
response prompts (Cooper et al., 2007; Wolery & Gast, 1984). The use of such
prompts allows teachers to be more efficient by providing additional opportunities for
correct responding and reinforcement, which leads to faster skill acquisition (West &
Billingsley, 2005).
Prompts are considered a temporary crutch that should be systematically
withdrawn as soon as the student begins to demonstrate increasing ability (Alberto &
Troutman, 2006). Prompts must be gradually faded in order to transfer stimulus
control from the supplementary prompt to the natural cue (Cooper et al., 2007; West
& Billingsley, 2005). It is imperative that instructional assistants learn to reduce or
eliminate their prompts in order to allow students to independently function in the
classroom. This may pose a challenge for many instructional assistants, whose central
role is to provide support and assistance (Hall et al., 1995). For this reason, it seems
plausible that a portion of instructional assistant training should focus on prompting
procedures and prompt dependence. As stated by Hall et al. (1995), “Specific
prompting strategies can not only increase children’s skills, but can also promote
independent engagement of these skills” (p.215).
System of Least Prompts. One of the methods used most frequently to
systematically fade prompts is the System of Least Prompts (SLP) (also known as
“least-to-most” or “increased assistance” prompting) (Alberto & Troutman, 2006;
Cooper et al., 2007; Doyle, Wolery, Ault, & Gast, 1988). This method employs a
35
prompt hierarchy which begins with the least intrusive prompt, provides wait time to
allow the student to respond, then gradually increases the intensity of successive
prompts following an error (Alberto & Troutman, 2006; Cooper et al., 2007; Doyle et
al., 1988).
Although the number and type of prompts varies, the procedure typically
includes: 1) gradually increasing the intrusiveness of the prompt following an error or
non-response, 2) an initial discriminative stimulus followed by three or four
escalating verbal, visual or physical prompts, 3) the simultaneously delivery of the
discriminative stimulus and prompt at each level, 4) a consistent response interval
following each prompt and 4) the delivery of reinforcement after a correct
independent or prompted response (Doyle et al., 1988).
In a review of procedural parameters, Doyle, Wolery, Ault and Gast (1988)
analyzed ninety studies to determine what populations the SLP has been used in, the
skills taught, and the results. As indicated in this evaluation, the SLP has been
successfully used to teach discrete and chained tasks, although a majority of the
studies used the procedure to teach chained tasks. In the late 1980s, a majority of
discrete and chained skills taught by the SLP method were community and daily
living (29 %), social skills and leisure (20 %), vocational skills (12 %), self-care (11
%) and motor skills (5 %).
More recent literature supports the use of this procedure to teach various skills
in a variety of domains. Researchers have used SLP to teach self-help skills, such as
hand washing, eating, drinking and dressing (Engelman et al., 2003; Horner &
36
Keilitz, 1975; McConville, Hantula, & Axelrod, 1998; Parsons & Reid, 1999), social
skills such as telephone skills and using an augmentative communication device
(Manley, Collins, Stenhoff, & Kleinert, 2008) and academic skills such as letter
identification, receptive skills and letter writing (Collins, Branson, Hall, & Rankin,
2001; Doyle, Wolery, Gast, Ault, & Wiley, 1990; West & Billingsley, 2005).
Although a variety of skills can be taught using this procedure, all of the
reviewed studies employed similar methods. For example, a hierarchy of prompts
typically starts with a natural cue or discriminative stimulus. If the student does not
respond or responds incorrectly after a predetermined amount of time (between 3-5
seconds, depending on the study), an indirect or direct verbal prompt is given.
Following the prompt, if the student responds incorrectly or does not respond within
3-5 seconds, a verbal prompt plus a gestural or model prompt is given. Again, if an
incorrect or no response occurs within the designated time, a physical prompt in the
form of hand-over-hand guidance occurs. The procedure continues throughout the
task until a correct or independent response is given. Once this response is emitted,
the instructor praises the student with either verbal or tangible items (Collins et al.,
2001; Doyle et al., 1990; Horner & Keilitz, 1975; Manley et al., 2008; McConville et
al., 1998; Parsons & Reid, 1999; West & Billingsley, 2005). Two of the reviewed
studies used four levels of prompting instead of three, including both a gestural and
model prompt as part of the prompt continuum (Collins et al., 2001; McConville et
al., 1998). An indirect verbal prompt has also been used to evoke the correct response
without explicitly stating what behavior should occur. This type of prompt provides
37
the student with hints in order to allow the student to think and problem solve on their
own, prior to providing the student with a direct verbal prompt that indicates what
response should occur. While using a hierarchy of least-to-most prompting,
researchers have used an indirect verbal cue as the first prompt after the
discriminative stimulus, to initiate play or correct an error after an incorrect response
has been made prior to providing a direct verbal prompt (Breen & Haring, 1991; Coe,
Matson, Fee, Manikam, & Linarello, 1990; Cuvo, 1981). Another variation of the
procedure was conducted by West and Billingsley (2005), the authors compared the
effects of the traditional least-to-most procedure and a revised least-to-most
procedure to teach four students with autism to appropriately respond to task stimuli
in order to complete a motor task. The revised procedure eliminated the verbal
prompts throughout the sequence of the least-to-most procedure, a slight variation
from the traditional procedure. Although both procedures were effective, the number
of errors and number of sessions to reach criterion were less for the revised
procedure. This revision, although minor, eliminated the opportunity for verbal
prompts to exert control as the discriminative stimulus, allowing for a quicker transfer
of stimulus control to the intended discriminative stimuli. These variations are not
uncharacteristic of the procedure; the variability of number of prompts and the way in
which the prompts are delivered could be attributed to the researcher’s preference,
difficulty of task or student ability (Doyle et al., 1988).
The System of Least Prompts has gained popularity due to its ease of use and
successful outcomes with a variety of disabilities and age groups. The literature
38
confirms that there is no specific age group or handicapping condition that the
procedure works best with. The procedure has been used with a range of ages from
preschoolers to adults, and a range of mild to severe disabilities (Doyle et al., 1988).
More recent literature suggests that the SLP would be a useful procedure for
classroom personnel such as instructional assistants, who do not possess extensive
knowledge in delivering instruction but are assigned to provide educational assistance
to students with disabilities (West & Billingsley, 2005).
Although research indicates that SLP is a potentially useful procedure for
support staff, few studies have investigated how to provide training in SLP in an
efficient and effective manner. Engelman, Mathews and Altus (2002) trained certified
nursing assistants (CNAs) to use the SLP in a dementia care unit to assist three
residents during dressing routines. CNAs attended a 45-minute, multi-component inservice that included an introduction to the goal of SLP, steps of the SLP, role-play,
feedback, and coaching in-vivo. As a result of the training, the CNAs were able to
help the residents achieve increased independence during dressing routines, although
the effectiveness of the procedure varied among residents. Engelman, Altus, Mosier
and Mathews (2003) repeated a similar study with CNAs and residents in a dementia
care unit with a reduced training of 30 minutes and the addition of a self-monitoring
procedure. This study validated previous results: CNAs demonstrated an increased
ability to effectively prompt while residents demonstrated increased independence
during dressing routines.
39
Parsons and Reid (1999) evaluated a training program in basic teaching skills
for instructional assistants working with students with disabilities. Specifically, this
study aimed to teach instructional assistants working in inclusive settings to use “best
practice” teaching strategies. Initially, the Teaching Skills Training Program (TSTP)
was developed to teach direct support staff in a residential program to assist people
with severe disabilities (Parsons, Reid, & Green, 1993). Upon conclusion of the
successful training program, the researchers used the same training procedures in one
eight-hour training to teach group home personnel, instructional assistants and
undergraduate teaching interns (Parsons & Reid, 1996). Since then, trainers have used
TSTP to train more than 300 instructional assistants and support staff.
The TSTP program includes training support staff in four basic teaching
skills: 1) task analysis, 2) SLP, 3) reinforcement, and 4) error correction. The training
format consisted of a classroom component where instructional assistants were given
the rationale for each skill and corresponding terminology. Following this
information, participants were provided with explicit training and practice in each of
the four performance skills. Similar to previous training formats, the instructors used
didactic instruction, modeling of correct and incorrect implementation, and practice
through role-play and performance feedback. During 15-30 minute observations,
instructors observed each instructional assistant in class and provided feedback
regarding each of the skills implemented. Skills insufficiently performed by the
participant were described and demonstrated by the instructor. Follow-up observation
40
demonstrated that participating instructional assistants maintained an 80 percent
proficiency in trained skills.
Other studies have used SLP or a modified version of SLP to train
instructional assistants and residential support staff to embed teaching within natural
routines (Schepis et al., 2001), generalize the use of discrete trial instruction into the
natural environment (Bolton & Mayer, 2008; Taubman et al., 2001), and use tactile
prompts to increase the implementation of a token economy (Petscher & Bailey,
2006). Although these studies did not focus primarily on teaching the SLP, they were
an important part of the intervention.
Most to Least Prompting. Also known as “decreasing assistance,” most-toleast prompt systems require an initial level of prompting that assures the student will
produce the desired response. Over successive trials, the amount of assistance is
gradually reduced until the student demonstrates success until no prompts are
provided (Cooper, Heron, & Heward, 1987; Schoen, 1986). Unlike the least-to-most
procedure, this prompt system requires the use of the most intrusive prompt first,
followed by a systematic presentation of less intrusive prompts until the learner
demonstrates the skill independently. For this reason, some researchers believe that
most-to-least prompting is advantageous when compared to increasing assistance,
since the student receives more intensive instruction and less opportunity to err at an
earlier point in time (Schoen, 1986). Since the student has fewer opportunities to err,
it is likely that this procedure produces more opportunities for reinforcement,
compared to other procedures (Vuran, 2008). In 1984, Wolery and Gast stated that
41
this prompting procedure was the most frequently used to teach response chains to
students with developmental disabilities. However, researchers acknowledge that both
single-step and chained behaviors can be taught to individuals of varied ages, with a
range of disabilities, using this procedure (Batu, Ergenekon, Erbas, & Akmanoglu,
2004; Vuran, 2008). In conclusion, the research suggests that the most-to-least
prompting procedure is easy to use and effective in teaching individuals with
moderate to severe disabilities (Vuran, 2008).
Whether most-to-least prompting employs a highly structured or more loosely
structured training format, this form of teaching helps to transfer stimulus control
from the response and stimulus prompts to the natural stimulus. The prompt hierarchy
customarily moves from physical guidance to partial physical guidance, modeling,
gesture prompts, and, finally, verbal directives (MacDuff, Krantz, & McClannahan,
2001), although slight variations of this procedure exist, depending on the student and
which prompt evokes the correct response (Wolery & Gast, 1984). For example, in
some cases, most-to-least prompting transitions from physical guidance to visual
prompts and ends with verbal instructions (Cooper et al., 2007).
Regardless of specific prompts used, several moment-to-moment teacher
decisions and her corresponding behaviors influence the effectiveness of this
procedure. Initially, the teacher must identify the target skill and the necessary
component behaviors required to perform the skill. After she decides on the right
order of steps, she must determine the level of prompting required to evoke the
correct response. In order to ensure the student does not become dependent on a
42
specific level of prompting, the teacher must also decide upon the criterion level at
which it is appropriate to transition from one prompt level to the next. For instance,
research suggests that a common criterion to transition to the next step is when the
student demonstrates independence on 90 percent or more of the trials for two
consecutive days (Wolery & Gast, 1984). The final step in most-to-least prompting is
to determine how errors will be corrected and when reinforcement will be given.
This procedure is well suited for teaching chained skills such as dressing,
bathing and playing, although it has also been used successfully to teach
communication and cognitive skills. For example, this procedure has been used to
teach children with autism to follow photographic activity schedules (Massey &
Wheeler, 2000), pedestrian skills (Batu et al., 2004), leisure skills (Vuran, 2008),
writing skills (Dooley & Schoen, 1988), and instruction-following (Striefel &
Wetherby, 1973).
Dooley and Schoen (1988) examined the effects of most-to-least prompting on
pre-writing skills while concurrently employing differential reinforcement of other
behaviors (DRO) to decrease talking-out behavior of a seven year-old boy with
cerebral palsy and brain damage. The authors provided praise for the demonstration
of appropriate behaviors by both the target student and other students in the class, and
inappropriate behaviors were placed on extinction. The authors used decreasing
assistance to encourage pre-writing skills, moving from physical guidance to verbal
prompts to no cues. The criterion for moving to a less intrusive prompt was that the
student demonstrate four successful consecutive attempts at the current level of
43
prompting. At baseline, talking-out behaviors averaged 84 percent and pre-writing
behavior averaged 8 percent. The DRO system reduced talking-out behavior from 84
percent to 7 percent by the end of the eight-day intervention and remained at this rate
for the remainder of the study. The decreasing assistance procedure increased prewriting behavior from 8 percent to 100 percent by the end of the intervention. This
rate was also maintained for the duration of the study. This study demonstrates the
effectiveness of a combined intervention of most-to-least prompting and DRO to
decrease inappropriate behaviors and increase academic writing behaviors.
In a study conducted by Massey and Wheeler (2000), a four year-old
diagnosed with autism attending an integrated public pre-school classroom learned to
use an activity schedule using most-to-least prompting. Through physical, gestural,
and verbal prompting, the student learned how to use an activity schedule to locate
each task. In order to determine the effectiveness of the procedure, researchers
concurrently measured task engagement, number and type of prompts used, and
challenging behavior. Using a 15s momentary time sampling data collection
procedure, the researchers demonstrated an increase in task engagement and a
concurrent decrease in prompts and challenging behavior, suggesting that the use of a
most-to-least prompting procedure that teaches students with autism to use activity
schedules can help increase levels of task engagement.
In 2008, researchers Libby, Weiss, Bancroft and Ahearn conducted a study to
compare the most-to-least prompting procedure (with and without a 2s delay) and a
least-to-most prompting procedure on the solitary play skills of five students with
44
autism. Each of the prompting procedures consisted of four prompts: 1) hand over
hand, 2) manual guidance at the forearm, 3) manual guidance at the upper arm and 4)
light touch and shadow. Depending on which prompting procedure was being
implemented, these prompts were given from most intrusive to less intrusive or vice
versa. Criteria for moving to the next prompt in a most to least procedure was set at
two consecutive, correct responses at the targeted prompt level. Likewise, two
consecutive errors made by a student signaled the researcher to increase the
intrusiveness of the prompt. The same procedure was used for the least-to-most
system in the reverse order. A slight variation was used with the 2s time delay
procedure, in that the student was given 2s to respond prior to the use of a more
intrusive prompt. Results indicated that for two students, the most-to-least procedure
was more effective than the least-to-most; however, three of the students
demonstrated more rapid acquisition during the least-to-most procedure. Although
this study suggests that the least-to-most procedure was more effective, the authors
indicated that the two students who benefited from the most-to-least procedure
showed little to no progress when the least-to-most procedure was employed. This
suggests, for some students, a most-to-least procedure is both more effective and
efficient in teaching chained behaviors.
Upon further investigation, the authors indicated that the most-to-least
procedure led to fewer errors per training session when compared to the least-to-most
procedure, particularly when time delay was inserted. As suggested by the authors,
staff may want to minimize errors in order to help facilitate discrimination learning,
45
particularly during the beginning trials of discrimination. It should be remembered
that each student error requires additional training trials coupled with ongoing and
complex decision-making by teachers to determine how to reset training steps and
prompting levels to facilitate correct responding. Also, continual errors lower the rate
of reinforcement, which can result in slower or impaired learning. For this reason, the
authors state that the most-to-least procedure is best used when the student’s learning
history is unknown and when errors are known to impede the student’s learning or
increase problem behaviors.
Although there is limited information regarding error rates and the number of
trials required to reach criterion for a given skill or task, some research indicates that
the most-to-least procedure produces a stable rate of correct responding. It is viewed
favorably by staff due to its ease of implementation (Demchak, 1990). Demchak
(1990) believed the decreasing assistance procedure to be the most efficient prompt
fading procedure, since fewer errors are made and skill acquisition is faster, compared
to the least-to-most procedure. A prompting procedure that results in fewer errors
prior to task mastery is important for two reasons. First, learning should be errorless
during the skill acquisition stage. Once an error has been made, it is likely to be
repeated. Secondly, each time an error is made, the instructional session must stop in
order to correct the error, which results in loss of instructional time (Demchak, 1990).
It is common for a supervising adult to consequate the error by stopping the student
when the error is made, returning the student to the start of the task, restating the
discriminative stimulus and physically guiding the student through the entire response
46
(Wolery & Gast, 1984). Continual use of this error correction procedure could result
in the student engaging in aberrant behavior to escape from the task due to frustration.
Therefore, procedures that require less instructional time, fewer trials or sessions and
less errors are favored when instructional time is limited (Demchak, 1990).
Graduated Guidance. Similar to decreasing assistance, graduated guidance
provides the student with the level of physical assistance required for the student to
complete the response (Cooper et al., 2007; Demchak, 1990; Schoen, 1986; Wolery &
Gast, 1984). However, graduated guidance differs from the decreasing assistance
procedure by intensity or location of the prompt, which can vary within a given trial
(Demchak, 1990; MacDuff, Krantz, & McClannahan, 2001). Although variations of
the procedure exist, the general guidelines are as follows: 1) Apply the necessary
amount of pressure needed to initiate each trial. 2) As the student begins to respond,
immediately begin to fade out physical prompts to transfer stimulus control to the
natural environmental stimuli (Cooper et al., 2007). 3) If the student begins to resist
or respond incorrectly, immediately increase the level of physical assistance to ensure
correct responding. 4) Continue each trial uninterrupted until the student completes
the targeted response. 5) Once the response has been completed, provide a reinforcer
if the student has completed any portion of the response independently. Verbal praise
can also be used during the trial if the student is actively participating, but should be
withheld if the student is resisting assistance as the response is completed (Demchak,
1990; Wolery & Gast, 1984). Although these steps suggest the typical sequence of the
47
procedure, the key to the correct implementation of graduated guidance is to
immediately begin to adjust the manual pressure as the student begins to respond.
Changes in the intensity or location of the prompt are known to change
moment to moment, depending on the student’s responsiveness. One way of fading
physical prompts is by moving the location of the prompt, also known as spatial
fading. For example, physical guidance may begin at the hand and fade to the wrist,
elbow and finally to the shoulder (Demchak, 1990; Schoen, 1986). Another way of
fading physical prompts is to decrease the intensity of the prompt. If the intensity of
the prompt is to be faded, physical guidance can be decreased by initially guiding the
entire hand followed by guiding only the thumb and index finger and lastly
shadowing the response (Demchak, 1990). Shadowing the response allows the
instructor to follow the student’s movement without actually touching the student
(Schoen, 1986). This allows the instructor to quickly prompt the student if the student
stops moving or begins to move in the wrong direction (Wolery & Gast, 1984).
Graduated guidance is best suited to teaching motor and chained responses,
since the entire chained response can be taught in sequence without interruption
(Demchak, 1990; Wolery & Gast, 1984). When compared with decreasing assistance,
this procedure is advantageous in that student participation is both encouraged and
permitted at any point during task. However, attempts made by the instructor to exert
physical control over a student may result in countercontrol (Schoen, 1986).
Much of the research on graduated guidance focuses on teaching self-help
skills to students with moderate to severe disabilities (Azrin & Armstrong, 1973;
48
Azrin, Schaeffer, & Wesolowski, 1976; Diorio & Konarski, 1984). More recent
literature has used the graduated guidance procedure to teach children with autism
and moderate intellectual disabilities to use a picture activity schedule to increase ontask and on-schedule behaviors (Bryan & Gast, 2000; MacDuff, Krantz, &
McClannahan, 1993; Spriggs, Gast, & Ayres, 2007). Bryan and Gast (2000) studied a
two-component teaching package, comprised of graduated guidance and picture
activity schedules, to increase on-task and on-schedule behaviors of four boys (ages
7-8) with high functioning autism. They employed a 60s momentary-time sampling
procedure to score on-task and on-schedule behaviors, and used a graduated guidance
procedure to teach the students how to use the picture activity schedule. Guided by
the MacDuff et al. (1993) observation that verbal prompts were frequently used by
staff but largely ineffective in facilitating independent responses for students with
autism, the researchers used graduated guidance instead of verbal prompts. The
researchers believed the use of manual prompts would be more effective during a
chained task since it decreased the likelihood that verbal prompts would be used.
Graduated guidance began once the student did not respond to the teacher’s
initial directive to transition to literacy centers. After waiting 10s for the student to
respond, the teacher provided a manual prompt by placing her hand on the student’s
shoulder in order to guide him to the picture-activity schedule. This procedure was
used for all steps of the sequence, which included manipulating the picture-activity
schedule, completing the activity, returning materials and returning to the pictureactivity schedule before transitioning to the next activity. The student received
49
manual prompts from behind, in order to avoid verbal prompts and verbal exchanges
between the teacher and the student. Throughout the procedure, the teacher made
decisions regarding prompt intensity and location on a moment-to-moment basis. To
begin to fade the use of manual prompts, the location and intensity of the prompt was
systematically withdrawn as the student began to respond independently. The result
of this study was a dramatic increase in both on-task and on-schedule behaviors for
all participants. Similar to the results of MacDuff et al. (1993), who also increased
participant on-task and on-schedule behaviors via a two-component teaching package
utilizing photographs and graduated guidance, students in this study quickly learned
to manipulate the picture-activity schedule. After the withdrawal of the procedure,
student on-task and on-schedule behaviors maintained and generalized across novel
activity schedules with no additional training or prompting.
When researchers Spriggs, Gast and Ayres (2007) sought to extend the
knowledge of picture activity schedules to include 12-13 year-old students with
moderate intellectual disabilities, they found similar results. They used a systematic
replication of the Bryan and Gast (2000) study, with slight variations in participants,
setting, materials, methods and activities completed during centers. One notable
variation was the use of paraprofessionals as the instructional agents instead of the
classroom teacher. Paraprofessionals learned the same graduated guidance procedure
as the Bryan and Gast (2000) study in order to help the student use a picture activity
schedule book to transition between centers during a 40-minute training session.
Similar to the previous study, researchers believed that adult attention during a
50
sequence may accidentally reinforce the student, pulling the student’s attention away
from the task. For this reason, they believed manual prompts would decrease student
dependence.
Unlike the Bryan and Gast (2000) study, which used no verbal prompts and
only graduated guidance to teach students to use a picture-activity schedule book,
Spriggs, Gast and Ayres (2007) used a system of least-to-most prompts during the
second training session, following the initial training session, which used only manual
guidance. This procedural difference was unintended but warranted when students
began to resist hand-over-hand assistance. As a result of the modification, after the
second training session, which utilized a system of least prompts, all prompts except
for a controlling verbal prompt were faded. Similar to the results of the Bryan and
Gast (2000) study, students quickly learned to manipulate their picture schedules,
teacher prompting was rapidly faded and students continued to demonstrate high
levels of on-task and on-schedule behaviors with the picture schedule following the
withdrawal of the intervention. The success of this approach suggests that a
combination of graduated guidance and least-to-most prompting may help to elicit
independent responses from students, particularly those who may become less
tolerant of hand-over-hand assistance.
Wait time. Wait time is the amount of time between the trainer’s instruction
and the learner’s response (Tincani & Crozier, 2008). The SLP and most-to-least
prompting systems includes wait time for two reasons: 1) to allow the learner to
respond to the natural cue prior to providing a supplementary prompt (McConville et
51
al., 1998) and 2) to allow the learner to respond prior to introducing a more intrusive
prompt (Wheeler & Richey, 2005). As indicated in the literature, wait time is
typically between 3-5 seconds from the presentation of the natural cue or last prompt
(Collins et al., 2001; Cooper et al., 2007; Doyle et al., 1988; Doyle et al., 1990;
Manley et al., 2008; McConville et al., 1998; Wheeler & Richey, 2005).
Several studies support the use of wait time to improve academic responses
and the acquisition of skills (Bakken, Whedon, Aloia, Aloia, & Edmonds, 2001;
Tincani & Crozier, 2008; Tobin, 1986; Valcante, Roberson, Reid, & Wolking, 1989).
A majority of the research on wait time was conducted in the 1980s and focused
primarily on comparing the effects of brief and extended wait times on student
performance. Results of these studies indicate that a longer wait time (typically 3-5
seconds) is superior to a shorter wait time (typically 1 second) (Lee, O'Shea, &
Dykes, 1987; Rowe, 1987; Tobin, 1986; Valcante et al., 1989).
Similarly, wait time has been used in other instructional methods such as
Carnine, Silbert and Kameenui’s (1997) Direct Instruction (DI) and modified versions
of direct instruction. This instructional method requires a group of students to
participate in small group instruction that is well designed and systematically
delivered. Wait time (also known as a thinking pause) gives students a moment to
think about an answer prior to responding. As defined by Carnine, Silbert and
Kameenui (1997), wait time can be as long as the teacher feels is needed. Although
this may vary depending on the skill and the level of the learners, the authors suggest
52
that wait time should be around 2-4 seconds in order to keep pacing adequate and
student frustration low.
Reinforcement. As Cooper, Heron and Heward (2007, p.36) assert,
“Reinforcement is the most important principle of behavior and a key element of
most behavior change programs designed by behavior analysts.” Reinforcement is a
consequence that strengthens a behavior under similar conditions. Likewise, positive
reinforcement is the contingent presentation of a stimulus, immediately following a
behavior, that increases the likelihood of the behavior occurring in the future (Cooper
et al., 2007; Wheeler & Richey, 2005). Positive reinforcers, the actual stimulus
presented following the behavior, take a variety of forms, such as verbal praise,
social, tangible, privileges or activities, edibles and tokens or points redeemable for
items or activities (Alberto & Troutman, 2006; Cooper et al., 2007; Wheeler &
Richey, 2005).
The first article in the 1968 Journal of Applied Behavior Analysis investigates
the effects of contingent teacher attention on the study behavior of students who
demonstrate a high frequency of disruptive behaviors (Hall, Lund, & Jackson, 1968).
The simple procedure required teachers to provide a student with social reinforcement
contingent upon study behavior, using verbal and descriptive praise or a pat on the
shoulder.
In order to train teachers to use this procedure, weekly meetings and graphs of
each day’s session were given to the teachers, followed by verbal reinforcement for
using the procedure. Initially, the teachers were signaled by the researcher to provide
53
praise during the reinforcement condition. After a reversal to baseline, the
reinforcement procedure was reinstated; however, this time the teacher continued to
provide reinforcement without the researcher’s prompts. The results indicated a high
rate of appropriate study behavior among the students following teacher praise,
providing evidence that contingent teacher attention can reduce inappropriate
behavior efficiently and effectively, and that teachers can learn to use the procedure
quickly and observe immediate effects (Hall et al., 1968).
Following this introductory study, Madsen, Becker and Thomas (1968)
trained teachers to use rules, praise and ignoring to decrease high rates of disruptive
behavior by several children in a public elementary school. In a workshop on the use
of behavioral principles in the classroom, teachers were instructed to “catch the child
being good” and provide praise in the form of contact, verbal praise or facial attention
contingent upon appropriate behavior. Teachers were instructed to “start small” and
give praise at the first instance of appropriate behavior, then gradually increase the
amount of praise when the child began to demonstrate increasing levels of appropriate
behavior. The results of this study indicate that the combination of ignoring
inappropriate behavior and acknowledging appropriate behavior increased student
time on task. Teacher approval of appropriate student behavior was found to be
pivotal to effective classroom management. This research added to the growing
literature supporting the idea that teachers can be taught to use contingent
reinforcement and other effective classroom management procedures to increase
student responsiveness and on-task behaviors.
54
Although researchers initially focused on decreasing inappropriate student
behavior, in the early seventies they began to explore the use of contingent
reinforcement to improve academic performance (Chadwick & Day, 1971; Ferritor,
Buckholdt, Hamblin, & Smith, 1972). Ferritor, Buckholdt, Hamblin and Smith (1972)
demonstrated that reinforcing students solely for attending decreased disruptive
behavior and increased attending behavior, but did not have an effect on the
correctness of academic work. When the contingencies were reversed, and placed
solely on correct work, the number of problems correct increased while disruptive
behavior increased. When both contingencies were addressed, attending behavior and
percent of problems done correctly increased simultaneously.
Several studies further explored the effects of contingent praise on both
academic and social student behaviors (Ferguson & Houghton, 1992; Freeland &
Noell, 1999; Martens, Lochner, & Kelly, 1992; Northup, Vollmer, & Serrett, 1993;
Sawyer, Luiselli, Ricciardi, & Gower, 2005). Although a majority of the research
focuses on teachers’ use of praise, Parsons and Reid (1999) trained instructional
assistants to use reinforcement as one of the four basic teaching practices for students
with disabilities. Although the study indicates that praise is typically an effective
form of reinforcement for most students, they also note the importance of determining
what constitutes reinforcement for each student on an individual basis.
More recent studies have shown that praise is a necessary part of a multicomponent intervention. Matheson and Shriver (2005) compared the effects of
teacher praise plus effective commands, vs. effective commands alone, on student
55
compliance and academic behaviors. After implementing the effective commands
condition, teachers were trained to give effective commands accompanied by verbal
praise. Verbal praise was defined as providing a positive statement or using
descriptive praise within 5 seconds of the student’s behavior. The 30-50 minute
training included didactic instruction, video examples of teachers either praising or
not praising students, practice identifying appropriate behaviors to reinforce,
corrective feedback, and modeling from the trainer. Following the training, teachers
were provided written feedback and coaching during subsequent observations.
Although the researchers did not require the teachers to follow a specific
schedule of praise, teachers were asked to give a minimum of 10 praise statements
per academic activity. Verbal praise was to be given only upon the demonstration of
appropriate classroom behaviors and compliance with teacher commands. Although
effective commands alone did result in small increases in student compliance and
academic behaviors, the addition of contingent teacher praise significantly increased
both desired academic behaviors and student compliance. This study demonstrates the
importance of including teacher praise as part of a training package for teachers.
Likewise, the existing literature on instructional assistant training typically
includes the use of praise as part of a multi-component training. It has been employed
in the SLP (Doyle et al., 1988; Parsons & Reid, 1996; West & Billingsley, 2005),
discrete trial training (Bolton & Mayer, 2008), embedded instruction (Schepis et al.,
2001), social skills instruction (Causton-Theoharis & Malmgren, 2005; Sawyer et al.,
2005), and behavioral and academic interventions (Lowe, 1974). As demonstrated in
56
the literature, providing praise contingent on positive behaviors is a necessary skill
for instructional assistants working with students with disabilities. The appropriate
use of praise as reinforcement is crucial to student inclusion in the classroom, because
the use of external rewards is likely to cause disruption in the class and temporarily
remove the learner from the current activity.
Proximity and student attention. Due to deficiencies in pre-service and inservice training, instructional assistants are known to have inadvertent detrimental
effects on students with disabilities and their ability to function independently,
especially when instructional assistants are assigned to provide one-to-one support in
inclusive settings (Giangreco & Broer, 2005b; Giangreco et al., 1997; Giangreco et
al., 2005).
In particular, over-prompting typically occurs when an instructional assistant
is in close and continual proximity to the student (Giangreco & Broer, 2005a;
Giangreco, Broer et al., 2001b; Giangreco, Edelman, Broer et al., 2001; Giangreco et
al., 2005; Harper & McCluskey, 2003). Over-prompting can inadvertently strengthen
the student’s dependence on adult cues (Giangreco et al., 1997), increasing the
likelihood that the student’s attention will be focused on the instructional assistant
and not on the teacher.
Excessive proximity has been shown to have several other unfavorable
effects, such as: (a) interference with the general education teacher’s ownership and
sense of responsibility for the student, (b) separation from peers and inhibition of peer
relationships, (c) unnecessary reliance on adults, (d) deceased levels of teacher
57
attention and access to quality instruction, (e) loss of personal control, (f) loss of
gender identity, (g) interference with the instruction of other students, (h) limited
relationships, and (i) an increased likelihood of problematic behaviors demonstrated
by the student (Giangreco & Broer, 2005b; Giangreco, Broer et al., 2001b; Giangreco
et al., 1997; Giangreco et al., 2005; Marks et al., 1999).
A small number of researchers have investigated the detrimental effects of
instructional assistant proximity on disabled students in the general education
classroom (Giangreco & Broer, 2005a; Giangreco, Broer et al., 2001b; Giangreco,
Edelman, Broer et al., 2001; Giangreco et al., 1997; Giangreco et al., 2005; Harper &
McCluskey, 2003). Giangreco et al. (1997) reported that after several observations
and interviews, instructional assistants were in unnecessarily close proximity to their
assigned student on a constant basis. As noted by several researchers, this constant
close proximity can hinder both academic independence and social interactions
(Giangreco et al., 1997; Giangreco et al., 2005; Harper & McCluskey, 2003;
Malmgren & Causton-Theoharis, 2006).
Particularly during academic tasks, research suggests that disabled students
who are supported by one-to-one instructional assistants are likely to have fewer
interactions with the general education teacher, thereby limiting the student’s access
to effective instruction given by a certified teacher (Giangreco, Broer et al., 2001b;
Giangreco et al., 1997). In addition, general education teachers who appear less
engaged with disabled students commonly transfer the responsibility of the student to
the instructional assistant.
58
The researchers observed that this shift of responsibility caused instructional
assistants to implement and design a majority of the student’s curricular and
instructional decisions, inevitably limiting the teacher-student interactions
(Giangreco, Broer et al., 2001b; Giangreco et al., 1997). In such cases, they found
that the general education teacher becomes less knowledgeable about the student’s
current skill levels and areas of deficit and mastery of IEP goals. This lack of
awareness presumably contributes to the lack of instructional decision-making and
lesson planning. Possibly for these reasons, teachers who were not engaged with the
students did not support fading out instructional assistant supports; rather, they
believed the supports to be a necessary part of student inclusion (Giangreco, Broer et
al., 2001b).
In sum, when instructional assistants assume ownership for a disabled student
and his/her education in a general education classroom, it seems that close and
consistent proximity becomes inevitable. This close proximity and over-prompting by
instructional assistants can inadvertently result in the student looking to the
instructional assistant, rather than the teacher, for directives, reinforcement, and
clarification.
It is important to note that researchers find that constant proximity of the
instructional assistant to the student typically results in over-prompting in both
academic and social situations. In a study by Giangreco et al. (1997), instructional
assistants did not fade prompts in order to facilitate independence or responsiveness
to natural cues in the environment. Instead, instructional assistants were seen to
59
prompt throughout writing tasks, after teacher directions were given, when using
materials and while on the playground. In Giangreco, Edelman, Luiselli and
MacFarland (1997), student dependence was observed in several instances where the
student would look back at the instructional assistant and in some cases attempt to
grab the instructional assistants’ hand, possibly indicating a reliance on the
instructional assistant.
To understand instructional assistants’ perspectives on proximity and its effect
on student performance, Giangreco and Broer (2005) surveyed 153 instructional
assistants in twelve public schools in Vermont. Results of the survey revealed that
less than 15 percent of the instructional assistants felt that their close proximity was
unnecessary or hindered teacher and peer interactions, although 37 percent did
indicate a concern that their student was dependent on them. This is of particular
interest since these same instructional assistants also indicated that they spent more
than 86 percent of their day in close proximity to the student.
Student Academic Engagement
The concept of academic engagement (also known as engagement in academic
responding or on-task behavior) has been constructed as a way of making sense of
how student behaviors are linked to academic outcome and influenced by the way in
which instruction is delivered and received (Greenwood, Horton, & Utley, 2002).
Within this framework, one way to assess special education students’ engagement in
general education academic tasks is to measure the time that students are actively
60
engaged in the curricular activity (Bardin & Lewis, 2008; Martella, Nelson, &
Marchand-Martella, 2003).
Academic engagement has been defined in a variety of ways. Most
commonly, academic engagement is defined by a combination of specific student
behaviors, such as appropriately engaging in a task, manipulating object or work
materials as designed, facing the instructor or task, listening to instruction, engaging
in a social activity, reading aloud or silently, writing, talking about an academic task,
asking and answering questions and in some cases, play behaviors (Conroy, Asmus,
Ladwig, Sellers, & Valcante, 2004; Logan & Malone, 1998; McDonnell, Thorson,
McQuivey, & Kiefer-O'Donnell, 1997; Pelios, MacDuff, & Axelrod, 2003).
Although some researchers include both passive and active behaviors in a
general definition of student engagement, others divide passive and active behaviors
into two separate categories. For example, passive responding is said to be occurring
when a student is attending to an instructional activity but not actively engaged.
Student behaviors such as waiting, listening, giving direct eye contact and attending
to the teacher would indicate a passive level of engagement. Active responding
requires the student to be actively responding to the task, such as participating in
verbal communication or physical movement, indicating direct and active
involvement in the activity (Hollowood, Salisbury, Rainforth, & Palombaro, 1994;
Hunt, Farron-Davis, Beckstead, Curtis, & Goetz, 1994; Kamps, Leonard, Dugan,
Boland, & Greenwood, 1991; Tindal & Parker, 1987).
61
Although academic engagement varies slightly by definition and does not
directly measure student academic progress, students demonstrating high levels of
engagement during classroom tasks exhibit a higher degree of academic achievement
(Brophy & Good, 1986) while students demonstrating low levels of academic
engagement are more prone to school failure (Cooper & Spence, 1990). Brophy and
Good (1986) report that general education students who are actively engaged
demonstrate increased achievement. For students with autism and more severe
disabilities, a link has been demonstrated between academic engagement and
criterion-referenced assessments (Kamps et al., 1991; Logan, Bakman, & Keefe,
1997). In conclusion, for students with disabilities, the more time spent academically
engaged, the greater the achievement (Bulgren & Carta, 1992; Logan et al., 1997;
Sindelar, Smith, Harriman, Hale, & Wilson, 1989).
Researchers have found a positive correlation between academic engagement
and academic gain. The literature also indicates that in order to function within
various environments, students with autism and related disabilities typically require a
supervising adult to continually prompt engagement (Dunlap & Johnson, 1985;
Dunlap, Koegel, & Johnson, 1987). This has been found to be particularly true in
educational settings (Pelios et al., 2003). In some cases, the elimination of a
supportive adult can have undesirable consequences on student behavior, such as the
demonstration of off-task behaviors, the emergence of stereotypic behaviors, and a
decrease in academic engagement (Dunlap & Johnson, 1985).
62
Research pertaining to this topic has focused on the effects of teacher-specific
instructional variables, instructional context and interaction patterns on the engaged
behavior of students with and without disabilities in general and special education
classrooms (Bardin & Lewis, 2008; Keefe, 1994; Sindelar, Shearer, Yendol-Hoppey,
& Libert, 2006). Specifically, numerous studies have examined academic engagement
and student achievement of students with and without severe disabilities (Hollowood
et al., 1994; Logan & Malone, 1998), students at risk of school failure (Cooper &
Spence, 1990; Greenwood, 1991), mildly handicapped students included in general
education classes (Alves & Gottlieb, 1986; Thompson, White, & Morgan, 1982),
students with learning disabilities (Bulgren & Carta, 1992), students with autism
spectrum disorders (Conroy et al., 2004; Kamps et al., 1991; McDonnell et al., 1997),
students with moderate, severe and profound disabilities (Keefe, 1994; Logan et al.,
1997) and students with visual impairments (Bardin & Lewis, 2008).
Despite the fact that many of these studies vary slightly in targeted students,
definition of academic engagement, and the environment in which observations were
conducted, many of the studies utilized an ecobehavioral assessment, an
observational research method used to describe, compare, and assess relationships
between classroom variables, teacher variables and student behavior (Greenwood,
Carta, Kamps, & Arreaga-Mayer, 1990; Logan et al., 1997). This assessment provides
information regarding instructional conditions within a classroom, compares
instructional conditions across classrooms, discerns instructional variables occurring
within the classroom that correlate with high levels of student academic engagement,
63
and documents changes in classroom variables as a result of experimental
manipulation (Kamps et al., 1991; Logan et al., 1997). The primary focus of such
analysis is to evaluate active student response and engagement (Logan et al., 1997).
Although there are variations of ecobehavioral assessment among the studies
(Cooper & Spence, 1990; Logan & Malone, 1998), the observational data system
used to code each of the variables remains constant. The Code for Instructional
Structure and Student Academic Response (CISSAR) developed by Juniper Gardens
Children’s Project (Stanley & Greenwood, 1981) or the Code for Instructional
Structure and Student Academic Response-Mainstreamed Version (MS-CISSAR)
(Bulgren & Carta, 1992; Carta, Greenwood, Arreaga-Mayer, & Terry, 1988;
Greenwood, 1991; Greenwood et al., 2002) is a direct observation tool which
employs a momentary time sampling procedure to record students’ opportunities to
engage in academic responding as a result of different instructional arrangements in
the classroom (Bulgren & Carta, 1992).
Using this tool, researchers found comparable rates of engagement for
students with a range of disabilities in various general education classrooms. When
student engagement was defined as either active engagement or both active and
passive engagement, student engagement was found to be 36 percent for students
with moderate to severe disabilities (Hollowood et al., 1994; Logan et al., 1997), 34
percent for students with moderate to profound disabilities who also engaged in
competing behaviors (Keefe, 1994), 34 percent for students with severe disabilities
with the exclusion of students with disruptive behaviors (Logan & Malone, 1998) and
64
32 percent for students with low-incidence disabilities (i.e.: mental retardation,
physical disabilities and autism) respectively (McConville et al., 1998). Although
these overall engagement scores are similar, these results should be interpreted with
caution due to the variance in the definition of academic engagement.
McDonnell et. al (1997) investigated the academic engaged time, task
management and competing behaviors of six students with and without disabilities in
a general education classroom. For this study, academic engaged time was comprised
of the proportion of intervals in which the student was engaged in academic
responding and task management activities as set forth by the MS-CISSARS.
Examples of academic responding behaviors were writing, manipulating objects
relevant to the task, reading silently or aloud, and engaging in verbal exchanges
regarding the subject matter. Task management behaviors, similar to the previous
definition of passive engagement behaviors, were defined as behaviors required in
order to engage in the assigned task. For example, raising a hand, requesting help,
playing with items approved by the teacher, getting out materials, transitions,
listening and attending were all considered task-management behaviors. An
additional measure of competing behaviors, defined as aggression, disruption,
noncompliance, self-stimulatory behavior and not attending were considered to be
incompatible with academic engagement and task management behaviors. A 20s
momentary time-sampling data collection procedure was employed weekly for at
least 20-minute observations across 5 consecutive months. During the 20s intervals,
65
the observers recorded data on events occurring in the classroom as well as teacher
and student behaviors.
This study included a noteworthy evaluation of the effect of instructional
assistant support staff on the academic engagement of students with low-incidence
disabilities in inclusive classrooms. The experimental group was divided into two
groups, categorized by students with disabilities who received continual support from
a one-to-one instructional assistant and students with disabilities who did not receive
continual support to determine if instructional assistant assistance effected student
engagement in academic tasks.
Students with disabilities in the experimental group were, on average, engaged
in academic responding during 32 percent of the observed intervals, compared with
their typical peers, who were academically engaged for 37 percent of the observed
intervals. The combination of academic responding and task management behaviors
for students with and without disabilities revealed an average rate of engagement of
78.5 percent. An analysis comparing the three dependent measures for students with
disabilities who either did or did not receive support from an instructional assistant
showed that there were no statistically significant differences between the two groups
of students for any of the three variables. Results indicated that students with
disabilities were not statistically different from their grade level peers in terms of
academic engagement and task management behaviors, although students with
disabilities were known to demonstrate higher levels of competing behaviors. It is
interesting to note that this finding has been replicated by other researchers, who have
66
found students with disabilities to be equally engaged when compared with their
typically developing peers (Hollowood et al., 1994; Ysseldyke, Christenson, Thurlow,
& Skiba, 1987). In a further analysis of instructional grouping and time engaged,
Logan, Bakeman and Keefe (1997) found that the percentage of academic
engagement varied depending on the instructional arrangement. For instance, during
whole class instruction, students with severe disabilities in the general education class
were engaged 23 percent of the time, 43 percent during one-to-one activities, 42
percent during small group and 50 percent during independent work. For the three
instructional contexts that produced the highest level of engagement (one-to-one,
small group instruction and independent work), there was a mean score of 45 percent
engagement across these activities. This suggests that academic engagement can vary
depending on instructional context.
Although these findings suggest that students with disabilities are able to
function similarly to their peers in terms of academic engagement throughout the day,
it is unknown whether students who have the assistance of a one-to-one instructional
assistant would be able to demonstrate similar levels of independent academic
engagement without the assistance or prompting from an instructional assistant. Much
research has focused on teacher-delivered instruction within a variety of contexts,
leaving little known about the effects of instructional assistants on student
independent academic engagement during various classroom academic activities
throughout the day.
67
Training Recommendations
Review of the literature reveals a need for instructional assistant training
programs that are efficient, effective, ongoing, and support the transferability of
newly learned practices into the classroom setting. In particular, instructional
assistants who support a student with disabilities in the general education classroom
need explicit instruction on how to increase student independence by increasing
behaviors that facilitate independence and decreasing behaviors that promote
dependence.
Major Research Questions and Related Hypotheses
The current study investigates a more effective, efficient instructional assistant
training program that increases instructional assistant behaviors that facilitate student
academic engagement, and decreases behaviors that interfere with students’ academic
engagement in the general education setting. Specifically, this study evaluates
whether the use of a multi-component training package, consisting of components
considered essential for successful staff training, will lead to an increase in effective
behaviors and reduction of behaviors known to contribute to student dependence. As
noted in the literature, instructional assistants (Downing et al., 2000; Hall et al., 1995;
Parsons & Reid, 1999) are likely to require explicit training on basic instructional
behaviors in order to effectively deliver instruction and teach the student to respond
to natural cues in the environment. The major research questions are listed below,
followed by their associated hypotheses.
68
Question 1
Will instructional assistant training be generalized into the natural environment?
•
Instructional assistants in the treatment condition will outperform those in the
control condition who did not receive the training on both a total cognitive
and behavioral posttest measure.
Question 2
Which specific instructional assistant observable behaviors are more likely to
change as a result of training?
•
Of the five observable instructional assistant behaviors, the most likely to
show statistically significant improvements are descriptive praise, proximity
and prompting.
Question 3
Which instructional assistant characteristics are most susceptible to training
effects?
•
Years of experience, age and gender are hypothesized to affect training
receptiveness.
69
CHAPTER III
METHOD
An experimental, 2 x 2 repeated measures, pretest-posttest design was used to
test the main hypotheses. The independent variable was a training package, which
included a total of three hours of in-class training and one hour of in-class coaching.
The purpose of the design was to measure both cognitive and behavioral differences
between the experimental and control group, in order to inform a more effective
model of instructional assistant training and practice.
Participating Districts
The study took place in three school districts that participate in one of
California’s Special Education Local Planning Areas (SELPA).
District A
As of the 2009-2010 school year, District A currently served 3,618
kindergarten-6th grade elementary school students in nine local elementary schools.
The district has a diverse population, with Hispanic (49%) and White students (41%)
representing the two largest ethnic populations (Ed-Data, 2010). In schools of District
A, 27 percent of the total enrollment is classified as English Learners, 40 percent
qualify for free or reduced meals, and 26 percent receive compensatory education.
Within the district, there are 193 full-time teachers and a pupil-teacher ratio of
18:1. Of the 193 full-time teachers, 186 are assigned to work in self-contained
classrooms, 15 work as special education teachers and 14 are assigned as “other,”
which includes resource, independent study and alternative program teachers.
70
Currently, there are 138 instructional assistants employed by the district (60% of
classified staff). The district’s average class size is 20 students school-wide. For
grades kindergarten, first, second, and third, the class size is under 20 students. In the
upper grades, class size averages about 24 students (Ed-Data, 2010).
In 2009, the Local Education Agency (LEA) Overview from the California
Department of Education, Policy and Evaluation Division (2010) indicated that
District A did not make Adequate Yearly Progress (AYP) in percent proficient in
both English-Language Arts and Mathematics. Students with disabilities did not meet
proficiency in both English-Language Arts and Mathematics categories (CDE Policy
and Evaluation Division, 2010).
As reported by the district, of the 3,618 students, 408 students currently
qualify for special education services, constituting approximately 11 percent of the
total district population. Of the identified students with disabilities, 64 are currently
assigned a one-on-one instructional assistant. Of the 64 students, 31 are included in
the general education classroom with the support of an instructional assistant for a
portion of their day; others receive support in self-contained special education
classrooms. Twenty-five of the 31 students are included in their general education
classrooms for 80 percent or more of the day.
District B
District B currently serves 1,046 kindergarten-6th grade elementary school
students in three local elementary schools. The district has a predominately Hispanic
population (69%), with the second largest population being Caucasian students (24%)
71
(Ed-Data, 2010). Of the students who attend a school in District B, 40 percent of the
total enrollment is classified as English Learners, 60 percent qualify for free or
reduced meals and 28 percent receive compensatory education.
District B employs 127 full-time teachers and a pupil-teacher ratio of 20:1. Of
the 127 full-time teachers, 186 are assigned to work in self-contained classrooms, 12
work as Special Education teachers and 13 are assigned as “other,” similar to the
previous district. There are 58 instructional assistants employed by the district which
make up 43 percent of the classified staff (Ed-Data, 2010). The district’s average
class size is 24 students school-wide. For grades kindergarten, first, second, and third,
the class size is under 20 students. In the upper grades, class size averages about 27
students (Ed-Data, 2010).
In 2010, the Local Education Agency (LEA) Overview from the California
Department of Education, Policy and Evaluation Division (2010) indicated that
District B did not make Adequate Yearly Progress (AYP). The district did meet the
percent proficient standards in English-Language Arts but not in Mathematics.
Students with disabilities did not meet proficiency in Mathematics but did meet
proficiency in English-Language Arts (CDE Policy and Evaluation Division, 2010).
As reported by the district, of the 1,046 students, 119 students currently
qualify for special education services, constituting approximately eleven percent of
the total student population. Of the identified students with disabilities, six are
currently assigned a one-on-one instructional assistant throughout the entire school
day (as designated in their IEP) although the special education director states that all
72
special education students within the district are mainstreamed for some portion of
the day, although it may not be written into the IEP. Four of the six students with 1:1
aide support designated in the IEP are included in the general education classroom for
the entire day with a part time assistant. The students who do not have 1:1 aide
support specified in their IEPs either attend their general education classroom without
the support of an aide or an assistant accompanies two to three students within the
same general education classroom.
District C
District C, the largest of the three districts, currently serves 5,758 kindergarten
through 6th grade elementary school students in thirteen local elementary schools. The
district has a large Hispanic population (66%), with the second largest population
being Caucasian students (28%) (Ed-Data, 2010). Of the students who attend a school
in District C, 40 percent of the total enrollment is classified as English Learners, 46
percent qualify for free or reduced meals and 68 percent receive compensatory
education.
District C employs 282 full-time teachers and a pupil-teacher ratio of 20:4. Of
the 127 full-time teachers, 272 are assigned to work in self-contained classrooms, 19
work as Special Education teachers and 15 are assigned as “other,” as defined by both
District A and District B. There are 140 instructional assistants employed by the
district, which make up 42 percent of the classified staff (Ed-Data, 2010). The
district’s average class size is 22 students school-wide. For grades kindergarten, first,
73
second, and third, the class size is under 20 students. In the upper grades, class size
averages about 27 students (Ed-Data, 2010).
As of 2010, the Local Education Agency (LEA) Overview from the California
Department of Education, Policy and Evaluation Division (2010) indicated that
District C did not make Adequate Yearly Progress (AYP) for the 2008-2009 school
year. The district did meet the percent proficient standards in English-Language Arts
but not in Mathematics. Students with disabilities did meet proficiency in
Mathematics or in English-Language Arts (CDE Policy and Evaluation Division,
2010).
As reported by the district, of the 5,758 students, 724 students currently
qualify for special education services, constituting approximately 13 percent of the
total population. Of the identified students with disabilities, 28 are currently assigned
a one-on-one instructional assistant in their IEP. All 28 students are included in the
general education classroom with the support of an instructional assistant for a
portion of their day, although the Special Education Director states that numerous
other special education students receive aide support at some point in the day
although it is not stated in the IEP. Eighteen of the 28 students are included in their
general education classrooms for 80 percent or more of the day.
Instructional Assistant Employment, Skills and Requirements
The requirements and skills necessary to obtain employment as an
instructional assistant in any of the three districts are similar. Although the position
varies slightly in the basic duties and responsibilities, interested applicants must
74
possess a basic understanding of special needs, issues and requirements of students
with disabilities, basic knowledge of various school-taught subjects, correct use of
English, grammar, spell and punctuation, interpersonal relations skills, recordkeeping techniques and appropriate conduct.
Instructional assistants must also have the ability to assist with and reinforce
teacher-provided instruction, help with the development of self-help and social skills,
assist with the physical needs of the students, tutor individual or small groups of
students, perform clerical duties such as preparing, typing and duplicating materials,
print legibly and compute simple math calculations quickly and correctly, understand
and follow oral and written directions, communicate effectively, observe and control
behavior of students such as restraining and disciplining students according to
approved policies and procedures, report student academic and behavioral progress,
operate a computer and maintain cooperative and effective working relationships with
others. Previous education must include graduation from high school and two years
experience working with school-aged children in a structured setting. Adhering to the
changing state requirements, instructional assistants must also have a Bachelors
degree, Associates degree or pass a general examination developed by the State of
California.
There are only minor differences between the districts in terms of hiring
instructional assistants. In District A, instructional assistants can be hired as either a
Special Education Instructional Assistant or a Severely Handicapped Special
Education Assistant. Both types of instructional assistants support students in their
75
general education classroom, but differ in the environments in which they provide the
majority of this support. A Special Education Instructional Assistant works primarily
in the resource, speech or learning handicapped classrooms, while the Severely
Handicapped Instructional Assistant works in inclusion and the special day class
(SDC) classroom. Districts #2 and #3 do not differentiate the title or environment to
which the instructional assistant will be assigned.
District Pre-Service Training and Orientation Procedures
Although the districts are similar in their hiring processes and requirements,
once an instructional assistant is hired the districts vary in their pre-service training
and orientation procedures.
District A. Newly-hired instructional assistants receive an Instructional
Assistant Training Manual which provides information such as procedures for
absences, confidentiality requirements, instructional assistant do’s and don’ts, tips for
working in the general education setting, procedures for addressing concerns, child
abuse reporting and tips on providing academic, behavioral, social and peripheral
support. At the beginning of the year, the district’s two full-time inclusion specialists
and Special Education Director provide an initial training for all of the instructional
assistants that covered training based on the manual’s contents. The training is either
one or two full days and all instructional assistants are required to attend.
Instructional assistants also receive a one-day training in Brain Gym, which is a
movement-based program used to promote play and the joy of learning (Dennison &
Dennison, 1980) and Crisis Prevention Intervention (CPI) to teach crisis prevention
76
and de-escalation. During the year in which the study was conducted, the district
supported two inclusion aides to attend an Applied Behavior Analysis (ABA), oneday training and two inclusion aides to attend a six-day ABA training although none
of the instructional assistants who participated in the study reported receiving either
of these ABA trainings.
Throughout the year, both district inclusion specialists occasionally observe
and provide feedback to their instructional assistants as a way of providing ongoing
training. Other trainings are offered to instructional assistants throughout the year but
are not required by the district. If a specific school principal requests it, the inclusion
specialists can provide training for the school’s instructional assistants. For the 20092010 school year, the inclusion specialists or other professionals in the field provided
training on general characteristics of autism, writing substitute plans, and
communicating with parents. Each year the goal is to enhance the aides’ knowledge
in specific areas such as behavior management, teaching social skills, sensory
concerns, writing social stories and information regarding various disabilities.
District B. New instructional assistants also receive a training manual,
although the contents vary slightly and include information regarding autism and
Asperger’s, general guidelines for interventions and best practices, techniques to
promote independence and fade support, tips for handling tantrums and behavioral
outbursts, and adaptations and modifications for various school subjects. Instructional
assistants could have also check out a smaller manual, The Paraprofessional’s Role in
Inclusive Classrooms: Support Manual, which is accompanied by a video. Once
77
hired, the inclusion/resource specialist provides orientation training and the manual.
The inclusion/resource specialist also stated that much of the time the instructional
assistants are not able to observe the student prior to working with them, although she
does provide an orientation on-the-job training to ensure their comfort in their new
position.
Throughout the year, the inclusion/resource specialist provides monthly
trainings for an hour after school with the instructional assistants. She provides some
of these trainings, and an expert or professional in the field is asked to give others. In
the beginning of the year, trainings are focused on reviewing IEPs and creating “fast
fact” sheets about the strengths, challenges and goals of each student. Instructional
assistants also receive training early on in the year regarding basic behavior
management strategies, facilitating social skills and CPI training. Follow up trainings
throughout the year continue to reiterate the themes and skills from the beginning of
the year and primarily focus on autism, behavior management and academic support
strategies. Similar to the District A, various other trainings are offered throughout the
year, ranging from an hour or two to an all-day training, either district-wide or
SELPA-wide, but are not required.
District C. Administrators are currently putting together a training manual for
newly hired instructional assistants. At the time of the study, no initial training or
manual was provided. During the year of the study, the district held an all day
training on autism entitled, “Autism Spectrum Disorder (ASD): What every
paraeducator should know” presented by the Southern California Diagnostic Center.
78
This training was well-attended by a variety of instructional assistants, related service
providers and administrators but was not required by the district. Although the
resource teacher from a participating school within the district attended, the two
participants from the district who took part in the study did not attend. This district
reported that it typically sends aides to various professional development activities;
however, in this school year it is unclear if any instructional assistants were sent and
what trainings were attended. As with the previous districts, District C also provides
trainings primarily focusing on autism, classroom management and behavior.
Although the three participating districts provided some form of training for
instructional assistants, these trainings lacked targeted instruction on specific
instructional support behaviors used to increase student independent academic
engagement. District or school trainings frequently provided conceptual information
regarding general practices but lacked detailed information on specific instructional
supports as well as how and when to employ such behaviors in the classroom. These
trainings did not include a coaching component or a plan to assist with the
transferability of skills from the training environment to the classroom.
Participants
Instructional assistant participants. The primary participants were 31
instructional assistants (25 female and 6 male) employed by one of the eight
elementary schools within the three school districts participating in the study.
The Superintendent, Assistant Superintendent and/or Director of Special
Education for each school district referred instructional assistants who were currently
79
supporting students with disabilities in the general education classroom during the
2009-2010 school year. Although most of the instructional assistants had indicated
they had received some type of training either prior to or during employment, it was
clear that none of the participants had previously attended a training focusing
primarily on the fundamental support behaviors for instructional assistants.
In order to participate in the study, instructional assistants had to meet the
following inclusion criteria: (1) employment in a public elementary school in one of
the participating school districts, (2) female or male and in between the ages of 18
and 65 years old, (3) currently supporting a student with disabilities in an inclusive
setting for at least 50 percent of the school day, (4) had not previously received
training in fundamental support behaviors, and (5) was interested in participating.
To determine which instructional assistants met these criteria, the researcher
contacted the case managers for each included student within each of the participating
school districts. Case managers were asked the following questions: (1) if the student
was included in the general education setting, (2) for what percentage of the day, (3)
what activities the student was included in, (4) if the assigned instructional assistant
accompanied and assisted the student in the general education class and (5) if the
instructional assistant provided continual support to the student or monitored the
student while assisting other students. This information was used to determine which
instructional assistants met the criteria for participation and how instructional
assistants are assigned throughout each of the districts. Instructional assistants who
met the criteria were asked to participate through written consent during the 2009-
80
2010 school year. When an adequate sample was obtained, all participants were asked
to fill out an instructional assistant questionnaire to gain pertinent demographic and
job-related information (see Appendix A).
Of the included instructional assistant participants, age ranged from 20-65
years old with an ethnic composition which included 20 Caucasian, 11 Hispanic and 2
other. The amount and type of school completed by each instructional assistant
ranged from the completion of high school to graduate school (see Table 1).
Years of experience as an instructional assistant ranged from less than a year
to 19 years with the number of years working in an inclusive setting ranging from less
than a year to 15 years. Although participants may have worked several years as an
instructional assistant, the three districts were known to move instructional assistants
to different schools or different students within the district both yearly and throughout
the school year, depending on student needs and determined “best fit” between the
instructional assistant and the student. The number of years at the school the
instructional assistant was currently employed at ranged from less than a year to 9
years.
It should be noted that although 31 participants were recruited, the distribution
of instructional assistants was not equivalent across districts with 20 from District A,
9 from District B and 2 from District C. Interestingly, of the 31 instructional
assistants participating, 22 (71%) were able to correctly identify their student’s
primary diagnoses while 6 (19%) indicated the wrong disability and 3 (10%) did not
answer the question on the initial demographics questionnaire.
81
Students with disabilities: The secondary participants were 31 students with
disabilities (6 female and 25 male) ranging from 5-12 years old diagnosed with either
a low or high-incidence disability (see Table 2). The student participants selected for
inclusion in the study met the following criteria: 1) diagnosed with one of the thirteen
disabilities as defined by IDEA, 2) participated in a general education classroom for
50% or more of the school day, 3) received one-on-one support from a participating
instructional assistant and attended an elementary school in one of the three
participating districts and 4) was given written consent by a parent/guardian to
participate.
As to ethnic composition, students were predominately White (52%) or
Hispanic (36%) and attended a kindergarten through sixth grade elementary school
within one of the three school districts. Each of the general education classrooms the
student participants attended varied in size, number of mainstreamed handicapped
students and achievement level. No notable differences were observed between
schools and classrooms although appearance, set up and daily schedules varied
depending on the teacher and grade level. The general education teachers were
predominately female, varying in age, education and professional experience.
Although specific curriculum was not analyzed, all teachers taught the basic subjects
of Language Arts, Math, and Social Sciences.
Students were a combination of low and high-incidence disabilities, with their
primary diagnosis verified through each of their IEPs. Of these students, 12 also had a
82
secondary diagnosis. Only one student, diagnosed with Emotional Disturbance, had a
current Behavior Support Plan (BSP) as indicated in the student IEPs.
Although disability and age varied, all students were able to verbally
communicate. Verbal communication skills ranged from limited verbal ability (could
indicate response using one to two word answers) to grade level verbal ability (able to
use verbal language in a variety of ways similar to general education peers). No
student used augmentative communication although one of the students diagnosed
with visual impairment did use a magnifier to help blow up worksheet pages onto a
computer screen in order to better see the page.
Experimental Design
An experimental, 2 x 2 repeated measures, pretest-posttest design was used to
test the main hypotheses. Specifically, it compares the effects of a treatment condition
consisting of carefully designed instructional assistant training and a control condition
in which instructional assistants receive no in-class instruction. The study used
random assignment to assign consenting participants to either the treatment or control
condition.
The control condition included instructional assistants who received no inclass training. Instructional assistants in the treatment condition received the carefully
designed training package. This type of group training represented a prototypical
training method commonly utilized by schools as a way of providing instruction to
groups of similar school personnel.
83
Measuring Instruments
Four kinds of measures were utilized in this study. The first measure, a
cognitive measure, included twelve short vignettes. These were developed by the
researcher and validated by fellow graduate students and experts with background in
Applied Behavior Analysis (see Appendix B). This measure was “cognitive” in that it
asked instructional assistants to think about a scenario and to thoughtfully evaluate it.
This measure served as a pretest posttest measure to determine if a proximal training
effect was achieved, and as an initial assessment to determine each instructional
assistant’s current skill level. Each vignette tested whether the instructional assistant
could correctly assess another fictitious instructional assistant, based on one of the
observable instructional assistant support behaviors.
Each of the observable instructional assistant behaviors was measured three
times, using three different vignettes to ensure an accurate measurement for each of
the behaviors. The vignettes were randomly ordered in a packet given to each of the
participating instructional assistants prior to the training. Each of the vignettes was
written from the perspective of a supervisor conducting an in-class observation,
creating a hypothetical situation likely to be observed in an elementary school
classroom. In order to eliminate variability in instructional assistant experience that
might influence responses, the introduction to each vignette stated that the identified
instructional assistant in the fictional scenario had worked with the target student for
two years. Each vignette also included the instructional assistant’s name, the student’s
name, and the student’s grade level.
84
In order to ensure that the vignettes were intelligible, each vignette was
written at a sixth grade reading level. Instructional assistants were asked to respond to
each of the twelve vignettes using a 6-point Likert scale ranging from not effective to
very effective.
The purpose of the cognitive measure was to determine if the participant could
distinguish between an effective and ineffective instructional assistant. Although the
instructional assistants were asked to score the vignettes on a Likert scale of one to
six, instructional assistant effectiveness is not easily quantified in one correct answer.
Therefore, participants received a categorical score of either correct or incorrect for
each vignette, and depending on the vignette, correctness and incorrectness was
defined by a score within the high range (4-6) or low range (1-3) depending on the
predetermined effectiveness of the instructional assistant in each of the vignettes.
Table 1 provides the instructional assistant observable behavior category of each
vignette, description and whether the correct answer was in the high or low range.
It should be noted that although three vignettes targeted instructional assistant
wait time, this behavior was not specifically measured. Preliminary observations
deemed this behavior too difficult to observe and accurately record in combination
with the other seven instructional assistant and student behaviors. Although wait time
was not directly observed, the researcher felt this behavior should be included in the
vignettes since preliminary observations revealed that instructional assistants were
observed to provide no wait time between prompts. No wait time was typically
85
equated with the use of high levels of verbal prompts, which were typically repeated
over and over again until the student responded.
86
Table 1
Vignette Type, Answer and Description
Vignette
#
Vignette
Type
Vignette
Answer
Vignette Description
1
Proximity
Ineffective
2
Prompting
Effective
IA sitting next to student throughout entire
lesson, even when the student is working
independently.
IA using prompting sequence to prompt
correct response.
3
Wait Time
Ineffective
4
Descriptive
Praise
Effective
IA gives descriptive praise for appropriate
behavior and academic behaviors.
5
Proximity
Effective
IA adjusts proximity when student begins
to demonstrate independence.
6
Descriptive
Praise
Ineffective
IA continually uses generic praise to
reinforce student behavior.
7
Prompting
Ineffective
IA immediately begins to use physical
prompting to prompt student to task.
8
Wait Time
Effective
IA provides wait time after teacher
instruction or prompt.
9
Wait Time
Effective
IA provides wait time after teacher
instruction or prompt.
10
Descriptive
Praise
Ineffective
IA provides redirection back to task but no
descriptive praise.
11
Prompting
Ineffective
IA uses same verbal prompt repeatedly to
evoke student response.
12
Proximity
Ineffective
IA adjusts proximity prior to student
demonstrating independence with task.
IA does not provide wait time between
prompts.
87
The vignette measure was not a substitution for direct observation but instead,
a measure of each instructional assistant’s susceptibility to training. Susceptibility to
training may be best understood through the instructional assistant’s conscious
understanding of good practices, even if the instructional assistant does not
demonstrate these practices. Therefore, it is assumed that direct observation could
interfere with evaluation of each instructional assistant’s actual teaching ability.
The second measure used was a 10-minute direct observation instrument to
capture the dyadic interaction between the student and the instructional assistant. In
order to quantify the effectiveness of the interaction, both student and instructional
assistant behaviors were directly observed for ten minutes. In this study, the term
total observable behaviors describe the student’s and instructional assistant’s
combined behaviors, both effective and ineffective.
To measure the total observable behaviors, an omnibus score was calculated
using the simple sum of ratings for the four effective observable behaviors (student
academic engagement, prompts to the teacher or natural cue, descriptive praise and
movement of proximity of two or more feet) minus the simple sum of the ratings for
the three ineffective behaviors (student challenging behavior, prompts to the task and
general (non-descriptive) reinforcement for each participant. Blank intervals were not
counted as either desirable or undesirable and were therefore withheld from this
analysis. Therefore, the scoring was adjusted for each of these variables to calculate
an omnibus score that would be more easily interpretable. That is, a positive omnibus
88
score indicates a high rate of effective behaviors while a negative omnibus score
reflects a high rate of ineffective behaviors.
After numerous preliminary observations, the ten-minute observation was
broken down into sixty ten-second intervals. The researcher determined that this
interval length allowed enough time for at least one of the behaviors to occur, but also
limited the number of times that the behavior could occur.
The direct observation instrument was made up of seven observable
behaviors: two student behaviors and five instructional assistant behaviors (see
Appendix C). These student and instructional assistant behaviors were derived from
the literature on student engagement, effective instructional assistant supports and
extensive direct observation by the primary researcher.
Two student behaviors were observed and recorded: independent academic
engagement and the demonstration of challenging behavior. The five target
instructional assistant behaviors were prompting, verbal praise and proximity, but
both prompting and verbal praise were further broken down to reflect the type of
prompt or type of praise given, and further differentiated by prompts to the teacher or
natural cue or prompts to the task. Prompts to the natural cue (also known in the
literature as a indirect prompt) were considered an important initial prompt that
should be given prior to a prompt to the task, in order to direct the student’s attention
to the teacher or environmental cue. For this reason, prompts to the teacher or natural
cues were scored separately. As noted previously, although wait time was not directly
measured, the researcher assumed that increased use of wait time by the instructional
89
assistant would reduce the frequency of prompts given in a particular interval. Thus,
the more wait time, the less opportunity to prompt. Verbal praise was also broken
down into both descriptive and general praise, to determine the frequency of each.
The researcher collected data on one student and instructional assistant dyad
at a time using either a whole interval or partial interval recording for each targeted
behavior during each of the 10s intervals during a 10-minute observation. A 10s
whole interval recording procedure was used to measure the continuous and
independent academic engagement of each of the student’s with disabilities in the
general education classroom. Whole interval recording is a conservative measure
known to slightly underestimate the occurrence of the targeted behavior and is
typically used when the goal is to increase the behavior. An earplug which cued the
observer at the beginning and end of each interval assured strict adherence to the 10s
time frame. At the end of each interval, the observer circled “academic engagement”
if the student was engaged independently throughout the entire 10s interval.
For the purposes of this study, “independent academic engagement” was
defined as actively or passively engaging in the assigned activity. For instance, the
student might be actively engaged by complying with teacher delivered directives,
using instructional materials as designed and appropriately interacting with the
assignment or task by writing, responding, tracing, talking to a peer or teacher about
the current activity, and/or reading out loud. The student may have been passively
engaged by orienting toward and attending to the adult delivering instructions,
90
reading silently, looking at the overhead or looking at the instructional materials
(Baker, Lang, & O'Reilly, 2009; Chafouleas et al., 2010).
Challenging behavior was defined as verbal or physical aggression, property
destruction, tantrums, verbal refusal to complete the task and elopement. Stereotypic
behaviors such as repetitive body rocking, hand flapping, vocalizations or other
rhythmic patterns of behavior were not counted as challenging behavior. Since these
types of behaviors are known to provide automatic reinforcement for the student, it
would be difficult to discriminate whether the stereotypic behavior was a function of
automatic reinforcement or escape/avoidance behavior.
Researchers used a partial interval recording procedure to measure the
occurrence or non-occurrence of each of the five measures of instructional assistant
behavior during each 10s interval. This data collection procedure produces a slight
overestimate of the behavior and is typically used to determine whether a behavior
did or did not occur within a given interval. Since the observer was recording
occurrence versus nonoccurrence, it was then possible to measure multiple behaviors
concurrently (Cooper et al., 2007).
Partial interval data was collected on two types of academic prompts: prompt
to the teacher or natural cue, and prompts to the task. It should be noted that observers
did not count any prompts given to manage student behavior. For example, prompts
referring to stereotypy, vocalizations or challenging behaviors were not scored since
these were prompts to redirect student behavior and not prompts relating to the
academic task. Also, since instructional assistant prompting and student independent
91
academic engagement were incompatible behaviors, there were no intervals in which
an occurrence of a prompt and student independent academic engagement were
recorded in the same interval.
A prompt which directs a student to the teacher or a natural cue is also known
in the literature as an indirect verbal prompt (Breen & Haring, 1991; Coe et al., 1990;
Cuvo, 1981). This type of prompt helps to guide the student to the correct behavior
without explicitly stating what specific behavior is to be performed (Alan, 2010;
Kraemer, Morton, & Wright, 1998). For the purposes of this study, a prompt to the
teacher or natural cue was defined as the instructional assistant providing the student
with verbal hints about the expected behavior. For example, an instructional assistant
might say, “What’s next?” or, “What are your friends doing?” in order to evoke the
correct response. This type of prompt was scored separately from the prompt to task
variable for two reasons. First, it is commonly known that students with autism and
related disabilities do not attend to or take cues from their environment. Therefore, it
is vital that students with disabilities in general education classrooms learn to attend
to and respond to environmental cues as the other students do. It is equally important
that instructional assistants first attempt to direct the student’s attention to the teacher
prior to providing prompts to the task itself. A student with disabilities will not learn
to function independently in a general education classroom if the student only takes
direction from the instructional assistant. Therefore, instructional assistants were
advised to prompt the student to the teacher or the natural cues in the environment
prior to using prompts directed towards the task.
92
A “prompt to task” was defined as the use of a verbal, gestural, model, visual
or physical prompt to evoke a correct academic response from the student. A single
prompt, a combination of prompts or the repetition of the same prompt were all
recorded as an occurrence of a prompt to the task for the ten-second interval in which
it occurred. It should be noted that the data collected on prompts is not a reflection of
the number of prompts occurring in each interval, but instead whether or not a prompt
was given in the designated 10 seconds.
Descriptive reinforcement, also known as behavior-specific praise or
descriptive praise, was defined as a specific verbal statement regarding the
appropriate student behavior (Chalk & Bizo, 2004; Conroy, Sutherland, Snyder, AlHendawi, & Vo, 2009; Sutherland, Wehby, & Copeland, 2000). In order for the
praise to be counted as descriptive, an explicit statement of the student’s behavior
must have been contingent upon a desired academic behavior. For example,
statements such as, “I like the way you wrote your name,” or “Excellent job counting
to ten by yourself” were counted.
In contrast, the definition of general praise was taken from Sutherland, Wehby
and Copeland (2000) who defined non-behavior specific praise as a verbal praise
statement given by the teacher that did not specify the specific behavior which the
student was receiving praise for. Examples included, but were not limited to phrases
such as, “Great job,” “Way to go,” “You did it,” or tactile praise such as a pat on the
back or a high five.
93
Instructional assistants were determined to be out of a student’s proximity
when the assistant was two or more feet away from the student at any point in the tensecond interval. Due to the fact that it would be difficult to determine instructional
assistants’ purposeful movement away from the student, a more conservative
approach was taken. Regardless of whether the instructional assistant looked to be
moving away from the student intentionally or accidentally, the interval was coded as
an adjustment in proximity once the instructional assistant stepped two or more feet
away from the targeted student. In the event that the instructional assistant remained
out of proximity throughout one or several intervals, the intervals were recorded as a
change in proximity for each interval.
In instances where none of the targeted behaviors were occurring, the interval
was left blank. Blank intervals typically included incidents where the student was not
academically engaged, not being prompted and not interacting with the instructional
assistant.
The third measure was a posttest interview for selected participants from the
treatment group. This was a semi-structured interview with open-ended questions
designed to assess the instructional assistant’s behavior and decision-making process
(see Appendix D). Because all of the participants in the treatment group demonstrated
similar gains in knowledge from pretest to posttest on the vignette measure, a random
selection of five instructional assistants were asked to participate in an interview.
The fourth measure was a social validity questionnaire to determine if the
instructional assistants felt that the components of the training were acceptable and
94
could be reasonably implemented in the classroom (see Appendix E). As stated by
Lane & Beebe-Frankenberger (2003), “If an intervention is viewed as socially
acceptable there is higher probability that it will be implemented with treatment
integrity than if the intervention procedures were initially viewed to be unacceptable
(p. 33).” The responses to the 12-item questionnaire were measured using a five point
Likert scale (strongly agree, disagree somewhat, neutral, agree somewhat and
strongly agree) to determine instructional assistant acceptability.
Procedures
Informed consent. The Superintendent of each of the three school districts
provided the names of the students with disabilities included in general education,
instructional assistants who supported these students and the corresponding general
education teacher. A written overview of the purpose of the study, the minimal risks
involved and the consent procedures were delivered to each of the schools and placed
in the boxes of each of the instructional assistants and teachers who meet the criteria
for participation. Parent consent forms were sent home with each of the prospective
students.
Instructional assistants and parents were informed that their identities or child’s
identity, pre-posttest measures and direct observations would be kept confidential and
used for scoring purposes only. Instructional assistants were asked to consent to
participate on a voluntary basis only and informed that declining participation would
not affect their job in any way. Similarly, a parent or guardian was asked to provide
voluntary consent in order for their child to participate in the study. Parents were
informed that the alternative to their child participating in this research was that their
95
child's instructional assistant would not receive training and continue to work with
their child as they had before.
Since instructional assistant voluntary participation was initially low, the
researcher personally recruited instructional assistants from each of the districts by
speaking at school and district meetings, meeting personally with small groups or
individual instructional assistants and speaking with special education teachers to
describe the purpose, procedures and requirements of the study.
Intervention procedures. Instructional assistants who consented to participate
in the study were initially asked to complete the following tasks: 1) respond to the
twelve vignettes, 2) complete a questionnaire with demographic and job related
information and 3) provide a rank ordered list of academic activities in which the
student requires instructional assistant support for observation purposes (see
Appendix F). The list of academic activities was utilized to identify classroom
academic activities that the student required instructional assistant support to
complete. Instructional assistants were instructed to rank the activities from one to
three, from most to least support required. All instructional assistants in the treatment
group indicated Language Arts, Mathematics or Science, as the primary academic
activity the student was known to struggle with. For this reason, Language Arts,
Mathematics or Science was observed for each participating student and instructional
assistant pair.
In order to ensure that the instructional assistants responded to the packet of
vignettes and questionnaires and returned them in a timely manner, each instructional
96
assistant who submitted their completed packets within a two-day time limit was
entered into a raffle for a $50 gift card.
After receiving the returned packets, instructional assistant responses to the
vignettes were scored. The primary researcher then randomly assigned instructional
assistants using random assignment to the control or treatment conditions. To ensure
that the instructional assistants did not significantly differ in their knowledge at
pretest, the researcher conducted a one-way Analysis of Variance (ANOVA). Results
of this test showed no significant differences in knowledge between groups, therefore
no further action was necessary to balance the levels of experience and skill of the
instructional assistants across groups.
After being assigned to one of the two conditions, the primary researcher and
research assistants collected direct observation data to establish a baseline measure
for each participating instructional assistant during Language Arts, Math or Science
in the general education classroom. Although the scheduling for these subjects varied,
observations typically occurred in the morning and before lunch, during the more
intensive academic time. Each student was observed during the same content-area
across all observations. If the student, instructional assistant or teacher was absent,
the observation was rescheduled for another day in order to avoid any confounding
variables.
The primary researcher in data collection procedures and observation methods
trained two undergraduate research assistants and a fellow graduate student. Training
occurred across two months as a small group and individually with the researcher.
97
Training included memorization of behavioral definitions, examples and nonexamples, videotape practice, and in-vivo training with the researcher. Data collectors
were considered sufficiently trained when 85 percent or greater agreement was
reached for each of the direct observation variables. Since the fellow graduate student
had previously worked with students with disabilities and was familiar with the study,
behavior change procedures and data collection procedures, significantly less time
was required to meet an adequate agreement with the primary researcher. Each data
collector reached and maintained agreement with all other data collectors prior to
collecting data independently.
All baseline sessions occurred in the natural classroom environment between
February 22nd and March 9th. Each instructional assistant was observed during a 10minute activity previously identified by the instructional assistant. The observer
entered the classroom at the scheduled time, found a seat close to the student and
instructional assistant, and either waited for the scheduled activity to start, or
immediately began recording data if the activity was already occurring. The observer
watched the activity for 10 minutes and subsequently rated the student’s and
instructional assistant’s behaviors using the direct observation instrument. If the
student received a break during the 10-minute observation, the observation was
stopped and a line was drawn under the last interval to indicate that a break had
occurred. Once the student returned from the break, data collection resumed after the
instructional assistant gave the first prompt, or the student’s behavior indicated
engagement.
98
During each observation, the observer sat close enough to the instructional
assistant and student to hear the directives given by the instructional assistant to the
student, the types of prompt(s) being utilized and the student’s response to
instructional assistant directives and prompts. In addition to rating the targeted
instructional assistant behaviors and student behaviors, observers also recorded
descriptive information regarding the time of day, the activity being observed, the
grade level and any other environmental factors that could potentially influence the
interactions between the instructional assistant and student. Observers also noted any
interactions or behaviors of the instructional assistant that were of interest. For
example, because a partial interval recording procedure only confirmed either the
occurrence or non-occurrence of prompting, during baseline observers transcribed the
prompting procedure and numerous verbal prompts used by the instructional assistant
to reflect the number of prompts that were given in a 10s interval.
Following baseline, the treatment participants received three one-hour
trainings over a seven-week period for Districts A and B. Participants in District C
received the trainings over an eight-week period due to scheduling conflicts. The
training employed a multi-component training package consisting of didactic
instruction, modeling, practice, feedback and coaching. The first training occurred
shortly after the last of the baseline data had been collected. The second training
occurred during the fourth week for Districts A and B and fifth week for District C.
The final training occurred during the seventh week for Districts A and B and eighth
week for District C. Training occurred immediately after school hours in either a
99
classroom or a conference room. Each of the three districts was trained separately in
order to accommodate the instructional assistants and eliminate any drive time and
inconvenience. To ensure the trainings were the same across districts the researcher
conducted an assessment of fidelity of implementation using a checklist and only
provided the information on the slides or in the notes written prior to the training. The
only variation in the trainings derived from questions and comments made by the
participants.
Instructional assistants in the treatment group were allowed to sit in selfselected seats for each of the trainings. Instructional assistants were asked to bring
examples of their student’s work to use and refer to during practice sessions in the
training. Examples of student work include writing samples, math worksheets and
reading tasks for which the participants felt they needed to provide continual and
ongoing support. Participants were asked to bring these examples to motivate
participants to think critically about how to use the skills learned in training when
supporting their student in the general education classroom.
The primary researcher assumed no presupposed knowledge of instructional
assistant basic support behaviors. Due to the timing of the training, it was likely that
instructional assistants had been exposed to some training during the beginning of the
2009-2010 school year, although it is highly unlikely that any instructional assistant
has received this type of focused training during their employment. It was assumed
that participating instructional assistants had time to become familiar with the student
and the student’s academic and social abilities.
100
Prior to the beginning of the initial training, the primary researcher
introduced herself and described her training and experience. In an attempt to ensure
that instructional assistants in the control group would not receive any information
from the training, participants in the treatment group were asked to not share any
information from the training with other instructional assistants. This instruction was
also printed on the copies of the power points and handouts from the first day of
training.
The introduction included a brief definition of the role of the instructional
assistant, the goals of inclusion, the benefits of learning fundamental support
behaviors and the inadvertent detrimental effects of one-on-one support on students
with disabilities. This was reiterated in more detail in handouts given to the
instructional assistants. Instructional assistants were asked to comment on their
understanding of their role as an instructional assistant and the goal of inclusion, in
order to assess how instructional assistants perceive their job, what their duties are,
and how their job relates to inclusion.
The researcher discussed the current utilization of instructional assistants, and
then presented a more effective model to increase student performance. The
researcher described the current role of the instructional assistant as an underutilized
support staff who typically functions as a reactive monitor. This was described
through examples of instructional assistants who monitor a student until an error
occurs. For example, once a student error occurs, an instructional assistant steps in to
correct the error and continues to monitor until another error is made. This model
101
most likely occurs for numerous reasons, the most pertinent being that instructional
assistants are rarely informed of effective instructional supports and how they relate
to the final outcome. Although utilizing instructional assistants in this way can
provide some benefits to the student, a more intensive model could increase student
performance.
Next, the participants were introduced to behavioral theory while concurrently
translating this theory into real-life classroom practices. All participants were asked to
actively participate in group activities and to draw on their current or previous
experiences in providing one-to-one support to students with disabilities in the
general education setting. Repeatedly referring back to classroom experiences not
only reinforced the theory being taught in a meaningful way, but also motivated the
participants to continue to think critically about their own behavior and how their
behavior relates to the goal of inclusion and supporting their specific student.
From the wealth of topics that would be relevant and helpful to instructional
assistants, the researcher carefully selected fundamental instructional support
behaviors. The training walked instructional assistants through these effective
instructional support skills based on Applied Behavior Analysis principles. The
researcher intended to teach instructional assistants to make conscious decisions
about the level and type of support they provide, in order to facilitate student
independence. For this reason, the researcher frequently used the term “automatic
pilot” to describe instructional assistants’ unconscious use of prompts and varying
support strategies that may hinder student independent academic engagement.
102
Participants were reminded that to be an effective instructional assistant, they must be
constantly aware of and thinking about how and when they provide support. The
researcher then stated the five goals of the training. These goals were: 1) increase
student independence, 2) prompt less, 3) prompt to the teacher, natural cue or other
students first, 4) give specific praise and 4) don’t hover. These goals were reiterated
at the beginning of each of the trainings and referred to continually throughout the
training.
Following the introduction, the researcher systematically introduced each of
the instructional support behaviors determined to be necessary to be an effective
instructional assistant. The behaviors were taught in a natural progression: 1)
prompting and prompting procedures, 2) reinforcement, and 3) proximity. For each of
the one-hour trainings, one instructional support behavior was introduced and taught
using a systematic format.
Each of the behaviors were introduced and discussed, with attention to the
following factors: (1) how each instructional support behavior related to academic
student behaviors, (2) the benefits and possible drawbacks of its use in the classroom,
and (3) how the specific strategy could be implemented in the classroom. Following
this description, the researcher provided examples of the behavior to illustrate correct
and incorrect usage.
To increase the likelihood that these newly learned skills would transfer to the
classroom, instructional assistants were asked to participate in an activity following
the description of each instructional support behavior using information from their
103
current placement. Instructional assistants worked with one another to devise a
response to each activity. After completing the activity, instructional assistants
practiced in pairs. Each pair was given five minutes to respond to the activity and an
additional five minutes to practice.
To illustrate this procedure, the prompting instruction was sequenced in the
following paragraph as follows. Each type of prompt (verbal, gesture, model, visual
and physical) was defined and described in relation to academic activities. Following
these descriptions, a brief video clip showed the participants what each specific
prompt looked like in the classroom environment. Next, a short description of the
benefits and potential drawbacks of each prompt were given. For example, although
verbal prompts are known to be the least intrusive, in some instances, such as chained
tasks, verbal prompts can be one of the most intrusive prompts, since they interrupt
the chain and pull the student out of the sequence to attend to the instructional
assistant. The researcher modeled how to use each prompt and demonstrated
examples and non-examples of the prompt.
This procedure was also used to describe three prompting procedures and the
use of wait time within the following procedures: 1) System of Least Prompts also
known as Least to Most Assistance, 2) Most to Least Assistance, and 3) Graduated
Guidance. As an activity, participants were asked to choose a lesson that their student
currently struggled with. With this lesson in mind, participants were asked to
determine how they could use prompts or a prompting procedure to facilitate student
independence. After five minutes, participants were then asked to share their answers.
104
Participants then were asked to practice how they would implement this procedure in
the class with their partner. During this time, the researcher walked around the room
to answer any questions and correct any misuse of the procedure. Also during this
time, participants asked questions about a specific student or specific activity with
which they were having difficulty. The researcher attempted to keep answers to
questions brief and restate any particular points of the training that would help to
answer the question.
Following the initial and final training, each instructional assistant received a
thirty-minute coaching session given by the primary researcher and/or fellow
graduate research assistant. Coaching sessions were arranged with the general
education teacher and instructional assistant to ensure that the same activity would
occur, in order to avoid observing on any day that was not a typical classroom day
(i.e.: field trip, minimum day, end of the year celebrations, specialist day, therapy
session, etc.). The first coaching session was scheduled immediately after the first
training. All participants received coaching within one week of the initial training.
The second and final coaching session proved to be difficult to schedule, due to state
testing and closed campuses for the participating schools. For this reason, the final
coaching sessions were scheduled one to two weeks after the last training with an
average lag time of a week-and-a-half following training. Coaching sessions were
again coordinated with the teacher and instructional assistant.
During each coaching session, a form was used to ensure fidelity of
implementation (see Appendix G). The form included these items: (1) an initial
105
positive statement, (2) performance-based positive feedback and descriptive praise,
(3) performance-based corrective feedback and suggestions regarding specific
behaviors in need of improvement, (4) a concluding positive statement, (5) answers to
any questions (Leblanc et al., 2005) and 6) the five goals of the training printed at the
bottom. The observer entered the classroom during the same instructional time as
previously observed and watched the instructional assistant for ten to twenty minutes,
depending on what was occurring in the class at that time. For the first coaching
session, which focused on prompting, the observer recorded an initial positive
statement and pertinent performance-based positive feedback on prompting with
regard to the following occurrences: 1) the types of prompts used, 2) if prompts were
given to the teacher or natural cue, 3) if a prompting procedure was used, 4) if
appropriate wait time was given after or between prompts, 5) if prompts were given
and time was allowed for an independent response, 6) if the student was allowed to
work independently and 7) any notes or other observations relating to prompting. The
instructional assistant also received performance-based, corrective feedback on any
aspect of prompting that needed improvement. For instance, if the instructional
assistant continually repeated the same prompt, or did not use wait time in between
prompts, the observer noted this in the corrective feedback portion of the form.
Instructional assistants were also given suggestions regarding a specific scenario that
could be improved, relating to prompting and student academic engagement. The
observer wrote a concluding positive statement and asked the instructional assistant to
step outside of the class to review the form for 10-15 minutes. The observer reviewed
106
both the performance-based positive and corrective feedback with the instructional
assistant. Any notes were discussed, followed by a short discussion regarding a
specific scenario observed and prompting strategies to use to facilitate student
independence. The instructional assistant was asked if he or she had any questions
and was given a copy of the form to refer to.
The two subsequent trainings followed the same procedures as the initial
training, with the addition of a review of each of the instructional support behaviors
previously taught. A 10-15 minute review allowed participants to ask questions
concerning the implementation of the instructional support behaviors and clarify any
misinterpretations or vagueness. Prior to beginning any new instruction, the
participants were reminded of their five goals for the training.
The second coaching session, which followed the third and final training, was
similar to the first, with the addition of both reinforcement and proximity in the
performance-based positive and corrective feedback sections of the form.
Specifically, the observer recorded what type of verbal reinforcement was given
(descriptive or general) and the frequency of the reinforcement. In terms of proximity,
the observer recorded where the majority of the instructional assistant’s time was
spent and if the instructional assistant made an attempt to change proximity. If
proximity was adjusted during the observation, the observer noted in what way
proximity had changed. The discussion format was the same as the previous coaching
session, with the addition of an average of five more minutes required to discuss and
107
provide feedback about all three of the instructional support behaviors learned in the
training.
At the conclusion of the three trainings and second coaching session, all
participants in either the treatment or control group were asked to respond to the same
twelve vignettes from pretest. The vignettes were delivered to the school and either
put in the instructional assistant boxes or handed to the office manager to be handed
out to the instructional assistants. Instructional assistants were asked to respond to the
packet of vignettes within two days to again be entered into a drawing for a $50 gift
certificate. To ensure social validity of treatment procedures, instructional assistants
in the treatment condition were also given a twelve-item questionnaire to rate the
training and its components along with the vignettes.
Post-test data collection. For the participants in the treatment group, posttest
data collection began within one-to-two weeks following the final coaching session,
with an average of one-and-a-half weeks. For the participants in District C, posttest
data was collected two weeks after the two participants coaching sessions due to
scheduling conflicts and end of the year field trips. Posttest data for participants in the
control group was also collected during the same two weeks as the treatment
participants. Posttest data collection observations were set up with the general
education teacher and the instructional assistant. As the end of the school year
approached, classroom schedules varied from day to day. This made scheduling and
data collection increasingly difficult, due to changing schedules and end-of-the-year
activities.
108
Follow-up. Following the seven or eight-week period, follow-up probes were
collected during the following one to two weeks for participants in the treatment
group. As a result of the end of the year approaching, the researcher was not able to
obtain follow-up data on control participants. It was also for this reason that a longer
maintenance data collection period was not permitted for treatment participants.
These follow-up observations were used to assess the maintenance of all observable
instructional assistant and student behaviors following the withdrawal of the
treatment. Similar to baseline, no instructions or coaching were given during this
time. The primary researcher and research assistants conducted observations in the
same manner as baseline observations.
Interviews. In order to understand instructional assistant knowledge of
effective practices as it relates to observable behaviors, twenty-minute interviews
were conducted with a small sample of instructional assistants from the treatment
condition. Initially, participants were to be selected based on responsiveness versus
non-responsiveness on the knowledge measure. Since treatment participants
demonstrated similar gains, ranging from an increase of one correct vignette to three
correct vignettes, a random selection of five instructional assistants was asked to
participate in an interview. The interview took place in a quiet area of the classroom
or just outside of the classroom to ensure privacy. This semi-structured interview
asked instructional assistants to respond to eight open-ended questions in order to
understand the instructional assistant’s decision-making process and behaviors. This
interview also provided information regarding any discrepancy between instructional
109
assistant behaviors and instructional assistant knowledge of effective practices. The
interview was used as a way of determining what other variables might have hindered
the implementation of effective practices in the classroom.
Inter-observer Agreement
The researcher conducted inter-observer agreement checks for the total
number of observations and also by each observable behavior to determine overall
observer agreement and inter-observer agreement by each of the seven observable
behaviors. During these checks, two observers independently and simultaneously
collected data on each of the behaviors of interest. Following data collection, the data
were compared to determine to what extent the two data collectors agreed. For the
total inter-observer agreement, an agreement was scored when both observers
recorded either the presence or absence of all of the same observational behaviors
during the ten-second interval. To determine inter-observer agreement for the
individual behaviors, an agreement was scored when both observers indicated the
presence or absence of a specific observable behavior during the ten-second interval.
Inter-observer agreement was collected for 37 percent of the observations.
The total overall inter-observer agreement (agreements divided by agreements plus
disagreements multiplied by 100) averaged 93 percent (85%-100%). For each of the
observed behaviors, inter-observer agreement was calculated in the same way as
overall inter-observer agreement. The agreement for each of the behaviors were as
follows: 1) independent academic engagement averaged 98 percent, 2) challenging
behavior averaged 99 percent, 3) prompts to the teacher or natural cue averaged 99
110
percent, 4) prompts to the task averaged 97 percent, 5) general reinforcement
averaged 99 percent, 6) descriptive praise averaged 99 percent and 7) proximity
averaged 99 percent.
111
CHAPTER IV
RESULTS
This study addressed the following questions: (1) will instructional assistant
training be generalized into the natural environment? (2) Which specific instructional
assistant and student observable behaviors are more likely to change as a result of
training? 3) Which instructional assistant characteristics are most susceptible to
training effects?
The following analyses will address these three questions. First, an analysis of
the intervention effects of the total observable behaviors and total vignette measure
will be conducted. Next, an analysis of the specific instructional assistant and student
observable behaviors will be conducted to understand which behaviors were impacted
the most as a result of the training. A separate analysis of the specific vignettes more
likely to change as a result of training will also be conducted. Lastly, a post hoc
analysis will be performed to determine which characteristics contributed the most to
instructional assistant susceptibility to training effects.
Participant demographic information
Table 2 lists the demographic information for the primary participants.
112
Table 2
Instructional Assistant Demographics
Characteristic
n
%
Treatment
16
52
Control
15
48
Female
25
81
Male
6
19
20-29
7
23
30-39
9
29
40-49
9
29
50-59
5
16
60-69
1
3
White
20
65
Hispanic
10
32
Other
1
3
5
16
Group
Gender
Age
Ethnicity
Highest level of education completed
High School
113
Some Undergraduate
11
36
Associates Degree
3
10
Undergraduate Degree (BA or BS)
6
19
Some Graduate School
1
3
Graduate School
5
16
Less than 6 months
2
7
1-3 years
15
48
4-6 years
6
19
7-9 years
4
13
10-12 years
3
10
13-15 years
1
3
Less than a month
2
7
Less than 6 months
6
19
Less than a year
13
42
1-3 years
7
23
4-6 years
3
10
Years employed as an inclusion instructional assistant
Years supporting current student in inclusive setting
The instructional assistant participants were primarily female, with a wide
distribution of ages ranging from 20-65 years old. Interestingly, 15 of the 31
participants had obtained an associates degree or higher. Although a majority of the
114
instructional assistants had been employed as an inclusion instructional assistant for
one to three years, over half of the participants had spent less than a year supporting
their current student. Table 3 lists the demographics information for the secondary
participants.
115
Table 3
Student Participant Demographics
Characteristic
n
%
Female
6
19
Male
25
81
K
6
19
First
5
16
Second
7
23
Third
6
19
Forth
2
7
Fifth
4
13
Sixth
1
3
White
16
52
Hispanic
11
36
Other
4
12
Autism
19
61
Other Health Impairment
4
13
Gender
Grade
Ethnicity
Disability
116
Emotional Disturbance
2
7
Specific Learning Disability
2
7
Visually Impaired
1
3
Speech and Language
1
3
Traumatic Brain Injury
1
3
Orthopedic Impairment
1
3
Students were predominately White or Hispanic. The majority of the students
were male. The population included mostly students in kindergarten through third
grade with a small number of students in fourth through sixth grade. Of the
participating students, a large number were diagnosed with Autism (61% of total
sample) with the remaining students diagnosed with a range of disabilities.
Correlation of Total Observable Behaviors at Pretest
In order to assess the degree of the relationship between the total observed
behaviors at pretest, a Pearson Product-Moment Correlation Coefficient was
calculated for all participants. Each correlation represents the relationship between
two directly observed behaviors during a ten-minute observation broken down into
sixty ten-second intervals. Some of these behaviors, by definition, could not be
observed within the same interval, which suggests that they are incompatible
behaviors. Therefore, negative correlations were presumed to represent this
incompatibility. For example, intervals in which prompting was observed could not
have been coded in the same interval as student independent academic engagement,
117
since prompting and student independent engagement are incompatible behaviors.
The following Table displays the significant correlations for instructional support
behaviors at pretest.
Table 4
Correlations of Total Observable Behaviors at Pretest for All Participants
1
2
3
4
5
6
7
1. Academic
engagement
--
2. Challenging
behavior
-
--
3. Prompts to
teacher or natural
cue
4. Prompts to
task
-
-
--
-.73**
-
--
--
5. Nondescriptive praise
-
-
-
.49**
--
6. Descriptive
praise
-
-
-
-
.74**
--
.73**
-
-
-.65**
-
-
--
-
-
-
-.50**
-
-
-
7. Proximity
8. Blank intervals
8
--
Note. ** p < .01
Of the two student behaviors, five instructional support behaviors, and blank
interval, there was a reliable negative correlation for three of the variables and a
statistically significant positive correlation for three variables. There was a significant
118
negative correlation between student independent academic engagement and
instructional assistant prompts to the task [r = -.730, n = 31, p < .01]. The second
negative correlation existed between instructional assistant prompts to the task and
instructional assistant proximity to the student [r = -.652, n = 31, p < .01]. The final
negative correlation was between instructional assistant prompts to task and blank
intervals in which no behaviors were recorded [r = -.502, n = 31, p < .01]. These
negative correlations suggest the incompatibility of these behaviors, since the
occurrence of one behavior negates the occurrence of the other behavior. There were
also three statistically significant positive correlations. The first was between
academic engagement and instructional assistant proximity [r = .730, n = 31, p < .01]
and the second was between instructional assistant prompts to the task and general
(non-descriptive) reinforcement [r = .489, n = 31, p < .01]. The final correlation
existed between the use of descriptive praise and non-descriptive praise
[r = .740, n = 31, p < .01].
Correlation of Total Observable Behaviors at Posttest by Group
Since this was an experimental intervention, the researcher anticipated that
there would be a relationship between pretest and posttest, which would differ by
group; therefore, posttest correlation measures were analyzed by group. Table 5
illustrates the significant correlations for the treatment group at posttest.
119
Table 5
Correlations of Total Observable Behaviors for Treatment Participants at Posttest
1
2
3
4
5
6
7
1. Academic
engagement
--
2. Challenging
behavior
-
--
3. Prompts to
teacher or
natural cue
-
-
--
4. Prompts to
task
-.76**
-
-
--
5. Nondescriptive
praise
-
-
-
-
--
6. Descriptive
praise
-
-
-
-
-
--
7. Proximity
-
-
-
-
-.60*
-
--
-.60**
-
-
-
-
-
-
8. Blank
intervals
Note. ** p < .01, *p <.05
120
8
--
There were three negative correlations for the total observable behaviors at
posttest for the treatment group. There was a statistically negative correlation between
student academic engagement and instructional assistant prompts to the task
[r = -.759, n = 16, p < .01]. A reliable negative correlation also existed between nondescriptive reinforcement and instructional assistant proximity
[r = -.603, n = 16, p <. 05]. The final negative correlation for the treatment group was
between blank intervals and student independent engagement
[r = -.603, n = 16, p < .01]. As previously noted, the negative correlations represent
incompatible behaviors that could not occur within the same interval.
Correlations of total observable behaviors at posttest for the control
participants are shown in Table 6.
121
Table 6
Correlations of Total Observable Behaviors for Control Participants at Posttest
1
2
3
4
5
6
7
1. Academic
engagement
--
2. Challenging
behavior
-
--
3. Prompts to
teacher or
natural cue
-
-
4. Prompts to
task
-.69**
-
-
--
5. Nondescriptive
praise
-
-
-
-
--
6. Descriptive
praise
-
-
-
-
-
--
7. Proximity
.52*
-
-
-.82**
-
-
--
8. Blank
intervals
-.55*
-
-
-
-
.71*
-
8
--
Note. ** p < .01, *p <.05
For the control group at posttest, there were three statistically significant
negative correlations and two positive correlations between the total observed
122
--
behaviors. The first and strongest statistically significant negative correlation was
between instructional assistant prompts to task and instructional assistant proximity
[r = -.823, n = 15, p < .01]. The second negative correlation existed between
instructional assistant prompts to a task and student independent academic
engagement [r = -.687, n = 15, p < .01]. A statistically significant moderate negative
correlation was also found between blank intervals where none of the targeted
behaviors were occurring and student academic engagement
[r = -.547, n = 15, p < .05]. A reliable, moderate positive correlation existed between
student independent academic engagement and instructional assistant proximity
[r = .521, n = 15, p < .05].
Correlations between Total Observed Behaviors and Vignette Scores at Pretest
The correlations between observed behaviors and vignette scores at pretest
were analyzed in two ways. First, correlations between a calculated total cognitive
measure and the total observed behavior measure were analyzed. This was done in
order to make results of these measures more easily interpretable. A total omnibus
vignette score was calculated for each participant to get the total cognitive measure
sum score for each of the twelve vignettes. To calculate this score, the participants
were given a categorical variable of either correct or incorrect for each of the twelve
vignette responses. As previously stated in the methods, although the instructional
assistants were asked to score the vignettes on a Likert scale of one to six, the purpose
of this measure was to determine if the participant could identify an effective or noteffective instructional assistant. Therefore, this omnibus score indicated the number
123
of correct responses to the vignettes. The researcher then calculated a mean for the
entire sample on the total vignette score.
Pearson-R correlation was used to understand the relationship between the
total cognitive and total behavior measures, and revealed a weak positive correlation
between the total cognitive measure and total behavioral measure at pretest
[r = .392, n = 31, p < .029].
Next, a Pearson-R correlation was used to investigate the significant
correlations between individually observed behaviors and responses to each of the
twelve vignettes for all participants at pretest. Table 7 highlights the significant
correlations.
124
Table 7
Correlations of Pretest Cognitive and Behavioral Measures for All Participants
Vignette
AE
CB
-
-
-.36*
-.38*
.41*
-
#10:
Descriptive
praise
#11:
Prompting
#3:
Wait time
#6:
Descriptive
praise
#8:
Wait time
P:T/NC P:Task
Sr
SrDes
Prox
Blank
-
-
-
-
-
-
-
-
-
-
-
-
-.41*
-
-
-
-
-
-.45*
-
-
-
.36*
-
-.61**
-
-
-
-
-
-
-.40*
-
-
Note. AE = Student independent academic engagement; CB = Student challenging
behavior; P:T/NC = Instructional assistant prompt to teacher or natural cue; Sr = Nondescriptive praise; SrDes = Descriptive praise; Prox = Proximity; Blank=Blank
interval.
**p <.01, *p <.05
At pretest, there were significant correlations between five of the vignettes
and six of the observed behaviors. Most significant, there was a moderate correlation
between vignette ten and instructional assistant proximity. Weak correlations between
five of the vignettes and the six observable behaviors were shown to be significant at
the .05 level.
125
Correlation of Observed Behaviors and Vignette Scores at Posttest by Group
A Pearson Product-Moment Correlation Coefficient was used to determine the
relationship between the observed behaviors and responses to posttest vignettes
following the intervention. Since this was an experimental intervention study,
researchers expected that pretest and posttest scores would differ by group. Therefore,
the following tables present the posttest findings by treatment and control group.
Table 8 shows the significant correlations for the treatment group.
Table 8
Correlations between Posttest Vignettes and Observed Behaviors for Treatment
Participants
Vignette
AE
CB
P:T/NC P:Task
Sr
SrDes
Prox
Blank
#1: Proximity
-
-
-.52*
-
-
-
-
-
#8: Wait time
-
-
-
-
-
-.51*
-
-
#11: Prompting
-
.51*
-
-
-
-
-
-
#12: Proximity
-
-
-
-
.51*
-
-
-
Note. AE = Student independent academic engagement; CB = Student challenging
behavior; P:T/NC = Instructional assistant prompt to teacher or natural cue; Sr = Nondescriptive praise; SrDes = Descriptive praise; Prox = Proximity; Blank=Blank
interval.
**p <. 01, *p < .05
For participants in the treatment group, there were significant correlations
between four of the vignettes and four of the observed behaviors. There was a
significant negative correlation between vignette one, ineffective proximity, and
instructional assistant prompts to the teacher or natural cue. There was a second
126
negative correlation between vignette eight, effective wait time, and the use of
descriptive praise. A significant positive correlation was found between vignette
eleven, ineffective prompting, and student challenging behavior. The second positive
correlation was between vignette twelve, ineffective proximity, and the use of nondescriptive praise.
Table 9
Correlations between Posttest Vignettes and Observed Behaviors for Control
Participants
Vignette
AE
CB
P:T/NC P:Task
Sr
SrDes
Prox
Blank
#3:
Wait time
-
-.72**
-
-
-
-
-
-
#6:
Descriptive
praise
-
-
-
-
.56*
-
-
-
#7:
Prompting
-
-.54*
-
-
.59*
-
-
-
#10:
Descriptive
praise
-
-
-
-
.75**
-
-
-
#11:
Prompting
-
-.67**
-
-
-.64**
-
-
-
Note. AE = Student independent academic engagement; CB = Student challenging
behavior; P:T/NC = Instructional assistant prompt to teacher or natural cue; Sr = Nondescriptive praise; SrDes = Descriptive praise; Prox = Proximity; Blank=Blank
interval.
**p < .01, *p < .05
127
For participants in the control group, there were several significant
correlations between five of the vignettes and two of the observed behaviors. Most
notably, a significant correlation was found between vignette three, ineffective wait
time, and student challenging behavior. There was also a statistically significant
positive correlation between the vignette ten, ineffective descriptive praise, and nondescriptive reinforcement. Other moderate correlations also existed between the five
vignettes and two observable behaviors (See Table 9).
Analysis of Vignette Scores at Pretest
Prior to analyzing the cognitive measure at posttest, two analyses were
conducted to ensure that there were no differences between groups on the number of
total and individual correct vignettes at pretest. First, a one-way ANOVA was
computed to determine if there was a statistically significant difference between
groups on the total cognitive measure. Although the mean number of correct vignettes
was slightly larger for the treatment group versus the control group (Treatment, 8.13;
Control, 7.87), a one-way ANOVA showed, at pretest, there were no significant
differences [F (1,29) =.154, p =.698] between participants in the Control and
Treatment conditions. A second analysis was conducted to determine if there were
any significant differences between individual vignettes at pretest by group. A oneway ANOVA confirmed there were no statistically significant differences between
groups on each of the twelve vignettes.
128
Training Effect on Vignette Scores
The first hypothesis stated that instructional assistants who received the
intervention would outperform participants in the control condition on both a total
cognitive and behavior measure. The first analysis will examine the cognitive
measure followed by the second analysis, which will evaluate the behavior measure.
At posttest, to test for main effects of time and group and the interaction effect
for the total cognitive measure, a Repeated Measures ANOVA was performed (See
Tables 10 and 11).
129
Table 10
Descriptive Statistics for Pretest and Posttest Correct Vignettes by Group
Vignette Pretest
Vignette Posttest
N
M
SD
N
M
SD
Treatment
16
8.13
1.96
16
9.31
1.78
Control
15
7.87
1.68
15
7.27
1.67
Table 11
Repeated Measures Analysis of Variance of Total Cognitive Measure
Source
df
MS
F
Sig
η2
Time effect
1
1.336
.901
.350
.030
Group effect
1
20.55
4.232
.049*
.127
Interaction effect
1
12.368
8.338
.007**
.223
Note. **p < .01, *p < .05
130
Figure 1
Pretest and Posttest Total Correct Vignettes by Group
Of the main effects for time and group, only the main effect for group was
statistically significant F (1, 29) = 4.232, p = .049. An interaction effect between time
and group was statistically significant F (1, 29) = 8.338, p = .007 suggesting that
participants in the treatment group scored higher on the posttest vignettes then
participants in the control group (See Table 11).
The estimated effect size for each of the main effects and interaction effect
was calculated using a Partial Eta Squared. Three percent of the total variance for the
cognitive measure was explained by the main effect for time, while 12.7 percent of
the total variance was explained by the main effect for group. The interaction effect
of time and group accounted for 22.3 percent of the total variance.
131
Analysis of Observed Instructional Support Behaviors at Pretest
Prior to analyzing the behavioral measure at posttest, two analyses were
computed to ensure there were no differences between groups on the total behavioral
measure at pretest.
First, a one-way ANOVA was computed to determine if a significant
difference between groups existed on the total behavioral measure. Although the
mean was slightly larger for the treatment group versus the control group (Treatment,
-38.13; Control, -25.93), a one-way ANOVA showed, when collapsing all of the
instructional support behaviors for a total score, there was no statistically significant
differences between the treatment and control group at pretest
[F (1,29) = .691, p = .412] (See Table 12).
A second analysis was conducted on each of the seven observable behaviors
to determine if statistically significant differences existed between the groups on any
of the seven directly observed behaviors prior to the intervention. The results of the
one-way ANOVA indicated there was a statistically significant difference at the
p<.05 level in student challenging behavior [F (1,29) = 5.87, p = .022]. Further
analysis showed students of the participants in the treatment group (M = 11.5, SD =
13.35, n = 16) had more incidents of challenging behavior at pretest then students of
the participants in the control group (M = 2.87, SD = 3.54, n = 15). The remaining six
measures showed no significant difference between the groups at pretest.
132
Training Effect on Observed Instructional Support Behaviors
The initial hypothesis stated instructional assistants in the treatment group
would outperform those in the control group following training on both a cognitive
and behavioral measure. The following analysis will first examine the behavioral
measure at posttest.
A Repeated Measures ANOVA on the total observable behavior measure was
conducted to determine whether or not the intervention produced a statistically
significant difference between the treatment and control groups. At pretest, the range
for this measure was -99 to 36 for the treatment participants and -59 to 83 for the
control participants. At posttest, the range for this measure was 11-99 for treatment
participants and -56 to 35 for control participants. A negative score indicates an
abundance of ineffective behaviors, which can also be described as a multitude of
poor quality interactions throughout the 10-minute observation. A positive score
indicates a considerable number of effective behaviors, which also suggests a large
number of effective interactions between the instructional assistant and the student
during the 10-minute observation.
Tables 12 and 13 present the means and standard deviations for the total
behavioral measure by group as well as the repeated measures ANOVA results.
133
Table 12
Descriptive Statistics for Pretest and Posttest Total Observable Behaviors by Group
Total Observable Behaviors
Pretest
Total Observable Behaviors
Posttest
N
M
SD
Range
N
M
SD
Range
Treatment
16
-38.13
40.88
-99 to 36
16
59.69
26.60
11 to 99
Control
15
-25.93
40.71
-59 to 83
15
-28.00
22.70
-56 to 35
Note. Negative numbers represent the number of undesirable behaviors
Table 13
Repeated Measures Analysis of Variance for Main Effects and Interaction Effects of
Time and Group
Variable
df
MS
F
Sig
η2
Main effect of time
1
35486.19
61.40*
.00
.68
Main effect of group
1
22063.05
12.97*
.00
.31
Interaction Effect
1
38616.19
66.82*
.00
.70
*p < .01
134
Figure 2
Pretest to Posttest Total Observable Behaviors by Group
Pretest Total Observable Behaviors
Posttest Total Observable Behaviors
The mean differences are a result of the adjusted omnibus score, which reflect
the sum of the effective behaviors minus the sum of ineffective behaviors for each
participant. Mean differences at pretest show that treatment participants demonstrated
a mean of 38.13 ineffective behaviors, with a standard deviation of 40.88 during a
ten-minute observation. Similarly, the control group demonstrated a mean of 25.93
ineffective behaviors with a standard deviation of 40.71 within a ten-minute
observation. At posttest, treatment participants demonstrated a substantial and
significant increase in effective observable behaviors with a mean of 59.69 and a
standard deviation of 26.60 within the ten-minute observation. Control participants
135
had a slight increase in ineffective observable behaviors at posttest, with a mean of 28
and a standard deviation of 22.70 although this finding is not statistically significant
(See Table 12).
The ANOVA summary table (See Table 13) indicates that there were
statistically significant main effects for both time [F (1, 29) = 61.40, p =.000] and
group [F (1,29) =12.97, p = .001] as well as an interaction effect between time and
group [F (1,29) = 66.82, p = .000]. These results demonstrate the effectiveness of the
training to not only reduce the number of less effective instructional support
behaviors, but also increase the number of effective instructional support behaviors.
The estimated effect size for each of the main effects and interaction effect
was calculated using a Partial Eta Squared. The main effect for time explained 68
percent of the total variance for the behavioral measure, while the main effect for
group explained 31percent of the total variance. The interaction effect of time and
group accounted for 70 percent of the total variance.
Short-term maintenance data results indicate that the participants in the
treatment group maintained a high rate of total observable behaviors from posttest
(M = 59.68) to maintenance (M = 68.81). A Paired Samples T-test shows that
although the group mean of total observable behaviors was higher at maintenance,
there was no significant difference between posttest and maintenance means. After
independently analyzing each of the seven observable behaviors, there was only one
behavior for which a significant difference was found (See Table 14).
136
Table 14
Treatment Group Differences from Posttest to Maintenance for Total Behaviors
Observed
Posttest
Observable Behavior
Maintenance
M
SD
M
SD
t
Academic Engagement
40.13
7.39
44.19
6.60
-1.47
Challenging Behavior
2.44
4.72
2.31
4.35
.1
Prompts to Teacher or Natural
Cue
1.50
1.67
.25
.58
2.71*
Prompts to Task
12.69
4.92
9.69
4.88
1.94
General Praise
2.44
2.48
2.19
2.14
.43
Descriptive Praise
3.44
3.37
2.63
2.87
1.01
Proximity
32.19
16.67
35.94
16.70
-.81
Note. *p < .05
Instructional assistants in the treatment group significantly decreased their
prompting to the natural cue or environment from posttest to maintenance.
Instructional assistants also demonstrated a mean decrease in prompts to the task,
137
from 12.69 at posttest to 9.69 during maintenance, but the decrease was not
statistically significant. Similar, smaller gains in group means were also found for
student academic engagement and proximity adjustment (See Table 14).
Post Hoc Analysis of Individual Observable Behaviors
The next section will address the second hypothesis regarding which of the
seven observable behaviors were most sensitive to change as a result of the
intervention. In this study, a Pearson Chi-Square test was calculated for each of the
seven observable behaviors to determine which instructional assistant and student
observable behaviors were most sensitive to training effects. The researcher took
several steps to determine the number of instructional assistant participants who made
notable gain or reduction in specific behaviors. First, the researcher calculated the
gain or reduction scores for each of the seven observed behaviors for each participant.
Gain scores were calculated for academic engagement, prompts to the teacher or
natural cue, descriptive reinforcement and proximity, since these behaviors were
intended to increase as a result of the intervention. To calculate the gain score, each
participant’s posttest score was subtracted from their pretest score for each variable.
Conversely, reduction scores were calculated for challenging behavior, prompts to
task and general reinforcement, since these behaviors were intended to decrease as a
result of the intervention. To calculate these reduction scores, posttest scores were
subtracted from pretest scores for each behavior. Next, the median gain or reduction
for the entire sample was calculated for each of the seven observable behaviors. Last,
a Chi-square was calculated to determine the number of instructional assistant
138
participants in each group who gained or lost more than the median for each total
observable behavior. The following table presents the results of these calculations for
the behaviors targeted to increase.
139
Table 15
Means, Medians, Gain Scores and Effect Sizes for Observable Effective Behaviors
Group
Behavior
Pretest Posttest Median # significant
Mean
Mean
Gain
Gain*
Effect
Size
Treatment
(n=16)
Academic
Engagement
7.13
40.12
15
15
4.55
Prompt to teacher
or natural cue
.75
1.5
0
9
.42
Descriptive
Praise
.50
3.44
0
12
3.28
Proximity > 2 ft.
8.19
32.19
0
13
1.50
Academic
Engagement
12
13.07
15
0
.09
Prompt to teacher
or natural cue
.27
.27
0
3
0
Descriptive
Praise
.20
.33
0
4
.32
10.27
5.47
0
1
-.30
Control
(n=15)
Proximity > 2 ft.
Note. *Significant gain was defined as a gain score greater than the median for that
measure
As a result of the intervention, participants in the treatment group not only
outperformed those in the control group on both the total cognitive and behavioral
140
measures, but also made significant gains for the targeted individual behaviors. For
student academic engagement, the pretest mean shows that students were typically
academically engaged for an average of 7.12 out of 60 ten-second intervals.
Following the intervention, the mean number of intervals in which a student was
independently academically engaged increased to 40.12 out of 60. Fifteen out of the
sixteen, or 94 percent, of the treatment participants’ students demonstrated a
significant increase in academic engagement following the intervention by increasing
student academic engagement by sixteen or more intervals. To determine the strength
of the relationship between the variables, Cohen’s D was calculated. According to
Cohen’s (1988) guidelines, an effect size of 2.0 would indicate that the mean of the
treatment group is at the 97.7 percentile of the control group, which can also be
described as 81.1 percent of nonoverlap between the two distributions. For this
particular behavior, the effect size of 4.55 would suggest a substantial difference
between the treatment and control group for this measure (See Table 15).
A majority of the instructional assistants also demonstrated significant gain
for both adjustment of proximity (n = 13) and the use of descriptive praise (n = 12).
The mean difference from pretest (M = 8.19) to posttest (M = 32.19) for proximity is
sizable as is the effect size (d = 1.50). An effect size of 1.50 indicates that the mean of
the treatment group is at the 93.3 percentile of the control group. There is a
nonoverlap of 70.7 percent in the two distributions (Cohen, 1988). Although the mean
differences from pretest (M = .50) to posttest (M = 3.44) for descriptive praise was
not as large as proximity and academic engagement, twelve participants made
141
significant gain, with a large effect size of 3.28. Nine of the participants demonstrated
a significant gain for the final gain behavior, prompts to the teacher or the natural cue.
The pretest (M = .75) to posttest (M = 1.5) mean gain was small, with a medium
effect size of .42. This suggests that the mean of the treatment group is in the 66th
percentile of the control group (Cohen, 1988) (See Table 15).
None of the control group participants made significant gain in academic
engagement as defined by the median score. Pretest (M = 12) to posttest (M = 13.07)
mean differences and a .09 effect size suggests that the mean of the treatment group is
close to the 54th percentile of the treatment group. Although one participant
significantly increased proximity from pretest to posttest, mean differences suggest
participants in this group had fewer instances of proximity change from pretest
(M = 10.27) to posttest (M = 5.47). Interestingly, four participants did significantly
increase their use of descriptive praise, although mean pretest (M = .20) to posttest
(M = .33) scores show little change, with only a moderate effect size of .32. This
indicates a 21.3 percent overlap of the two distributions. Similarly, three participants
increased prompts to the teacher or the natural cue although the pretest (M = .27)
posttest (M = .27) group means showed no change (See Table 15). The following
table will present the findings for the reduction scores for the undesirable observable
behaviors.
142
Table 16
Means, Medians, Reduction Scores and Effect Sizes for Ineffective Observable
Behaviors
Group
Behavior
Pretest
Mean
Posttest
Mean
Median
Reduction
# significant
reduction*
Effect
Size
Challenging
Behavior
11.5
2.44
-1
9
-.68
Prompt to
task
37.75
12.69
-13
13
-1.49
General
Praise
5.44
2.44
-1
6
-.41
Challenging
Behavior
2.87
3.40
-1
4
.15
Prompt to
task
39.20
39.40
-13
2
.01
General
Praise
6.60
4.33
-1
9
-.48
Treatment
(n=16)
Control
(n=15)
Note. *Significant reduction was defined as a loss score greater than the median for
that measure
As with the previous gain behaviors, many of the participants in the treatment
condition significantly reduced undesirable behaviors following the intervention.
Most notably, thirteen of the fifteen, or 87 percent, of the participants reduced their
prompts to the task by fourteen or more prompts. Likewise, mean differences from
143
pretest (M = 37.75) to posttest (M = 12.69) for this behavior demonstrate a large
reduction in the number of prompts following the intervention. A large effect size of
-1.49 suggests that the mean of the treatment group is in the 93.3 percentile of the
control group (Cohen, 1988).
Nine of the participants significantly reduced challenging student behavior as
a result of the intervention. Mean pretest (M = 11.5) to posttest (M = 2.44) scores
indicate a noteworthy reduction in challenging behavior, which is of particular
interest since the mean student challenging behavior was significantly higher at
pretest for treatment participants than that of control participants. A reduction of
general praise was significant for six of the treatment participants. Mean pretest
(M = 5.44) to posttest (M = 2.44) differences indicate that the use of general praise
decreased overall although the reduction of general praise may have been the result of
an increase in descriptive praise (See Table 16).
For the control group, although the mean group differences showed nearly no
change from pretest (M = 39.20) to posttest (M = 39.40), two participants decreased
their prompts to task without intervention. Challenging behavior was significantly
reduced for four of the participants although mean pretest and posttest scores
demonstrate only a slight reduction, with a small effect size of .15. Similarly, nine
participants in the control group decreased their use of general praise. A moderate
effect size of -.48 suggests that the mean of the control group is within the 66th
percentile of the treatment group, which indicates a nonoverlap of 27.4 percent.
144
Analysis of Instructional Assistant Characteristics and Susceptibility to Training
Using a Standard Multiple Regression the next section addresses the third
hypothesis regarding which instructional assistant characteristics were more
susceptible to training effects. This was done to determine if there were any specific
instructional assistant characteristics that reliably predicted positive outcomes on both
the cognitive and behavioral measure. The final standard multiple regression will
determine if any particular instructional assistant characteristics predicted student
academic gain. The researcher selected the independent variables group, gender, age,
previous education, years experience as an instructional assistant, and years working
in inclusion as the independent variables, which would predict the specific dependent
variable being analyzed for each of the three analyses.
145
Table 17
Summary of Standard Multiple Regression Analysis for Variables Predicting Posttest
Total Behavioral Measure
Independent
Variable
B
SEb
Beta
T
p
Group
-82.41
10.45
-.82
-7.89
.00***
Gender
-21.62
12.40
-.17
-1.74
.09
Age
-.03
.51
-.01
-.06
.96
Previous
-1.92
2.52
-.07
-.76
.45
education
2.46
2.65
.93
.93
.36
Years experience
-2.85
2.98
-.96
-.96
.35
Years in inclusion
Note. Adjusted R2 = .76, ***p < .001
Results of the multiple regression suggest that a single significant correlation
existed between the group and the posttest total behavioral measure [r = -.877, n = 31,
p < .000]. Two weak correlations were also found between the dependent variables
age [r = .381, n = 31, p < .956] and gender [r = -.307, n = 31, p < .094] although
neither were statistically significant (See Table 17). For the remaining variables, no
significant correlations were found.
The results of the standard multiple regression demonstrated that 76 percent of
the variance in the total behavioral measure was explained by the predictors in this
146
model, which include group, gender, age, last degree completed, years experience as
an instructional assistant and years working as an inclusion instructional assistant.
When analyzing the contribution of each independent variable, group makes the
strongest and only significant contribution in explaining the total posttest behavioral
scores, when the variance explained by all the other variables was controlled for. The
remaining variables are not statistically significant, indicating that none of the other
variables made a unique contribution to the prediction of the posttest total behavioral
measure.
Table 18
Summary of Standard Multiple Regression Analysis for Variables Predicting Posttest
Total Cognitive Measure
Independent
Variable
B
SEb
Beta
t
p
Group
-2.20
.65
-.56
-3.41
.00**
Gender
1.13
.77
.23
1.47
.16
Age
.05
.03
.29
1.70
.10
Previous education
.03
.16
.03
.20
.85
Years experience
-.16
.16
-.33
-.97
.34
Years in inclusion
.37
.18
.68
2.01
.06
Note. Adjusted R2 = .39, **p < .05
The multiple regression analysis, which was conducted to identify the
predictive variables for the posttest total cognitive measure, showed that group
147
[r = -.522, n = 31, p < .000] and age [r = .418, n = 31, p < .102] were only moderately
correlated with the dependent variable. Although group was significant at the p < .001
level, age was not found to be statistically significant. No other noteworthy
correlations were found.
The results of this analysis suggest that 39 percent of the variance in the
posttest total cognitive measure was explained by the independent variables included
in this model (See Table 18). When examining the contribution of each independent
variable in this model, the group variable made the strongest unique contribution
when the variance explained by the remaining variables was controlled for.
Table 19
Summary of Standard Multiple Regression Analysis for Variables Predicting Student
Academic Gain
Independent
Variable
B
SEb
Beta
T
Group
-31.62
4.50
-.85
-7.03
.00**
Gender
-4.84
5.34
-.10
-.91
.37
-.11.05
.22
-.06
-.51
.62
Previous education
-.05
1.08
.01
.05
.96
Years experience
-.01
1.14
-.00
-.01
.99
Years in inclusion
-.33
1.28
-.06
-.26
.80
Age
Note. Adjusted R2 = .68, ***p < .001
148
p
A standard multiple regression was used to determine which, if any,
instructional assistant characteristics predicted student academic gain. Of the selected
independent variables, group was the only variable that showed a strong correlation
with the dependent variable [r = -.853, n = 31, p < .000]. For the remaining variables,
no significant correlations were found.
The outcome of this analysis suggests that 68 percent of the variance in
student academic gain was explained by the independent variables included in this
model (See Table 19). When the other variables were controlled for, group was the
only variable that was a significant predictor of student academic gain. The remaining
variables did not uniquely contribute to the prediction of student academic gain and
were not statistically significant.
Social Validity
Participants in the treatment group reported favorably in the social validity
measure. In response to the eleven 5-point and six 3-point Likert type items, there
was a mean score of 62 out of 67. All sixteen participants agreed somewhat or
strongly agreed with the usefulness of the intervention as a whole, as well as specific
intervention components. Further analysis showed a mean score of 52 out of 55 for
the initial eleven questions, which indicated that instructional assistants were 1)
satisfied with the training, 2) felt more confident and better prepared to support their
student, 3) felt that the training was critical for supporting students with disabilities in
general education and 4) felt that the skills were both transferable to the classroom
and were useful with a variety of students.
149
Question twelve asked participants to rate each of the six components of the
training as highly valuable, moderately valuable or not valuable to determine which
components participants felt were most beneficial. A total mean score of ten out of
twelve suggests that participants’ felt that each of the training components was
moderately or highly valuable. In response to the open-ended questions, all of the
assistants made positive comments regarding their participation.
For example, one instructional assistant wrote:
I have been an IA for 20 years now and truly feel that if I had gone through
this training and others like it years ago, my students would have been more
independent learners. I too would have been a more effective teacher!
PLEASE continue this kind of training for ALL instructional assistants…for
the advancement of education and for our children!!!...
Another instructional assistant wrote, “…We seem to have the same training
over and over each year. It gets old and ineffective after 10 years. The training rarely
targets the issues as well.”
One of the instructional assistants who demonstrated a high rate of effective
support behaviors prior to the training wrote, “Although I have learned some of these
methods before, it was helpful to be challenged to use them and think more
purposefully as I am working. I have become more aware of when I am effective and
ineffective, and am more apt to correct myself.”
During post-intervention interviews, a small number of instructional assistants
indicated that being observed was initially uncomfortable although repeated
150
observation appeared to reduce or eliminate this discomfort. One instructional
assistant stated, “You get nervous when someone is watching you and don’t act how
you would normally act, but after awhile its ok.” Another instructional assistant also
indicated similar feelings by commenting, “Coaching does make you uncomfortable
but makes you really aware of what you are doing and how to be better and anyone
can benefit from someone coming in and observing.” Although instructional
assistants initially felt uncomfortable, they also suggested that the coaching sessions
continue once a month in order to ensure the ongoing correct implementation of the
newly learned skills in order to get them “back on track.”
151
CHAPTER V
DISCUSSION
The current study expanded on the growing need for a training model that
provides instructional assistants the skills necessary to provide effective academic
supports in inclusive settings. The results suggest that instructional assistants can
learn to a) increase student independent academic engagement through the use of
effective supports, b) decrease the use of less effective supports after participating in
three hours of in-service training and two short coaching sessions and c) demonstrate
a change in thinking and increased awareness of their own behavior and its
corresponding effects on the student.
Directly Observed Changes in Instructional Support Behavior and the Impact on
Students
Following the intervention, the data indicates a significant change in
instructional assistants' observable behaviors. At baseline, the instructional assistants
were observed implementing few, if any, effective support behaviors. Instead, the
majority of the time was spent a) providing the bulk of instruction to the student, b)
hovering over the student, c) providing an abundance of verbal and gestural prompts,
c) using little to no wait time after an instruction or prompt and d) employing prompts
in a way that interfered with student independent academic engagement. As indicated
in the literature (Conroy et al., 2004; Egilson & Traustadottir, 2009; Giangreco, Broer
et al., 2001a; Giangreco et al., 1997; Malmgren & Causton-Theoharis, 2006; Young
et al., 1997), these behaviors are common among untrained instructional assistants
152
and can have lasting detrimental effects on student academic and behavioral gains. As
indicated in the literature (Giangreco, 2009; Giangreco et al., 1997; Giangreco et al.,
2005), it is likely that most instructional assistants are unaware of these detrimental
effects, due to lack of training. Consistent with this literature, instructional assistants
in this study rarely received any pre-service or in-service training specifically
designed to teach research-based strategies and techniques to successfully facilitate
the inclusion of students with disabilities (Deardorff et al., 2007; Giangreco, Broer et
al., 2001b; Pickett & Gerlach, 2003; Pickett et al., 2003). For example, prior to
training, the researcher observed many instructional assistants repeat variations of the
same verbal prompt to evoke a response from students. The continual use of an
ineffective prompt reflects either an inability to execute an instructional plan around
classroom tasks, or an inability to notice or problem-solve the failure of repetitive
prompting.
Following the intervention, instructional assistants in the treatment group
showed improvements in effective support behaviors and fewer ineffective behaviors.
For example, instructional assistants decreased the use of prompts to the task, which
increased independent student academic engagement. Also, following training,
instructional assistants made appropriate proximity adjustments more frequently,
allowing students greater opportunity to be independently engaged in academic
learning. These direct observation results from the natural environment strongly
suggest that the experimental intervention successfully taught instructional assistants
the intended instructional strategies and skills. Results also showed the expected
153
positive, collateral effects of changed instructional support behaviors on the
likelihood of students’ academic engagement and appropriate classroom behaviors.
This finding begins to explain the correlation between instructional assistant use and
student outcome, although further research is needed to investigate the longer-term
effects of measurable achievement or behavioral adjustment.
Although not directly measured, it is also possible that an increase in student
academic engagement served to reinforce the newly learned skills of the instructional
assistants. During interviews, for example, trained instructional assistants made
comments such as,
a. He has become more independent and can take a direction and
follow through.
b. He was more independent on certain tasks that he wasn’t able
to do before. A complete change.
Comments like these suggest a noticeable change in students’ behavior, which could
encourage the instructional assistant to continue behaving in ways that result in
student independence.
Although it seems reasonable to associate changes in students’ academic
engagement with changes in observable instructional assistant behaviors, comments
made by several of the instructional assistants during the post-intervention interviews
also suggest that the instructional assistants’ skills improved beyond observable
behaviors. Many instructional assistants stated that they had been unaware of their
own behavior and how it affected students. Once the notion of consciously thinking
154
about how and when to provide support was introduced, instructional assistants were
more conscious of their actions and the positive or negative effects it had on student
behavior. One such comment was made by an instructional assistant who said, “It is
vital for aides to know why they are doing what they are doing and what the specific
reason for doing that action was.” Another instructional assistant stated, “The training
brought some awareness to what I was doing and why.” These comments suggest that
the training produced an overall improved instructional assistant and that instructional
assistant improvement was more than the sum of all of the discrete behaviors but
instead a combination of a shift in thinking and the generalization to applied settings
of newly learned behaviors.
Directly Observed Changes in Student Challenging Behavior
Although not targeted specifically by experimental training, in some cases
students’ challenging behaviors decreased in association with observed changes in
instructional support behaviors. Observation of some instructional assistants in the
treatment condition at pretest showed a statistically significant higher rate of student
challenging behavior than that of the control group at pretest. Although this can be
attributed to a small number of students who demonstrated high rates of inappropriate
behavior, the fact that these behaviors diminished in association with the observation
of more effective instructional behavior may be important. That is, an increase in
instructional effectiveness may serve to redirect or remediate undesirable student
behaviors indirectly by better directing attention, pacing the task, or matching
prompts to student behavior as task engagement proceeds.
155
It is also possible that spontaneous improvements in student behaviors
prompted changes in what were coded as more effective support behaviors by the
instructional assistant. However, this is unlikely given the histories and baseline
observations of assistant-student dyads prior to intervention. The specific
instructional support behaviors taught to assistants in this study are most successfully
employed when assistants can quickly recognize that a specific support is no longer
effective, what alternative supports might be effective, and what decisions will result
in movement towards more independent student responding. In contrast,
undifferentiated and repetitive prompting that is contingent on a variety of student
responses is much more likely to be under the control of student behavior. It is
therefore unlikely that spontaneous improvements in student behavior would have
elicited the generalized improvement observed in the instructional assistants. In
addition, self-report by some assistants seems to support an interpretation of the
direction of effects as being trained instructional behaviors prompted student behavior
change and not visa versa. For example, one instructional assistant was asked how
she knew when to use certain methods of support. She stated,
It certainly depends on how he (the student) reacts; it can be a minuteto-minute decision.
Such reflective responses further suggest that trained instructional assistants were
aware that they should and did consciously and quickly make decisions regarding the
type and level of support, instead of unconsciously repeating ineffective behaviors.
156
Directly Observed Changes in Descriptive Praise
At baseline, across all participants, even casual observation showed that use of
“good job” or similarly generic, non-descriptive phrases were most frequent.
Although the use of descriptive praise is one of the easiest strategies to employ, at
pretest instructional assistants not only infrequently used any verbal praise but also
rarely if ever used descriptive praise to reinforce appropriate behavior. Anecdotal
evidence suggests that without being specifically taught to do so, instructional
assistants began to generalize their use of descriptive praise to all appropriate student
behaviors, not purely academic behaviors, following the intervention. This is not
surprising, considering that instructional assistants were given an hour-long training
on the importance of descriptive praise to increase student appropriate academic
behaviors. During this time, participants were trained not only to increase their use of
praise but to use descriptive statements contingent on the demonstration of desirable
academic behaviors. More specifically, participants were instructed to avoid the
“good job trap” in which the use of “good job” became an overused and automatic
response given to the student following the occurrence of a behavior.
If non-descriptive praise did in fact serve as a reinforcer for the student either
academically or behaviorally, the continual use of this same type of phrase may have
lost reinforcement value for students, possibly leading some students to engage in
more challenging behavior. It is possible; therefore, that observed decreases in
students’ challenging behaviors might also have resulted from increases in the trained
use of more descriptive praise. This interpretation should be viewed with caution,
157
however, because it is unclear whether any specific verbal praise functioned
simultaneously as a reinforcer of appropriate social and academic behaviors.
As suggested previously, for the purposes of this study general praise was
defined as an undesirable behavior. However, it should be noted that the use of
general praise, particularly when it serves as a reinforcer, might be sufficient to
increase or maintain appropriate student behavior. Therefore, general praise should
not be thought of as a behavior which instructional assistants should eliminate all
together, but instead investigated further to determine which type of praise is more
effective in increasing academic engagement and the demonstration of appropriate
student behaviors. These data only show that treatment participants decreased their
use of general praise from pretest to posttest, which may have been a result of a
corresponding (substitutive) increase in descriptive praise. Similarly, participants in
the control group demonstrated a decrease in the use of general praise from pretest to
posttest but did not show a corresponding increase in their use of descriptive praise at
posttest. This suggests that instructional assistants who did not receive the training
provided fewer general praise statements at posttest when compared to pretest. This is
of some interest; it appears that without training, the use of general praise may
decrease over time because assistants begin to view the use of verbal praise as nonfunctional. Further investigation is necessary to be clarify these issues.
It should also be noted that although a reliable difference was found between
the groups at posttest for the use of descriptive praise, the actual mean change from
pretest to posttest for the treatment group was relatively small. This is of interest
158
because the delivery of descriptive verbal praise was assumed by the researcher to be
the easiest of the targeted behaviors to train. One possible explanation for the low
rates of descriptive praise following intervention may be that participants did not
interpret this behavior to be of high value. The effects of using descriptive praise on
student academic engagement may not have been as apparent as other behaviors, such
as adjusting proximity and decreasing prompting to the task. This would likely be true
if verbal praise given by the instructional assistant did not serve as a reinforcer for the
student. For instance, older students may discourage the use of verbal praise by an
adult since it may be perceived as socially stigmatizing. Another possible explanation
may be that within a ten-minute observation, it may not be appropriate to emit such
high rates of descriptive praise. Continually verbally reinforcing a student during a
small period of time may feel contrived to the students or the instructional assistants,
or inappropriate in the context of the particular observation.
Conversely, a small number of the instructional assistants commented on the
ease and effectiveness of descriptive praise. One such instructional assistant stated,
“Descriptive praise was the easiest to implement since he (her student) responded so
well to it.” This may have been due to the fact that descriptive praise, as previously
suggested, served as a reinforcer for this particular student. If this was the case, the
instructional assistant may have observed a more evident change in the student’s
behavior, increasing the likelihood that the instructional assistant would continue to
use descriptive praise. This may help to explain why some instructional assistants
159
demonstrated high rates of descriptive praise while others continued to show limited
use at posttest.
Interestingly, the instructional assistant quoted above also acknowledged the
lack of praise given by other instructional assistants and the notable difference in
student behavior. This reflection may suggest that the trained instructional assistant
became increasingly aware of the absence of descriptive praise from other
instructional assistants. Although this warrants further investigation, it is conceivable
that increased awareness of the instructional assistant’s own behaviors concurrently
increased the awareness of the behaviors of other instructional assistants. These
insights into the instructional assistant’s thinking support the possibility that the
training brought an awareness of what behaviors the instructional assistant was
emitting and how such behaviors affected the student’s level of independence.
Directly Observed Changes in Prompts to the Teacher or Natural Cue
The finding that prompts to the teacher or natural cue were significantly
different between the groups at posttest suggests that participants in the treatment
group learned to initially prompt the student to environmental cues instead of
immediately providing task-specific prompting. Although this finding was
statistically significant, it is interesting to note that the actual mean change from
pretest to posttest for the treatment group was small. This may be because
instructional assistants have historically been known to start the position without the
appropriate training, leaving the instructional assistant to fend for him or herself and
develop their own strategies for supporting the student (Council for Exceptional
160
Children, 2001; Deardorff et al., 2007; Jones & Bender, 1993; Petscher & Bailey,
2006; Pickett & Gerlach, 2003; Pickett et al., 2003). These independently developed
patterns of support behavior are typically aimed at keeping the student quiet and out
of the teacher’s way, which may be difficult patterns of behavior to break after a
history of reinforcement from the environment and teacher (Marks et al., 1999). For
these reasons, initial prompting to the teacher may been viewed as causing a
disruption in the flow of the class or teacher instruction, while prompting to the
environment may significantly slow down the lesson, leaving the student behind and
rushing to catch up. All of these factors inevitably put pressure on the instructional
assistant to quickly and quietly keep the student moving through the task, which may
be easiest done by immediately providing prompts directly related to the task.
The increase in treatment participants’ prompts to the teacher or natural cue is
important in light of the literature, which states that teachers purposefully or
unintentionally cede much of the responsibility for managing students with
disabilities to instructional assistants, leaving the instructional assistants to deliver the
bulk of instruction by default. Teachers end up having only limited instructional
interactions with these students (Giangreco, 2009; Giangreco et al., 1997). It is
therefore significant that trained assistants directed attention more frequently to the
instruction being delivered by teachers in classrooms. Such an increase in prompts to
the teacher may help to focus the student’s attention on the teacher and her instruction
and thereby increase the benefits of inclusion. For these reasons, a portion of the
training reminded instructional assistants that the goal of inclusion is for the student
161
to function as part of a teacher-led class, and the instructional assistant does not have
to conduct all instruction in order to maintain inclusion. Future development may
build on methods for training instructional assistants to better understand how to
transfer instructional control to the classroom teacher and, indeed, to better solicit
teachers’ instructional attention to students with disabilities in their classrooms.
Directly Observed Changes in Prompts to the Task
Although changes in prompts to the teacher or natural cue were small, results
show that prompts directed to the task were reduced following the intervention. As
with praise, instructional assistants prior to training rarely used task specific prompts
effectively, rarely adjusted them to the current circumstances, and rarely showed
consideration in their choices of prompts. For example, at baseline all participants
were observed to engage in a high rate of prompting to the task. Because the
frequency of prompting was so high and instructional assistants used multiple
prompts at once, it was impossible to code in real time the number and type of
prompts to a task given during each ten-second interval without a permanent video
record. Anecdotal evidence confirmed that the most commonly observed prompts
were verbal and gestural prompts, which were repeatedly used in combination at a
high frequency by all participants. This is consistent with research conducted by Hall,
McClannahan and Krantz (1995), which indicated that prior to an intervention,
instructional assistants engaged in high levels of multiple prompts to the task,
followed by continual use of high levels of verbal and gestural prompting following
the intervention phase. Only when the researchers specifically told the instructional
162
assistants to only use physical prompts did the frequency of verbal and gestural
prompts decline.
The changes in instructional assistant prompting from pretest to posttest
suggest that the intervention successfully taught instructional assistants to be aware of
how often they were prompting, how many prompts they were giving at a time, and if
the prompt was necessary to facilitate student independence.
Although instructional assistant prompts to the task showed substantial
change, instructional assistants agreed that this was the hardest behavior to modify.
For example, in response to being asked what the most difficult strategy to
implement, one instructional assistant responded, “Not doing a verbal and gestural
prompt at the same time. The training brought some awareness to what I am doing
and why.” This instructional assistant acknowledged that she was still occasionally
prompting using this combination of prompts but now catches herself and has to
focus to stop herself. This suggests an increasing awareness of how prompts are being
given, an important characteristic of an effective instructional assistant. Also of
interest, this instructional assistant determined that the least intrusive prompt was not
a verbal prompt but instead a visual prompt. Following the training, she recognized
that her student had become dependent on verbal prompts and found that a visual
prompt was much easier to fade. Although the observable behavior changes were
quantitatively demonstrated in the data, these comments also reflect a qualitatively
different instructional assistant capable of targeting the reason for the lack of student
163
independence and problem solving in order to determine the most effective course of
action.
Another instructional assistant also recognized the difficulty in using only one
prompt at a time by stating, “The most difficult strategy to implement was not
verbally prompting along with additional prompts (gestures).” This statement is
important because observations during coaching sessions, and comments made by
instructional assistants while debriefing after coaching, indicated that this was a
difficult pattern of behavior to break. This difficulty may be due to the fact that
instructional assistants were left to develop their own strategies or observed other
instructional assistants using multiple prompts to ensure the student responded.
By using a variety of different kinds of prompts concurrently, for example
verbal and gestural prompts, the student is more likely to respond to at least one of
the prompts. Although this may initially seem like a favorable method of prompting,
instructional assistants were reminded that any prompt inserted into the environment
must be faded in order for the student to demonstrate independent engagement. This
was a critical portion of the training: instructional assistants were reminded that the
use of multiple prompts simultaneously made it increasingly difficult to determine
which prompt served as the controlling prompt. For example, many students with
autism are characterized by their difficulty processing and responding to verbal
language. Therefore, the use of a gesture prompt may be more effective. If an
instructional assistant were to use multiple prompts concurrently, it would be difficult
164
to determine which prompt was controlling the behavior and which prompt should be
faded first.
Wait time, although not directly measured, was also an important variable in
the instructional assistants’ ability to effectively prompt. At baseline, the repetition of
multiple prompts without wait time left little chance for the student to demonstrate
independent academic engagement. For instance, the following sequence
demonstrates an abundance of prompting that occurred within forty seconds by an
instructional assistant.
1) “What number goes here?”
2) “How many hoses?”
3) “Is that 26?”
4) “Can you write 26?”
5) Hand over hand prompting to write two and then six.
6) “What is that?”
7) “Do you know?”
8) “Say, “I don’t know”
9) “What is it?”
10) “Let’s count.”
11) “Eyes on your paper.”
12) “Eyes on your paper.” (With gesture prompt)
13) “Lets count, “20, 21…” (Student begins to count)
14) “Good job, give me a high five.”
This illustrates a typical sequence of prompting observed during baseline,
which supports the previous statement that, at baseline, prompts were typically
ineffective, not specific to the circumstances and task, and not deliberately selected
and utilized. This also suggests that such high levels of prompting incorporated little
to no wait time between the prompts, which inevitably interfered with the student’s
ability to independently respond. During the interview, this instructional assistant
acknowledged that providing wait time as well as not repeating the same prompt were
165
two of the most difficult strategies to implement. Although the data did not
specifically measure wait time due to the complexity of the behavior, it is likely that
instructional assistants in the treatment group were using increased amounts of wait
time following the intervention. This was supported by the data, which showed an
increase in student academic engagement, a behavior incompatible with prompting.
Directly Observed Changes in Proximity
The proximity between the instructional assistant to the student substantially
increased following the intervention for treatment participants when compared to
control participants. The training components effectively taught instructional
assistants to both recognize their proximity to the student and to make decisions in
regards to when it was appropriate to adjust proximity. This was a key factor in
differentiating between teaching the instructional assistants a discrete behavior and
definitive rule, versus teaching instructional assistants to consciously make decisions
regarding their proximity and its corresponding effects on student engagement. One
instructional assistant commented in the interview, “I always sat right next to or
beside the student to make them work,” which suggests that she wrongly learned that
proximity should be used to control student engagement. Undeniably, this is a faulty
assumption for two reasons. First, close proximity can lead to over-prompting and
student dependence simply by being close to the student (Egilson & Traustadottir,
2009; Giangreco et al., 1997; Malmgren & Causton-Theoharis, 2006). Second, it is
unrealistic to use proximity to control engagement since students in general education
are frequently expected to work independently and without the close supervision of
166
an adult. Although teachers do use proximity to redirect students back to the task, the
continual use of close proximity may begin to serve as a nonverbal prompt for the
student to engage in the task. This type of prompt, as all other prompts, must be faded
to ensure student engagement is not controlled by the presence of an adult. Anecdotal
evidence suggests that this was the case for one of the instructional assistants. During
posttest data collection, it was clear that the student had come to rely on the presence
of the instructional assistant in order to engage in the task. Student behaviors such as
turning around and looking at the instructional assistant or disengaging in the task
were evident once the instructional assistant moved more than four or more feet
away. Correctly, the instructional assistant identified that her proximity must be
slowly faded in order to ensure the student’s continual engagement in the task.
Although instructional assistants were deemed out of the student’s proximity
when he or she was two or more feet away from the student, it was also evident that
many of the instructional assistants were able to move much further away than two
feet without interrupting the student’s academic engagement. Many of the
instructional assistants in the treatment group were able to monitor the student from
two or more feet away, for several ten-second intervals in a row, following the
intervention. This suggests that in some cases student academic engagement was not
controlled by instructional assistant proximity. This seems to support the notion that
many of the instructional assistants were not aware of their continual close proximity
and the importance of distancing themselves from the student. This is consistent with
one such study by Giangreco and Broer (2005), which states that instructional
167
assistants were typically within close proximity to the student for 86 percent or more
of the day. This is of interest since fewer than 15 percent of the sample stated that
they were worried that their close proximity may be unnecessary or interfere with
teacher and student interactions.
Instructional Assistant and Teacher Interactions
Although no data were collected on instructional assistant and teacher
interactions, a small number of instructional assistants commented on the general
education teacher’s response to the student’s increased academic engagement. These
comments serve to reinforce the notion that student behavior change was evident and
observable. For example, one such instructional assistant stated, “she noticed that he
was quieter and more independent. She talked more about what she wanted him to do
academically. She noticed a huge difference before and after the training.” Another
instructional assistant made the comment, “I am being more informative to her (the
teacher) as to what the student is accomplishing. He (the student) has made such great
strides that I wanted to share it with her.” These comments seem to imply an
awareness both by the instructional assistant and the general education teacher in
regards to the changes in the student. This is of great importance since research has
indicated that many teachers have historically avoided ownership and responsibility
for the student due to the fact that an instructional assistant is readily available
(Giangreco, Broer et al., 2001b; Giangreco et al., 1997).
168
Impact on Instructional Assistant Thinking
An experimental measure, developed by the researcher and validated by
experts in Applied Behavior Analysis, aimed at determining if the training
intervention changed how instructional assistants thought about instructional
interactions. A series of realistic vignettes were devised to represent common
behavioral interactions between instructional assistants and students during inclusive
instruction. Although this was an experimental measure and should be further
evaluated to determine its reliability and validity, how instructional assistants process
observations of students’ as well as their own behaviors is a potentially important
component of evaluating overall instructional assistant effectiveness. Instructional
assistants can learn specific behavioral techniques, as demonstrated by this study, but
their appropriate generalization of these techniques to changing circumstances can be
facilitated if they also learn to attend to, recognize, and effectively respond to
instructional opportunities as they occur. In this study, the researcher hoped that the
vignettes might capture the degree to which training experiences and feedback had
generalized to untrained examples of common classroom episodes.
At baseline, in a series of vignettes representing a variety of behavioral
scenarios related to inclusive instruction, a majority of the participants demonstrated
only a limited ability to judge whether depicted instructional support behaviors were
likely to be “ineffective” or “effective.” This result was to be expected, as the
literature has indicated historically that instructional assistants are typically the least
trained personnel assisting students with the most demanding and complex
169
educational needs (Giangreco, 2009). Although not specifically instructed to
comment, a small number of the instructional assistants nevertheless wrote comments
on the vignettes either justifying their judgments or giving examples of behaviors
thought to be more efficacious. For example, for the a vignette depicting ineffective
prompting, one of the instructional assistants wrote,
Does Eric (the student) need hand over hand? IA should modify
assignment for Eric with a model to follow and use positive
reinforcement.
This type of comment served as an indicator that the instructional assistant could
identify the ineffective behavior and also suggest alternative effective behaviors
although it should be noted that at baseline correct comments such as this one were
rare.
Although a few instructional assistants were able to identify the ineffective
instructional assistant behavior, many of the instructional assistants who wrote such
comments focused predominately on student behaviors and skill levels depicted in the
vignettes. For instance, for a vignette depicting ineffective proximity, an instructional
assistant incorrectly indicated the hypothetical instructional assistant was “highly
effective” and wrote,
…but…Matt is too dependent after 2 years. Child should be able to
write name himself.
This comment is of interest since it suggests that the instructional assistant recognized
only that the student was too dependent but did not attend at all to the correlated
170
behaviors of the instructional assistant being depicted. In fact, the instructional
assistant rated the hypothetical aide as “highly effective.” One aim of this study,
therefore, was to prompt instructional assistants to see such instructional episodes – in
the vignettes, but also in real life – as involving behavioral interactions in which their
own behavior could and should be observed and modified.
A few instructional assistants did correctly identify when the fictional
instructional assistant’s behaviors were ineffective, but nevertheless recorded an
incorrect judgment. For example, for an ineffective proximity vignette, one
instructional assistant incorrectly rated the vignette as “somewhat effective” although
her comments clearly indicated that she knew that close proximity was unnecessary.
She wrote,
…fourth graders don’t need you right next to them.
She also wrote,
…Why not wander around the class and give him a chance to be
independent and confident in his work.
The discrepancy between her comments and her recorded rating of IA effectiveness
raises a question of whether she was confused by the rating scale or, more
specifically, what was meant by “somewhat effective.” She may have felt that
although the close proximity was inappropriate, the hypothetical instructional
assistant might still be doing a relatively good job, indicating that she may not have
realized how adjusting proximity could be employed dynamically as part of an aide’s
instructional repertoire.
171
It is interesting also to note that, although some instructional assistants were
able to identify effective or ineffective instructional assistant behaviors on the various
vignettes, this apparent knowledge did not appear to translate to actual practice when
they were observed in classrooms. Quite the contrary, during baseline direct
observation, the instructional assistants displayed high rates of ineffective and low
rates of effective behaviors. For instance, although all 31 instructional assistants were
able to identify a hypothetical instructional assistant as effective in the effective
descriptive praise vignette, only eight instructional assistants (26%) were observed to
deliver even a few descriptive praise statements during baseline direct observations.
For the remainder of the instructional assistants, no descriptive praise was observed at
baseline. These examples suggest at best that instructional assistants may possess
some formal, declarative knowledge about classes of instructional behavior that are
desirable and “effective” while not possessing the procedural knowledge necessary to
generate these behaviors appropriately when they were working with students
(Anderson, 1976).
At posttest, the data indicated that instructional assistants in the treatment
group had significantly improved compared to the control group. Although the mean
gain for the treatment group was small, the posttest vignettes did reflect an increase in
instructional assistant awareness and recognition of effective and ineffective
instructional assistant behaviors. These findings were further supported by many of
the written comments made by the instructional assistants who could accurately
detect the ineffective and effective behaviors of the hypothetical instructional
172
assistants. Written comments provided additional information with regard to what
information instructional assistants attended to, how such information was
interpreted, and how they reasoned through a specific scenario that had a number of
different possible interpretations. In response to the vignette about ineffective wait
time, for example, one instructional assistant wrote, “no wait time” and “she gave him
the answer, he didn’t have to do any work.” For the vignette on ineffective descriptive
praise, another instructional assistant wrote, “It seemed boring just saying “good job,”
It did not really explain…needed to be more specific.” For the vignette concerning
effective wait time, a different instructional assistant wrote, “I think that this was very
good-1st she gave him a few seconds to get going-when she saw that he just sat there,
that’s when she made her move.” These comments assisted in the interpretation of the
quantitative changes detected in the “correct” characterizations of the interactions the
vignettes were meant to illustrate.
Other written comments that detailed specific elements of the vignettes
support the interpretation that instructional assistants were attending to discriminative
information and interpreting that information as the training intended. Such
comments identified ineffective use of multiple prompts at once, the use of more
intrusive prompts before less intrusive prompts and a lack of wait time between
prompts. Likewise, in vignettes illustrating descriptive praise, instructional assistants
were able to identify the absence of specific praise, the occurrence of reprimands
instead of praise and the use of praise to motivate the student to continue working.
Comments regarding proximity suggested that instructional assistants were able to
173
identify the appropriate adjustment of proximity when the student was independently
engaged, a hovering instructional assistant and the incorrect adjustment of proximity
before the student demonstrated independence on the task. These written comments
confirm that instructional assistants in the experimental training group gained
increased awareness of salient features of the interactions depicted in vignettes and
that they reasoned differently about what they noticed in vignettes.
In contrast, some participants in the control group continued to pick out
insignificant features of the vignettes at posttest, and commented only on them. For
instance, one instructional assistant rated the ineffective proximity vignette as very
effective and wrote, “Very effective but level of mathematics seems so low for a 4th
grader.” Although this and similar comments make what might be valid observations,
they overlook the salient features of the instructional interactions intended by the
vignette, which was to notice that the instructional assistant depicted was
unnecessarily maintaining close proximity to the student. For an ineffective
prompting vignette, an instructional assistant in the control group underlined the
sentence in which the instructional assistant said to the student, “Carlos, write roots
on your chart or you are going to be behind everyone else.” Next to this sentence she
wrote, “Never compare your child to another. Not ok!” Although this observation
may be valid, the salient feature of this vignette was the ineffective instructional
behavior of continually repeating the same verbal prompt.
It should be noted that many of the instructional assistants in the control group
were able to indicate correct answers on the vignettes that were written to reflect an
174
instructional assistant engaging in effective support behaviors. This appears to
indicate that training may have had a differential effect on participants’ ability to
identify ineffective instructional behaviors. This differential effect may explain why
some control group members tended to comment on a student’s behavior or academic
ability for relevant vignettes. In short, these results support the potential of carefully
designed and sufficient training that will increase the salience of both ineffective as
well as effective instructional behaviors to outcomes for students.
Follow Up
Results of the follow-up probes indicate that instructional assistants in the
treatment group were able to maintain their skills for 2-3 weeks following the
intervention. Maintenance of the instructional support behaviors following the
training indicates that instructional assistants can sustain lasting behavior change
following the withdrawal of training. This may have occurred for several reasons.
First, the training was carefully designed to provide job specific skills to instructional
assistants working in inclusive settings. This may have made the retention of the
newly learned behaviors more likely. Second, the training package incorporated
coaching as a way of transferring newly learned behaviors from the training
environment to the classroom. Providing a means of transferring knowledge from
training into the classroom may have solidified how to use such skills in the natural
environment thereby increasing the likelihood of lasting behavior change. Lastly, the
design of the training package enabled instructional assistants to learn and practice
one skill in isolation prior to the introduction of a new skill. This may have been a
175
simple yet understandable training format for instructional assistants, thereby
increasing the likelihood for transferability and retention over time. For these
reasons, instructional assistants were able to successfully acquire and maintain the
skills necessary to demonstrate the targeted behaviors following the training.
Although these factors may have contributed to instructional assistant retention, it
should also be noted that follow-up was assessed with a single probe and was
conducted shortly after posttest. Future research is required to determine the
durability of maintenance over progressively longer periods of time.
In conclusion, the efficiency and effectiveness of the program demonstrated
desirable results, particularly in relation to the increase in instructional assistant
effective support behaviors and student independent academic engagement. In
contrast to other efficient training programs, this study contributed to the literature by
demonstrating both positive changes in instructional assistant and student engagement
from low baseline levels to high post-intervention levels, following only three hours
of training and an average of 50-60 minutes of in-class coaching. If such gains can be
achieved in a relatively short training period, it is likely that ongoing training and
coaching throughout the course of a school year would produce larger and more
substantive gains in student engagement and instructional assistant effectiveness.
Social Validity
Social validity was demonstrated by the instructional assistants’ favorable
ratings. This is of great importance since instructional assistant acceptance is likely to
translate to greater susceptibility to training. In this study, favorable ratings were
176
likely a result of a training package that provided job specific skills tailored to the
needs of instructional assistants working in inclusive classrooms. This type of
targeted instruction is a crucial aspect of providing instructional assistant training that
is both effective and durable.
It may also be assumed that instructional assistants felt comfortable and
confident using the strategies learned in the classroom, since the training was
presented in an easily understandable yet structured format. In essence, the training
emphasized the rationale, the correct usage of each of the targeted behaviors and how
and when to use such behaviors. Presenting the information in such a way enabled
instructional assistants to both learn the specifics of how to effectively use each
behavior but also reinforced the notion that utilizing such behaviors would improve
instructional assistant effectiveness and increase student independence. To further
support this claim, one instructional assistant made the comment, “I needed to hear
and see how these strategies can help a student become more independent.” This type
of comment suggests that components of future trainings should include clear yet
easily understandable concepts, as well as precisely defined behaviors that can be
broken down in order to teach each of their respective parts.
Instructional assistants indicated that this training was useful with their target
student but also with a variety of other students, including those without disabilities.
For instance, one instructional assistant wrote the comment, “This has helped my
student and with the other students in the class.” It may be that instructional assistants
indicated such favorable ratings since this training providing instruction on pivotal
177
behaviors that could be used with a variety of students. This may be of increasing
importance since instructional assistants are frequently moved within the district to
support a wide range of students. For this reason, future trainings for instructional
assistants should provide targeted instruction on foundational skills that can be used
to support students with varying disabilities and ages.
It is also important to take into account the evolving role of instructional
assistants, which has made it is increasingly common for instructional assistants to
deliver instruction to small groups of students both with and without disabilities.
Therefore, it may be that a training, which provides instruction on a set of pivotal
skills utilizable with a variety of students, is highly valuable. Conceivably, this could
increase the likelihood of treatment acceptability as well as the durability of behavior
change over time since such skills would likely be utilized frequently with a variety
of students.
Overall, instructional assistants felt the training was critical for supporting
students with disabilities in general education and would also recommend the training
to other instructional assistants. Written comments such as, “It’s nice to have a
beneficial training. Our trainings rarely target our issues all that well” also provides
further evidence that trainings specifically tailored to the needs of instructional
assistants are regarded as highly valuable. This provides further support that districts
can provide targeted and carefully designed trainings that instructional assistants find
both useful and highly valuable in a brief and cost effective training format.
178
Finally, when asked to rate the specific components of the training as not
valuable, moderately valuable or highly valuable, many of the instructional assistants
indicated that practice (during the training) was only moderately valuable. This is an
interesting finding, considering that practicing a newly learned behavior with the
supervision of the trainer would ensure that the correct use of the behavior prior to
using it in the classroom. During practice time, the researcher observed instructional
assistant discomfort when asked to practice a specific skill with a fellow instructional
assistant. If the researcher was not standing within close proximity to the instructional
assistants, instructional assistants were observed to talk with one another or engage in
other off-task behaviors, such as checking their phones or looking outside. One
possible explanation for this discomfort may have been feelings of embarrassment in
demonstrating the new behaviors in front of others. In one such case, this fear caused
such anxiety that one instructional assistant almost chose not to participate in the
study. She made a comment to the researcher that she did not want to participate if
she was going to be “put on the spot” or asked to stand in front of the group and
demonstrate a behavior. This provides further evidence that instructional assistants
may be hesitant in their jobs, due to a lack of training at the onset of the job and
throughout their time as an instructional assistant.
A majority of the instructional assistants indicated that coaching and feedback
were highly valuable. This is consistent with past literature, which indicates that
coaching and feedback are regarded as widely accepted training components (Arco,
1991). For this study, this component of the training appeared to be pivotal in
179
ensuring the transferability of skills from the training to the natural environment.
After reviewing the coaching forms, it was clear that many of the incorrect patterns of
prompting behavior observed at pretest were still being demonstrated following the
initial training on prompting. Participants continued to engage in the use of multiple
prompts, little wait time and the repetition of an ineffective prompt. Similarly, as
evidenced by anecdotal notes and coaching feedback, after the final training a number
of participants continued to demonstrate a low frequency of descriptive
reinforcement. This suggests the importance of coaching and feedback following
training to ensure the transferability of newly learned skills.
Although many of the instructional assistants indicated that coaching was
initially uncomfortable, upon conclusion of the study, coaching was regarded as a
necessary component of the training. Although feelings of discomfort are presumably
common while being observed, this may also provide some evidence that
instructional assistants are not accustomed to being observed or receiving feedback in
the natural classroom setting. This notion is supported in the literature, which
indicates that special education teachers are rarely able to train and supervise
instructional assistants in the general education classroom (Giangreco et al., 1997).
Unfortunately, lack of supervision and feedback may further reinforce the notion that
instructional assistants are to rely on their own ideas of how and when to provide
support. As research has suggested, this is neither effective nor appropriate and can
result in unfavorable effects on student independence and performance (Giangreco et
al., 2005).
180
It may be that consistent coaching and feedback would ease instructional
assistant discomfort and thereby be viewed more positively. As evidenced in this
study, over time instructional assistants became accustomed to observation and
feedback, which resulted in a shift in instructional assistant opinions regarding
observation and coaching. Comments made by instructional assistants during
interviews indicated that being observed was helpful because the presence of the
observer served as a prompt to utilize newly learned skills and to continue to be
cognizant of student behavior. Although further investigation is needed to determine
the effects of coaching and feedback on instructional assistant performance, this study
demonstrated that brief coaching sessions can be quick and extremely effective.
Instructional Assistant Characteristics and Susceptibility to Training
For both the total cognitive measure and the total behavioral measure,
instructional assistant susceptibility to training was only defined by what group the
instructional assistant was in. The other characteristics thought to predict significant
changes in vignette scores from pretest to posttest, such as instructional assistant
gender, age, previous education, years experience as an instructional assistant and
years working in inclusion, were not predictive of instructional assistant susceptibility
to training for these measures. This is a noteworthy finding since it may be assumed
that instructional assistants who have higher levels of education or have worked
longer in the job are more equipped to provide effective academic supports for
students with disabilities. This finding suggests that this is not true and that this type
of specified training is needed and can be beneficial for a variety of instructional
181
assistants with a range of experience and education. This also supports that claim that
long-term employment does not equate to an increase in knowledge and the
demonstration of effective support behaviors.
Summary
As continually reiterated in the literature, instructional assistants rarely
preview job-specific training which targets the necessary skills to effectively support
a student with disabilities (Deardorff et al., 2007; Doyle, 1995; Giangreco et al.,
2002; Riggs & Mueller, 2001). Luckily, a small portion of the training literature has
focused on providing systematic training to instructional assistants, in order to
increase their ability to implement a variety of skills used to facilitate social
interactions (Causton-Theoharis & Malmgren, 2005; Kohler, Anthony, Steghner, &
Hoyson, 2001), teach literacy skills (Lane et al., 2007), teach communication skills
(Bingham et al., 2007; Wood et al., 2007) and implement discrete trail teaching
(Bolton & Mayer, 2008).
Several prior researchers have demonstrated that instructional assistants can
learn such skills and successfully implement new strategies after a relatively short
training period (Bingham et al., 2007; Bolton & Mayer, 2008; Causton-Theoharis &
Malmgren, 2005; Hall et al., 1995; Petscher & Bailey, 2006). Supported by previous
research, this study suggests that a training package comprised of didactic instruction,
modeling, role-playing, practice, coaching and feedback can produce an efficient and
effective training package for teaching instructional assistants to implement evidencebased strategies.
182
Although brief trainings on a variety of topics have shown to be successful, a
growing number of researchers have indicated the vital need for more specific
research detailing the effects of instructional assistant supports on students with
disabilities in order to develop targeted professional development that is both
effective and efficient (Causton-Theoharis & Malmgren, 2005; Giangreco et al.,
1997; Hall et al., 1995; Malmgren & Causton-Theoharis, 2006; Young et al., 1997).
In recent literature, researchers have given high priority to studies investigating
student outcomes as a result of instructional assistant utilization (Conroy et al., 2004;
Giangreco, Broer et al., 2001b; Giangreco et al., 1997; Malmgren & CaustonTheoharis, 2006; Young et al., 1997) rather than focusing primarily on student,
instructional assistant and related professionals’ perspectives and attitudes regarding
utilization (Giangreco & Broer, 2005b; Giangreco, Edelman, & Broer, 2001; Marks et
al., 1999; Riggs & Mueller, 2001; Tews & Lupart, 2008) and teacher and
instructional assistant roles and responsibilities (Carroll, 2001; Lamont & Hill, 1991;
Pickett & Gerlach, 2003; Pickett, Gerlach, Morgan, Likins, & Wallace, 2007). This
research takes the current understanding of the utilization of instructional assistants a
step further, to determine the effects of a carefully designed instructional assistant
training package on student academic behavior in the general education setting. To
date, no research has specifically been conducted on this topic.
This study contributes to the literature on instructional assistant utilization by
examining the effects of student academic engagement prior to and following a
carefully designed yet brief training package. To date, a majority of the research on
183
the targeted behaviors in this study have been observational and qualitative in nature
and have not evaluated the effects of an intervention to determine if changes in
instructional assistant behavior consistently alter student academic behavior (Egilson
& Traustadottir, 2009; Giangreco, Broer et al., 2001b; Giangreco et al., 1997;
Malmgren & Causton-Theoharis, 2006; Young et al., 1997). This study provides a
foundation for future research to determine the effects of instructional assistant
training on student academic outcome.
Limitations of Findings
Findings in this study should be interpreted with caution due to several
limitations, which threaten the interpretations and generality of the results. The
following limitations should be addressed in future research. First, the sample size
was relatively small, which was likely to reduce the precision of measurement and
thereby effect the power of the statistical analysis. Replications with independent
samples are needed to reinforce the findings of this study.
Also to be considered is the possibility that participants in this study may not
be representative of instructional assistants in other settings, as this study employed a
small sampling realm within the confines of a single location. Although a small
sample size may not generalize to the larger population of instructional assistants,
these instructional assistants may show similarities to other instructional assistants.
Detailed demographic information from a small number of studies suggests that
instructional assistants are typically known to vary in age, education level and years
of experience but are predominately female and are known to begin the job with
184
limited job-specific training (Bolton & Mayer, 2008; Causton-Theoharis &
Malmgren, 2005; Deardorff et al., 2007; Wood et al., 2007). Previous observational
research supports the similarity of this sample to other such samples of instructional
assistants who have also demonstrated behaviors such as continual close proximity to
the student, more assistance then necessary and other detrimental behaviors, which
negatively effect student achievement and social integration (Egilson & Traustadottir,
2009; Giangreco et al., 1997; Malmgren & Causton-Theoharis, 2006). Past
experimental research also suggests similarities between samples of instructional
assistants with previous samples of instructional assistants demonstrating low levels
of effective support behaviors prior to intervention, followed by an increase in
effective behaviors following intervention (Bingham et al., 2007; Bolton & Mayer,
2008; Causton-Theoharis & Malmgren, 2005; Petscher & Bailey, 2006).
Second, although instructional assistants were randomly assigned to the
treatment condition, the instructional assistants volunteered to take part in the study,
therefore a sampling bias could pose as a potential limitation. Voluntariness may have
clouded the interpretation of results since these participants may have had an initial
interest in learning new skills. This may suggest that there is a difference between
instructional assistants who volunteer and instructional assistants who are required to
attend such trainings. Although this may be seen as a limitation, all previous research
done with instructional assistants requires their voluntary consent to participate in
research. Therefore, the participants in this study may be comparable to other
instructional assistants who have participated in similar research, although it is
185
unclear if similar results would be achieved if a school or district required
participation.
Third, due to a lack of standardized measures for determining change in
instructional assistant thinking prior to and following an intervention, the
development of an experimental measure was required. Since this measure was
devised for this study and had never been used before, its validity is questionable.
Further research must be conducted to investigate the effects of the measure to ensure
its validity and reliability.
Fourth, it should be noted that because the training package included a
combination of training procedures (e.g., instruction, modeling, role playing, practice,
coaching, feedback), it is unclear what role each of the components played in the
observed improvements in instructional assistant behavior. In order to determine the
predominant components, future research could conduct a component analysis of
each of the training components. This would allow for the evaluation of each
component and the elimination of those components considered less important or
unnecessary.
Lastly, additional research is needed to determine the maintenance and
generalization of such skills, particularly in light of the low baseline performance of a
majority of the instructional assistants. Although the data indicates a continual use of
effective behaviors after the withdrawal of treatment, it is necessary to determine the
lasting effects of such a treatment and instructional assistants’ ability to generalize
these newly learned skills to different activities and different environments. Research
186
on staff-training indicates that the continuation of improved performance is not likely
without some form of ongoing feedback (Reid & Parsons, 2000), although it remains
unclear how much feedback is necessary (Schepis et al., 2001).
Implications for Practice
That instructional assistant training in basic instructional support procedures
are necessary for the successful inclusion of students with disabilities into general
education is generally agreed upon in the literature (Giangreco et al., 1997; Young et
al., 1997). Attempts at “one shot,” didactic trainings have been shown to be
ineffective in ensuring that instructional assistants retain and transfer newly learned
instructional support skills in the general education classroom (Leach & Conto,
1999). However, research has consistently shown that instructional assistants can be
taught to implement effective support strategies in a relatively short amount of time,
with high fidelity, using a combination of specific training components (Bingham et
al., 2007; Bolton & Mayer, 2008; Causton-Theoharis & Malmgren, 2005; Petscher &
Bailey, 2006; Schepis et al., 2001). The current study contributes to this last body of
literature by demonstrating that a carefully designed yet relatively short training,
coupled with a small amount of coaching, can have substantial effects on both
instructional assistant behavior and student outcome.
With respect to training efficiency, the low-cost training package considerably
improved instructional assistant effectiveness in a brief period of time. That is,
following a summation of three hours of in-class training and an average of an hour
of coaching, instructional assistants were able to increase their use of effective
187
support strategies, while simultaneously decreasing their use of ineffective strategies.
These changes also produced collateral positive effects on student behavior by
increasing independent academic engagement and decreasing the occurrence of
challenging behavior. This suggests that the training package could help to address
the need for rapid and effective instructional assistant training procedures, responding
to continued budget cuts and evolving laws requiring the adequate training of
instructional assistants.
In light of these promising findings, there are several future directions in
which this research may be further developed to improve its applicability. Ideally, this
type of training ought to be attended by instructional assistants, their corresponding
general education teachers and the special education teachers. There are many reasons
this would be advantageous, including but not limited to increased awareness of
effective and detrimental instructional assistant supports, and reinforcement of
collaborative working relationships. Both factors contribute to effective instruction
for students with disabilities in general education classrooms (Devlin & Toledo,
2005). Lack of time to collaborate is frequently cited as one of the predominant
hindrances to effective teaming (Giangreco et al., 1997; Lerman et al., 2004; Rueda &
Monzo, 2002; Wallace et al., 2001); therefore, training attended by both the teacher
and instructional assistant may provide a starting point for collaborative discussionmaking, regarding each team member’s basic roles and responsibilities, how and
when student progress will be discussed, and an understanding of how to promote
effective supports in the classroom.
188
It may also benefit the special education teacher to attend these trainings, in
order to provide seamless services between special education and general education.
Although research indicates that special education teachers frequently bear the burden
of a large caseload of students, leaving little time to supervise or train instructional
assistants (Giangreco & Broer, 2005a), special education teacher attendance would
ensure that instructional assistants are, at minimum, receiving the same information
regarding effective and ineffective supports, which can then be easily reinforced by
the special education teacher. This may help both the special and general education
teachers effectively supervise instructional assistants, and determine which
instructional assistants require more explicit training in basic skills.
It should be noted that this study was originally intended to include a third
condition comprised of instructional assistants and their general education
counterparts, but recruitment resulted in dismal general education teacher
participation rates. Therefore, the condition had to be eliminated. Districts wishing to
implement such a training may first need to acknowledge and understand how general
education teachers view the role of the instructional assistant, and assess whether
these views correspond with teacher unwillingness to partake in joint trainings with
instructional assistants.
Districts wishing to train instructional assistants to provide more effective
supports should also be cautious of providing trainings solely to instructional
assistants, as this may inadvertently perpetuate the faulty assumption that
instructional assistants are to assume the educational responsibility for the student
189
with disabilities (Giangreco et al., 2003; Marks et al., 1999). For this reason, which is
continually reiterated in the literature, it may also be necessary to define teacher and
instructional assistant responsibilities during such a training. This may help to ensure
that instructional assistants are viewed as support personnel and that general
education teachers clearly understand the situations in which an instructional assistant
may be utilized (Pickett & Gerlach, 2003; Pickett et al., 2007).
Although a joint training of instructional assistants and general education
teachers appears to be warranted for several reasons, districts must also consider who
will be directly supervising instructional assistants and if the general education
teacher will either assume or share in this role. This brings into question a host of
other, overlooked complications; research has indicated that teachers are generally
unprepared to supervise and support instructional assistants within the general
education classroom. If teachers are to assume these roles, they should be afforded
specific training in effective supervision practices (Devlin, 2005; Marks et al., 1999;
Wallace et al., 2001).
It may also be beneficial to extend the training for instructional assistants
throughout the entire school year. Ongoing training and professional development
may ensure that instructional assistants’ behaviors change in the long-term. Taking
into account that continuing education would most likely need to be cost effective, it
may be beneficial to provide monthly maintenance sessions to safeguard against the
possible drift from correct implementation. A monthly meeting of this sort would also
provide a forum for instructional assistants to share ideas and effective strategies with
190
the oversight of a trainer, a practice typically employed by teachers but yet to be
instituted for instructional assistants.
Similarly, there may be several ways to increase the effectiveness of coaching.
First, modeling by the trainer during coaching sessions may enhance coaching
sessions and help instructional assistants transfer newly learned skills into the
classroom. Particularly for instructional assistants who continue to struggle with the
transferability of skills, modeling gives the instructional assistant an opportunity to
observe implementation of skills by an expert. Secondly, the addition of selfgenerated feedback may heighten instructional assistant awareness of their own
behavior and its corresponding effects on the student (Arco, 2008). This could be of
particular use in light of this study, which suggests that the development of an
effective instructional assistant requires the instructional assistant be aware of and
make conscious decisions regarding how and when support is provided. Allowing the
instructional assistant to reflect on and analyze their own behavior may reinforce the
importance of making deliberate decisions on the level and type of support provided.
Lastly, in addition to on-the-job feedback, the use of post-session, videotaped
feedback may provide another method to provide ongoing feedback in a more flexible
manner. A small amount of research indicates that this may be a useful tool that does
not require the presence of a trainer (Roter et al., 2004). This could be of assistance,
as it is likely that a trainer would be supervising multiple instructional assistants
working in several different schools (Robinson, 2007). Video feedback may be more
practical than in-person feedback, since the time in which feedback is given is more
191
flexible and it is less disruptive to the teacher and students when given outside of the
classroom. Other benefits of including this method of delivering feedback include
allowing the instructional assistant to critique their own behavior, acknowledging
effective and ineffective support strategies, and generating possible solutions to
ineffective supports. Instructional assistants should be constantly reflecting on and
analyzing their own behavior and the effects that it has on the student.
Lastly, providing induction sessions similar to teachers’ sessions, in order to
consolidate skills prior to the beginning of the school year, may be a practical
alternative to waiting for bad habits to form. It may require more intensive
intervention to unlearn ineffective strategies and replace them with effective ones
than to train well in the beginning. This may be feasible for many districts because
the beginning of the year typically permits one or two in-service days for school
personnel. Although this may be a viable alternative to providing training after the
school year has begun, it is increasingly important that continual training be attended
throughout the school year following the “honeymoon period” in which the student
and instructional assistant are establishing their relationship.
Implications for Policy
Although Federal and State policies encourage the inclusion of students with
disabilities, the practical realities of this mandate have proven difficult, if not
impossible, without the assistance of instructional assistants. With the increasing
employment of instructional assistants, policy makers should take into account the
growing need to prepare and support instructional assistants to provide the most
192
effective education for students with disabilities. Although amendments from the
1997 and 2004 reauthorization of the IDEA require that states develop policies and
procedures to ensure instructional assistants are adequately trained to assist with the
provision of special education services, vagueness of the law allows states and local
education agencies to devise their own interpretation of what is considered to be
“appropriate training” (Giangreco et al., 2003).
Notable recent legislation put forth by the California Commission on Teacher
Credentialing deems it necessary to add an autism authorization for teachers
instructing students with autism (Commission on Teacher Credentialing, 2009). It has
become increasingly evident that California is attentive to the drastic increase in
students requiring services for autism spectrum disorders but fails to acknowledge the
likely corresponding rise in employment of instructional assistants necessary to
support such students throughout their school day. This poses a question: If teachers
must be certified to provide adequate instruction to students with autism, shouldn’t
instructional assistants, who spend the majority of the day supporting the student, also
be required to meet some level of competency?
Although California continues to uphold the Title 1 requirements for hiring
qualified instructional assistants, the state falls short in developing specific standards,
credentialing and training requirements. If California continues to permit the
increased reliance on inadequately trained personnel, strict training requirements and
or certifications should be devised to ameliorate the current disparities in instructional
assistant preparation and continuing education.
193
It may be that institutions of higher education devise programs to address
instructional assistant-required competencies in order to prepare instructional
assistants prior to entering the classroom. Faculty within institutions of higher
education can play a pivotal role in preparing instructional assistants for their roles
and responsibilities, as well as how to provide the most effective and efficient
supports within both special and general education classrooms. Similarly, these same
institutions may also wish to include similar preparation programs for pre-service
teachers who will be working with and supervising instructional assistants (Wallace
et al., 2001).
To help policymakers, state education agencies and districts address the need
to devise guidelines for responsibilities and competencies for a variety of school
personnel, Pickett et. al (2007) has developed professional, ethical and legal
responsibilities for teachers and instructional assistants, as well as management
responsibilities for district administrators and school principals. Although this is a
first step toward a more sophisticated and detailed system of accountability, it is also
necessary to ensure that comprehensive systems of professional development are in
place, which continue to provide the necessary pre-service and in-service
opportunities.
State licensing units may wish to consider the findings of this study when
devising standards for instructional assistants. Currently, only 14 states have
incorporated policies that require some level of credentialing or licensure (Pickett et
al., 2003). As state education agencies and local districts continue in their attempts to
194
institute practices that support initiatives for improving the quality of education and
related services for all students, there is a need to identify policies, procedures and
practices that will strengthen the capacity of instructional assistants to effectively
perform their continually expanding duties and responsibilities (Pickett & Gerlach,
2003; Wallace et al., 2001). Wallace et al. (2001) points out that we are often quick to
institute new initiatives for educational improvements without considering the impact
such changes will have on both the staff and systems that must implement and
support these changes. It is imperative to identify the skills required to implement
such initiatives, the trainings needed to teach these skills and how the systems will
continue to support them.
Implications for Research
The next step for this research is to assess the durability of the treatment
effects from the prescribed intervention over time. For example, future studies should
investigate the maintenance of newly learned skills over the course of the school year
and the necessary amount of coaching to maintain skills. These findings would
provide districts with the information needed to inform decisions about how and
when training and coaching can be provided in order to achieve durable and lasting
behavior change.
Additional research on this particular training package needs to be conducted
with a larger sample of instructional assistants from varying locations to determine
the generalizability of the study’s findings. This would help to identify whether the
prescribed intervention is appropriate for instructional assistants with a range of
195
characteristics. An expansion of this study could also investigate the generalizability
of instructional assistant skills across different students and environments. For
instance, future study should focus on how these skills generalize to varying
academic activities that occur throughout the day, which frequently include both
highly structured and loosely structured learning opportunities. Generalization should
also include the transferability of these skills to non-academic activities, such as
social skills training, to determine the effectiveness of this training package on
student non-academic performance in contrast to student academic performance. This
would provide further evidence that the training package produced an overall better
instructional assistant and did not provide a set of discrete skills only applicable to the
targeted student and specific activity.
Independent replications are also needed to investigate both the behavioral
and cognitive measures used to determine instructional assistant effectiveness and its
corresponding effects on student academic engagement. In particular, further
investigation ought to refine the behavioral measure in order to determine the exact
prompts the instructional assistant gives within a 10s interval. This would require the
use of videotape to precisely code each prompt but would provide stronger evidence
of both the large and small changes in instructional assistant behavior and its relation
to student outcome.
It would also be valuable to develop a measure of student academic growth in
response to changes in instructional assistant behavior. Although this study measured
student academic engagement, it was not designed to assess student academic gains.
196
A measure of student academic gain would provide substantial evidence that changes
in instructional assistant behavior could produce quantitative changes in student
outcome.
Also, because the vignette measure was experimental, it requires further
replication and validation to examine its accuracy in detecting change in instructional
assistant thinking. It may be beneficial to include a section in which instructional
assistants provide reasoning for their selected score on each of the vignettes. This
may provide further insight into instructional assistant thinking and their ability to
differentiate between effective and ineffective instructional assistant behaviors.
Future research should collect data on both students with disabilities and their
general education peers, in order to compare the rates of independent academic
engagement and challenging behavior for both types of students. For instance, such
research could investigate the number of intervals in which peers are independently
academically engaged, and compare this the number of intervals a student with
disabilities was engaged within the same classroom. This could provide further
evidence of the effectiveness of the training package, as well as useful information
about how students with disabilities differ from or are similar to their peers in the
general education classroom, in terms of independent academic engagement and
challenging behavior.
Given the very positive effects of the training package, a component analysis
would provide additional information regarding the effects of each component of
training, to determine which components correlate with the largest treatment gains. In
197
light of continual budget cuts to education, this may provide increasingly valuable
information regarding the minimum components required to acquire significant
changes in instructional assistant effectiveness.
In summary, the shift towards inclusive education for students with
disabilities has resulted in an increase in both the employment and reliance on
instructional assistants. This is particularly evident in the general education setting
where it is frequently presupposed that the inclusion of a student with disabilities
requires the accompaniment of an instructional assistant (Causton-Theoharis &
Malmgren, 2005; Egilson & Traustadottir, 2009; Giangreco & Broer, 2007).
Although the support of such personnel provides many benefits for a student with
disabilities in a general education classroom, past and current research have
questioned the appropriateness of assigning the least qualified personnel to support a
student with the most intricate and complex learning needs (Giangreco, 2009). This
proclamation becomes particularly controversial in light of limited understanding of
how instructional assistant support affects student outcomes.
Consequently, researchers have emphasized the importance of effective and
efficient trainings for instructional assistants, as well as the need to examine the
relationship between instructional assistant use and student outcome (Deardorff et al.,
2007; Giangreco et al., 1999; Petscher & Bailey, 2006; Young et al., 1997). This
study provides preliminary evidence that a brief instructional assistant training
package incorporating empirically validated training approaches can increase
instructional assistant effectiveness.
198
Past research in this area has also overlooked the increasing need for
investigation of the effects of carefully-designed and job-specific training packages
on instructional assistants’ ability to foster student independence and engagement.
Although further replication is required to validate the specific type of training
package used in this study, districts can no longer afford to ignore the impact, both
positive and negative, that instructional assistants have on the education of students
with disabilities. Continual research into and validation of proper training protocols
for instructional assistant effectiveness will increase the likelihood that students with
disabilities can integrate successfully into general education classrooms.
199
REFERENCES
Alan, S. (2010). Section 2: Cues, Prompts, and Objectives: Author Stream.
Alberto, P. A., & Troutman, A. C. (2006). Applied Behavior Analysis for Teachers
(7th Ed ed.). Columbus, Ohio: Pearson Prentice Hall.
Alves, A. J., & Gottlieb, J. (1986). Teacher interactions with mainstreamed
handicapped students and their nonhandicapped peers. Learning Disabled
Quarterly, 9(1), 77-83.
Arco, L. (1991). Effects of outcome performance feedback on maintenance of client
and staff behavior in a residential setting. Behavioral Residential Treatment,
6(4), 231-247.
Arco, L. (2008). Feedback for improving staff training and performance in behavioral
treatment programs. Behavioral Interventions, 23, 39-64.
Arco, L., & Millett, R. (1996). Maintaining instructional behavior after on-the-job
training with process-based performance feedback. Behavior Modification,
20(3), 300-320.
Azrin, N. H., & Armstrong, P. M. (1973). The "mini-meal"-A method for teaching
eating skills to the profoundly retarded. Mental Retardation, 11, 9-13.
Azrin, N. H., Schaeffer, R. M., & Wesolowski, M. D. (1976). A rapid method of
teaching profoundly retarded persons to dress by a reinforcement-guidance
method. Mental Retardation, 14(6), 29-33.
200
Baker, S. D., Lang, R., & O'Reilly, M. (2009). Review of video modeling with
students with emotional and behavioral disorders. Education & Treatment of
Children, 32(3), 402-420.
Bakken, J. P., Whedon, C. K., Aloia, G. F., Aloia, S. D., & Edmonds, C. (2001).
Characteristics of effective teachers: Perceptions of students and professionals
in the field. Journal of Asia-Pacific Special Edcuation, 1(2), 19-32.
Bardin, J. A., & Lewis, S. (2008). A survey of the academic engagement of students
with visual impairment in genernal education classes. Journal of Visual
Impairment and Blindness, 102(8), 472-483.
Batu, S., Ergenekon, Y., Erbas, D., & Akmanoglu, N. (2004). Teaching pedestrian
skills to individuals with developmental disabilities. Journal of Behavioral
Education, 13(3), 147-164.
Bingham, M. A., Spooner, F., & Browder, D. (2007). Training paraeducators to
promote the use of augmentative and alternative communication by students
with significant disabilities. Education and Training in Developmental
Disabilities, 42(3), 339-352.
Blalock, G. (1991). Paraprofessionals: Critical team members in our special education
programs. Intervention in School and Clinic, 36, 200-214.
Bolton, J., & Mayer, M. D. (2008). Promoting the generalization of paraprofessional
discrete trail teaching skills. Focus on Autism and Other Developmental
Disabilities, 23(2), 103-111.
201
Boomer, L. W. (1994). The utilization of paraprofessionals in programs for students
with disabilities and their teachers. Focus on Autistic Behavior, 9, 1-9.
Breen, C. G., & Haring, T. G. (1991). Effects of contextual competence on social
initiations. Journal of Applied Behavior Analysis, 24(2), 337-347.
Brophy, J., & Good, T. L. (1986). Teacher behavior and student achievement. In
M.C. Whittrock (Ed.). In Handbook on teaching (3rd ed., pp. 328-375). New
York: Macmillian.
Brown, L., Farrington, K., Knight, T., Ross, C., & Ziegler, M. (1999). Fewer
paraprofessionals and more teachers and therapists in educational programs
for students with significant disabilities. Journal of the Association for
Persons with Severe Handicaps, 24, 249-252.
Bryan, L. C., & Gast, D. L. (2000). Teaching on-task and on-schedule behaviors to
high functioning children with autism via picture activity schedules. Journal
of Autism and Developmental Disorders, 30(6), 553-567.
Bulgren, J. A., & Carta, J. J. (1992). Examining the instructional contexts of students
with learning disabilities. Exceptional Children, 59, 182-191.
Carroll, D. (2001). Considering paraeducator training, roles and responsibilities.
Teaching Exceptional Children, 34(2), 60-64.
Carta, J. J., Greenwood, C. R., Arreaga-Mayer, D., & Terry, B. (1988). In Code for
instructional structure and student academic response: Mainstream version
(MS-CISSAR). Kansas City: Juniper Gardens Children's Project Bureau of
Child Research, University of Kansas.
202
Causton-Theoharis, J. N., & Malmgren, K. W. (2005). Increasing peer interactions for
students with severe disabilities via paraprofessional training. Council for
Exceptional Children, 71(4), 431-444.
CDE Policy and Evaluation Division. (2010). Local Educational Agency (LEA)
Overview: 2009 Adequate Yearly Progress (AYP) Report: California
Department of Education: Dataquest.
Chadwick, B. A., & Day, R. C. (1971). Systematic reinforcement: Academic
performance of underachieving students. Journal of Applied Behavior
Analysis, 4(4), 311-319.
Chafouleas, S. M., Briesch, A. M., Riley-Tillman, T. C., Christ, T. J., Black, A. C., &
Kilgus, S. P. (2010). An investigation of the generalizability and dependability
of Direct Behavior Rating Single Item Scales (DBR-SIS) to measure academic
engagement and disruptive behavior of middle school students. Journal of
School Psychology. , 48(3), 219-246.
Chalk, K., & Bizo, L. A. (2004). Specific praise improves on-task behaviour and
numeracy enjoyment: A student of year four pupils engaged in the numeracy
hour. Educational Psychology in Practice, 20(4), 335-351.
Codding, R. S., Feinberg, A. B., Dunn, E. K., & Pace, G. M. (2005). Effects of
immediate performance feedback on implementation of behavior support
plans. Journal of Applied Behavior Analysis, 38, 205-219.
203
Coe, D., Matson, J., Fee, V., Manikam, R., & Linarello, C. (1990). Training
nonverbal and verbal play skills to mentally retarded and autistic children.
Journal of Autism and Developmental Disorders, 20(2), 177-187.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.).
Hillsdale, NJ: Lawrence Earlbaum Associates.
Collins, B. C., Branson, T. A., Hall, M., & Rankin, S. W. (2001). Teaching secondary
students with moderate disabilities in an inclusive academic setting. Journal
of Developmental and Physical Disabilities, 13(1), 41-59.
Commission on Teacher Credentialing. (2009). Press Release: Commission Approves
New Autism Authorization for Teachers. Sacramento, Ca:
www.ctc.ca.gov/briefing-room/pdf/PR-2009-01-30-a.pdf.
Conroy, M. A., Asmus, J. M., Ladwig, C. N., Sellers, J. A., & Valcante, G. (2004).
The effects of proximity on the classroom behaviors of students with autism in
general education settings. Behavioral Disorders, 29(2), 119-129.
Conroy, M. A., Sutherland, K. S., Snyder, A., Al-Hendawi, M., & Vo, A. (2009).
Creating a Positive Classroom Atmosphere: Teachers' Use of Effective Praise
and Feedback. Beyond Behavior, 18, 18-26.
Cooper, D. H., & Spence, D. L. (1990). Maintaining at-risk children in general
education settings: Intial effects of individual differences and classroom
environment. Exceptional Children, 54, 117-128.
Cooper, J. C., Heron, T. E., & Heward, W. L. (1987). Applied Behavior Analysis.
Upper Saddle River, New Jersey: Prentice-Hall, Inc.
204
Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied Behavior Analysis (2nd
ed.). Upper Saddle River, New Jersey: Prentice-Hall, Inc.
Council for Exceptional Children. (2001). National Clearinghouse for Professions in
Special Education. Retrieved June 8th, 2009 from http://www.special-edcareers.org/career_choices/profiles/professions/para_edu.html
Cuvo, A. (1981). Teaching laundry skills to mentally retarded students. Education
and Training of the Mentally Retarded, 16(1), 54-64.
Darling-Hammond, L., Wei, R. C., Andree, A., Richardson, N., & Orphanos, S.
(2009). Professional learning in the learning profession: A status report on
teacher development in the United States and abroad: National Staff
Development Council and The School Redesign Network.
Data Accountability Center. (2009a). Personnel employed (FTE) to provide special
education and related services to children and students 3 through 21 under
IDEA, Part B, by personnel type, certification status and state: Fall 2007.
Washington D.C.: U.S. Department of Education, Office of Special Education
Programs, Data Analysis Systems (DANS) Available from
http://www.ideadata.org/arc_toc10.asp#partbCC.
Data Accountability Center. (2009b). Table 2-2. Students ages 6 through 21 served
under IDEA, Part B, by educational environment and state: Fall 2007.
Washington D.C.: U.S. Department of Education, Office of Special Education
Programs, Data Analysis Systems (DANS) Available from
http://www.ideadata.org/TABLES31ST/AR_2-2.htm.
205
Deardorff, P., Glasenapp, G., Schalock, M., & Udell, T. (2007). TAPS: An innovative
professional development program for paraeducators working in early
childhood special education. Rural Special Education Quarterly, 26(3), 3-14.
Demchak, M. (1990). Response prompting and fading methods: A review. American
Journal of Mental Retardation, 94(6), 603-615.
Dennison, P., & Dennison, G. (1980). Brain Gym International.
Devlin, P. (2005). Effects of continuous improvement training on student interaction
and engagement. Research & Practice for Persons with Severe Disabilities,
30(2), 47-59.
Diorio, M. S., & Konarski, E. A. (1984). Evaluation of a method for teaching dressing
skills to profoundly mentally retarded persons. American Journal of Mental
Deficiency, 89, 307-309.
Dooley, D. M., & Schoen, S. F. (1988). Decreasing talking-out and increasing
academic behavior in a 7-year-old child (No. ED299724): La Salle
University.
Downing, J. E., Ryndak, D. L., & Clark, D. (2000). Paraeducators in inclusive
classrooms: Their own perceptions. Remedial and Special Education, 21(3),
171-181.
Doyle, M. B. (1995). A qualitative inquiry into the roles and responsibilities of
paraeducators who support students with severe disabilities in inclusive
classrooms: Unpublished doctoral dissertation, University of Minnesota,
Minneapolis.
206
Doyle, P. M., Wolery, M., Ault, M. J., & Gast, D. L. (1988). System of least prompts:
A literature review of procedural parameters. Journal of the Association for
Persons with Severe Handicaps, 13, 28-40.
Doyle, P. M., Wolery, M., Gast, D. L., Ault, M. J., & Wiley, K. (1990). Comparison
of constant time delay and the system of least prompts in teaching
preschoolers with developmental delays. Research in Developmental
Disabilities, 11, 1-22.
Dunlap, G., & Johnson, J. (1985). Increasing independent responding of autistic
children with unpredictable supervision. Journal of Applied Behavior
Analysis, 18, 227-236.
Dunlap, G., Koegel, R. L., & Johnson, J. (1987). Maintaining performance of autistic
children in community settings with delayed contingencies. Journal of
Applied Behavior Analysis, 20, 185-191.
Ed-Data. (2010). District Reports: Education Data Partnership.
Egilson, S. T., & Traustadottir, R. (2009). Assistance to pupils with physical
disabilities in regular schools: promoting inclusion or creating dependency?
European Journal of Special Needs Education, 24(1), 21-36.
Engelman, K. K., Altus, D. E., Mosier, M. C., & Mathews, R. M. (2003). Brief
training to promote the use of less intrusive prompts by nursing assistants in a
dementia care unit. Journal of Applied Behavior Analysis, 36, 129-132.
207
Erbas, D., Tekin-Iftar, E., & Yucesoy, S. (2006). Teaching special education teachers
how to conduct functional analysis in natural settings. Education and Training
in developmental disabilities, 41(1), 28-36.
Etscheidt, S. (2005). Paraprofessional services for students with disabilities: A legal
analysis of issues. Research & Practice for Persons with Severe Disabilities,
30(2), 60-80.
Ferguson, E., & Houghton, S. (1992). The effects of contingent teacher praise, as
specified by Canter's Assertive Discipline Programme, on children's on-task
behaviour. Educational Studies, 18(1), 83-94.
Ferritor, D. E., Buckholdt, D., Hamblin, R. L., & Smith, L. (1972). The noneffects of
contingent reinforcement for attending behavior on work accomplished.
Journal of Applied Behavior Analysis, 5(1), 7-17.
Fleming, R. K., & Sulzer-Azaroff, B. (1989). Enhancing the quality of teaching by
direct care staff through performance feedback on the job. Behavioral
Residental Treatment, 4(4), 377-393.
Freeland, J. T., & Noell, G. H. (1999). Maintaining accurate math responses in
elementary school students: The effects of delayed intermittent reinforcement
and programming common stimuli. Journal of Applied Behavior Analysis,
32(2), 211-215.
French, N. (1998). Working together: Resource teachers and paraeducators. Remedial
and Special Education, 19, 357-368.
208
French, N. (1999a). Paraeducators and teachers: Shifting roles. Teaching Exceptional
Children, 32, 65-69.
French, N. (1999b). Paraeducators: Who are they and what do they do. Teaching
Exceptional Children, 32, 65-69.
French, N. K. (2001). Supervising paraprofessionals: A survey of teacher practices.
The Journal of Special Education, 35(1), 41-53.
French, N. K., & Pickett, A. L. (1997). Paraprofessionals in special education: Issues
for teacher educators. Teacher Education and Special Education, 20(1), 6173.
Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., et al. (2008).
The impact of two professional development interventions on early reading
instruction and achievement. Retrieved. from
ies.ed.gov/ncee/pubs/20084030/.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What
makes professional development effective? Results from a national sample of
teachers. American Educational Research Journal, 38(4), 915-945.
George, H. P., & Kincaid, D. K. (2008). Building district-level capacity for Positive
Behavior Support. Journal of Positive Behavior Interventions, 10(1), 20-32.
Giangreco, M., & Broer, S. (2007). School-based screening to determine overreliance
on paraprofessionals. Focus on Autism and Other Developmental Disabilities,
22(3), 149-158.
209
Giangreco, M. F. (2009). Crtical issues brief: Concerns about the proliferation of
one-to-one paraprofessionals. Arlington, VA: Council for Exceptional
Children, Division on Autism and Developmental Disabilities
Giangreco, M. F., Backus, L., CichoskiKelly, E., Sherman, P., & Mavropoulos, Y.
(2003). Paraeducator training materials to facilitate inclusive education: Initial
field-test data. Rural Special Education Quarterly, 22(1), 17-27.
Giangreco, M. F., & Broer, S. M. (2005a). Questionable utilization of
paraprofessionals in inclusive schools: Are we addressing systems or causes?
Focus on Autism and Other Developmental Disabilities, 20(1), 10-26.
Giangreco, M. F., & Broer, S. M. (2005b). Questionable utilization of
paraprofessionals in inclusive schools: Are we addressing symptoms or
causes? Focus on Autism and Other Developmental Disabilities, 20(1), 10-26.
Giangreco, M. F., Broer, S. M., & Edelman, S. W. (1999). The tip of the iceberg:
Determining whether paraprofessional support is needed for students with
disabilities in general education settings. The Journal of the Association for
Persons with Severe Handicaps, 24, 280-290.
Giangreco, M. F., Broer, S. M., & Edelman, S. W. (2001a). Teacher engagement with
students with disabilities: Differences between paraprofessional service
delivery models. Journal of the Association for Persons with Severe
Handicaps, 26(2), 75-86.
Giangreco, M. F., Broer, S. M., & Edelman, S. W. (2001b). Teacher engagement with
students with disabilities: Differences between paraprofessional service
210
delivery models. Journal of the Association for Persons with Severe
Handicaps, 26(2).
Giangreco, M. F., Broer, S. M., & Edelman, S. W. (2002). "That was then, this is
now!" Paraprofessional supports for students with disabilities in general
education classrooms. Exceptionality, 10(1), 47-64.
Giangreco, M. F., Edelman, S. W., & Broer, S. M. (2001). Respect, appreciation, and
acknowledgment of paraprofessionals who support students with disabilities.
The Council for Exceptional Children, 67(4), 485-498.
Giangreco, M. F., Edelman, S. W., Broer, S. M., & Doyle, M. B. (2001).
Paraprofessional support of students with disabilities: Literature from the past
decade. Council for Exceptional Children, 68(1), 45-63.
Giangreco, M. F., Edelman, S. W., Luiselli, T. E., & MacFarland, S. Z. C. (1997).
Helping or hovering? Effects of instructional assistant proximity on students
with disabilities. Exceptional Children, 64(1), 7-18.
Giangreco, M. F., Hurley, S. M., & Suter, J. C. (2009). Special education personnel
utilization and general class placement of students with disabilities: Ranges
and ratios. Intellectual and Developmental Disabilities, 47, 53-56.
Giangreco, M. F., Smith, C. S., & Pinckney, E. (2006). Addressing the
paraprofessional dilemma in an inclusive school: A program description.
Research & Practice for Persons with Severe Disabilities, 31(3), 215-229.
Giangreco, M. F., Yan, S., McKenzie, B., Cameron, P., & Fialka, J. (2005). "Be
careful what you wish for...": Five reasons to be concerned about the
211
assignment of individual paraprofessionals. Teaching Exceptional Children,
37, 28-34.
Greenwood, C. R. (1991). Longitudinal analysis of time, engagement, and
achievement in at-risk versus nonrisk students. Exceptional Children, 57, 521535.
Greenwood, C. R., Carta, J. J., Kamps, D., & Arreaga-Mayer, C. (1990).
Ecobehavioral analysis of classroom instruction. In S. Schroeder (Ed.). In
Ecobehavioral analysis and developmental disabilities (pp. 33-63).
Baltimore: Paul H. Brookes.
Greenwood, C. R., Horton, B. T., & Utley, C. A. (2002). Academic engagement:
Current perspectives and research and practice. School Psychology Review,
31(3), 328-349.
Guskey, T., & Yoon, K. (2009). What works in professional development? Phi Delta
Kappa, 90, 495-500.
Guskey, T. R. (1986). Staff development and the process of teacher change.
Educational Researcher, 15(10), 12-15.
Hall, L. J., McClannahan, L. E., & Krantz, P. J. (1995). Promoting independence in
integrated classrooms by teaching aides to use activity schedules and
decreased prompts. Education and Training in Mental Retardation and
Developmental Disabilities, 30(3), 208-217.
Hall, R. V., Lund, D., & Jackson, D. (1968). Effects of teacher attention on study
behavior. Journal of Applied Behavior Analysis, 1(1), 1-12.
212
Harchik, A. E., Sherman, J. A., Hopkins, B. L., Strouse, M. C., & Sheldon, J. B.
(1989). Use of behavioral techniques by paraprofessional staff: A review and
proposal. Behavioral Residential Treatment, 4(4), 331-352.
Harper, L. V., & McCluskey, K. S. (2003). Teacher-child and child-child interactions
in inclusive preschool settings: do adults inhibit peer interactions? Early
Childhood Research Quarterly, 18, 163-184.
Hiralall, A. S., & Martens, B. K. (1998). Teaching classroom management skills to
preschool staff: The effects of scripted instructional sequences on teacher and
student behavior. School Psychology Quarterly, 23(2), 94-115.
Hollowood, T. M., Salisbury, C. L., Rainforth, B., & Palombaro, M. M. (1994). Use
of instructional time in classrooms serving students with and without severe
disabilities. Exceptional Children, 61(3), 242-253.
Horner, R. D., & Keilitz, I. (1975). Training mentally retarded adolescents to brush
their teeth. Journal of Applied Behavior Analysis, 8(3), 301-309.
Hughes, M. T., & Valle-Riestra, D. M. (2008). Responsibilities, preparedness, and
job satisfaction of paraprofessionals: working with young children with
disabilities. International Journal of Early Years Education, 16(2), 163-173.
Hunt, P., Farron-Davis, F., Beckstead, S., Curtis, D., & Goetz, L. (1994). Evaluating
the effects of placement of students with severe disabilities in general
education versus special classes. Journal of Severe Handicaps, 19(3), 200214.
213
Jones, K. H., & Bender, W. N. (1993). Utilization of paraprofessionals in special
education: A review of the literature. Remidial and Special Education, 14, 714.
Kamps, D. M., Leonard, B. R., Dugan, E. P., Boland, B., & Greenwood, C. R. (1991).
The use of ecobehavioral assessment to identify naturally occuring effective
procedures in classrooms serving students with autism and other
developmental disabilities. Journal of Behavioral Education, 1(4), 367-397.
Katsiyannis, A., Hodge, J., & Lanford, A. (2000). Paraeducators: Legal and practice
considerations. Remedial and Special Education, 21, 5.
Keefe, E. (1994). Ecobehavioral analysis of integrated settings for students with
moderate to profound disabilities. The University of New Mexico,
Albuquerque, New Mexico.
Kohler, F. W., Anthony, L. J., Steghner, S. A., & Hoyson, M. (2001). Teaching social
interaction skills in the integrated preschool: An examination of naturalistic
tactics. Topics in Early Childhood Special Education, 21, 93-113.
Kraemer, B. R., Morton, K., & Wright, D. B. (1998). Prompting. Southern California:
Diagnostic Center & California Department of Education.
Lamont, I. L., & Hill, J. L. (1991). Roles and responsibilities of paraprofessionals in
regular elementary classrooms. Journal of Special Education, 15(1), 1-24.
Lane, K. L., Fletcher, T., Carter, E. W., Dejud, C., & DeLorenzo, J. (2007).
Paraprofessional-led phonological awareness training with youngsters at risk
214
for reading and behavioral concerns. Remedial and Special Education, 28(5),
266-276.
Lang, M., & Fox, L. (2003). Breaking with tradition: Providing effective professional
development for instructional personnel supporting students with severe
disabilities. Teacher Education and Special Education, 26(1), 17-26.
Leach, D. J., & Conto, H. (1999). The additional effects of process and outcome
feedback following brief in-service teacher training. Educational Psychology,
19(4), 441-462.
Leblanc, M.-P., Ricciardi, J. N., & Luiselli, J. K. (2005). Improving discrete trail
instruction by paraprofessional staff through abbreviated performance
feedback intervention. Education and Treatment of Children, 28(1), 76-82.
Lee, J., O'Shea, L., & Dykes, M. K. (1987). Teacher wait-time: Performance of
developmentally delayed and non-delayed young children. Education &
Training in Mental Retardation, 22(3), 176-184.
Lerman, D. C., Vorndran, C. M., Addison, L., & Kuhn, S. C. (2004). Preparing
teachers in evidence-based practices for young children with autism. School
Psychology Review, 33(4), 510-526.
Logan, K. R., Bakman, R., & Keefe, E. B. (1997). Effects of instructional variables
on engaged behavior of students with disabilities in general education
classrooms. Exceptional Children, 63(4), 481-497.
Logan, K. R., & Malone, D. M. (1998). Comparing instructional contexts of students
with and without severe disabilities. Exceptional Children, 64(3), 343-358.
215
Loucks-Horsley, S., Love, N., Stiles, K., Mundry, S., & Hewson, P. (2003).
Designing professional development for teachers of science and mathematics
(2nd ed.). Thousand Oaks, CA.
Lowe, T. (1974). The use of verbal reinforcement by paraprofessionals in the
treatment of underachieving elementary school students. Journal of Student
Personnel Association for Teacher Education, 12(3), 95-101.
Maag, J. W., & Larson, P. J. (2004). Training a general education teacher to apply
functional assessment. Education and Treatment of Children, 27(1), 26-36.
MacDuff, G. S., Krantz, P. J., & McClannahan, L. E. (1993). Teaching children with
autism to use photographic activity schedules: Maintenance and generalization
of complex response chains. Journal of Applied Behavior Analysis, 26(1), 8997.
MacDuff, G. S., Krantz, P. J., & McClannahan, L. E. (2001). Prompts and PromptFading Strategies for People with Autism. In C. Maurice, G. Green & R. M.
Foxx (Eds.), Making a Difference: Behavioral Intervention for Autism.
Austin, TX: Pro-Ed.
Malmgren, K. W., & Causton-Theoharis, J. N. (2006). Boy in the bubble: Effects of
paraprofessional proximity and other pedagogical decisions on the interactions
of a student with behavioral disorders. Journal of Research in Childhood
Education, 20(4), 301-312.
216
Manley, K., Collins, B. C., Stenhoff, D. M., & Kleinert, H. (2008). Using a system of
least prompts procedure to teach telephone skills to elementary students with
cognitive disabilities. Journal of Behavioral Education 17, 221-236.
Marks, S. U., Schrader, C., & Levine, M. (1999). Paraeducator experiences in
inclusive settings: Helping, hovering or holding their own? Exceptional
Children, 65(3), 315-318.
Martella, R. C., Nelson, J. R., & Marchand-Martella. (2003). Managing disruptive
behavior in schools: A schoolwide, classroom, and individualized social
learning approach. Boston: Allyn & Bacon.
Martens, B. K., Lochner, D. G., & Kelly, S. Q. (1992). The effects of variableinterval reinforcement on academic engagement: A demonstration of
matching theory. Journal of Applied Behavior Analysis, 25(1), 143-151.
Massey, N. G., & Wheeler, J. J. (2000). Acquisition and generalization of activity
schedules and their effects on task engagement in a young child with autism in
an inclusive pre-school classroom. Education and Training in Mental
Retardation and Developmental Disabilities, 35(3), 326-335.
Matheson, A. S., & Shriver, M. D. (2005). Training teachers to give effective
commands: Effects on student compliance and academic behaviors. School
Psychology Review, 34(2), 202-219.
McConville, M. L., Hantula, D. A., & Axelrod, S. (1998). Matching training
procedures to outcomes: A behavioral and quantitative analysis. Behavior
Modification, 22(3), 391-414.
217
McDonnell, J., Thorson, N., McQuivey, C., & Kiefer-O'Donnell, R. (1997).
Academic engaged time of students with low-incidence disabilities in general
education classes. Mental Retardation, 35, 18-26.
McKenzie, A. R., & Lewis, S. (2008). The role and training of paraprofessionals who
work with students who are visually impaired. Journal of Visual Impairment
& Blindness, 102(8), 459-471.
Moore, J. W., Edwards, R. P., Sterling-Turner, H. E., Riley, J., DuBard, M., &
McGeorge, A. (2002). Teacher acquisition of functional analysis
methodology. Journal of Applied Behavior Analysis, 35, 73-77.
Myles, B. S., & Simpson, R. L. (1989). Regular educators' modification preferences
for mainstreaming mildly handicapped children. Journal of Special Education,
22(4), 479-492.
National Center for Education Statistics. (2009a). Staff employed in public and
elementary school systems, by functional area: Selected years, 1949-50
through Fall 2007. Washington, D.C. : U.S. Department of Education,
National Center for Education Statistics, Statistics of State School Systems,
various years; Statistics of Public Elementary and Secondary Schools, various
years; and Common Core of Data (CCD), "State Nonfiscal Survey of Public
Elementary/Secondary Education," 1986–87 through 2007-2008.
National Center for Education Statistics. (2009b). Staff employed in public
elementary and secondary school systems, by type of assignment and state or
jurisdiction: Fall 2007 Washington, D.C. : U.S. Department of Education,
218
National Center for Education Statistics, Statistics of State School Systems,
various years; Statistics of Public Elementary and Secondary Schools, various
years; and Common Core of Data (CCD), "State Nonfiscal Survey of Public
Elementary/Secondary Education," 1986–87 through 2007-2008.
Nelson, R. R. (1996). Effects of Direct instruction, Cooperative learning, and
Independent learning practices on the classroom behavior of students with
behavioral disorders: A comparative analysis. Journal of Emotional &
Behavioral Disorders, 4(1), 53-63.
Northup, J., Vollmer, T. R., & Serrett, K. (1993). Publication trends in 25 years of the
Journal of Applied Behavior Analysis. Journal of Applied Behavior Analysis,
26(4), 527-537.
Northup, J., Wacker, D. P., Berg, W. K., Kelly, L., Sasso, G., & DeRaad, A. (1994).
The treatment of severe behavior problems in school settings using a technical
assistance model. Journal of Applied Behavior Analysis, 27, 33-47.
Parsons, M. B., & Reid, D. H. (1996). Training basic teaching skills to community
and institutional support staff for people with severe disabilities: A one-day
program. Research in Developmental Disabilities, 17(6), 467-485.
Parsons, M. B., & Reid, D. H. (1999). Training basic teaching skills to paraeducators
of students with severe disabilities. Teaching Exceptional Children, 31(4), 4854.
Parsons, M. B., Reid, D. H., & Green, C. W. (1993). Preparing direct service staff to
teach people with severe disabilities: A comprehensive evaluation of an
219
effective and acceptable training program. Behavioral Residential Treatment,
8, 163-185.
Passaro, P. D., Pickett, A. L., Latham, G., & Hongbo, W. (1994). The training and
support needs of paraprofessionals in rural special education. Rural Special
Education Quarterly, 13(4), 2-9.
Pearman, E. L., Huang, A. M., & Mellblom, C. I. (1997). The inclusion of all
students: Concerns and incentives and educators Education and Training in
Mental Retardations and Developmental Disabilities, 17, 11-20.
Pelios, L. V., MacDuff, G. S., & Axelrod, S. (2003). The effects of a treatment
package in establishing independent work skills in children with autism.
Education and Treatment of Children, 26(1), 1-21.
Petscher, E. S., & Bailey, J. S. (2006). Effects of training, prompting, and selfmonitoring on staff behavior in classroom for students with disabilities.
Journal of Applied Behavior Analysis, 39, 215-226.
Pickett, A. L., & Gerlach, K. (2003). Supervising Paraeducators in Educational
Settings: A Team Approach (2nd ed.). Austin, Texas: Pro-ed.
Pickett, A. L., Gerlach, K., Morgan, R., Likins, M., & Wallace, T. (2007).
Paraeducators in Schools: Strengthening the Educational Team. Austin,
Texas: PRO-ED, Inc.
Pickett, A. L., Likins, M., & Wallace, T. (2003). The Employment & Preparation of
Paraeducators: The State-of-the-Art-2003. New York: Center for Advanced
220
Study in Education Graduate School and University Center City University of
New York.
Prue, D. M., & Fairbank, J. A. (1981). Performance feedback in organizational
behavior management: A review. Journal of Organizational Behavior
Management, 23, 1-16.
Quick, H. E., Holtzman, D. J., & Chaney, K. R. (2009). Professional development
and instructional practice: Conceptions and evidence of effectiveness. Journal
of Education of Students Placed at Risk, 14, 45-71.
Riggs, C. G., & Mueller, P. H. (2001). Employment and utilization of paraeducators
in inclusive settings. The Journal of Special Education, 35(1), 54-62.
Robinson, S. E. (2007). Training Paraprofessionals of Students with Autism to
Implement Pivotal Response Treatment Using a Video Feedback Training
Package. University of California, Ca, Santa Barbara, Ca.
Roter, D. L., Larson, S., Shinitzky, H., Chernoff, R., Serwint, J. R., Adamo, G., et al.
(2004). Use of an innovative video feedback technique to enhance
communication skills training. Medical Education, 38(2), 145-157.
Rowe, M. B. (1987). Wait time: Slowing down may be a way of speeding up.
American Educator: The Professional Journal of the American Federation of
Teachers, 11(1), 38-43.
Rueda, R., & Monzo, L. D. (2002). Apprenticeship for teaching: Professional
development issues surrounding the collaborative relationship between
221
teachers and paraprofessionals. Teaching and Teacher Education, 18, 503521.
Sarokoff, R. A., & Sturmey, P. (2004). The effects of behavioral skills training on
staff implementation of discrete-trail training. Journal of Applied Behavior
Analysis, 37, 535-538.
Sawyer, L. M., Luiselli, J. K., Ricciardi, J. N., & Gower, J. L. (2005). Teaching a
child with autism to share among peers in an integrated preschool classroom:
Acquisition, maintenance and social validation Education and Treatment of
Children, 28(1), 1-10.
Schepis, M. M., Reid, D. H., Ownbey, J., & Parsons, M. B. (2001). Training support
staff to embed teaching within natural routines of your children with
disabilities in an inclusive preschool. Journal of Applied Behavior Analysis,
34, 313-327.
Schoen, S. F. (1986). Assistance procedures to facilitate the transfer of stimulus
control: Review and Analysis. Education and Training of the Mentally
Retarded, 21(1), 62-74.
Sindelar, P. T., Shearer, D. K., Yendol-Hoppey, D., & Libert, T. W. (2006). The
sustainability of inclusive school reform. Exceptional Children, 72(3), 317331.
Sindelar, P. T., Smith, M. A., Harriman, N. E., Hale, R. L., & Wilson, R. J. (1989).
Teacher effectiveness in special education programs. The Journal of Special
Education, 20, 195-207.
222
Spriggs, A. D., Gast, D. L., & Ayres, K. M. (2007). Using picture activity schedule
books to increase on-schedule and on-task behaviors. Education and Training
in Developmental Disabilities, 42(2), 209-223.
Stanley, S. O., & Greenwood, C. R. (1981). In CISSAR: Code for Instructional
Structure and Student Academic Response: Observer's Manual. Kansas City:
University of Kansas Juniper Garden's Children's Project, Beureau of Child
Research.
Striefel, S., & Wetherby, B. (1973). Instruction-following behavior of a retarded child
and its controlling stimuli. Journal of Applied Behavior Analysis, 6, 663-670.
Sutherland, K. S., Wehby, J. H., & Copeland, S. R. (2000). Effect of varying rates of
behavior-specific prasie on the on-task behavior of students with EBD.
Journal of Emotional and Behavioral Disorders, 8(2), 2-8.
Taubman, M., Brierley, S., Wishner, J., Baker, D., McEachin, J., & Leaf, R. B.
(2001). The effectiveness of a group discrete trial instructional approach for
preschoolers with developmental disabilities. Research in Developmental
Disabilities, 22, 205-219.
Tews, L., & Lupart, J. (2008). Students with disabilities' perspectives of role and
impact of paraprofessionals in inclusive education settings. Journal of Policy
and Practice in Intellectual Disabilities, 5(1), 39-46.
Thompson, R., White, K. R., & Morgan, D. P. (1982). Teacher-student interaction
patterns in classrooms with mainstreamed handicapped students. American
Educational Research Journal, 19(2), 220-236.
223
Tincani, M., & Crozier, S. (2008). Comparing brief and extended wait-time during
small group instruction for children with challenging behavior. Journal of
Behavioral Education, 17, 79-92.
Tindal, G., & Parker, R. (1987). Direct observation in special education classrooms:
Concurrent use of two instruments and their validity. Journal of Special
Education, 21(2), 43-58.
Tobin, K. (1986). Effects of teacher wait time on discourse characteristics in
mathematics and language arts classes. American Educational Research
Journal, 23(2), 191-200.
U.S. Bureau of Labor Statistics. (2010). Occupational Employment Statistics Division
of Occupational Employment. Washington, D.C.
: Available at http://www.bls.gov/oes/current/oes258041.htm.
U.S. Department of Education. (2006). Table 1-3. Students ages 6 through 21 served
under IDEA, Part B, by disability category, educational environment and
state: Fall 2006. In D. file (Ed.). Washington, DC: Available from
https://www.ideadata.org/PartBdata.asp.
U.S. Department of Education, Office of Special Education and Rehabilitative
Services, & Office of Special Education Programs. (2005). 25th Annual
Report to Congress on the Implementation of the Individuals with Disabilities
Education Act, Vol. 1. Washington, DC.
224
Valcante, G., Roberson, W., Reid, W. R., & Wolking, W. D. (1989). Effects of waittime and intertrial interval durations on learning by children with multiple
handicaps. Journal of Applied Behavior Analysis, 22, 43-55.
Vuran, S. (2008). Empowering leisure skills in adults with autism: An experimental
investigation through the most to least prompting procedure. International
Journal of Special Education, 23(1), 174-181.
Wadsworth, D. E., & Knight, D. (1996). Paraprofessionals: The bridge to successful
inclusion. Intervention in School and Clinic, 31, 166-171.
Wallace, M. D., Doney, J. K., Mintz-Resudek, C. M., & Tarbox, R. S. F. (2004).
Training educators to implement functional analyses. Journal of Applied
Behavior Analysis, 37, 89-92.
Wallace, T., Shin, J., Bartholomay, T., & Stahl, B. J. (2001). Knowledge and skills
for teachers supervising the work of paraprofessionals. The Council for
Exceptional Children, 67(4), 520-533.
West, E. A., & Billingsley, F. (2005). Improving the system of least prompts: A
comparison of procedural variations. Education and Training in
developmental disabilities, 40(2), 131-144.
Wheeler, J. J., & Richey, D. D. (2005). Behavior Management: Principles and
Practices of Positive Behavior Supports. Upper Saddle River, New Jersey:
Prentice Hall, Inc.
Wolery, M., & Gast, D. L. (1984). Effective and efficient procedures for the transfer
of stimulus control. Topics in Early Childhood Special Education, 4(3), 52-77.
225
Wood, A. L., Luiselli, J. K., & Harchik, A. E. (2007). Training instructional skills
with paraprofessional service providers at a community-based habilitation
setting. Behavior Modification, 31(6), 847-855.
Yell, M. L. (2006). The Law and Special Education (Second Edition ed.). Upper
Saddle River, New Jersey: Pearson Prentice Hall.
Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. L. (2007).
Reviewing the evidence on how teacher professional development affects
student achievement. Washington, DC: U.S. Department of Education,
Institute of Education Sciences, National Center for Education Evaluation and
Regional Assistance, Regional Educational Laboratory Southwest. Retrieved
from http://ies.ed.gov/ncee/edlabs.
Young, B., Simpson, R. L., Myles, B. S., & Kamps, D. M. (1997). An examination of
paraprofessional involvement in supporting inclusion of students with autism.
Focus on Autism and Other Developmental Disabilities, 12(1), 31-38.
Ysseldyke, J. E., Christenson, S., Thurlow, M., & Skiba, R. (1987). Academic
engagement and active responding of mentally retarded, learning disabled,
emotionally disturbed and nonhandicapped elementary students (No. Report
No. 4). Minneapolis: University of Minnesota, Instructional Alternatives
Project.
226
Appendix A
Instructional Assistant Survey
General Background Information
Name:
Age:
Ethnicity:
Last completed year of school:
Job Related Information
School employed at:
Student Information
Student disability:
Student age:
Student gender:
Grade:
Number of students you are responsible for:
Personal Experience
Years/months of experience as an instructional assistant:
Years/months of assignment at this school site:
Years/months of experience working in inclusion:
Years/months of experience supporting students in inclusion:
Years/months of experience supporting current student:
227
Experience & Training
How many times have you participated in district provided professional development
(defined as a half a day or more meeting in which the district has provided people to
train you on skills for your job) in the past year?
What other activities have you participated in to increase your knowledge and skills?
Have you had previous experience with persons with disabilities prior to your job?
What type of training did you receive prior to starting as a instructional assistant?
Please explain the type and the duration (Ex: workshop, on the job training from
teacher/special education teacher/inclusion specialist, no training)
Did you get additional or other training when you began working in the general
education setting?
What was the most beneficial training you have ever received? Why?
From which one of these have you gotten the most useful information for your job?
Please circle one
Workshops and formal training
Classes and courses
228
Discussions and supervision with teachers
Experience with a family member
Mentoring or talking with other instructional assistants
Which one best describes you? Please circle one
I feel I understand my student well enough to take initiative to take action with my
student
I feel it is best to wait for instruction from my supervisor prior to taking action with
my student
229
Appendix B
Instructional Assistant Vignettes
NAME: __________________________ SCHOOL: _________________________
PLEASE include your name on each vignette just in case they get separated!
Background information: Each of these scenarios was written by a supervisor who is
observing each of the instructional assistants working with their assigned student in the
classroom. Each instructional assistant has been working with the current student for about
two years.
Directions: Please read the following vignettes and indicate (by circling the number) of how
effective the instructional assistant was.
Vignette # 1
BACKGROUND:
Student: Matt
Age: 9
Grade: 4th
Instructional assistant: Julie
The teacher passes out the math worksheet; Julie is sitting next to Matt. When the
teacher gives Matt the paper, Julie says, “Matt, put your name on your paper.” Matt puts his
name on the paper and then Julie points to the first problem. Julie says, “Let’s go over the
first couple of problems together so you can see what the steps are.” For the first three
problems, Julie went over the steps of the problem. Julie says, “Remember, first you are
going to line up the numbers so all of the decimals are in line with one another. Next you are
going to add starting with the numbers on the right. If the number is ten or more, remember to
carry.” Matt completes the first three problems while Julie reminds him of the steps.
During the next four problems, Julie says nothing but stops Matt in the middle of one
of the problems. Matt erases part of the problem. After this, Julie tells Matt, “Try the problem
again.” Matt continues to do the problems by himself with no apparent mistakes.
While Matt is working, Julie sits next to him the entire time until Matt is done. When
Matt finishes his math worksheet, he continues to sit in his chair. Shortly after, Julie says
something to Matt and then Matt puts his paper in a box located near the teacher’s desk.
Based upon your experience, how effective do you think Julie is as an instructional assistant?
Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
230
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 2
BACKGROUND:
Student: Sarah
Age: 5
Grade: K
Instructional assistant: Pat
Scenario:
The teacher gives Sarah a writing task that requires her to trace shapes on a
paper. When Sarah gets the paper she writes her name at the top. Sarah finishes
writing her name and then leans back in her chair and sits at her desk. Pat says to
Sarah, “Pick up your pencil and trace the circle.” Sarah continues to sit at her desk.
After several seconds, Pat says to Sarah, “Start tracing the circle” as she points to the
circle on the paper.
Sarah remains sitting at her desk holding the pencil. Next Pat says to Sarah,
“Trace the circle” as she puts her hand on Sarah’s. With her hand on Sarah’s, Pat
moves Sarah’s hand to the circle and positions Sarah’s hand with the pencil on the
dotted line of the circle. Sarah begins to trace the circle and Pat removes her hand
while continuing to watch Sarah.
After Sarah finishes tracing the circle, Sarah puts her pencil down. Pat tells
Sarah “Good job.” When the teacher comes over, Sarah gives her paper to the
teacher.
Based upon your experience, how effective do you think Pat is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
231
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 3
BACKGROUND:
Student: William
Age: 11
Grade: 6th
Instructional assistant: Megan
The teacher gives the directions to the class to read a short story and answer
several questions. The teacher passes out the reading materials and tells the students
to “Get started.” Megan sits next to William while he reads the story out loud to her.
Once William is finished reading, Megan says, ”It’s time to start working on the
questions.”
William reads the first question out loud. While William writes his first
answer, Megan says, “Think about the story, why did the little boy jump off the
bridge?” William sits and stares at his paper. Megan says, “Look back in the story
and find the part where it says what the boy did.” William looks at the story and
Megan says, “Remember in paragraph two there is something about the boy and the
bridge.” William continues to look at the paper and Megan says, “See, it’s right here”
and points to the part of the story that talks about the boy jumping off the bridge.
Megan then says, “What are you going to write here?” and points to the first
question. William looks at his paper for a second or two. Megan says, “You want to
write something about how the boy jumped off the bridge because he thought it
would be fun.” She then points to the paper and says, “You’ll want to write that right
here.”
Based upon your experience, how effective do you think Megan is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
232
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 4
BACKGROUND:
Student: Kyle
Age: 11
Grade: 6th
Instructional assistant: Shelby
The class sits at their desks waiting for instruction on completing a timeline
for a history assignment. Kyle looks at the teacher while she is explaining the
assignment. Shelby is standing behind Kyle. Once the teacher finishes explaining
how to complete the timeline, she passes out the strips of paper.
While the teacher passes out the strips, Shelby approaches and says to Kyle “I
like how you had your eyes on the teacher and you were listening to her instructions.”
Kyle receives his strip and begins writing his name. After writing his name, he begins
drawing a line across his strip. He erases and redraws the line several times while
saying “I can’t get this line straight!” in a loud voice. Shelby approached Kyle and
says, “What do you think you should do if you need help?” Kyle raises his hand and
the teacher comes over. Kyle tells her he couldn’t get the line straight. The teacher
gives him a ruler. After the teacher walks away, Shelby whispers to Kyle, “that was a
good choice, you raised your hand to get the teacher’s attention. Nice job!” Kyle
continues working on his timeline at his desk.
Once the timeline is complete Kyle turns in the assignment to the teacher. On
his way back to his desk, Shelby gave him a high five and told him “You completed
the assignment by yourself! That’s great!”
Based upon your experience, how effective do you think Shelby is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
233
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. This instructional assistant has been working with the current student
for a year.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 5
BACKGROUND:
Student: Sammy
Age: 5
Grade: Kindergarten
Instructional assistant: Kathy
Sammy and his class are in the computer lab. Kathy is sitting in a chair next to
Sammy. Sammy pushes the button to turn the computer on; while the computer is
loading Kathy says, “Sammy, do you know what program you are doing?” Sammy
says, “Yep, I’m doing the bird program!” Kathy says, “That’s right Sammy!” and gets
up to and walks over to another student.
Sammy continues to click on some keys that appear to move birds across the
computer screen. Kathy walks back by Sammy and says, “Oops, you are using the
keys instead of the mouse, let’s try the mouse instead.” Sammy puts his hand on the
mouse and starts moving the birds. Kathy stands back and watches as Sammy moves
the birds to put them into a pattern. When Sammy is unable to move one of the birds,
Kathy moves behind his chair and says “To move the bird you need to click on it and
then drag it, just like you did for the other ones.” Kathy stands back and watches
while Sammy moves the next two birds. Once the birds are in place, Kathy walks
over to another student. After twenty seconds, Kathy comes back and glances at
Sammy’s screen and then walks away again.
Based upon your experience, how effective do you think Kathy is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
234
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 6
BACKGROUND:
Student: Betsy
Age: 9
Grade: 4th
Instructional assistant: Norma
All of the students in the class are broken up into small groups of 5 students to
complete a history lesson about missions. The students move into groups while Betsy gets out
of her seat and walks over to her group. As soon as Betsy arrives at her group and takes a seat
Norma tells her “Good job, Betsy.”
After the students work together for some time, the teacher tells the students to
identify who the people are who lived in the mission and that they might need their books.
Betsy remains in her seat. After about 10 seconds, Norma says, “Betsy, go get your history
book out of your desk and come back and sit down.” Betsy gets up and gets her history book.
When she comes back and sits down, Norma says, “Great job, Betsy!”
The students open their books to the right page and begin reading. When the students
are done reading the section, they discuss what they would write about their mission. One of
the students says, “Let’s all write about what the mission looks like.” All of the students get
out their pencils and begin writing. Betsy begins writing “the mission is big.” Norma is
watching Betsy and moves toward her and says “Good job, Betsy!”
Another student says, “We should also write about who lived in the mission.” The
students pick up their pencils and begin writing. Betsy also begins writing. Norma says to
Betsy “Good job, Betsy.” The teacher came over and looks at Betsy’s writing. The teacher
says, “Betsy, I’m not sure what this says, can you write it neater?” Betsy takes her pencil and
erases the last sentence and begins re-writing it. Norma says to Betsy, “Good job, Betsy.”
The teacher tells the students to put their books away and go to the rug. Betsy and the
other students get up and put their books back in their desks. Betsy walks to the rug and sits
down. Norma looks at Betsy and says, “Good job, Betsy.”
Based upon your experience, how effective do you think Norma is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
235
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 7
BACKGROUND:
Student: Eric
Age: 10
Grade: 5th
Instructional assistant: Maggie
The class is given a blank map of the United States with each of the states
outlined. The teacher tells the class “Please fill in each of the states on the map during
the next twenty minutes.” Eric gets his pencil from out of his desk and puts his name
on the paper. He sits there for about 5 seconds when Maggie says, “Eric, start filling
in each of the states, you can start with California.” Eric continues to sit at his desk,
staring at his pencil. Maggie then picks up his hand and puts the pencil in his hand
and puts his hand with the pencil on California and says, “Start filling in each of the
states.” Eric writes “California” in the correct state.
Eric moves to Oregon and puts his pencil on the state while saying “This is
Oregon.” Maggie says, “That’s right!” Eric looks away towards another student’s
desk that is disrupting the class. Maggie says “Eric, write Oregon in the state” while
moving his hand onto his pencil and guiding it to Oregon. With Maggie’s assistance,
Eric writes Oregon.
Next, Maggie moves Eric’s hand to Arizona. With her hand over Eric’s, she
begins writing Arizona on the state. Maggie then says, “Eric, go on to Nevada.” Eric
says, “I don’t know where Nevada is.” Maggie puts her hand on top of Eric’s and
guides his hand to pick up the pencil and then to Nevada. Once on Nevada, with her
hand over Eric’s, she physically assists him to write Nevada on the state.
Based upon your experience, how effective do you think Maggie is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
236
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 8
BACKGROUND:
Student: Ryan
Age: 7
Grade: 2nd
Instructional assistant: Sheri
The teacher hands everyone a blank science worksheet on water on their
desks. The teacher gives the instructions to the class to begin by putting their name on
their paper and then filling in the parts of the water cycle. The other students in the
class get out their pencils and start writing their names on their papers. Sheri, standing
behind Ryan, looks down at him. Ryan continues to sit at his desk looking out the
window.
After about six seconds, Sheri said to Ryan “Ryan, put your name on the
paper and start working.” After a few seconds, Ryan picks up his pencil and writes
his name. After he writes his name, he puts his pencil down. Sheri stands behind
Ryan and watches him. After a few seconds, Ryan picks up his pencil and begins
filling in the water cycle.
Ryan fills in the parts of the water cycle, puts his pencil down and sits in his
chair looking around the room for several seconds. Sheri says to Ryan, “Now that you
are done, you can get some crayons and color the water cycle in.” Ryan stands up and
goes to get the crayons.
He gets the crayons and returns to his desk, Sheri smiles and nods at him as he
returns to his seat. Ryan does not immediately start coloring. Sheri watches Ryan but
does not say anything to him. After about three seconds Ryan picks out a blue crayon
and begins coloring the sky. Sheri continues to watch as he completes the assignment.
As Ryan completes the assignment, Sheri tells Ryan, “You did a nice job finishing the
assignment on your own.”
Based upon your experience, how effective do you think Sheri is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
237
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 9
BACKGROUND:
Student: Jeremy
Age: 9
Grade: 4th
Instructional assistant: Leah
The teacher just finished teaching a lesson on rounding to the nearest dollar.
The teacher then says, “Now you will get a worksheet with various items with prices
on them. You are to add up the items and round to the nearest dollar.” Jeremy gets his
worksheet and gets out his pencil and places it on his desk. After five seconds, Leah
whispers something to him. Jeremy continues to sit at his desk for about four to five
seconds. Leah stands behind Jeremy and says, “Pick up your pencil and write your
name,” while pointing to his pencil on his desk. Leah continues to watch Jeremy but
does not say anything. Jeremy picks up his pencil after two or three seconds and
begins to write his name. Leah says, “Thank you for getting started.”
Once Jeremy writes his name, he puts his pencil down on his desk again.
Jeremy begins looking around the room. Leah continues to watch Jeremy but does not
say anything. After about two seconds, Jeremy picks up his pencil and starts adding
the first group of numbers. Leah gives Jeremy a pat on the back.
Jeremy finishes the first problem and says something to his neighbor. Leah
says to him, “Jeremy, please keep going.” Jeremy looks at her and continues talking
to the other student. Leah stands next to Jeremy’s desk and says, “Jeremy, please
continue doing the problems on the worksheet,” and points to the worksheet. She
stands next to Jeremy’s desk for a few seconds before Jeremy turns back to his
worksheet and begins the second problem. Jeremy completes two more problems.
Leah says, “I like how you took your time and did those problems carefully!”
Based upon your experience, how effective do you think Leah is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
238
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 10
BACKGROUND:
Student: Jose
Age: 6
Grade: 1st
Instructional assistant: LaQuita
Jose and LaQuita are working at a back table on a worksheet with several
clocks on it with blank spots to write in the time. LaQuita says, “Read the directions
and then start with problem #1.” Jose looks at his paper for a few minutes and then
begins with problem #1. LaQuita continues to watch Jose as he completes several
problems. He then says to LaQuita, “I don’t want to do this anymore.” LaQuita says,
“You have to finish it, keep going.” Jose lets out a grunt and begins working on the
worksheet again.
Jose starts fidgeting in his seat and puts his pencil down. LaQuita says, “Keep
going.” Jose starts to fall out of his chair. LaQuita tells Jose “Sit up and finish this.”
Jose sits up and starts on the next problem. After two more problems he pushes the
paper away. LaQuita says, “If you just finish it you’ll be done.” Jose huffs again and
did four more problems.
On the last problem, Jose takes much longer on the last problem to complete
it. As soon as he finishes, LaQuita says “You could have gone to free time earlier if
you had worked quicker, you can go play now.” Jose gets up and leaves the table.
Based upon your experience, how effective do you think LaQuita is as an
instructional assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
239
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 11
BACKGROUND:
Student: Carlos
Age: 8
Grade: 3rd
Instructional assistant: Betsy
The teacher is teaching a lesson on plant parts. He has an overhead projector
out and a paper with a plant and lines to fill in each part on it. All of the students have
a copy of the same paper on their desks. The teacher gives the instruction to the
students to write their name on the paper. Carlos sits at his desk playing with a piece
of paper he had torn off of the side of the worksheet. Betsy immediately says to
Carlos “Please put your name on the paper.” Carlos looks at her and then back to the
paper. Betsy then says, “Carlos, put your name on the paper,” while putting her hand
over his and physically moving it to the pencil on his desk and then to the paper.
Carlos begins writing his name. Once he has written his name, he put his pencil down
and looks up at the teacher.
The teacher then instructs the students to fill in the blank where the roots
were. Carlos sits at his desk and does not pick up his pencil. After two seconds, Betsy
says, “Carlos, put the word “roots” on the line like the teacher did on his paper.”
Carlos continues to sit at his desk. Betsy again says, “Carlos, write the word roots on
that bottom line.” Carlos does not move but continues to look around the class. Betsy
then says, “Carlos, write roots on your chart or you are going to be behind everyone
else.” Carlos writes the word in the appropriate blank.
Based upon your experience, how effective do you think Betsy is as an instructional
assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
240
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
NAME: _____________________________
Background information: Each of these scenarios was written by a supervisor who
is observing each of the instructional assistants working with their assigned student in
the classroom. Each instructional assistant has been working with the current student
for about two years.
Directions: Please read the following vignettes and indicate (by circling the number)
of how effective the instructional assistant was.
Vignette # 12
BACKGROUND:
Student: Jacob
Age: 11
Grade: 6th
Instructional assistant: Daniella
The class has just finished writing in their journals. The teacher tells the class
to get out their math assignments and finish converting fractions into decimals or
percents. Daniella stands next to Jacob’s desk and says, “Get out your worksheet and
start on number eight where you left off.” Jacob gets out his worksheet, picks up his
pencil and begins writing. Daniella walks over to another student.
Jacob puts down his pencil and sits at his desk for five minutes playing with
his eraser. Daniella looks over to Jacob from across the room and begins walking
over to him. Once by his desk, Daniella says, “Remember, what you need to do to
convert the fractions into a decimal. Try the next one.” Jacob picks up his pencil and
writes on his paper. Daniella says, “That’s not correct, try it again.” Jacob erased
what he had written and writes something else. Daniella says, “Nice try, let’s do this
one together.” Daniella says, “First you need to divide the bottom number into the top
number, next you multiply by 100 and that gives you a percent.” Jacob writes down
something and Daniella says, “There you go!” Daniella walks away to another
student’s desk while Jacob puts his pencil down on his desk. Jacob sits at his desk for
the remainder of the activity.
Based upon your experience, how effective do you think Daniella is as an
instructional assistant? Please circle the number that indicates your response.
Extremely
Ineffective
1
Very
Ineffective
2
Somewhat
Ineffective
3
241
Somewhat
Effective
4
Very
Effective
5
Extremely
Effective
6
Appendix C
Direct Observation Instrument
242
Appendix D
Final Interview for Instructional assistants
1) Describe how you are currently supporting your assigned student in the
general education setting?
2) How do you determine which method of support to use?
3) How do you know when to use such methods?
4) Is there any methods you would like to use but don’t? (Ie: things you learned
in the training but weren’t able to implement or other strategies you have
heard about or seen that you would like to use?)
a. If yes, why do you not able to use them?
5) What were the easiest strategies to implement from the training?
6) What were the most difficult, if any, strategies to implement from the
training?
7) Do you feel your student’s performance has been affected since the training?
a. If yes, in what ways?
8) Following the training, describe your communication with the general
education teacher. Has it changed since before the training? If so, in what
ways?
243
Appendix E
Instructional Assistant Social Validity Questionnaire
Name: _______________________School:__________________________________
Please indicate to the extent to which you agree or disagree with the following
statements regarding the training by circling the number that most closely reflects
your opinion.
Strongly
Disagree
1
Disagree
Somewhat
2
Neutral
3
Agree
Somewhat
4
Strongly
Agree
5
1. I feel the components of this training are critical for supporting students with
disabilities in the general education classroom.
1
2
3
4
5
2. Overall, I feel the training was beneficial.
1
2
3
4
5
3. The training was targeted specifically for my job duties.
1
2
3
4
5
4. I would recommend this training to other instructional assistants.
1
2
3
4
5
5. I will use what I learned in the training while working with my assigned student in
the general education class.
1
2
3
4
5
6. I feel that the skills I learned in the training will be easily implemented in the
classroom.
1
2
3
4
5
7. I feel comfortable/confident using the strategies I learned in the classroom.
1
2
3
4
5
8. I feel this training has improved my ability to perform my job duties.
1
2
3
4
5
9. If I use the skills I learned in the training, I will be a more effective instructional
assistant.
1
2
3
4
5
244
10. I feel this training will allow me to assist a variety of students with and without
disabilities.
1
2
3
4
5
11. I would like to receive this or similar trainings in the future.
1
2
3
4
5
12. What aspects of the training did you find most and least valuable? Put an X in the
box
Highly
Valuable
Lecture
Power Point
Examples and Non-Examples
Practice
Feedback
Coaching
Other: (please list)
Additional comments:
245
Moderately
Valuable
Not
Valuable
Appendix F
Observation Questionnaire
OBSERVATION QUESTIONNAIRE
PLEASE FILL OUT IF YOU CONSENT TO PARTICIPATING IN THE STUDY!!!
Dear Instructional Assistant,
During the study, undergraduate research assistants and myself will briefly be observing you
working with your student. We would like to observe during academic activities that your
student typically requires your continual assistance/guidance (this means that the student
requires your support, help and guidance for a majority of the activity).
IF YOU SUPPORT MORE THAN ONE STUDENT in a class, please fill one out for EACH
student.
Please list two to three activities that you know your student typically struggles with and
requires your support.
Your Name: __________________________________ School: ___________________
Best way to reach you: (Phone and/or email): ________________________________
Academic activity _______________________________________________________
Time of activity ________________________________________________________
Day(s) activity is done in classroom ________________________________________
Academic activity________________________________________________________
Time of activity _________________________________________________________
Day(s) activity is done in classroom _________________________________________
Academic activity _______________________________________________________
Time of activity _________________________________________________________
Day(s) activity is done in classroom _________________________________________
Please send back attached to consent form!
246
Appendix G
1st Coaching Form
1st Coaching Session
Please check off items after discussing with IA
Name of IA:
School:
Date:
Time/Activity:
Name of observer:
1. Initial positive statement:
PERFORMANCE-BASED POSITIVE FEEDBACK
& DESCRIPTIVE PRAISE
2. PROMPTING
Types of prompts used:
Prompts to teacher or natural cue:
Uses a sequence of prompts:
Least to most
Most to least
Graduated guidance
Uses appropriate wait time after prompt/between prompts:
2a. ACADEMIC ENGAGEMENT
Prompts student and allows for independent response:
Allows student to work independently:
Notes:
247
PERFORMANCE-BASED CORRECTIVE FEEDBACK
3. PROMPTING
Types of prompts used:
Needs to prompt to teacher or natural cue:
Repeats same prompt:
Needs to use a prompt sequence:
Least to most
Most to least
Graduated guidance
Needs to use appropriate wait time after prompt/between prompts:
3a. ACADEMIC ENGAGEMENT
Needs to prompt student & allow for an independent response:
Needs to allow student to work independently:
Notes:
4. Suggestions regarding specific scenario that could be improved relating to:
Prompting:
Student academic engagement:
Notes:
5. Concluding positive statement
6. Questions from IA & answers
Goals: 1) Increase student independence, 2) Prompt less, 3) Prompt to teacher, natural cue or
other students, 4) Give specific praise, & 5) Don’t hover
248
Appendix H
2nd Coaching Form
Last Coaching Session
Please check off items after discussing with IA
Name of IA:
School:
Date:
Time/Activity:
Name of observer:
1. Initial positive statement:
PERFORMANCE-BASED POSITIVE FEEDBACK &
DESCRIPTIVE PRAISE
2. Prompting
Types of prompts used:
Prompts to teacher or natural cue:
Uses a sequence of prompts:
Least to most
Most to least
Graduated guidance
Uses appropriate wait time after prompt/between prompts:
3. Reinforcement
Type of verbal reinforcement (general or descriptive)
Frequency of reinforcement
4. Proximity
Majority of time spent:
Made attempt to change proximity by:
5. Academic Engagement
Prompts student and allows for independent response:
Allows student to work independently:
Notes:
249
PERFORMANCE-BASED CORRECTIVE FEEDBACK
6. Prompting
Needs to prompt to teacher or natural cue:
Repeats same prompt:
Needs to use a prompt sequence:
Least to most
Most to least
Graduated guidance
Needs to use appropriate wait time after prompt/between prompts:
7. Reinforcement
Needs to use more descriptive verbal reinforcement
Needs to increase frequency of verbal reinforcement
8. Proximity
Could adjust proximity by:
9. Academic Engagement
Needs to prompt student & allow for an independent response:
Needs to allow student to work independently:
Notes:
10. Suggestions regarding specific scenario that could be improved relating to:
Prompting:
Reinforcement:
Proximity:
Student academic engagement:
Notes:
11. Concluding positive statement
12. Questions from IA & answers
Goals: 1) Increase student independence, 2) Prompt less, 3) Prompt to teacher, natural cue or
other students, 4) Give specific praise, & 5) Don’t hover
250