Defining “Engagement” Within MTSS Perhaps no term is more

advertisement
Defining “Engagement” Within MTSS
Perhaps no term is more widely used and less understood than “Student Engagement (SE)”.
Christenson et al (2008) defined SE as including a “commitment to and investment in learning,
identification and belonging at school, and, in terms of participation in the school environment
and initiation of an activity to accomplish an outcome, is associated with desired academic,
social, and emotional learning outcomes.” Christenson et al (2008) further describe SE as
consisting of four subtypes (academic, behavioral, cognitive and psychological) that include a
wide range of distal indicators from absence and problem behaviors to more proximal indicators
such as interest, positive affect, responsibility, and motivation. In addition, SE is impacted by a
range of contextual factors that include the family, peers and school (school-wide and classroom
factors). The resulting understanding of the construct of SE is further complicated by the “lack of
clarity about what is included under the larger umbrella of student engagement” (Christenson et
al, 2008) and a tendency to equate SE as solely the responsibility of the student.
The purpose of this paper is to further clarify the scope of the SE construct, discuss how to
conceptualize, utilize and evaluate SE within a multi-tiered system of support (MTSS)
framework, and how to distribute the responsibility for SE across the community, family, school
(administration and teachers) as well as the student.
Defining Student Engagement
Student Engagement (SE) is defined by the degree to which students participate in all
aspects of the school environment (academic and social) and assume the appropriate level
of responsibility for their own learning and behavior.
Clarifying and Extending the Definition
Student engagement is a high priority for successful schools and is critical to student success.
The absence (or low levels) of student engagement results in poor student performance and
significantly reduces the probability of positive, post-secondary outcomes. Student engagement
is influenced by the presence or absence of evidence-based and measurable instructional factors,
school/learning environment factors, and parent/community involvement factors. Thus, it is
critically important to view SE is a reciprocal process between those factors with student factors.
Student factors involve measuring student attitudes, beliefs, and skills that influence student
affiliation, goal setting, performance, and the students’ assumption of responsibility for their own
learning outcomes. Measuring student attitudes, perceptions and beliefs could involve
understanding their perceptions of the school environment, connectedness with peers or staff,
and staff caring or supports for their learning. It could involve students’ self efficacy to engage in
tasks assigned, accomplish their goals, or otherwise involve understanding of students’ thoughts
on the purpose of their education for the future.
When the word “skills” is used above, it is important to recognize a continuum of skills that
extends from academic performance skills, to academic behavior skills, to social skills. Academic
performance skills are those skills that students use to demonstrate they have reached the
instructional goals of their grade or classroom. Academic behavior skills are a specific set of
behaviors necessary for students to apply within the instructional environment that enable them
to access the teacher’s instruction and meet learning goals (e.g., participate in activities, regulate
attention to teacher instructions, conform to classroom rules and norms, self-monitoring of
performance, etc.). Social skills are the verbal and emotional behaviors that students use to
build, improve, and sustain positive relationships with their peers, teachers, and
family/community members. As a continuum of skills, educators must consider measurement of
each type when solving barriers to students being successful in reaching their learning goals.
Deficits in any of these types of skills should be viewed as targets for instruction and supports for
students.
Instructional factors would include classroom instruction (proximal factors) and classroom
support related roles, responsibilities, procedures and resources used to support instruction in
the classroom (distal factors). An examination of instructional factors might include determining
if evidence-based instructional strategies are being used with fidelity and sufficiency. One might
ask if instruction is being differentiated among students based on current performance needs. Is
there evidence that the rules of the classroom are effectively taught and understood by students?
Are educators using and/or supported to use lesson studies to monitor effects of instructional
strategies on student progress towards grade/course standards? Do instructional lesson plans
consider student academic behaviors needed to fully participate in the activities planned and
also consider the curricular relevance or cultural responsiveness of students?
School/learning environment factors could involve all school wide roles, responsibilities, and
resources (both leadership and staff) intended to create a safe, supportive, predictable, and
positive school-learning climate. As with instructional factors, some of these factors have a
proximal influence (e.g., classroom routines/expectations) and others a more distal influence on
student engagement (e.g., school climate). An examination of the presence of these factors might
involve analysis of positive supports available to both students and staff. One could seek to
understand staff beliefs and perceptions about the potential for all students to learn. It might
involve understanding how teachers perceive instances of misbehavior or failure to achieve
academic goals (i.e., a sign for more support for a student, or a reason to dismiss them or
otherwise lower expectations for their potential to succeed?). The responsiveness to student
concerns and the commitment of staff to work together to respond to student needs thoughtfully
and immediately can help create a positive school/instructional environment where students
feel supported and safe. An analysis of the school/learning environment also could involve
perception among staff on how well they are supported in creating positive and effective
learning environments in their classrooms. How responsive is the school leadership to the needs
of the staff when working with students of need? Does leadership communicate a clear vision
and mission that emphasizes urgency for specific successful outcomes for all students? Are there
sufficient options for student affiliation within the school and community setting?
Parent/community involvement factors focus on more than just whether or not parents are
participating in meetings. Schools may involve parent presence in school-related decisions and
planning for school improvement. Schools can involve community partnerships to support and
enhance school climate and learning experience options for students. And it could involve
collaborative communication with and by parents and the community with educators focused on
successful student achievement; partnering to help all students reach their learning goals.
Schools and teachers may benefit from having input from parents on the climate of their school
and the value of communication that is being provided to parents to know how to support their
child’s learning.
Student attitudes, beliefs and skills (behaviors) can be influenced and taught. Together,
student attitudes, beliefs, and skills influence their affiliation (attendance,
connectedness/participation, social networks, leadership), academic goal setting and
performance (productivity and accuracy) and can either promote or hinder the assumption of
responsibility for learning outcomes (i.e., independence). Students must embrace accountability
for their own student engagement practices. When faced with a lack of student accountability for
engaging in their own education, educators can influence and teach them how to be accountable
and independent learners. The reciprocal relationship between each of the four categories of SE
factors above demands that any problem-solving processes designed to improve student
engagement must include ALL of the factors involved when developing a plan to create an
engaging school environment.
Problem-solving Applications
As described above, SE is a complex construct that serves as an umbrella term for multiple
factors that impact student’s being successful in reaching their educational goals. It is also a
useful construct to help educators understand the reciprocal relationship between academic
performance and student social, emotional, and behavioral skills. While a focus on ensuring high
levels of effective and school-wide student engagement as a proactive/preventative goal is
always encouraged, the following example will illustrate how student engagement may be
addressed as a hypothesis for why students are not reaching their academic learning goals.
It is important to note that the following example focuses on student progress towards endof-year grade/course objectives for promotion as a primary concern for educators. If results of
examining student academic performance indicate students are not sufficiently progressing then
teachers would engage in a problem solving process to develop a plan of action. It is also
important to preface that the following example is intended to be broad and generic for simply
ensuring a general context for applying the above definition and related factors of student
engagement (i.e., the following is not meant to be a case study). Additional professional
development opportunities and resources are encouraged beyond this paper.
Tier 1 – Literacy Example
Tier 1 Problem ID: This step recognizes that a significant percentage of students at the aggregate
school, grade, classroom or content areas are not sufficiently progressing towards their literacy
goals. This step of PS should also identify the expected goal and both the problem statement and
goal statement should be written in clear, objective, and measurable terms. A final critical
element of this step is to make sure to disaggregate the student data sufficiently enough to target
the appropriate population of students.
Template problem statement: ____ % (i.e., specific % less than 80% of population) of ____ students
(total population, subgroup, etc.) in the ______ (school, grade, classroom, language arts courses, etc.) are
meeting their literacy learning goals as evidence by the ________ (measurement(s) used to
determine “level/rate of progress”) administered ______ (date/month/semester, etc).
Template goal statement: _____ % (i.e., ambitious and realistically higher than current level) of ____
students (same students identified in the problem statement) in the ______ (same unit of analysis as the
problem statement) will meet their literacy learning goals as evidence by the ________ (same
measurement(s) used to determine “level/rate of progress” in the problem statement) by _______
(date/month/semester, etc.).
Problem Analysis: This step of a structured 4-step problem solving process involves using crossreferencing domains of hypotheses across measurement options. The purpose of this step is to
understand “why” the problem as stated above is occurring. A comprehensive and valid
understanding of “why” the problem is occurring is critical to inform action plan development.
The RIOT/ICEL approach to problem analysis provides the methodology needed to consider the
reciprocal relationship among the various SE factors and allows teams to consider the balance
between data that are currently available, versus data that need to be collected, vs. data that have
limited usefulness or measure factors with limited alterability. The following table provides an
example of how hypotheses about SE factors might be considered and further exemplifies how
teams could weight the cost/benefits of data collection methods and data use value to determine
the validity of hypotheses developed by the team.
(Student Engagement
Factors)
Sample Questions
to Be Answered?
RIOT/ICEL Hypothesis Development and Analysis
1. (Instructional) Is the
classroom instruction
provided with
fidelity?
2. (Instructional) Is the
classroom instruction
differentiated for
matching student
needs?
1. (School/Learn Env.)
Are there classroom
rules for behavior and
are they aligned with
school-wide
expectations?
2. (School/Learn Env.)
Is there an evidencebased behavior
curriculum
established?
Instruction
Hypotheses:
Curriculum
Hypotheses:
1.
2.
(School/Learn
Env.) To what
degree is the
learning
environment free of
disruptions?
(Parent Involve)
Do parents perceive
teachers as
supportive?
Environment
Hypotheses:
1. (Student) Are these
students exhibiting
high levels of
absences?
2. (Student) Do
students perceive
teachers as
supportive?
Learner
Hypotheses:
Review Records
Interview
Observe
Test
Since “student engagement” is a broad construct encompassing the relationships among several
factors, and because each category of factors has many options for measurement targets and
methods it is essential that a problem-solving team be specific and strategic in their use of
resources when analyzing student engagement factors as hypotheses for consideration at Tier 1.
Teams should prioritize use of existing/currently available data that measures the most
observable and alterable factors for each of the four types of factors. For example, it may be
worth examining readily available data on absences as a specific hypothesis before considering
student perceptions of school climate if such information is not already available. For some of
the factors listed below there may be multiple assessment options available (i.e., review of
records, interviews, observations, testing). A team should always consider the cost-benefit of
collecting additional data, the measurement method(s) used to collect the information (e.g.,
survey vs. observation), the resources needed to collect that new information, and how that new
information will be used for understanding “why” students may not be sufficiently engaged in
their learning. More importantly, a team should always drive their decisions around what data
to select/collect and use based on a specific question or set of questions to answer.
Plan Development & Implementation:
The development of a Tier 1 improvement plan will likely involve multiple factors identified
from the four domains listed above. The amount of detailed analysis and understanding of the
barriers that are impacting students’ progress conducted during the problem analysis phase can
either help or hinder development of an effective and efficient plan to address significant student
engagement concerns at Tier 1. For example, a school may find that in addition to having
significant absences among its student population, there are high levels of disruptive behaviors
occurring in one or multiple settings and involving high percentages of students. Teachers may
perceive student disruptive behaviors as evidence that parents and students do not care about
student learning, while students and parents perceive teachers are not supportive. There may
also be evidence that instruction is not being provided with fidelity or not being differentiated
within the Tier 1 classroom/grade/course level. And finally, consider if the above scenario also
involved evidence that students and staff poorly understood classroom and school-wide
expectations and rules. While a team may have evidence of all of these factors occurring within
the school, it is critical that a team consider the reciprocal relationship among them before
attempting to implement a plan to improve student progress in literacy as described in the
Problem ID step above. It is less important to determine if the intervention plan should focus on
changing perceptions or changing behaviors first, and more important to consider the reciprocal
relationship among perceptions, practices, and procedures or policies to best drive plan
development and implementation.
Evaluate Effectiveness:
There are three broad sources of information that should be considered when evaluating the
effectiveness of a Tier 1 plan designed to address barriers affecting student engagement, which
in turn was hypothesized to be the reason why so many students are not progressing towards
their learning goals in literacy. Those three sources are (a) changes in student performance
towards their literacy goals, (b) degree of fidelity with implementing the Tier 1 plan, and (c)
degree of positive changes in the barriers (i.e., lack of important student engagement factors)
that were targeted in the plan. The examples of how a team would use these three sources of
information are as varied as the outcomes for those three sources could result. As a sample,
consider first, is there evidence that the performance of the specific population of students
identified for this particular Tier 1 plan improved sufficiently to reach their literacy goals? There
may be evidence that some students improved sufficiently, while others have not. It helps to
facilitate this step to remain focused on evaluating the degree to which the goal as stated in the
Problem ID step was reached. If the goal was for at least 80% of students to be progressing
sufficiently towards their literacy goals, and your evidence indicates that the plan resulted in a
change from 40% of students to 60% of students now performing as expected, then a team
would conclude that the plan has had some impact, but not enough to meet the goal.
Understanding whether to continue the plan, change it, or abandon it should rely on the
remaining two sources of information: Fidelity of implementing the plan, and the changes in
factors validated as potential barriers to students being engaged that were targeted in the plan. If
improvements in Tier 1 were insufficient to reach the intended goal, even if the fidelity of
implementing the plan was high, it is important to know if any improvements were due to
positive changes in the barriers (i.e., improvements with the student engagement factors
targeted in the plan) hypothesized to preventing students from being successful. What should a
team decide if they have evidence that the plan did not work, while they also have evidence that
the student engagement factors targeted for improvement in the plan did positively change?
Ideally, the team should decide to go back to analyzing the problem further as it appears that the
changes in the factors addressed in the plan were insufficient to reach the plan’s goal(s). Or,
consider a different scenario: What if a team finds the plan did work in reaching the intended
goal(s), but there is no evidence of any changes occurring to the targeted validated student
engagement factors that were considered to be barriers to reaching the goals. That is, student
performance improved, but it was not apparently due to the designed plan. It would be valuable
for a team to understand why improvements did occur, even if not due to the designed plan, so
that they can more effectively address similar problems in the future more effectively and
efficiently.
Summary Statements
As school teams engage in problem-solving around school-wide or individual student issues
targeting the construct of “student engagement”, it is recommended that particular attention be
paid to the following:
1. Perhaps the most effective approach to addressing engagement issues is to implement
school-wide academic and behavioral supports that prevent disengagement. Developing
and implementing effective Tier 1 academic and behavioral supports and expanding the
schools capacity to deliver a continuum of multi-tiered supports can create a safe,
challenging, and engaging school environment for all students.
2. Understand that the umbrella of SE is wide and that there may be an array of critical data
points to consider when targeting one or a group of indicators of SE or lack of SE. It may
be helpful to clarify as a school the critical data to be measured to determine whether a
school has a significant SE issue and the indicators of that problem (student attitudes,
beliefs and skills). Imprecise identification and measurement of the problem will result in
an ineffective PS process.
3. Problem analysis is also critical to an effective problem-solving process. If school teams
do not consider a full range of variables/factors that are impacting SE and their
alterability, it is likely that the team will identify only factors that they have limited ability
to address such as those that are reflected in the community or are internal to the student.
The team will then fail to address other critical factors that may be more amenable to
change (curriculum, instruction, school-wide environment, availability and effectiveness
of classroom supports and support personnel, etc.).
4. School teams may want to consider implementing a range of strategies at multiple levels
(community, home, school, classroom, individual student) to have an immediate and
sustained impact on SE at the entire school as well as the individual student levels.
5. The evaluation of students’ response to intervention will need to address 1) the critical
data identified in the problem identification and analyses steps of the PS process to
determine whether the interventions are impacting any or all of the targeted data
indicators and 2) the fidelity of the implementation of the interventions, particularly if
there has been less than a positive response to those interventions.
Download