Here - University of Alberta

advertisement
Running Head: DELINEATION OF STUDENT RESPONSE SYSTEMS
Delineation of Student Response Systems
Derek Diep
University of Alberta
Department of Psychology
Dr. Sandra Ziolkowski
April 15th, 2015
For Partial Fulfillment of the Requirements to Complete the Internship Program in Psychology
DELINEATION OF STUDENT RESPONSE SYSTEMS
Delineation of Student Response Systems
1. Introduction
The development and adoption of instructional technologies has become increasingly
widespread among educational institutions in recent years. Educational paradigms have been
shifting, and encouraging deeper learning, which can be supported by advancing technologies,
powerful mobile devices, learning management systems, and the like. This has been posited to
have great potential in all sectors of academia, but arguably at the post-secondary level
especially, due to generally large class sizes and lower levels of interactivity resultant of the
conventional lecture format. A common critique of educational approaches in use today is that
they often encourage students to memorize large amounts of information, but do not support
them in developing the skills to apply that knowledge meaningfully. Johnson and Meckelborg
(2008) elaborate that lectures are traditionally characterized by what is the passive and
unidirectional communication of information, and this lack of interaction within the lecture is
what causes Lecturalgia or “painful lecture,” a term coined by McLaughlin and Mandin (2001).
Due to these perceptions, individuals within various levels of the educational system are
beginning to look to the potential of instructional technologies to better support teaching and
learning, especially deeper learning, which is described by Kuh and colleagues as: “(a) attending
to underlying meanings of what they learn; (b) integrating and synthesizing information; (c)
recognizing patterns; (d) applying what they learn; and (e) being able to examine issues from
various perspectives” (as cited in Fortner-Wood et al., 2013). This thinking has led to a flurry of
technological and pedagogical trends within recent years, such as MOOCs, blended learning,
flipped classrooms and the like, and it is of extreme importance to evaluate and determine both
2
DELINEATION OF STUDENT RESPONSE SYSTEMS
the merits and shortcomings of these instructional technologies and the methodologies that
underlie them in order to refine educational delivery as a whole.
One such technology is the Audience Response System, or within the context of
educational settings, the Student Response System (SRS). While the core of this technology has
existed for several decades, it has not come into prominence until recently. SRS has become a
favourable option amongst some educators as it is relatively time and cost efficient, does not
require large institution wide infrastructural investment, nor does it need to be processed through
the bureaucracy of academia as a large project. As such, the purpose of this review is to inform
instructors, educational institutions, educational designers, and others who are considering
adopting SRS about the characteristics and potential benefits of this technology, and how it can
be used to support teaching and learning. The article will focus on the uses of SRS in the context
of higher education in particular, as this may be where it is most direly needed against the stark
landscape of the conventional lecture, passive learning, and the looming plague of Lecturalgia.
1a. Bloom’s Taxonomy
One viewpoint in which passive learning and lecture can be evaluated and addressed is
through the scope of Bloom’s Taxonomy, a common pedagogical model. In conjunction with
goals of critical thinking and deep learning, one can classify how SRS facilitate specific
educational objectives through the taxonomy’s cognitive domain. In line with the concept of
deep learning, SRS have the potential to build on and extend foundational knowledge by
supporting interactive activities, collaborative discussion and problem-solving amongst peers,
which generally pre-supposes that you have some foundational knowledge base with which to
build on. This provides a valuable opportunity to target the higher levels of the cognitive domain
outlined in the taxonomy, which can be difficult to target within a passive lecture. If instructors
3
DELINEATION OF STUDENT RESPONSE SYSTEMS
are cognizant of what their learning objectives are and construct suitable questions for students to
work through, this can target the higher order processes such as: “ [1] applying: carrying out or
using a procedure through executing or implementing; [2] analyzing: breaking material into
constituent parts, determining how the parts relate to one another and to an overall structure or
purpose through differentiating, organizing and attributing; [3] evaluating: making judgments
based on criteria and standards through checking and critiquing; [4] creating: putting elements
together to form a coherent or functional whole; reorganizing elements into a new pattern or
structure through generating, planning or producing” (Forehand, 2011). SRS can support the
development of these skills by allowing learners to interact with information in different ways by
considering questions and formulating responses.
1b. History and Context
Response systems have existed for several decades, with its use in the classroom reaching
back to the 1960s (Judson & Sawada, 2002). These systems has been made more accessible and
refined over time, with a multitude of variations existing, but the core functionality is largely the
same; it works as a voting/response system for the purpose of presenting questions for an
audience to vote on, to gather feedback, or provide a level of interactivity between a “presenter”
and an “audience.” Responses are collected digitally in real-time and are often displayed visually
such as in the form of a chart or graph. A well-known contemporary example of this is illustrated
by the North American quiz game show, “Who Wants to be a Millionaire?” with the concept of
the lifeline, “Ask the Audience,” where a struggling contestant may pose the question to the
audience and have them vote on the correct answer via electronic remote devices. The responses
are aggregated and displayed for the contestant to see where the audience stands on the question.
4
DELINEATION OF STUDENT RESPONSE SYSTEMS
There exists a variety of terms for the technology as a whole, such as Audience Response
Systems, Student Response Systems, Personal Response Systems, Classroom Performance
Systems, Classroom Response Systems, Electronic Voting/Polling Systems, Electronic Response
Systems, and Clickers, which already excludes the name of proprietary versions of the
technology or software created by different companies. Despite the large variety of names, the
meaning within each of their names is quite explicitly obvious, and are all largely the same,
perhaps coloured only by the intended context of usage examined by the research, or at the fancy
of the developing company.
Most modern hardware based SRS come in the form of small wireless remotes that use
radio or infrared waves, and a small portable receiver terminal for the instructor. The
presentation of questions and responses are facilitated with a computer, projector, and either a
presentation program such as Microsoft PowerPoint, or software specific to the system. The
remote themselves usually consist of either a number pad, or lettered keys for multiple choice
questions. Radio wave based systems are the predominant technology, but wireless (web-based)
systems have become popular in recent years as well, each of which have different advantages
and disadvantages.
The fundamental merit of SRS technology is that it provides instructors the ability to
quickly conduct formative assessment, and gain insight into where the class immediately stands
on a present concept (Quinn, 2010). The system and instructor are able to provide instantaneous
feedback to students after the polling period, so that students can immediately see how the class
responded, and instructors can identify weak points in understanding. Instructors can then
follow-up and clarify any misconceptions and unclear concepts.
5
DELINEATION OF STUDENT RESPONSE SYSTEMS
The ability to assess understanding of concepts and adjust lectures is one of the initial,
obvious, and often presented points in support of SRS, but a number of other benefits of using
SRS have been suggested. These include increased attendance (Preszler et al. 2007), increased
levels of engagement, increased participation, improved memory retention, increased academic
performance, and improvements in the ability generate assessment/feedback. Studies addressing
these effects, along with some of the underlying pedagogy, challenges surrounding using the
technology, limitations of the current research, and suggestions for future research are reviewed
below. The contexts of the reviewed articles that assess SRS consist of experimental, quasiexperimental, and correlational studies in higher education, primarily looking at the
undergraduate population across a variety of disciplines.
2. Examples and Use
A variety of use cases exist for SRS, but the literature does not have substantial support
for the types of activities or number of uses that maximize benefits; that is likely varied
depending on the structure of the class, and the nature of the course material. Ways in which SRS
can be used include attendance checks, review questions (which can vary in nature from basic
retention to problem-solving), pop quizzes, the polling of controversial or thought provoking
questions, and the use of any of the aforementioned to facilitate group discussion (Dallaire,
2011). These types of activities help refresh students and gauge their retention of material from
previous lectures, check the understanding of current lecture material intermittently, provide a
more active “break” activity interspersed between sections of passive lecture, stimulating
discussions, and gather feedback on student understanding to see what concepts need to be
reviewed, and what concepts students already understand well, which assist with efficient use of
lecture time.
6
DELINEATION OF STUDENT RESPONSE SYSTEMS
SRS can be categorized into 4 types depending on the nature of the system: infrared
systems, radio frequency systems, wireless systems, and SMS system.
Infrared systems, like most SRS require a transmitter device and a receiver, but are the
most rudimentary and most unstable. They require an unobstructed line of sight between the
transmitter and receiver, and can only collect a limited number of responses depending on the
receiver. As such, due to the lack of reliability in infrared systems, they are generally not
suggested as they possess no real benefits over other systems.
Radio frequency systems are the most common type of SRS, and use radio transmissions
via a transmitter and receiver, but are more stable, don’t require direct line-of-sight, and can
collect more simultaneous responses.
Wireless or WiFi based systems utilize an internet connection or wireless network via
wireless devices, such as cellphones, laptops, tablets, etc. Such systems generally utilize a web
browser to send in responses, and the system is managed by the instructor from a web-browser as
well. The merit of this, is that assuming wireless network infrastructure is well developed, there
is no additional cost for set-up or hardware, aside from some wireless SRS services that have
subscription fees. Nonetheless, there are completely free alternatives such as Socrative and
Kahoot, both of which can be accessed through a web browser, or by using their official apps,
which are currently available in the app stores for both iOS and Android. Wireless systems can
allow for open-ended responses, they also have disadvantages that can make its widespread
adoption difficult, such as the requirement of a web-enabled device for each student versus a
lower cost proprietary remote. Studies and data collection regarding student ownership of webenabled devices is needed to assess their viability in the classroom.
7
DELINEATION OF STUDENT RESPONSE SYSTEMS
Lastly, a less commonly used option, are SMS polling systems, which have students
using their phones to send a text message to a specific number, set-up by the instructor generally
using a subscription based service such as Poll Everywhere, to respond to a presented question
and have responses collected. A consideration for this option is the cost of text messaging,
compared to a free wireless system. The “system” in this case however, retains simplicity in use
which is one of its primary benefits, although more research needs to be done to look at the
merits of this system over alternatives.
3. Context and Population
The articles referenced in this review utilized samples composed entirely of students in
post-secondary education. Studies were predominantly composed of undergraduate student
participants, with fewer studies examining graduate students exclusively. This generally reflects
the state of the literature in that studies in post-secondary education have looked primarily at
undergraduate students, rather than graduate students, as they fit more so into the large lecture
context.
The studies cover a variety of disciplines including: science, biology, chemistry,
neuroscience, research methods, nursing, social work, astronomy, sociology, business,
architecture, education, and psychology. As this paper is still in its working stages, there is the
intention to examine the field more exhaustively. Additional studies that examine a diversity of
other subjects do exist in the contemporary literature, and will be added to allow for a more
comprehensive examination of SRS use, and the differences that might exist between them. At
this point, the representation within higher education is primarily within the domain of education
and psychology, although this may be largely due to the theoretical and conceptual orientations
of the two. Education and psychology have been a primary scope under which instructional
8
DELINEATION OF STUDENT RESPONSE SYSTEMS
technologies such as SRS have been examined possibly because pedagogical and psychological
(cognitive, social, emotional) processes tie in closely as underlying mechanisms of the SRS
methodology. In any case, a more exhaustive search of the literature is needed to better assess its
merit and adoptability across educational institutions.
4. Effects
4a. Attendance
A key objective of instructors is to maximize the attendance rate of their classes, and a
number of studies have found class attendance to be positively correlated with academic
performance (Aden, Yahye & Dahirm, 2013; Chen & Lin, 2008). Attendance rates are indeed a
problem that post-secondary institutions often monitor, both for the concern of students who
often fail to attend class, as well as how attendance serves as an indicator of the quality of the
course.
If students find merit in attending class, and feel engaged to do so, then they are certainly
more likely to attend. Conversely, if students feel that they are not engaged in class, are not able
to pay attention, or are not missing out on anything by skipping, they’ll feel more inclined to do
so. Research findings support that SRS can be an effective tool in increasing attendance, perhaps
either by alleviating some of the environmental factors that lead to skipping in the first place, or
by making attendance a necessary component via extrinsic motivation by tying a grade
component to SRS use as a means to monitor attendance and review concepts.
While attendance grade components can increase attendance rates, a more important
question is whether they are actively engaged and attending to the course material within the
class, as those are important components for learning. For example, in a study employing SRS in
a college architectural research course, researchers found a significant increase in attendance
9
DELINEATION OF STUDENT RESPONSE SYSTEMS
rates compared to a retrospective control group (Bachman & Bachman, 2011), despite not having
a grade component. While this example doesn’t eliminate all factors of extrinsic motivation, it
does eliminate a major one, suggesting that students found merit in attending class for their
learning beyond having a grade directly being tied to attendance. According to coded responses
of open-ended questions at the end of the term of SRS use, “the majority of students, even those
with negative views regarding clickers recognized that they felt compelled to read the book and
attend class regularly” (Bachman & Bachman, 2011). In addition to this, students expressed how
SRS assisted them in being more engaged and attentive in class. Furthermore, scores on a
standardized thirty item final exam was also improved compared to a retrospective control group,
suggesting that SRS use in lecture is positively correlated with both attendance and academic
scores.
4b. Engagement
According to Appleton and colleagues (2006), engagement is a multi-dimensional
construct that consists of four subtypes: academic, behavioural, cognitive, and psychological.
Within the context of the classroom, engagement is often closely aligned with the cognitive
subtype, which will be primary scope of this section. Here we define engagement as cognitive
investment in the conceptual information presented and learning process in a way such that it
supports internalization of knowledge. Engagement supposes that the student holds some level of
motivation and interest in concepts discussed. Based on student questionnaires following SRS
employment in a variety of studies, students generally follow a consensus in the belief that SRS
have helped them be more engaged and focused in class (Bachman & Bachman, 2011; Hoekstra,
2008; Stowell & Nelson, 2011). By using SRS, students engage in active learning, which is
“anything course-related that all students in a class session are called upon to do other than
10
DELINEATION OF STUDENT RESPONSE SYSTEMS
watching, listening and taking notes” (Felder & Brent, 2009). Rather than passively receiving
information, they are actively participating, and so an increased level of engagement naturally
follows. Students instead attend more to concepts in the present time, and the “lecture feels more
active because they are using the knowledge they are learning each day” (Hoekstra, 2008). While
conventional large lectures create an atmosphere of passivity, and distance, “clickers make the
learning environment feel more active because students see and hear more activity” (Hoekstra,
2008) when using SRS. Furthermore, SRS assists students in attributing meaning to the
information they learn in class by presenting opportunities for them to actively apply it,
contributing to intrinsic motivation.
4c. Participation
Participation in the classroom, while not a necessary component in learning and
achieving favourable academic outcomes, is thought to be positively correlated with deep
learning and the higher processes of Bloom’s Taxonomy. Participation necessitates some level of
involvement as students become active contributors of the learning process, as in, they take part
in constructing their framework of understanding. According to Appleton and colleagues’ (2006)
model of engagement, participation is an example of behavioural engagement, whereby students
make an overt behavioural contribution such as sharing in discussion. Participation contributes to
the learning process by sharing perspectives, formally formulating a student’s own viewpoints,
summarizing presented/learned content, as well as providing feedback to instructors about a
student’s current level of understanding on a particular concept.
As such, student participation is a component of learning, which is often lacking due to
the nature of passive lecture, especially in large classes (Hoekstra, 2008; Johnson & Meckelborg,
2008). In these larger class sections, it becomes increasingly difficult for instructors to structure
11
DELINEATION OF STUDENT RESPONSE SYSTEMS
opportunities for participation, let alone facilitate it in an efficient and inclusive manner.
Furthermore, the ability to collect feedback can be lacking as well, depending on the
thoroughness of responses and depending on whether the instructor is able to document or record
them to look back at.
The traditional and long-standing method to facilitate participation in the classroom is
hand-raising. While hand-raising and having the liberty to speak freely in one’s verbal response
may provide some qualitative feedback and merit, it generally lends itself to a slow process of
choosing one student at a time to speak. In addition, when trying to poll the class at large, hand
raising is merely a rough visual estimate of numbers. As such, this method is often not
particularly conducive of a strong engaging learning environment, especially in large classroom
settings.
Furthermore, the consideration of student attitudes and their reluctance to participate for a
host of reasons (embarrassment, anxiety towards public speaking, hesitance in answering due to
lack of understanding of particular concepts, etc.) makes this method even more unproductive.
As such, the introduction of an instructional technology such as SRS has been conjectured to
mitigate some of these issues due to the nature of the system. To begin with, the system allows
for all students with a remote device to respond, and have their responses quickly collected in
real-time and subsequently displayed for the feedback of both students and the instructor. This
feedback can stimulate more participation and engagement, even easing students into verbal
participation once they can see the class’ responses and realize they aren’t alone in struggling
with certain concepts. Also, the benefit of feedback can be more long-standing and specific for
instructors, as class statistics and responses can be saved and looked at again to identify weak
points in student understanding.
12
DELINEATION OF STUDENT RESPONSE SYSTEMS
While difficulties aren’t uncommon when new technology is in the process of being
adapted, once learnt, an SRS can allow for student participation to be more consistent,
commonplace, and efficient for the instructor to facilitate and encourage. A key beneficial factor
of SRS, is the anonymity that using a remote device (to respond) affords to students. In post-test
questionnaires, students have expressed the value of anonymity, such as how it alleviates the fear
of being singled out and humiliated if they respond incorrectly (Heaslip, Donovan & Cullen,
2014; Stowell, Oldham & Bennett, 2010), which some students worry would accompany a more
explicit and vocal participation such as with hand-raising, being called upon, and being the focus
of everyone’s attention and gaze.
For example, in one quasi-experimental study of SRS with education students (n=108),
Johnson and Meckelborg (2008), found a mean rating of 4.17 mean on a five point Likert scale
ranging from (1) Strongly Disagree to (5) Strongly Agree scale for the item “I liked the fact that
my responses were completely anonymous.” In addition, they found frequent mention of
anonymity in open-ended questions which they coded as “anonymity encouraged participation”
suggesting that students perceived anonymity as a significant benefit of SRS. In another SRS user
study of 40 students, in a human behaviour in the social environment course, on their likes and
dislikes of technology used in the course, Quinn (201) found major themes related to the use of
clickers such as “facilitating discussion,” “clickers allowed for anonymous participation,” and
“clickers increased participation.” Overall, students have expressed great value in the anonymity
of the system in reducing barriers to participation.
4d. Academic Performance
A commonly investigated variable in studies of instructional technology is student
academic performance, and studies of SRS are no exception. Because of the varying quality and
13
DELINEATION OF STUDENT RESPONSE SYSTEMS
structure of assessment, academic performance is not wholly indicative of a student’s learning.
However, it is still the common tool used by instructors to assess student learning. Assessments,
and students’ performance on those assessments, are a reflection of their understanding of course
concepts and their ability to apply that information at varying levels as identified and evaluated
by the instructor. As such, academic performance is a correlate of student learning and
understanding of the course material. A variety of studies have shown support for the idea that
SRS use does indeed have a positive effect on academic performance, such as exam scores
(Bachman & Bachman, 2011; Yourstone, Kraye & Albaum, 2008).
When looking for an explanation of increased grades, one might postulate that rather than
the SRS mediating any sort of qualitative improvement in learning, it merely cues students to
which concepts are particularly important and likely to be test items. For example, a conjectured
benefit of SRS use is improved memory retention, as students are made to actively apply and
reflect on course concepts.
A quasi-experimental study by Shapiro and Gordon (2012), looked at performance on
factual psychology items targeted in class by clicker questions relative to the control nontargeted factual items on four non-cumulative exams. Students in the control group answered an
average of 61.4% of the targeted exam questions correctly, while students in the experimental
group answered an average of 69.8% of the targeted questions correctly. This result supports the
hypothesis that the use of SRS enhanced student memory on targeted items. However, Shapiro
and Gordon (2012) found that the explanation of attention cueing cannot completely account for
these findings, although the role of cueing has not been ruled out. In their study, an attention
grabbing explanation, where key points were highlighted on lecture slides, and students were
explicitly told that they would be important and included on the test, still resulted in a lower
14
DELINEATION OF STUDENT RESPONSE SYSTEMS
percentage of correctly answered target items compared to the non-explicitly targeted SRS use
condition, whereby a statistically significant difference was found.
One possible explanation for the effects on performance is the merit of review and
feedback that SRS provide. The way in which students actively learn by attending to and
manipulating information may have a role in their retention and understanding, as well as guide
their studying. According to Hoekstra (2008), “course material becomes more meaningful to
[students] because they are consistently seeing how it might appear in actual problems.”
Interviewees in her SRS study also explained that the technology helped them to “more
effectively determine which concepts to review in greater detail before exams.” Examples of
varying types of problems can be presented which have merit, whether they show up on exams
or not. They bring opportunities to practice and train students to draw from their knowledge base
and solve problems such as by operating through their frameworks of understanding, which they
continue to develop through this process. This creates a meaningful link between the
understanding of information, and how those concepts can be manipulated, transformed, and
reconstructed in different situations, serving to internalize them in the minds of students.
4e. Assessment and Feedback
A strong merit of SRS exists in its ability to conduct quick assessments and provide
instantaneous feedback, without the need for tangible, or online submissions using an automated
system (e.g., online quizzes on learning management systems), or have instructors and teaching
assistants go over them and provide written feedback. Instructors need only create questions
ahead of time, and present them via the SRS in class time to all students at once, then wait briefly
for responses to be submitted. Immediately following this, as collected responses can be
displayed to the class while the concept is still fresh in their minds. Due to this efficiency and
15
DELINEATION OF STUDENT RESPONSE SYSTEMS
immediacy, instructors can gain quick insight onto the level of understanding students have of
specific concepts, and structure lecture time accordingly. If all students understand a concept, the
instructor can move on, and if students are struggling with another area, the instructor may
clarify and spend additional time reviewing, allowing for a more adaptive and efficient lecture
based on contingent teaching. In addition, students are able to gain feedback on whether they
understand concepts, and can gain insight onto the correct answer, as well as reflect on their
thinking process and evaluate the ways in which it led to an incorrect answer. A consistent
process of assessment and feedback in a low stakes environment also reduces pressure on
students, as they can practice and build confidence with problems, rather than having to rely on
lecture notes and spend additional time outside of class preparing for exams. There is a more
scaffolded transition and development facilitated by the SRS.
5. Challenges
With the adoption of any technology, there exists shortcomings and challenges in both
the technology itself, and its deployment. These challenges can be grouped into four categories
based on the nature of the issues and their associated solutions, and include shortcomings with
the technology itself as a tool, issues of an infrastructural or institutional nature, challenges
centred around the instructor who must incorporate SRS into their classroom and use it
effectively, and the students, the learners whom the teaching is directed to and sit on the other
side of the relationship from the instructor.
5a. Technology
Response systems have come a considerable way since their initial development
approximately 50 years ago, when they were larger, rudimentary, and clunkier in operation. In
the past, systems might have been wired, fixed to a specialized classroom, and were subject to
16
DELINEATION OF STUDENT RESPONSE SYSTEMS
more interference or instability when sending or receiving responses, and did not have the ability
to easily present aggregated responses. Receivers and remotes have become more portable and
affordable, with improved interface. Additionally, a great deal of systems now use radio waves
rather than infrared waves, allowing for less interference, greater stability in collecting responses
(less responses lost), and the ability to collect a greater number of responses simultaneously.
Furthermore, SRS activities are better facilitated with improvements in how responses are
aggregated in real-time and easily displayed with a computer and projector.
However, issues still exist despite the aforementioned improvements. For example, in
exchange for the simplicity of the system and the remote or keypad, a great deal of systems are
limited to primarily multiple choice or numeric responses, and lack the support for open-ended
responses that better serve stronger pedagogical purposes. Additionally, the systems are often
proprietary and specifically made by different companies, which creates the necessity for the
audience to have a compatible remote in order to respond, as there is no built-in crosscompatibility between different brands of SRS, and without one, students are not able to
participate. Because of this, instructors must be conscious of the financial burden created for
students if they each require a different clicker for their class, and institutions should determine
which is the general best option for their classrooms, and only deviate to a specific alternative if
adequate justification is provided.
Furthermore, unless the SRS use is required as a grade component, and officially
registered and tied to students and monitored, instructors may not get the full level of
engagement and participation they’re after; despite an instructor’s encouragement. Unless checks
of accountability exist, some students may slip through the cracks with little to no participation.
As such, the literature needs to better explore the relationship between a grade component and
17
DELINEATION OF STUDENT RESPONSE SYSTEMS
student participation compared to the lack of a grade component tied to SRS, and what methods
can work best to maximize participation if an instructor wishes to use it merely for formative
purposes.
While SRS technology has advanced and become more reliable, no technology is without
issues or failures on occasion, and instructors must be aware and adaptive in these situations. In
such cases, limited class time may be wasted re-conducting the poll, presuming that the
malfunction was momentary, or the instructor would have to take time to troubleshoot the
problem.
Although wireless software based web alternatives are becoming increasingly prominent,
and attempt to address some of the shortcomings of traditional SRS, they aren’t without
shortcomings either. Instructors and institutions must evaluate which product serves their
intended use and their classrooms the best, while working creatively and collaboratively to refine
their methods and overcome identified challenges.
5b. Institution and Infrastructure
With the adoption of any technology, aims to change classroom structures, or revamping
of educational delivery, challenges exist on an institutional level. Due to the bureaucratic nature
of post-secondary education, changes are often slow to be evaluated and approved at the
institutional level. Furthermore, as universities and colleges are accountable for providing quality
education, any fresh methodologies must be carefully examined before they can be endorsed and
permeate through the organization across multiple levels. As such, this review aims to address
this challenge in part by appraising the contemporary research, and suggesting best practices to
allow institutions to become better informed. Resources such as time and money are also
constant factors to be taken into account when considering the adoption of new technologies, as
18
DELINEATION OF STUDENT RESPONSE SYSTEMS
the purchasing of the equipment is substantial, and there are potential renovations or installations
for necessary hardware. The time and resources invested by faculty and staff to learn how to use
SRS effectively must also be considered as well as having resources for the purpose of
troubleshooting for either instructors or students.
Another infrastructural challenge comes into question when considering web-based or
wireless systems as alternatives. As previously stated, their additional functionality can alleviate
some issues that radio-based hardware systems have, and also alleviate some of the financial
burden from students, but there are trade-offs for this that might prove to be a challenge in some
cases. These alternatives require wireless devices and internet access, and therein lies the need
for stable and consistent infrastructural support for Wi-Fi. If the overwhelming majority of
students in multiple large lecture classes connect to the same access point, this may bog down
overall speed and stability. Therefore, testing of current infrastructure is needed, and potentially
investment into ensuring Wi-Fi access is stable and consistent for a large number of devices
simultaneous is a necessity for this alternative to be viable.
5c. Instructor
5.c.1. Learning to use the Technology
As with any new tool, a challenge exists in learning to use the tool and navigating the
way in which the system is set-up and operates. While students often report that SRS are
relatively easy and quick to learn to use (Dallaire, 2011), there is a lack of data on instructor
perspectives, who arguably have a more difficult task in learning the tool as the lecturer who
manages the system. Time and effort are required for first-time users to become acquainted with
the system, and would likely involve the use of instructional resources, either provided by the
19
DELINEATION OF STUDENT RESPONSE SYSTEMS
manufacturer or institution. These initial investments may make some time-constrained
instructors reluctant to use the tool at all, let alone effectively.
Instructors need to know how to operate the base or receiver that collects all the
responses, the clicker itself to understand the student’s perspective, and any associated software
that runs on their computer as they open and display questions, and display responses. The
process may be tedious for some who are less tech-savvy, and unless practiced, the initial uses in
a classroom might not run smoothly, which can undermine student receptiveness in addition to
wasting class time.
5.c.2. Restructuring the Classroom and Increase in Workload
While instructional technologies such as SRS intend to improve the quality of teaching
and learning, there exists the need for the instructor to exert a larger amount of effort to
effectively use the technology and be keen in contingent teaching. Firstly, large lectures are
generally conducted through passive lecture the majority of the time, using pre-constructed
PowerPoint slides, as they go through the course material, answering any questions along the
way. However, SRS, if to be used effectively, requires contingent teaching, beyond the effort to
create effective questions that gauge student understanding accurately, or that tap into critical
thinking. Instructors must also learn to adapt to the responses and feedback they observe from
the SRS use, and adjust their lecture time and material to the needs of the students accordingly
which can be daunting at first, as instructors are required to deviate from their more cozy and
structured lecture.
Furthermore, depending on the method in which instructors use SRS, such as for quizzes,
review sessions, and discussions, more effort will be required if instructors which to maximize
feedback from previous SRS sessions and create questions that cater to the strengths and
20
DELINEATION OF STUDENT RESPONSE SYSTEMS
weaknesses of the current class specifically, and facilitate engaging discussion as well. This is
even more so the case, if instructors are interested in taking their pedagogical practice one step
further, in accommodating class time for primarily active learning activities, which would
necessitate blended learning practices and shifting the structure of their course to that of a more
flipped nature. To allow for class time to consist of primarily active learning, which takes
advantage of the opportunities for engaging face-to-face interactions and tap into the higher
levels of Bloom’s Taxonomy, other resources and materials must be made available online and
outside of the classroom for students to encounter independently first, and engage in the lower
levels of the Taxonomy such as remembering and understanding (Forehand, 2011). In addition,
these approaches require instructors to take on a slightly different role and become more active
facilitators of learning within the classroom. However, this can become difficult, as they are
required to manage activities for large class sizes, while the directed nature of regular lectures
does not face this issue, and an adjustment period, and preferably the access to instructional
support may be needed to allow instructors to become comfortable and better-versed in adapting
to these new pedagogical practice.
5d. Student
5.d.1. Receptiveness to Technology
Student attitudes in response to the implementation of technology in the classroom is an
additional challenge that SRS use might face. While questionnaires across a variety of studies
report that the majority of students have a positive attitude toward SRS (Hoekstra, 2008) and
perceive there to be benefits in their employment, a minority of students express otherwise.
Therein lies the question of how instructors address the issue of the less receptive students, who
may feel such a way for a variety of reasons, and they may perceive it to be a distraction rather
21
DELINEATION OF STUDENT RESPONSE SYSTEMS
than an assistive tool. These students may be skeptical of SRS in particular and have a negative
impression (e.g., either based on intuition, or negative previous experiences), or some students
may just be unreceptive to a change in the conventional structure of lectures, and thus are
unwelcoming of the new practice that follows. Such unreceptive attitudes can be aggravated
further if the student has to pay out of their pocket in order to purchase a device for the sake of
the class on top of already staggering tuition costs and the disdain of textbook prices. For
example, in a survey by Johnson and Meckelborg (2008), while students generally expressed
positive attitude towards SRS following its use in a quasi-experiment, one of the lowest scores on
their scales was for the item “I would be willing to spend $35 to use an iClicker in a future class”
with a mean of 2.41. Another study by Dallaire (2011) found that 65% of participant repondents
found SRS to be cost prohibitive. The cost of an SRS is a salient barrier that presents itself at the
outset, and is a factor that weighs largely on the minds of students.
It is the responsibility of the instructor to be sensitive to these potential attitudes, and to
have some level of insight on the scientific research behind SRS and its merits that justify their
adoption, allowing instructors to candidly explain how the tool serves to benefit their education,
answer questions, and quell any doubts. Allowing students to gain some insight into the
methodology rather than thrusting it upon them with a price tag as well, may serve to improve
acceptance, generate more positive attitudes, rapport, and elicit more reflection by the student,
rather than feeling like it’s another exploitive money grab. It should also be noted that any
concerns and feedback expressed by students during this period can be an important asset in
understanding student attitudes towards instructional changes in general, and can help refine the
methodology behind their use.
22
DELINEATION OF STUDENT RESPONSE SYSTEMS
Another potential option that can alleviate some of the skepticism is the use of a free
web-based SRS instead, which can eliminate the monetary cost for the students who have a webenabled device, and also the need for a proprietary clicker device. Students may instead use
various web-enabled devices such as their laptop, phone, tablet, etc., which aligns more with
their general web experience in contemporary society. The challenge of this option’s viability
lies in the number of students who have, or regularly carry web-enabled devices with them (to
class), as using this specific wireless system may exclude some students who don’t possess one.
While a 2010 survey of faculty, staff, undergraduate, and graduate students at the University of
Florida reported 87.2% of respondents in their survey possessed a mobile device that could
access the internet even if they do not use it for that purpose (Johnson, Means & Khey, 2013),
conscious exclusion of any number of students in the classroom is not acceptable, and could
undermine the learning of those students compared to their peers. Although an interesting area of
research may be whether students who don’t have clickers or a web-enabled device to personally
respond themselves, experience similar, reduced, or no benefits despite not being able to
participate in quite the same way as their peers who have devices.
While the survey by Johnson and colleagues (2013) does give us some insight into
mobile device ownership, there are several limitations. Firstly, the survey polled faculty and staff
in addition to students, but only reported ownership numbers on a yes/no basis. Our target
demographic are the students, so this does not give us an accurate number to gauge how viable
web-based SRS are in the classroom. Secondly, the survey only looked at mobile devices in
particular, while web-based SRS such as Socrative and Kahoot can be accessed across a variety
of platforms, as long as they are web-enabled (i.e., ability to access the internet and connect to
the necessary web-based application), meaning not only cellphones, but also laptops, tablets, and
23
DELINEATION OF STUDENT RESPONSE SYSTEMS
iPod touches as well. Therefore, the number of students who possess a device that could be used
for web-based SRS may be higher than the survey suggests. And lastly, the survey was conducted
in 2010, and we can only assume that the numbers of web-device owners has at least stayed the
same, if not increased. These aforementioned considerations suggest that the viability of webbased SRS is increasing, although institutions may wish to look at their own populations to be
more accurate.
6. Limitations/Suggestions
One prominent criticism of the current research, is the inconsistent terminology and
naming convention of SRS, which contributes to the difficulty in forming one unified and
coherent body of literature. A dozen names makes it difficult to thoroughly search through large
bodies of literature to find relevant studies. This might be warranted if the differences in
terminology were due to specific categorical differences in the technology of the system that
could justify it being termed separately, but even in such a case, they might be categorized as a
subtype within the broad umbrella term. Another potential justification for different terminology
may be the specific contexts in which the response systems are used, such as Student Response
System for classrooms, versus Audience Response Systems for presentations or conferences.
Nonetheless, careful scrutiny of the necessity for unique terminology is advised in interpreting
findings and recommendations.
While a variety of studies exist, both with the investigation of qualitative (e.g., quality of
learning, student attitudes) and quantitative (e.g., student participation, attendance rates,
academic performance) variables, there are still inconsistencies in the research and areas in
which research is lacking. Also, the current literature is still not particularly comprehensive in
regards to the contexts in which SRS are incorporated, and so we cannot draw strong conclusions
24
DELINEATION OF STUDENT RESPONSE SYSTEMS
about their use in different learning environments and whether SRS effects permeate across the
board and to what degree. For example, certain fields that involve more conceptual and
theoretical knowledge that can be applied formulaically in problem solving, such as STEM fields
(science, technology, engineering, and mathematics) may better benefit from SRS use as it lends
itself to application well. Therefore, there is a need to look at the situations in which SRS show
clear benefits and develop insight into the underlying reasons as to why this might be. More
information and replication in different areas is needed and future studies are needed to examine
new techniques that make SRS viable in supporting learning in areas that are lacking.
In addition, a close inspection of best practices and follow-up studies is needed to refine
our understanding of what methods work best to maximize benefits, and how previously
identified shortcomings, either expressed by students or the quantitative data can be addressed.
The current state of the research is still largely focused on delineating the effects of SRS on
outcomes such as grades, attendance, participation, student attitudes, quality of learning, etc. The
next logical step is to explore how factors such as learning styles, instructor attitudes, course
structure, course material and similar factors interact with each other and affect outcomes. These,
among other factors may provide insight into questions related to methodology such how often
SRS should be used per lecture, how many questions should be asked, how much time should be
given for discussion, how large should discussion groups be, and whether SRS should account for
a grade component, etc. are important areas to be explored to better facilitate this novel way of
learning for students.
Another question is whether the SRS technology specifically brings about the
aforementioned benefits in this paper or whether they are the result of the methods of the
instruction alone. Yourstone and colleagues (2008) posit that outcomes such as academic
25
DELINEATION OF STUDENT RESPONSE SYSTEMS
performance may be a result of how the system facilitates engagement and interactivity in the
classroom rather than being particular to the SRS. More exploration is needed to investigate
whether it’s the nature of the technology itself, or if the supported benefits are the result of a
particular method of instruction that’s being delivered, which may not be unique to SRS, or even
technology in general.
Furthermore, while research has laid out the basics of active learning and engagement as
being facilitated by SRS, the underlying pedagogy and psychology is still unclear. While some
studies exhibit increased academic performance and students report classes to be more engaging
and that they learn more when SRS are used, does this actually represent achievement of the
course’s learning outcomes? In addition, how closely are these outcomes linked to the perhaps
most important goals of deep learning and higher order thinking, and if one exists, what is the
relationship between them? Studies must be able to measure these outcomes in some way and
carefully define and examine them. Other important factors to examine are student
characteristics and learning styles and the role they play in learning outcomes.
Research needs to delve deeper into which mechanisms, techniques, and specific parts of
the SRS method might correspond most directly to specific benefits. The end goal is to ensure
consistent benefits across the board for majority of students and reduce the number of students
that may slip through the cracks; in other words, attempt to refine the SRS method to ensure that
the maximum number of students benefit by focusing more on adaptive instruction. A question
that should be explored is, why do some students benefit more from SRS use while others benefit
less, if at all? Are students who receive more gains simply better students to begin with? Are
they more open to technology and willing to try new learning methods? Are they more flexible
and adaptive? Do students who have fewer gains just need more experience to adjust? Are there
26
DELINEATION OF STUDENT RESPONSE SYSTEMS
ways in which these student differences can be identified and accounted for when using SRS?
For example, students in courses that are their major versus students who may have lesser
intrinsic interest and might come into a class with less background knowledge and skills
(students taking a course merely as an elective or a pre-requisite) may benefit differently from
SRS use. These student characteristics need to be taken into consideration when using SRS, as
they can affect how students approach certain activities, or deal with different types of questions,
which could have s subsequent effect on their outcomes. Questions like these will ultimately
need to be explored in the near future to improve on best practices of SRS, and other instructional
technology in general; students do not all fit into one mold, and it’s the duty of educators to be
adaptive to the needs of different learners and sharpen their pedagogical practice accordingly.
27
DELINEATION OF STUDENT RESPONSE SYSTEMS
References
Aden, A. A., Yahye, Z. A., & Dahirm, A. M. (2013). The Effect of Student’s Attendance on
Academic Performance: A Case Study at Simad University Mogadishu. Academic
Research International, 4(6), 409-417.
Appleton, J. J., Christenson, S. L, Kim, D., & Reschly, A. L. (2006). Measuring cognitive and
psychological engagement: Validation of the Student Engagement Instrument. Journal of
School Psychology, 44(5), 427-445.
Bachman, L., Bachman, C. (2011). A study of classroom response system clickers: Increasing
student engagement and performance in a large undergraduate lecture on architectural
research. Journal of Interactive Learning Research, 22(1), 5-21.
Chen, J. & Lin, T. (2008). Class Attendance and Exam Performance: A Randomized
Experiment. The Journal of Economic Education, 39(3), 213-227.
Dallaire, D. H. (2011). Effective Use of Personal Response “Clicker” Systems in Psychology
Courses. Teaching of Psychology, 38(3), 199-204.
Felder, R. M., & Brent, R. (2009). Active Learning: An Introduction. Retrieved from:
http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/ALpaper%28ASQ%29.p
df
Forehand, M. (2011). Bloom’s Taxonomy: From Emerging Perspectives on Learning, Teaching
and Technology. Retrieved from http://www.kjakalski.d41teachers.org/enews/think_tank
_articles/articles/BloomsTaxonomy.pdf
Fortner-Wood, C., Armistead, L., Marchand, A., & Morris, F. B. (2013). The effects of student
response systems on student learning and attitudes in undergraduate psychology courses.
Teaching of Psychology, 40(1), 26-30.
28
DELINEATION OF STUDENT RESPONSE SYSTEMS
Heaslip, G., Donovan, P., & Cullen, J. G. (2014). Student response systems and learner
engagement in large classes. Active Learning in Higher Education, 15(1), 11-24.
Hoekstra, A. (2008). Vibrant student voices: exploring effects of the use of clickers in large
college courses. Learning, Media and Technology, 33(4), 329-341.
Johnson, D., Means, T., & Khey, D. N. (2013). A State of Flux: Results of a Mobile Device
Survey at the University of Florida. Retrieved from http://www.educause.edu/ero/article/
state-flux-results-mobile-device-survey-university-florida
Johnson, T. & Meckelborg, A. (2008). Student Response Systems: A Cure for Lecturalgia?. In J.
Luca & E. Weippl (Eds.), Proceedings of World Conference on Educational Multimedia,
Hypermedia and Telecommunications 2008 (pp. 4709-4717). Chesapeake, VA: AACE
Judson, E., & Sawada, D. (2002). Learning from Past and Present: Electronic Response Systems
in College Lecture Halls. Journal of Computers in Mathematics and Science Teaching,
21(2), 167-181.
McLaughlin, K., & Mandin, H. (2001). A schematic approach to diagnosing and resolving
lecturalgia. Medical Education, 35(12), 1135-1142.
Preszler, R. W., Dawe, A., Shuster, C. B., & Shuster, M. (2007). Assessment of the Effects of
Student Response Systems on Student Learning and Attitudes over a Broad Range of
Biology Courses. Life Sciences Education, 6, 29-41.
Quinn, A. (2010). An Exploratory Study of Opinions on Clickers and Class Participation From
Students of Human Behaviour in the Social Environment. Journal of Human Behaviour
in the Social Environment, 20(6), 721-731.
Shapiro, A. M., & Gordon, L. T. (2012). A Controlled Study of Clicker-Assisted Memory
Enhancement in College Classrooms. Applied Cognitive Psychology, 26, 635-643.
29
DELINEATION OF STUDENT RESPONSE SYSTEMS
Stowell, J. R., Oldham, T., & Bennett, D. (2010). Using Student Response Systems (“Clickers”)
to Combat Conformity and Shyness. Teaching of Psychology, 37, 135-140.
Yourstone, S. A., Kraye, H. S., & Albaum, G. (2008). Classroom Questioning with Immediate
Electronic Response: Do Clickers Improve Learning? Decisions Sciences Journal of
Innovative Education, 6(1), 75-88.
_________________________________
____________________________________
Signature of Student
Signature of Supervisor
_________________________________
____________________________________
Date
Date
30
Download