Clicker article 7

advertisement
Q 2009 by The International Union of Biochemistry and Molecular Biology
BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION
Vol. 37, No. 2, pp. 84–91, 2009
Articles
Using Clickers to Improve Student Engagement and Performance in
an Introductory Biochemistry Class
Received for publication, September 15, 2008, and in revised form, December 12, 2008
Stephen Addison‡, Adrienne Wright§, and Rachel Milner§*
From the ‡Office of the Vice Provost (Information Technology), University of Alberta, Edmonton, Alberta, Canada,
and §Department of Biochemistry, University of Alberta, Edmonton, Alberta, Canada
As part of ongoing efforts to enhance teaching practices in a large-class introductory biochemistry
course, we have recently tested the effects of using a student response system (clickers) on student
exam performances and engagement with the course material. We found no measurable difference in
class mean composite examination score for students taught with clickers than for those taught in traditional lectures. However, there were significantly more students in the highest achievement category
(91–100%) in the section that incorporated clickers than in any other section over five academic terms.
Overall, students gave high approval ratings for the use of the clickers, particularly in increasing their
participation and engagement in lectures. However, students who reported their performance to be in
the lowest performance categories gave a lower level of approval for the use of the clickers than those
who reported their performance to be in the higher performance categories. The implications for using
clickers to improve teaching in biochemistry are discussed.
Keywords: Student response system, clickers, biochemistry, active learning, examination performance.
Recently, we have been attempting to improve teaching practices in the Department of Biochemistry. These
efforts began in response to consistent feedback from
students that our courses were ‘‘feared and hated.’’ Both
students and instructors felt that the courses contained
too much information and were highly inconsistent. We
decided to address this problem by (i) introducing a new
teaching program which specifically reduced course content and (ii) incorporating active learning strategies in the
classroom. We hoped that the reduced content would
lead to a better understanding of basic concepts and to
better attitudes toward the subject [1] and that the active
learning techniques applied in a large-classroom setting
would have significant benefits for students in terms of
engagement and learning [2].
The first changes we made included developing a oneterm introductory biochemistry course (BIOCH 200) for
which we prepared detailed teaching materials and
standardized learning tools. These materials were
intended to clearly define and limit the syllabus; to minimize the impact of variations in the quality or experience
of instructors; to support a variety of student learning
styles; and to encourage instructors to adopt active
learning strategies. Overall, student feedback regarding
BIOCH 200 has shown that they enjoy the new course
and feel motivated to learn more biochemistry (data not
shown).
Despite this improvement, we have found it difficult to
effectively introduce active learning strategies in the
classroom. We have consistently tried to engage students in one or two brief and varied activities during
class time, but we have found that the students tend not
to like the activities and are resistant to in-class participation. We have received feedback indicating that we
should ‘‘give up trying to solicit class participation.’’ We
have discussed this problem with colleagues who teach
other introductory science courses and it appears that
this is a common phenomenon.
Nevertheless, as student engagement has been shown
to be highly correlated with learning, we still wished to
improve student participation in class. To do this, we
decided to test a student response system, in which students use remote, hand-held devices called clickers. A
student response system (clickers) enables the instructor
to ask a question as part of the digital presentation; each
student uses their clicker to submit an answer anonymously, and the instructor displays a graph of the students’ responses and is able to discuss the results with
the whole class. Many studies have reported that the use
of clickers positively transforms the large-classroom lecture experience and leads to more responsive instruction
and better student understanding. Students tend to give
very positive approval ratings (70% and above) for the
use of clickers and generally perceive the classes to be
more interactive, engaging and enjoyable [3, 4].
* To whom correspondence should be addressed. 5-81 Medical Sciences Building, Edmonton, Alberta, Canada T6G 2H7.
Tel.: (780) 492-5550 E-mail: rmilner@ualberta.ca.
DOI 10.1002/bmb.20264
84
This paper is available on line at http://www.bambed.org
85
We introduced clickers in BIOCH 200 with the intention
of improving active learning in the class room. In addition, we were also interested in determining the possible
effects of clickers on the students’ performance in
examinations, on their perception of their learning, and
on their engagement and participation in the class.
METHODS
This study involved students in one particular section of an
introductory biochemistry class (BIOCH 200) in the Fall term of
2007. The study was conducted over a 12-week period in the
Fall, 2007 (one academic semester). Three distinct sections of
this course were offered in this term (A1, A2, and A3) and each
section was a large-classroom, lecture format with approximately 150–190 students. Students enrolled in section A1 of
this course were given the opportunity to use clickers in class
for the duration of the term. Of 189 students in this section who
completed the course, 174 chose to participate in the study
and were loaned a remote.
Students in section A1 Fall 2007 of BIOCH 200 (the clicker
section) were the experimental group in this study. Our comparison groups were the two other sections of BIOCH 200 taught
in Fall 2007 (A2 and A3) and all other sections of BIOCH 200
taught since its implementation: Fall 2005 (4 sections), Winter
2006 (3 sections), Fall 2006 (3 sections), and Winter 2007 (3
sections).
As far as is possible in a multi-section, multi-instructor, largeclass lecture format at a large university, all students in BIOCH
200 have an equal learning opportunity regardless of their section. All have access to identical learning materials, including
recommended readings from the text, instructors’ slide files,
highly-specific learning objectives and practice questions. All
instructors also work from the same extensive and detailed set
of instructors’ notes. All sections meet either for 50-minutes
three times a week or for 80-minutes twice a week, and are
taught at approximately the same time of day, starting between
11:00 A.M. and 1:00 P.M.
As far as is possible, all students in BIOCH 200 also have an
equal opportunity for performance evaluation regardless of their
section. In each term, students in all sections write identical examination papers at the same time (consolidated) and so their
performances can be compared directly. Although the performances of students in different terms cannot be compared
directly, examination questions in each term are drawn from a
large multiple-choice question database, and all questions in
the database have difficulty and reliability scores associated
with them. This means that examinations for all sections of
BIOCH 200 have been highly consistent. Indeed, the range of
mean scores for the 13 sections of BIOCH 200 completed
before this study was only 68.76–75.80%.
The most significant difference between sections in this study
was that in one section only, Fall 2007 A1, the instructors used
the iClicker student response system extensively throughout the
term. Each student was given the opportunity to borrow a
hand-held student remote (clicker) and to answer questions in
every lecture that were similar in style to the exam questions.
The students’ electronic responses were anonymous.
The clickers were used to pose questions about 4 or 5 times
in each 50-minute lecture. The students were given up to
2 minutes to respond, depending on the complexity of the
question. The instructors were able to gauge the time required
for each question by monitoring the rate at which responses
were collected. In some cases responses were collected in less
than 30 seconds. A variety of questions was used in each class
to achieve different goals. For example, specific questions were
designed to: introduce topics; to assess students’ background
knowledge of a subject; to illustrate known misconceptions of a
particular point; to determine whether new material had been
understood and could be applied in a problem; to review material covered in the previous class; or to enable practice of a calculation that the students were required to apply in different situations. In some cases students answered questions alone and
in other cases they were encouraged to discuss a question with
their peers before voting. In cases where a significant proportion of the class answered incorrectly students were encouraged to discuss their answers and then re-vote. Once a final
vote had been recorded, the instructors reviewed the responses
with the class and clarified why the incorrect responses were
incorrect. Overall, activity associated with the clicker questions
accounted for up to 10 minutes (20%) of each 50 minute
lecture.
Section A1 Fall 2007 was taught by two instructors whose
main purpose for incorporating the clickers was to increase
opportunities for active learning through student participation in
the lecture. All other sections of BIOCH 200 had neither clickers
nor the extra questions. No other feature of the course or its
delivery was altered for any of the sections.
Students’ Performance on Examinations
In this study, one of our objectives was to investigate the
effects of using clickers in the classroom on student examination scores; that is to compare the performance of students
taught by traditional lectures with those taught using the clickers. To do this, we compared the performance in consolidated
examinations of students in the clicker section (A1 Fall 2007)
with students in the other two sections (A2 and A3 Fall 2007).
We also compared the performance of students in the clicker
section (A1 Fall 2007) with that of students in all other sections
of BIOCH 200.
Students in BIOCH 200 are evaluated in three multiple-choice
examinations, two midterms and one final. The midterm examinations contain 50 questions which are answered in 1 hour and
the final examination contains 80 questions which are answered
in 2 hours. In this study, we compared mean composite scores
among the sections of BIOCH 200. Each student’s composite
score (%) is calculated from their three separate examination
scores; each midterm is worth 25% of the composite mark and
the final examination is worth 50% of the composite mark. The
composite score of an individual student indicates his or her
knowledge and understanding of the material in BIOCH 200 relative to other students in that section. The mean composite
score for each section was determined from the individual
scores of all students in that section who wrote three examinations.
Students’ Self-Reported Perception of Learning
The second outcome measure in this study was an attitudinal
and informational student survey. A survey with both Likert-type
and non-Likert type questions was administered to students in
section A1 to evaluate their perception of the clickers and their
effectiveness in the class room. A total of 152 students voluntarily completed the written survey at the end of term; Four of
these had not used the clickers. A total of 174 students had
borrowed a clicker for the study.
The informational part of the survey consisted of questions in
which students indicated the importance to them of various
course features using a scale of 1–5 (1 ¼ did not use, 3 ¼ neutral, 5 ¼ essential) and reported their attendance at lectures (%)
and level of achievement in the midterm examinations (<50%;
51–60%; 61–70%; 71–80%; 81–90%; and 91–100%). The attitudinal part of the survey consisted of 12 questions in which students responded using a Likert scale (1, strongly disagree; 2,
disagree; 3, neutral; 4, agree; 5, strongly agree). Questions 1–4
dealt with the perceived effect of the clickers on learning,
understanding and exam performance. Questions 5–6 dealt with
86
BAMBED, Vol. 37, No. 2, pp. 84–91, 2009
FIG. 1. The distributions of students’ actual and selfreported marks. In the written survey administered at the end
of the term, students were asked to indicate their current approximate mark in the course by selecting a category: <50%;
51–60%; 61–70%; 71–80%; 81–90%; or 91–100%. Self-reporting; the percentage of students in each category was calculated
from the survey data (N ¼ 152). Actual; the percentage of students actually in each category at the time the survey was
administered was calculated from the examination data spreadsheet (N ¼ 189).
perceptions related to the use of the clickers during class time.
Questions 7–12 dealt with the perceived effect of the clickers
on attention, engagement, participation and enjoyment. Most
questions were phrased so that Strongly Agree represented a
favorable reaction to the use of clickers in lectures. Questions 5
and 6 were phrased so that Strongly Agree represented an
unfavorable reaction. Three additional, open-ended questions
gave students the opportunity to identify and describe relevant
issues that were not anticipated in the survey questions.
In the final written survey, we asked students to indicate their
approximate mark in the course at that point (two midterm
examinations had been completed, each worth 25% of the total
overall mark). Because the surveys were anonymous, we cannot guarantee that the students accurately indicated their mark.
However, when we compare the distribution of the students’
self-reported marks after two midterms with the distribution of
actual marks after two midterms, we find that they are very similar (Fig. 1). A slightly greater percentage of students appear in
the upper categories in the survey (self-reported) with a slightly
lower percentage in the lower categories. This small difference
could be related to students ‘‘rounding up’’ exam scores, putting themselves in the 81–90% category if they scored 80.5–
80.9%. It could also be related to a greater proportion of
higher-scoring students than lower-scoring students completing
the questionnaire.
range of mean composite scores that we have seen previously (68.8–75.8%) for similar examinations. Overall,
there was no apparent difference in the mean composite
examination score for students taught with clickers compared with those taught by traditional lectures.
Despite the similarity in mean composite scores
among sections of BIOCH 200, we observed that a
greater than usual number of students in the clicker section (Fall 2007, A1) had obtained a grade of A or Aþ
(composite score ‡87%). Given this, we looked at the
distribution of scores in each section of BIOCH 200. In
the student survey, we had asked students to indicate
their approximate mark in the course at that point by
selecting one of the following score categories: <50%;
51–60%; 61–70%; 71–80%; 81–90%; 91–100%. Therefore, for each section, we calculated the percentage of
students who obtained a final composite mark within
each of these categories (defined specifically as
50.9%; 51–60.9%; 61–70.9%; 71–80.9%; 81–90.9%;
91–100%). We also combined the lower three categories
into a single group, 70.9, because the value of N was
low for each when considered as a distinct category. We
found that a greater proportion of students in the clicker
section (Fall 2007, A1) obtained a composite score in the
range of 91–100% than ever before (Fig. 3A), whereas
the proportion of students in the clicker section who
obtained scores in the ranges 81–90.9, 71–80.9, and
70.9 was within the range of values obtained for all
other sections (Figs. 3B–3D, respectively). Specifically,
13.23% of students in the clicker section (A1, Fall 2007)
are in the 91–100% category (Table I). Using Pierce’s criterion [5], we find that this value is significantly outside
the normal range for all sections of BIOCH 200 (Table I).
RESULTS
Students’ Performance on Examinations
Students in the clicker section (Fall 2007, A1) achieved
a slightly greater mean composite score for the three
consolidated examinations than students in the other two
sections (Fall 2007, A2 and A3) (Fig. 2). However, we
cannot conclude that this difference resulted from the
use of the clickers because there have consistently been
differences of this magnitude in mean composite score
among sections in previous terms (Fig. 2). Further, the
mean composite scores for all three sections in Fall 2007
(A1 ¼ 74.6%; A2 ¼ 72.1%; A3 ¼ 69.8%) were within the
FIG. 2. Comparison of mean composite examination
scores for all sections. The bars show mean composite examination scores for each section of BIOCH 200 from Fall 2005 to
Fall 2007. The composite examination score for each student
was calculated from scores for two midterms and one final.
Mean scores for each section were calculated from the individual scores of students who completed all three examinations.
Examinations for all classes in a given semester are identical
(consolidated). Examinations used from semester to semester
are similar and questions on all exams are drawn from the same
question databank. Standard error bars are shown. The Fall
2007 A1 section (light grey bar) used the clickers.
87
FIG. 3. The fraction of students in each section obtaining composite exam scores in specific categories. The score categories were: (a) 91–100%; (b) 81–90.9%; (c) 71–80.9%; (d) 70.9%; these categories matched those that the students used to report
their performance on the attitudinal survey. The Fall 2007 A1 section (light grey bar) used the clickers. *Outlier; exceeds the maximum allowable difference from the mean (xi2xm) using Peirce’s criterion.
TABLE I
The proportion of students in each section of BIOCH 200 obtaining
a composite score of 91–100%
Section
Winter 2007 B2
Fall 2005 A4
Fall 2006 A1
Winter 2007 B3
Fall 2005 A2
Winter 2006 B3
Fall 2005 A1
Winter 2006 B2
Fall 2007 A3
Winter 2007 B1
Fall 2006 A4
Fall 2007 A2
Fall 2006 A2
Fall 2005 A3
Winter 2006 B1
a
Fall 2007 A1
Mean
Std. dev.
Peirce’s R
max. allowable xi–xm
% of section
scoring ‡91 %
Difference from
the mean
2.0286
2.286
4.018
4.326
5.023
5.181
5.213
5.291
5.334
5.522
7.514
7.602
8.680
8.718
8.987
13.228
6.184
2.826
2.106
5.951
24.155
23.899
22.166
21.858
21.162
21.003
20.971
20.893
20.851
20.663
1.330
1.418
2.496
2.533
2.803
7.043b
xi–xm is the difference of a value from the mean. The data are
skewed but meet criteria for being normal.
a
Clicker section (Fall 2007, A1).
b
Exceeds the maximum allowable difference from the mean.
(The maximum allowable difference from the mean
(xi2xm) for this data set is 65.951; for the clicker section
(A1, Fall 2007) the difference from the mean is 7.043; no
other section has a value that falls outside the maximum
allowable difference from the mean.) This means that
high-achieving students in the clicker section performed
significantly better in three examinations than highachieving students in all other sections of BIOCH 200.
Students’ Self-Reported Perception of Learning
The results of the attitudinal survey for BIOCH 200
section A1 are presented in Table II. Students indicated
their response to each question using a Likert scale: 1,
strongly disagree; 2 disagree; 3, neutral; 4, agree; 5,
strongly agree. The 12 questions are listed and the mean
and standard deviation of the participants’ responses to
each question are given. The larger the mean value, the
greater the students’ agreement with a question. The table also expresses the students’ responses as % agree
and % disagree. % agree is calculated from the sum of
students who responded agree or strongly agree. % dis-
88
BAMBED, Vol. 37, No. 2, pp. 84–91, 2009
TABLE II
Attitudinal survey completed by students in the clicker section (A1, Fall 2007)
Question
1.
2.
3.
4.
The iClicker questions helped me to improve my learning.
The iClicker questions helped me to build a solid understanding of core concepts.
The iClicker questions were designed well for enhancing my learning of this subject.
I feel that this section of BIOCH 200 had an advantage in the consolidated
examinations because of the use of iClicker questions.
5. I often wrote down the iClicker questions instead of answering them in class.
6. I think that the iClicker questions took time that would be better used for presenting
information.
7. The iClicker questions helped me to focus and pay more attention in lectures.
8. The iClicker questions allowed me to engage directly with the content being
presented.
9. The iClicker questions encouraged me to attend lectures more regularly.
10. I enjoyed the lectures more because of the iClicker questions.
11. I liked knowing how my classmates responded to the iClicker questions.
12. The use of iClicker questions in this course should be continued in the future.
Mean 6 SD
%Agree
%Disagree
86.1
84.9
73.0
82.9
3.9
3.9
7.2
3.3
29.6
16.4
50.0
58.6
2.67 6 1.37
2.41 6 1.12
80.3
87.5
5.9
4.6
4.18 6 0.97
4.32 6 0.88
38.8
26.3
69.1
84.2
26.3
57.3
10.5
2.0
4.30
4.13
3.99
4.13
3.14
2.63
3.86
4.36
6
6
6
6
6
6
6
6
0.85
0.81
1.01
0.82
1.26
1.15
1.02
0.87
Students indicated their response to each question using a Likert scale: 1, strongly disagree; 2, disagree; 3, neutral; 4, agree; 5, strongly
agree. Mean 6 SD is the average response value 6 standard deviation; n ¼ 152 (the number of students that participated in the survey). %
Agree is calculated from the number of students who responded agree or strongly agree. % Disagree is calculated from the number of
students who responded disagree or strongly disagree.
agree is calculated from the sum of students who
responded disagree or strongly disagree.
The survey data (Table II) show that the majority of students agreed that the clicker questions used in class
helped them to improve their learning (86%), to build a
solid understanding of core concepts (85%), to focus
and pay more attention in lectures (80%), and to engage
directly with the content being presented (87%). The majority agreed also that they felt they had an advantage in
the examinations over the other two sections of the
course because of the clicker questions (83%), that the
questions were well-designed for learning this subject
(73%), and that the use of clickers should be continued
in this course in the future (84%). The majority also
agreed that they liked knowing how their classmates
responded to the questions (69%).
In contrast, the survey data (Table II) show that only a
few students agreed that the questions took time that
would have been better spent presenting information in
the lecture (16%). In addition, only a minority agreed that
the use of clicker questions encouraged them to attend
lectures more regularly (39%) or that they enjoyed the
lectures more because of the clicker questions (26%).
Interestingly, approximately one-third of the students
agreed that they often wrote down the questions instead
of answering them in class (30%).
To further investigate student attitude towards the
clickers, we grouped the attitudinal surveys according to
the students’ self-reported performance in the course at
the time the survey was administered (70%, 71–80%,
81–90%, or 91–100% average score on the two midterms). Two students had not indicated a performance
category, so 150 of the 152 surveys were used. For each
of the 12 questions on the survey, we calculated the %
agreement reported by students in that category (Table
III). (% agreement was calculated from the sum of students who responded agree or strongly agree.) Interestingly, we found that students who reported their performance to be in the lowest performance categories (70%)
were less likely to be in agreement to many of the questions than students who reported their performance to
be in the higher performance categories (71–80%, 81–
90%, or 91–100%). Specifically, they were less likely to
agree that they liked knowing how their classmates
responded to the questions (56%) or that they felt they
had an advantage in the consolidated examinations
(70%). They were also less likely to agree that the clicker
questions helped them to improve their learning (65%),
to build a solid understanding of core concepts (70%), or
to focus and pay more attention in lectures (65%). Students who reported their performance to be in the lowest
categories did agree, however, that the use of clickers
should be continued in the future (83%).
DISCUSSION
The main purpose for incorporating clickers in our introductory biochemistry classes was to increase student
engagement and participation. At the same time, we
were interested in determining whether there would be a
measurable change in the students’ performance on
examinations relative to students taught by traditional
lectures.
It is difficult to conduct research into the effects of
technology on classroom learning because of the inherent complexity of the many variables involved in teaching
practice and in assessment of learning. We realize that in
comparing the standard (traditional) teaching environment of BIOCH 200 with the more interactive environment provided through the use of clickers that we were
not able to control for effects such as an instructor’s
motivation and enthusiasm for one teaching method versus another. We were also not able to control for other
indirect effects of using clickers, such as alterations in
the students’ attendance at lectures. Finally, we were not
able to control for the Hawthorne effect [6], where the
students in section A1 knew they were part of a research
89
TABLE III
Student attitude toward the clickers versus self-reported level of achievement
% Agree by self-reported level of
achievement
Question
1.
2.
3.
4.
The iClicker questions helped me to improve my learning.
The iClicker questions helped me to build a solid understanding of core concepts.
The iClicker questions were designed well for enhancing my learning of this subject.
I feel that this section of BIOCH 200 had an advantage in the consolidated
examinations because of the use of iClicker questions.
5. I often try to write the clicker questions down instead of answering them in class.
6. I think that the iClicker questions took time that would be better used presenting
information.
7. The iClicker questions helped me to focus and pay more attention in lectures.
8. The iClicker questions allowed me to engage directly with the content being
presented.
9. The iClicker questions encouraged me to attend lectures more regularly.
10. I enjoyed the lectures more because of the iClicker questions.
11. I like knowing how my classmates respond to the iClicker questions.
12. The use of iClicker questions in this course should be continued in the future.
70
71–80
81–90
91–100
65.2
69.6
69.6
69.6
100
93.2
79.5
84.1
87.9
84.5
75.9
87.9
84.0
92.0
64.0
92.0
39.1
13.0
22.7
11.4
32.8
17.2
28.0
24.0
65.2
78.3
90.9
95.5
77.6
86.2
88.0
92.0
43.5
30.4
56.5
82.6
38.6
20.5
70.5
88.6
37.9
20.7
75.9
84.5
40.0
44.0
68.0
88.0
Students indicated their response to each question using a Likert scale: 1, strongly disagree; 2, disagree; 3, neutral; 4, agree; 5, strongly
agree. % Agree was calculated for each achievement category from the number of students in that category who responded agree or
strongly agree. The performance category was self-reported by students on the survey. N ¼ 23, 44, 58, and 25 for the achievement categories 70%, 71–80%, 81–90%, and 91–100%, respectively.
study which might have influenced both their survey
responses and their learning behavior.
Despite these limitations, we believe we have shown
that using clickers in an introductory biochemistry class
is beneficial to the students. This finding is of significance for us because of the difficulty we have experienced in introducing interactive learning techniques in
our lectures. Students in BIOCH 200 have tended to dislike the in-class, active learning strategies we have used,
and they have generally been resistant to in-class participation. In contrast, with clickers the majority of students
participated more in lectures and reported that they perceived an improvement in learning, understanding and
their performance on exams.
Effects of Clickers on Perception of Learning and
Performance in Examinations
To assess the students’ performance in BIOCH 200,
we calculate their composite score for three examinations. We then calculate a mean composite score for
each section. Overall, we have found that the range of
mean composite scores achieved by different sections of
BIOCH 200 is small, and we believe that this is related to
the examination format (multiple choice), the consistency
and specificity of the learning tools that are available to
the students, the repeated use of pre-tested questions
from a large question database that we have created,
and the use of consolidated examinations for all sections
in each term. The small differences that occur presumably result from a variety of the often intangible variables
that are important in teaching and learning.
In this study, students in the section that used clickers
(section A1, Fall 2007) obtained a slightly greater mean
composite score on the three consolidated examinations
than students in the other two sections (A2 and A3, Fall
2007). However, we cannot conclude that this resulted
from the use of the clickers because we have seen differences of this magnitude among sections of BIOCH 200
in previous terms (Fall 2005, Winter 2006, Fall 2006, and
Winter 2007).
Overall, use of clickers had no measurable effect on
the students’ mean composite examination score compared with that of sections taught by traditional lectures.
However, a significantly greater proportion of students
scored 91–100% in the section that incorporated the
clickers than in any other section of BIOCH 200 over the
previous five academic terms. This observation suggests
that use of the clickers benefited the high-achieving students most specifically, facilitating their performance in
the consolidated examinations and enabling relatively
more of them to obtain composite scores in the 91–
100% category. The use of clicker questions in class
enabled students to practice questions of a similar format to those that appeared on the three examinations.
We hypothesize that high-achieving students benefited
most from this practice because they were more able to
interpret and answer the clicker questions in the time
allowed than were the lower-achieving students.
Obviously, we cannot discount the fact that students
using the clickers had different instructors than the other
sections of BIOCH 200. However, at least 11 distinct
instructors have been involved in teaching BIOCH 200
since Fall term 2005, and both instructors of the clicker
section (Fall 2007, A1) have instructed several previous
sections. Despite the variety of instructors used, the
mean composite scores of all sections are remarkably
consistent and the mean composite score obtained by
the clicker section is within the range of scores that we
have seen previously. We believe that this consistency in
scores supports our suggestion that the use of the clickers led to the effects we are reporting here for highachieving students. We acknowledge that these results
might be more convincing if the investigation were
90
repeated. However, we were unable to conduct a comparable study in the subsequent semester because we
implemented an online game system for students in all
sections of BIOCH 200 in addition to the clickers.
Our findings in this study are similar to those of Knight
and Wood [2]. These authors found that an interactive
environment specifically benefited higher-achieving cell
biology students. They found that both grade A and B
students had higher learning gains in an interactive environment than in a traditional environment, whereas grade
C students made relatively low learning gains in both
environments. More recently, Reay, Li, and Bao [7]
reported that, overall, physics students who used clickers achieved higher scores in multiple-choice conceptual
examination questions than those students who did not
use clickers. We are aware that formalized testing is not
necessarily the most effective way to assess learning,
and we accept that the improved examination performance of high-achieving students in this study does not
measure long-term retention of information or changes in
behavior. True measurement of learning would be very informative, but requires a longitudinal investigation
beyond the scope of this study. Here, we have measured
the effect of clickers only on student performance in
examinations; we rely on the students’ self-reported perception of their effect on learning.
The survey data in this study is consistent with the examination data. Overall, the majority of students perceived a positive effect of the clickers on their learning,
understanding and performance in examinations. A few
students specifically commented in the survey that they
would have done less well on the exams without the use
of the clickers. However, students who reported their
performance to be in the lowest performance categories
(70%) were less likely to agree that the clicker questions had a positive effect on their learning, understanding and performance in examinations. Interestingly, the proportion of students who indicated that
they ‘‘had an advantage on the consolidated examinations because of the use of iclickers’’ decreased in parallel with their reported achievement in examinations
(Table III). This seems particularly compelling feedback,
because it matches our finding that the performance of
only the highest-achievers was improved in the examinations.
Although the majority of students responded positively
to the use of the clickers, a significant minority of students, especially those in the lowest-achievement category (70%), did not feel that the clickers helped their
learning or gave them an advantage in the examinations.
We believe these findings have important implications for
the introduction of technology into a learning environment, as a specific technology or activity will not necessarily benefit all students or even those that need it the most.
This observation is in keeping with that of Metz, who
recently reported that lower-division students were more
likely than upper-division students to perceive weekly online
quizzes as stressful, and less likely to describe the online
quizzes as helpful to their learning [8]. Certainly, we will
take these findings into consideration when planning use of
clickers in our courses in the future.
BAMBED, Vol. 37, No. 2, pp. 84–91, 2009
Effects of the Clickers on Student Engagement
and Participation
The results of the survey clearly demonstrate that the
majority of students perceived a positive effect of the
clickers on their attention, engagement, and participation
in the class. In particular, 84% were in agreement that
the use of clickers should be continued in this course in
the future. This level of agreement is a significant recommendation from the students. These attitudinal results
are consistent with trends in similar studies conducted in
other disciplines; students typically give high levels of
approval (70% and above) for the use of clickers.
Despite this high level of approval, the survey data
also shows that students who reported their performance
to be in the lowest performance categories (70%) were
least likely to agree that the clicker questions had a positive effect on attention, engagement, and participation.
Specifically they were least likely to agree that the clickers helped them to focus and pay attention in lectures
and to engage directly with the content being presented.
They were also least likely to agree that they liked knowing how their classmates responded to questions. These
findings have important implications for improving teaching in biochemistry classes. Although a particular teaching tool may benefit a significant proportion of a student
group, we need to take into account, in a large lecture,
that one technology may not meet everyone’s needs. We
recognize the need to find ways to more effectively
include those students who reported their performance
to be in the lowest performance categories (<70%). For
example, we could try the question sequence method of
Reay, Li, and Bao [7]. Rather than asking one question
per concept, these authors use a sequence of questions,
each covering the same concept in a different context.
We also recognize that effective teaching methods and
question design are critical factors in the successful use
of this technology.
Overall, based on this study, we believe that it is worth
continuing to use clickers in teaching biochemistry in
large lecture classes. This belief is based on our survey
which clearly shows that the students in BIOCH 200 generally perceived significant benefits for their educational
experience. Our findings concerning the examination
scores of high-achieving students in the clicker section
are also interesting but we realize their limitations and
they did not play a part in our decision to implement the
use of clickers in all future sections of our introductory
course.
CONCLUSIONS
In this study, the majority of students indicated strongly
that use of clickers enhanced their learning experience.
However, students in the lowest achievement categories
were less likely to agree that the clickers helped their
learning or performance in examinations. In keeping with
this, our examination data suggest that the in-class use
of clickers improved the performance on examinations of
only the highest-achieving students.
91
Acknowledgment— The authors thank Ralph Wright and Dr.
Colleen Norris for help with the statistical analysis of the data in
this article. The instructors would like to thank Cristina Arias for her
invaluable practical support with the iClicker system during class
and Dr. Jonathan Parrish for help with production of the figures.
REFERENCES
[1] M. D. Sundberg, M. L. Dini, E. Li ( 1994) Improving student comprehension and attitudes in freshman biology by decreasing course
content, J. Res. Sci. Teach. 31, 679–693.
[2] J. K. Knight, W. B. Wood ( 2005) Teaching more by lecturing less,
Cell Biol. Educ. 4, 298–310.
[3] C. Fies, J. Marshall ( 2006) Classroom response systems: A review
of the literature, J. Sci. Educ. Technol. 15, 101–109.
[4] J. E. Caldwell ( 2007) Clickers in the large classroom: Current
research and best-practice tips, CBE Life Sci. Educ. 6, 9–20.
[5] S. M. Ross ( 2003) Peirce’s criterion for the elimination of suspect
experimental data, J. Eng. Technol. 20, 38–41.
[6] E. Mayo ( 1977) The Human Problems of an Industrial Civilization,
Arno Press, New York, pp. 55–98.
[7] N. W. Reay, P. Li, L. Bao ( 2008) Testing a new voting machine question methodology, Am. J. Phys. 76, 171–178.
[8] A. M. Metz ( 2008) The effect of access time on online quiz performance in large biology lecture courses, Biochem. Mol. Biol. Educ. 36,
196–202.
Download