Document 15637775

advertisement
Fighting the Fade: Student Engagement in Large Peer-Instruction Classes.
Abstract: Recent studies of interactive-engagement classroom techniques show encouraging
results for improving students' conceptual understanding in large-lecture environments, but
there has been little investigation into the limits of such methods with regard to student
participation. We categorize a variety of student attitudes, and generate hypotheses regarding
student disengagement, as a guide toward improving large-class peer instruction methods.
INTRODUCTION: Physics Education Research (PER) [e.g. Arons97, Hake98, Hestenes92,
McDermott91, 93] argues compellingly that traditional large lectures can fail at various
levels - in student attitudes about physics, classroom engagement, and student performance in
tests of conceptual underpinnings. A promising approach advocated by Eric Mazur's group at
Harvard, among others, involves "Peer Instruction" [Mazur97, Fagen02, Meltzer02,
Wyckoff01]: frequent in-lecture multiple choice conceptual questions, with paired ("peer")
discussion and votes. The voting can be done by hand, by raising cards, or through
electronic feedback devices. This classroom approach is stimulating and engaging - the buzz
in the classroom during discussions is invigorating to students and teacher alike. Mazur's
research concentrates on the resulting gains in conceptual understanding. However, many
colleagues and I have observed a noticeable and discouraging "fade" phenomenon, in which
student participation peters out with time. The focus of our investigation is to ask: why is this
happening, and what might be done to change it?
There are deeper issues involved as well. We want to clarify the limitations and boundaries
of Mazur's approach, to help extend and improve its function. Student participation, per se, is
not our primary goal: improving student understanding of physics is. Beyond that, we want to
develop their ability to think critically, i.e. to understand the process of getting and
evaluating an answer to a physics problem, as versus rote memorization of facts and
formulas. We would also like to improve students' attitudes towards physics: help them enjoy
1
the material, appreciate its relevance, and stimulate their desire to learn more. In the process
of inquiring into participation fade we are also trying to better understand how Concept Tests
in large-lectures function to further these pedagogical goals. Our approach is largely survey
based - we rely heavily on student's self reporting. This provides many rich and detailed
student comments, but may not be easily generalized. Our result are meant to be suggestive
for further intervention and investigation.
CLASS SETTING: During this study I taught two introductory physics courses: an algebra
based Electricity and Magnetism course (Fa'01, 136 students) aimed at health science majors,
and a calculus based course (Sp'02, 307 students), aimed at engineers. We used a "colored
card" method of student voting: multiple choice answers are colors, and students raise an
index card to vote. This provides a quick visual survey of the class, with minimal expense or
effort. Using electronic feedback can have a positive effect on student participation, but adds
significant costs to both students and departments. Since many schools do not yet have this
option, we considered it important to investigate student participation without the benefit of
expensive technology.
DATA AND RESULTS: We developed an end-of-semester survey with Likert-scale
questions about various aspects of the course. Students in my fall course also filled out a
Student Assessment of Learning Gains1 survey as a consistency check. Fig 1 shows several
questions directly relating to Concept Tests. The results are consistently and overwhelmingly
positive. Thus, although the faculty perceive a significant participation fade as time goes by,
it is not clear the students agree. One must ask why this apparent disconnect exists.
1
http://www.wcer.wisc.edu/salgains/instructor
2
Homework in my class has a small (20%) purely conceptual component, while exams are
even more heavily oriented to concept questions. It may be that students recognize the
importance of practicing and becoming skilled at conceptual questions, but still do not
appreciate the value of active engagement with them during lectures.
As a quantitative assessment of conceptual learning, a subset of the CSEM [Maloney01]
(Conceptual Survey in Electricity and Magnetism) was included on final exams. The chosen
problems represented a broad spectrum of the CSEM concepts. Our class scores were higher
How do you feel about Concept Tests?
Love them, very positive: Fa'01:44%, Sp'02:57%, mildly positive: Fa:45%, Sp:34%,
neutral: Fa:6%, Sp:7%
mildly negative: Fa:3%, Sp:2%,
hate them, very negative: Fa:2%, Sp:1%
Irrespective of how you FEEL about them, do you believe they are beneficial in terms
of teaching physics?
Very beneficial: Fa:66%, Sp:66%,
mildly beneficial: Fa:26%, Sp:29%,
neutral: Fa:6%, Sp:5%,
not very beneficial: Fa:2%, Sp:1%,
very bad system Fa:0%, Sp:0%
SALG online survey:
How much did concept tests during class help your learning? (Fa '01 only)
Very much help: 46%,
Much help: 30%,
Moderate help: 14%,
A little help: 8%,
No Help: 2%.
Fig. 1 Anonymous end of semester surveys, for two semesters.
than the survey reference scores on every question asked. For the calculus based class, our
overall average was 73% (compared to 50% from the survey, for this set of questions.) Our
algebra based average was 73% (compared to 41% from the CSEM survey). For our
purposes we may at least conclude that the "fade" effect is not apparently indicative of poorer
exam performance or conceptual learning.
3
Through the Project Galileo web forum [Crouch02], we received feedback from faculty using
peer-instruction methods nationwide. I also interviewed faculty at my institution who have
tried this classroom approach. Of those faculty responding to the web forum request, only
those using electronic feedback consistently claimed they were not likely to suffer from the
"fade" phenomenon. All our faculty using concept tests independently observed and are
bothered by "fade". We have just switched to an electronic voting system, and the
improvement in student participation is considerable. Nonetheless, this option will not be
available to everyone, and the question of how to improve participation without such expense
remains important.
Online participation: We introduced a regular online open-ended survey requiring a written
response (nine each semester). These were submitted, and processed online with a web script
based on Just In Time Teaching[Novak99]. This turned out to be an invaluable data
collection tool. To encourage responses, we graded for participation, not content.
Participation always exceeded 70%. Some questions relating to in-class participation are
shown in Figure 2. It should be noted that in the spring, I emphasized the importance of
participation more heavily during lectures, and asked an extra survey question on this topic.
Although some fade still occurred in this class, it was noticeably less significant than I had
ever experienced before.
4
There was a broad spectrum of responses both semesters, but many were initially somewhat
perfunctory. However, the last Sp02 question was far more robust, generating thoughtful
comments and long replies.
Week 3: Tell me about the "discussions" you had with your neighbors. Did you talk or keep quiet? Did you
or any of your neighbors change your mind? Was the class time spent useful to your learning? Then, look a
little deeper. Tell me WHY - why did you participate (or not), why was it useful (or not), why did you change
your mind (or not)?
Week 8: I'm curious - has your class participation changed significantly since the beginning of the semester?
Has your attendance pattern changed (how, why?), do you raise colored cards more or less often than a
month ago, do you talk more or less with your neighbors during concept tests? Have you thought more about
why we do concept tests in lecture? Try to reflect and articulate a little more than just basic "yes or no"
answers - why do you think your answers are what they are?
Week 14: Throughout the semester, I have asked several participation questions regarding Concept Tests,
because I am interested in finding ways to improve their use and value for this class. I'd like you, in this final
question, to tell me the following. First, briefly characterize your level of participation (do you raise cards?
How frequently? Do you talk with your neighbors? How frequently? Very important - has this changed over
the semester? How?) Next, tell me what you think could have improved the value of these concept questions
for YOU. Don't tell me what would be good for other people. (Come up with as many ideas as you have time
and energy for, your thoughts on this could be very helpful for improving this course!)
Figure 2: Sample online questions relating to in-class discussion and participation, Phys
1120 (Sp '02)
To summarize and categorize, I developed a coding scheme based on student comments,
dividing responses into various categories. Results for self-reported frequency of
participation is shown in Table 1. These numbers show a reasonably high level of
Category of response
Always
Often
Occasionally
Rarely or never
More, as semester progressed
Same throughout
Less, as semester progressed
In-class talking
#
(per cent)
32
(21%)
55
(36%)
33
(22%)
19
(13%)
31
(21%)
59
(39%)
7
(5%)
Voting (raising cards)
#
(per cent))
66
(36%)
82
(45%)
11
(6%)
20
(11%)
14
(8%)
66
(36%)
18
(10%)
Table 1: Sp02 summary of student self classification of level of in-class talking and voting.
(Percents don't add to 100 because students often respond in multiple categories!)
5
participation, consistent with my observation that fading was not so strong this semester.
Recurring themes regarding reasons or explanations for participation were coded and sorted
as summarized in Table 2. (See discussion below.) Explicit student suggestions about in-class
discussion and concept tests are summarized in Table 3.
Positive reasons for participation:
Helps student learn concepts 40
Friends/social aspects
31
Alternate perspectives help
18
Preps us for exam/homework 14
Check student understanding 12
Engages/stimulates
11
Other:
17
Negative comments re participation:
Not enough time
48
Clueless, nothing to say
31
Missing cards
27
Social issues
19
Fatigue
12
Neighbor not helpful
6
Other
21
Table 2: Summary of students' reasons for or against in-class talking. 153 students made at
least one such comment, many made multiple comments. Tabulated #'s show how many
students' comments fit in the given categories.
Give more time for discussion
Need more discussion after/about concept test
Want solutions to all concept tests posted on web
Want more concept tests in each lecture:
Swayed by dominant vote visible around them
Improve "aesthetics" of the concept test overheads
Assorted other
35
18
14
13
12
7
42
Table 3: Specific suggestions for improvements in the course which might improve in-class
participation. 106 students had at least one suggestion
Based on these "self-reports", we evaluated the level of in-class talking (for all students who
answered at least one relevant question and reported their participation level) on a subjective
numerical scale from 0-22. I was able to rank 276 students (90% of the class). For the
2
Scale: 2.0 = clear statements that they always talk to their neighbors
1.5 = minor qualification ("mostly" talk) or inconsistency between the different surveys
1.0 = moderate qualification ("sometimes" talk) or stronger inconsistencies
0.5 = stronger qualification ("occasionally" talk), or serious inconsistency.
0 = clear statement that they never talk.
Average score for all ranked students came out to 1.3
6
average student, it is difficult to see any significant connection between course grade and
participation level. (The course GPA of "below average talkers" was 2.47 +/- .09, for "above
average talkers" it was 2.71 +/- .07) However, for more extreme students, some trends are
visible to the eye, shown in Figure 3.
A s tudents only
D/F students only
16
16
# 12
# 12
8
8
4
4
0
0
0
0.5
1
1.5
2
0
Self -reported "talking l evel"
0.5
1
1.5
2
Self -reported "talking l evel"
Figure 3: Histogram showing number of students with different self-reported in-class talking levels
in Phys 1120 (Sp '02), for low performing (overall course grade D+ or lower) or high performing
(overall gra de A- or higher) students.
Fig 3 shows that a few very poor students claim to be actively engaged in classroom
discussions, but a larger number admit to being disconnected. There is no reason to argue this
is a causal connection; one cannot argue from this data alone that student participation
improves course grade. Many weaker students indicated they simply have nothing to say, or
are embarrassed to expose a lack of understanding to neighbors. A small number of better
students stated that they never talk in class, preferring to spend that time in quiet
contemplation. A majority of top students, however, report active participation in class
discussions.
7
D/F students only
A s tudents only
7
14
6
#
12
#
5
4
3
10
8
6
2
4
1
0
2
0
0
2
4
6
8
10
12 14
16 18
0
Online participation score (18 max)
2
4
6
8
10
12 14
16 18
Online participation score (18 max)
Figure 4: Histogram of online survey participation scores in Phys1120 (Sp02) for students with
lowest and highest overall course grade. Credit was strictly for participation (2 pts/survey).
For written participation, the story is more pronounced. Fig 4 shows that stronger students
submit replies more consistently, even though they are worth little credit (3% of course
grade). A small number of failing students also do so, but are equally likely to submit very
few. (The average participation score for D/F students was 50%, for A students it was 83%.)
Correlations between self-reported talking level and course grade (correlation coefficient, CC
= +.16), or participation level (the number of on-line survey questions they answered, which
has nothing directly to do with in-class behavior) and course grade (CC = +.43), are weak
but positive. Apparently, self-reported in-class talking levels are not a particularly strong
predictor of course grade. However, the positive correlation between on-line participation
and course grade implies there is a slight selection bias involved when drawing conclusions
about in-class participation based on students' written responses, since those responses preselect for students with slightly higher GPA's.
HYPOTHESES: Based on our collected responses, we have constructed a set of hypotheses
regarding the fade. Each provides a target for future course improvements or investigation.
8
1) Fatigue: Students are increasingly tired as the semester wears on, and the novelty of the
Concept Tests wears off with time.
2) Social issues:
A) Peer pressure. For many students, it is not "cool" to raise cards and participate actively.
Many students also fear appearing wrong or foolish
B) High level of confusion. A confused student has little to say, and may also fear hearing
incorrect or misleading discussion of their peers.
C) Attendance in our large lectures tends to drop with time. Peer instruction is most effective
if students are close to each other. The noise level in the room is lower when there are fewer
people, adding to the faculty's sense that participation is fading.
3) Procedural/logistical aspects:
A) Lack of grade incentive. With hundreds of students in a lecture hall, using a low-tech
voting method means there is no easy means to grade classroom participation, hence
individuals are anonymous.
B) Lack of time: students who feel cut off too quickly may become discouraged. Students
can also become trained to learn that if they wait, they will still get the answer.
4) Student Epistemology: The idea that the process is important is not emphasized by the
very nature of multiple-choice questions. Why should a student bother with discussion if the
answer is all that matters?
9
5) Pedagogical issues: Problems (and the physics content itself) are getting harder as
semester goes on. Questions may be more involved, less basic, perhaps less amenable to
encourage or promote conceptual discussion.
ANALYSIS AND RECOMMENDATIONS: Re hypothesis #1 (“fatigue”): this a priori
obvious candidate is the prime assumption of faculty. Surprisingly, it was mentioned
infrequently in online surveys, and only a little more commonly in student focus groups (we
held two, after the semester was over.) Still, one might want to spice up concept tests as the
semester goes by, adding variety to maintain interest and enthusiasm.
Re #2 (“social issues”): A significant strength of Concept Tests lies in their social aspect; the
motivation of helping and learning from peers is compelling, and student comments indicated
social issues are significant. The fewer the students who participate, the more socially
awkward it becomes. Hence, the fade can feed on itself. Many students said they would not
participate if they weren't sitting with friends. Based on student feedback, the second
semester, for the first time, I provided an exercise early on to introduce students to one
another. Several comments indicated this had a positive effect.
Fear of embarrassment is also a factor in the fade. Many students commented that they were
unwilling to commit to an answer or discuss with a neighbor if they felt lost. As the semester
wore on, students stated their class preparation was weaker, and/or the material was getting
more challenging. After being wrong enough times, students tend to become "card-shy".
Combating this might require more strongly encouraging and supporting reading of
assignments before the lecture, asking more frequent confidence building questions, and
adapting the style of concept tests as the semester progresses.
10
The lecturer's attitude will also play a key role in student participation. I observed a colleague
who took over a class where concept tests were successfully incorporated. He was slightly
uncomfortable with their use, and unconvinced of their value. Despite no other significant
changes in class structure (exams and homework continued to be conceptually focused) class
participation dropped nearly to zero within a few weeks.
Re #3 (“Procedural/logistical”): These issues are very serious due to our large class sizes and
limited resources. Without grade incentives, active explanation of classroom methodology on
the part of the teacher (which I call, with slight tongue in cheek,"class propaganda") may
help motivate students by making explicit a variety of reasons why participation helps
grades. Patterning exam questions after concept tests can help, although students access and
study questions outside of class too, so this might still not encourage in-class discussion.
Designing assessment that rewards process over answer is challenging in large classes, and
worth further study. Questions that require students to talk (e.g. demanding that students
"vote in blocks", so that they must first agree with their neighbors) can help.
The "insufficient time for discussion " issue is a significant results of the surveys. It plays a
larger role in my classroom than I had suspected: a quarter of all student comments
mentioned it without prompting. Very few complained that I was spending too much time on
discussion. An electronic system helps here, as the number of students answering appears on
the screen real-time. But, there is an unfortunate overhead, you can't do "quickies" with the
electronic system.
11
Re #4 (“Student Epistemology”): A student who thinks that only the answer matters does
not yet understand what learning physics means. Developing "how do you know" questions
rather than "what is the answer" could help. Awareness of this issue might encourage the
professor to cajole, encourage, and demonstrate why participation is acceptable, useful, and
necessary. Asking students to explain the value of in-class discussions yielded many
insightful answers in the spring semester, and participation was not as "faded" as in the past.
Encouraging discussion, and teaching what “arguing” means to a physicist, are necessary
steps for the students to engage appropriately. Making open ended questions would be
useful, although getting feedback is then more difficult. Not telling them the answer every
time might help too, although this can be challenging for class morale. Still, it fights the
tendency of students to wait out a discussion period to simply hear "the answer". Testing on
such questions may encourage participation as a way for them to save on outside study time.
Re #5 (“Pedagogical issues”) Students may not know how to discuss a problem if they are
still confused about a concept, and need to be guided. Breaking a complex question into parts
can help here. But, what kinds of questions best encourage student discussions? What is the
character of a "good" concept question? The question of identifying and classifying particular
concept questions as to their effect on student engagement is an extension of this work well
worth pursuing.
Conclusions: The use of concept tests is compelling, I can no longer imagine not using
them. It's a shame that our students begin to feel a little dispirited near the end of a course,
hence my research into them. Ultimately, it should be useful to learn how students feel about
concept tests and classroom engagement, in order to deepen their usefulness in the
classroom. It is unlikely such research will lead to a "trick" or simple solution. But, learning
12
more about what induces students to participate actively in a large classroom setting is
invaluable. One of the most significant aspects of this study has been the introduction of
online written participation questions regarding pedagogical issues. It has broad uses, is
popular (extra credit), and yields valuable insight into student attitudes. In several cases, it
even allowed early "intervention" with potential problem students.
Another key result of this study has been the addition of what I call "propagandizing" in my
classroom. By this I mean making explicit the reasons behind classroom methodology. This
is emphasized by Mazur and others using Peer Instruction[Mazur97, Meltzer02]. I explained
why class participation is important for me and for them, and pushed participation actively.
This appears to have had some positive effect: participation levels were not as markedly
reduced at the end as in the past. Thus, the act of investigating this problem helped mitigate
the problem! Not much else has changed dramatically in my classroom, except the
"propaganda", slightly more frequent discussion of student's reasoning after concept tests,
and the questions I ask of them online, which implicitly indicate the importance I attribute to
Concept Tests. Making visible those aspects of learning which I want them to develop,
through active propaganda, and by asking students to think and write about the purpose and
value, may have improved their understanding and consequently their engagement level.
Many students indicated they were thrilled that I cared enough to ask and respond. This is a
case of "making a large class feel smaller", as are the Concept Tests themselves, surely a
central issue in student engagement, worth pursuing, understanding, and improving.
13
Acknowledgment: This work was undertaken as part of my participation in the Carnegie
Academy for the Scholarship of Teaching and Learning.
References:
Arons, A. 1997, "Teaching Introductory Physics", Wiley
Crouch, C., Mazur, E. 2001, American Journal of Physics 69, 970-977
Crouch, C. 2002, private communication.
Fagen, A.P., Crouch, C., Mazur. E.2002, Physics Teacher 40, 206-209
Hake, R. 1998, American Journal of Physics 66, 64-74
Hestenes, D., Wells, M., Swackhamer G. 1992, Physics Teacher 30, 141-158
Maloney, D., O'KumaT., Hieggelke C., Van Heuvelen, A. 2001, Phys. Educ. Res., American
Journal of Physics Supplement 69, S12-23
Mazur, E. 1997, "Peer Instruction: A User's Manual", Prentice Hall. See also
http://galileo.harvard.edu/
McDermott, L.C. 1991, American Journal of Physics 59, 301-315 ,
McDermott, L.C. 1993, American Journal of Physics 61, 295-298
Meltzer, D., Manivannan, K. 2002, American Journal of Physics 70 , P. 639-656
Novak, G., Patterson, E., Gavrin, A., Christian, W.1999, "Just in Time Teaching", Prentice
Hall, 1999. http://webphysics.iupui.edu/jitt/jitt.html
Wyckoff, S. 2001, Journal of College Science Teaching 30(5), 306-312
14
Download