Studio Physics Paper - Southern Connecticut State University

advertisement
Innovations in Studio Physics at Rensselaer
Karen Cummings and Jeffrey Marxa)
Rensselaer Polytechnic Institute, Troy, NY 12180
Ronald Thornton and Dennis Kuhl
Center for Science and Mathematics Teaching
Tufts University, Medford, MA 02155
In 1993, Rensselaer introduced the first Studio Physics course. Two years later, the Force Concept
Inventory (FCI) was used to measure the conceptual learning gain <g> in the course. This was found to
be a disappointing 0.22, indicating that Studio Physics was no more effective at teaching basic Newtonian
concepts than a traditional course. Our study verified that result, <g FCI,98> = 0.18  0.12 (s.d.), and
thereby provides a baseline measurement of conceptual learning gains in Studio Physics I for Engineers.
These low gains are especially disturbing because the studio classroom appears to be interactive and
instructors strive to incorporate modern pedagogies. The goal of our investigation was to determine if
incorporation of research-based activities into Studio Physics would have a significant effect on
conceptual learning gains. To measure gains, we utilized the Force Concept Inventory and the Force and
Motion Conceptual Evaluation (FMCE). In the process of pursuing this goal, we verified the
effectiveness of Interactive Lecture Demonstrations (<gFCI> = 0.35  0.06(s.d.) and <gFMCE> = 0.45 
0.03(s.d.)) and Cooperative Group Problem Solving (<gFCI> = 0.36 and <gFMCE> = 0.36), and examined
the feasibility of using these techniques in the studio classroom. Further, we have assessed conceptual
learning in the standard Studio Physics course (<g FCI,98> = 0.18  0.12(s.d.) and <gFMCE,98> = 0.21 
0.05(s.d.)). In this paper, we will clarify the issues noted above. We will also discuss difficulties in
implementing these techniques for first time users, and implications for the future directions of the Studio
Physics courses at Rensselaer.
1
I. INTRODUCTION
Discussion of Studio Physics at Rensselaer
Introductory physics at Rensselaer
Polytechnic Institute is taught in a "studio"
format with nearly 1000 students per term
enrolling in Physics I or Physics II .1 The
defining characteristics of Studio Physics are
integrated lecture/laboratory sessions, small
classes of 30 to 45 students, extensive use of
computers, collaborative group work, and a high
level of faculty-student interaction. Each section
of the course is led by a professor or
experienced instructor, with help from one or
two teaching assistants. The teaching assisants’
roles are to circulate throughout the classroom
while the students are engaged in group work.
There is currently no explicit training of
teaching assistants. As a result, there is great
variation in their effectiveness. Introductory
Studio Physics is a calculus-based, two semester
sequence equivalent to the standard physics for
engineers and scientists. Classes meet twice a
week for sessions lasting 110 minutes each. The
studio model has reduced the number of contact
hours – from 6.5 hours per week to less than 4
hours – without significantly reducing course
content. An expectation of some independent
learning on the part of the students has become
the norm.
The studio format was first introduced at
Rensselaer in 1993. During the Fall 1995,
Cooper used the Force Concept Inventory (FCI)
to measure conceptual learning gains.2,3 The
fractional gain in conceptual learning ,<g>, was
found to be a disappointing 0.22. This indicated
that the studio format was no more effective
than a traditional course structure in regard to
teaching for conceptual learning. The fractional
gain ,<g>, is defined as follows:
% Correct post-instruction  % Correct pre-instruction
100  % Correct pre-instruction
This expression is often referred to in the
literature as the "g" or "Hake" factor, and is the
ratio of the actual gain to the maximum possible
gain.4 The low gain in student understanding in
Studio Physics classes was puzzling because the
2
Discussion of Studio Physics at Rensselaer
studio classroom appeared to be interactive.
One noticeable difference between Studio
Physics and other interactive curricula is that
the activities used in the studio classroom are
predominately "traditional" activities adapted to
fit the studio environment and incorporate the
use of computers. Standard laboratory exercises
were simply modified to allow the data to be
collected or analyzed via the computer, without
making any real changes to the nature of the
exercise. For example, a traditional laboratory
activity which uses a spark timer or photogates
to measure the acceleration of an object in free
fall has been replaced with a video analysis
activity in which students measure the
acceleration of an object in free fall. In general,
the activities used are not based on the findings
of physics education research in that they do not
attempt to directly address known student
misconceptions and employ neither cognitive
conflict5 nor bridging techniques6.
Since the introduction of the studio format in
1993, a standard approach to instruction in these
courses has evolved. Although the Physics I and
Physics II courses are broken into a total of
approximately 20 class sections, taught by 10
different instructors, all students enrolled in a
given course follow the same syllabus, do the
same homework assignments, and take common
exams as a single group, both at finals and
during the semester. A standard course design
including daily lectures, in-class activities and
solutions, homework assignments and solutions,
and reading assignments is provided by a course
supervisor for use by all instructors. The course
supervisor is also responsible for exam
development. The motivation for this approach
is two-fold. First, it reduces redundancy in class
preparation. Second, it provides for consistency
in the material covered by the various
instructors. Nearly all instructors adhere to this
standard course design. Nevertheless, inherent
to the Studio Physics model is a certain
flexibility that enables motivated instructors to
include diverse approaches.
Methods of Inquiry
The physics education community
assiduously develops creative techniques for
engaging students in their own learning process.
Two of the authors7 were Studio Physics I
instructors during the Spring semester of 1998,
and merged two such techniques, Interactive
Lecture Demonstrations (ILD) 8 and Cooperative
Group Problem Solving (CGPS)9,10 , with the
standard set of activities in the Studio
classrooms at Rensselaer. These approaches are
discussed in more detail below. Our intent was
to ascertain the effectiveness of these techniques
while concurrently establishing the feasibility of
incorporating these methods into Studio
Physics.
As part of our study, we divided the Studio
Physics I sections into two broad categories –
experimental and standard. Standard classes
were taught by instructors who delivered the
"standard" studio instruction that was described
above. Experimental classes were taught by
instructors who modified the course design to
include either Interactive Lecture
Demonstrations, Cooperative Group Problem
Solving or both. There were seven standard
classes and five experimental classes. Two of
the experimental sections were taught by one of
the authors and the other three experimental
sections were taught by another author.7 The
seven standard classes were taught by four
different instructors, one of whom was the
course supervisor. There was no overlap
between the instructors for experimental and
standard sections. To offer some objective
measure of the conceptual learning gains in the
two groups, the authors administered two
diagnostic exams, the Force Concept Inventory3
and the Force and Motion Conceptual
Evaluation (FMCE)11, to every student both preand post-instruction.
II. REVIEW OF CURRICULA 
As noted above, two techniques developed
by the physics education community were
incorporated into the standard studio model of
instruction on an experimental basis.
Interactive Lecture Demonstrations
3
Interactive Lecture Demonstrations (ILDs)
are an instructional technique developed by R.
K. Thornton and D.R. Sokoloff. 8 They were
developed and refined based on the findings of
education research, and exploit the real-time
data acquisition and display powers of
microcomputer based laboratory (MBL) tools12.
Interactive Lecture Demonstrations were
designed to create an active learning
environment in large lecture classes, or any
class where providing equipment for student use
is an obstacle.
The steps currently prescribed by Thornton
and Sokoloff for use during all Interactive
Lecture Demonstrations are as follows:
1. The instructor describes and performs the
demonstration without the use of MBL
tools.
2. Students record their names and individual
prediction on a Prediction Sheet that is to
be collected at the end of the demonstration
period.
3. Students engage in small group discussion
about these predictions.
4. Students make any desired changes to their
predictions on the prediction sheet.
5. The instructor elicits common student
predictions from the class.
6. The instructor performs the demonstration
with MBL tools which allow the data to be
displayed to the entire class. Attention is
called to the most important features of the
graphs, and how they relate to the physical
situation.
7. Students record and discuss the results.
8. The instructor discusses analogous physical
situations.
Interactive Lecture Demonstrations have
been developed for a wide range of topics. The
Interactive Lecture Demonstration packet which
was commercially available in 1998 was used in
this study.13 The purchased package provided a
teacher's guide, computer files for the Mac or
PC, student Prediction Sheets and Result Sheets,
and teacher's presentation notes for each of four
Interactive Lecture Demonstration sequences.
These sequences cover kinematics (through
units on Human Motion and Motion with Carts)
and dynamics (through units on Newton's First
and Second Laws, and Newton's Third Law).
Each sequence takes between forty and fifty
minutes to perform. The commercially available
package was implemented "right out of the box"
at Rensselaer with little preparation by
Cummings and Marx who were motivated, but
had no prior experience in performing
Interactive Lecture Demonstrations.
Deviations from the prescribed
implementation of the Interactive Lecture
Demonstrations occurred and were a
consequence of two overlapping areas of
difficulty. The first area of difficulty was the
instructors' inexperience performing the
Interactive Lecture Demonstrations which was
compounded by the use of new software and
hardware. Consequently, they paid less attention
than desired to following the pedagogical
suggestions set forth in the teacher's notes.
Especially notable were consistent failures to
make analogies to similar situations and to have
students discuss the result. Furthermore,
students were routinely allowed too much time
to make their predictions. This was primarily a
result of the instructor’s desire to have every
student complete the prediction before
continuing. The additional prediction time was
counterproductive because some students lost
interest and moved on to other things, rather
than staying focused on the demonstration.
The second area of difficulty encountered
was performing Interactive Lecture
Demonstrations in physically small studio
classrooms with a student population
accustomed to interacting with one another and
their instructors. An important hindrance was
that the classrooms used have level floors, as
opposed to the raised tiers of seats present in
most lecture halls. Consequently, students in the
back of the room had trouble seeing the
demonstration. Additionally, since studentfaculty interaction is the norm, instructors
answered questions on an individual basis that
would have been more constructive if they
would have been asked and answered publicly.
Furthermore, students in the studio courses at
Rensselaer have begun to expect interaction
with their peers, and hence tended to share
predictions too soon, or even to make initial
4
predictions as small groups rather than
individually. As a result, some students never
made a personal "intellectual commitment" to
their predictions. It was also routinely difficult
for instructors to elicit incorrect answers from
the class. This made discussion of common
misconceptions awkward.
Cooperative Group Problem Solving
Cooperative Group Problem Solving (CGPS)
is a strategy for peer instruction in quantitative
problem solving which was developed by P.
Heller, et. al.9,10 This instructional approach
involves the following:
1. Formation of well-functioning cooperative
groups
2. Presentation, modeling and reinforced use
of an explicit and effective problem solving
strategy
3. Use of "context-rich" problems for group
practice
4. Rewarding cooperative learning through
grading
In order to facilitate learning, instructors
organize teams of three students in which high-,
average-, and under-achievers are represented.
(Students are not informed of this grouping
procedure.) Further, students are informed about
the roles and responsibilities taken on by
members of well functioning cooperative
groups and are encouraged to assign these roles
to members of their group. In this study, a
student's initial achievement level was based on
the results of conceptual diagnostic tests
(FMCE and FCI), and groups were rearranged
every three to five weeks following each course
exam.
A key tenet of Cooperative Group Problem
Solving is that students, as opposed to expert
problem solvers, have not yet developed a
generalized strategy for quantitative problem
solving. Hence, students are explicitly taught
such a strategy during the first days of the
course. The strategy adopted for use in the
experimental group was the same as that used at
the University of Minnesota14 and involves a
five-step process described as follows:
1. Comprehend the problem by drawing a
picture and noting the given information.
2. Represent the problem in formal terms by
translating the picture into a diagram or
graph.
3. Decide on the general approach and write
down equations that you believe
appropriate.
4. Combine equations in such a way as to
solve for the target quantity before
substituting in the given information.
Check to make sure that units are
consistent. If units are consistent, substitute
in the given information and solve for a
numerical answer.
5. Check to make sure that your numerical
answer is reasonable, that you have
answered the correct question and that units
are correct.
The problem-solving strategy outlined above is
modeled by the instructor during lectures and
then practiced by students in their group.
Deviations from the prescribed approach
discussed above resulted predominately from
testing and grading constraints within the Studio
Physics course structure. As previously
mentioned, students in all sections took
common exams throughout the semester. The
course supervisor frequently included material
based on the standard activities on these exams.
Hence, the investigators felt that it was
imprudent to completely displace these
activities. Instead they opted to spend one class
period per week on Cooperative Group Problem
Solving using recitation style context rich
problems available from the University of
Minnesota14, and one class period per week
doing the same activity that students in the
standard sections did. Further, context-rich
problems were not included on exams, although
part of the students’ class activity grade was
based on their cooperative group problem
solving work.
The investigators encountered three other
major difficulties. The first was that due to time
constraints, the instructor in the Cooperative
Group Problem Solving sections did not model
the problem solving technique as often as
5
desired. The two other difficulties we
encountered appear to be more general in
nature. The first was student resistance to
assignment of "roles" within the group. Several
students expressed that they were very
uncomfortable with this process. As a whole,
the students could not be encouraged to adopt
roles within the groups, and hence this aspect of
the technique soon died out. The second general
difficulty we encountered was that the problemsolving procedure outlined above is typically
not relevant when solving textbook-style
homework problems like those assigned in the
Studio Physics course at Rensselaer. Students
were required to use the problem-solving
procedure on all their homework assignments
and resented having to use this procedure if they
could easily have solved the problem without it.
III. ASSESSMENT AND EVALUATION
As mentioned above, the authors
administered two diagnostic examinations, the
Force Concept Inventory and Force and Motion
Conceptual Evaluation (FMCE). These two
exams were labeled as "Part A" and "Part B" of
a single exam packet, and both exams were
given pre- and post-instruction. Preinstructional testing was done during the first
class session. The authors allotted 60 minutes
for the students to finish the two exams – 25
minutes for the Force Concept Inventory and 35
minutes for the Force and Motion Conceptual
Evaluation. The time allotted for each exam was
determined by dividing by the total number of
questions (77) into the total exam time (60
minutes) and then multiplying by the number of
questions on the particular exam in question (47
for the Force and Motion Conceptual Evaluation
and 30 for the Force Concept Inventory). The
latest version of the Force Concept Inventory
was used. The version of Force and Motion
Conceptual Evaluation used had 43 questions
on force and motion topics and 4 questions on
the conservation of energy. This yielded an
allotted time which was less than that suggested
by the exams' authors. However, most students
finished both exams and all were given the same
amount of time to work on the exams during
pre- and post-testing periods. One of the authors
was present for every administration of the
exam. Post-instructional testing was done after
all of the relevant material had been covered in
the course. This was approximately two-thirds
of the way through the semester, or about ten
weeks after pre-instructional testing.
In the spring 1998 semester, studio classes
were divided into the categories discussed
below based on the nature of the instruction
they received. The division of students into
these categories was essentially a random
assignment. Students chose to enroll in a
particular section of the course based on
scheduling issues and before any information as
to which professor would be assigned to teach
the class became available. Division of class
sections into these categories was based on the
section instructor’s willingness to experiment
with new methods and materials.
groups as an in-class activity. They were then
given the last three Interactive Lecture
Demonstration sequences. A Cooperative Group
Problem Solving model was implemented in
sections 4 and 9. Sections 6 and 8 were given
context-rich problems as extra, in-class
activities on three occasions throughout the
semester. Table I summarizes the breakdown of
the experimental groups.
Table I: Instructional techniques used in
experimental sections.
Section
A. Standard Group
Seven class sections comprised the Standard
Group. These students were taught the standard
studio course. In this model, the first 30 minutes
of class was devoted to answering questions and
working problems on the board. Then next 1020 minutes were devoted to a brief lecture,
complete with derivations and examples. The
remainder of class time was used by the
students to work on an in-class assignment
based on the day's material. The scope of these
assignments ranged from pen and paper
exercises to spreadsheet exercises to computerbased laboratories. For the most part, students
were able to complete the in-class assignments
before the class hour was through. Some
instructors found activities to occupy the
students' time; others simply let students leave
early.
4
6
Instructional Techniques Used
Incomplete
Full ILD
ILD
CGPS
Sequence
Sequence
X
X
X
7
8
X
X
9
11
X
X
Table I: Instructional techniques used. Sections
referred to as “experimental” are those in which
either Interactive Lecture Demonstrations (ILDs),
Cooperative Group Problems Solving (CGPS) or
both techniques were used.
B. Experimental Group
Five sections were taught by either
Cummings or Marx and comprised the
Experimental Group. This group of students had
an instructional experience which was
predominately the same as that of the standard
group. However, these groups were also
exposed to Interactive Lecture Demonstrations,
Cooperative Group Problem Solving or both.
Students in sections 4 and 11 were given all
four Interactive Lecture Demonstration
sequences. Students in sections 6 and 8 did a
simplified version of the "human motion"
Interactive Lecture Demonstrations in small
6
Primarily, experimental activities were done
in place of the work performed by the standard
group. On the occasions that time allowed,
they were done in addition to that work. Hence,
we estimate that the inclusion of these activities
resulted in an increase in instruction time of
about 1% for Interactive Lecture
Demonstrations and about 5% for Cooperative
Group Problem Solving. The additional time
came about because the experimental sections
did not leave class early, while the standard
sections occasionally did. The topics covered by
the experimental and standard groups were
identical; the two groups remained
synchronized throughout the semester and took
common exams.
0.6
C. Intermediate Group
0.4
IV. RESULTS AND DISCUSSION
Figure 1 provides an overview of the results
of this investigation. We have assessed
conceptual learning in the standard Studio
Physics course during the Spring, 1998 semester
to be <g FCI> = 0.18  0.12(s.d.) and <gFMCE> =
0.21  0.05(s.d.). In the studio sections in
which Interactive Lecture demonstrations were
preformed we found <gFCI> = 0.35  0.06(s.d.)
and <gFMCE> = 0.45  0.03(s.d.). In studio
sections in which Cooperative Group Problem
Solving was used we found <gFCI> = 0.36 and
<gFMCE> = 0.36. The fractional gains discussed
here and represented by the height of each bar
in Figure 1 is the <g> factor discussed in the
introduction. In this analysis, we considered
only students for which there were both pre- and
post-test scores (matched samples).
7
Gain <g>
Inclusion of a group of students who could
act as an indicator of instructor influence,
separating the effects of the instructional
techniques and curricular materials from the
influence of the instructor (an instructor control
group), was not part of this study's design.
Nevertheless, such a group serendipitously
formed. Section 7 began the semester as a
standard section, taught by a professor other
than one of the authors. However, due to low
enrollment in this class and in section 8 (which
ran concurrently in another room), section 7 was
merged with section 8 approximately three
weeks into the semester. From this point on,
these students were taught by one of the
authors. Aside from having missed the first
three sequences of Interactive Lecture
Demonstrations(out of 4), they had an identical
educational experience as the students who
began the semester in section 8. Hence, we do
not consider section 7 to be part of the Standard
or Experimental Group, but rather they are a
weak control for the influence of the authors on
the outcome. We will refer to section 7 as the
Intermediate Group.
FMCE
0.5
FCI
0.3
0.2
0.1
0.0
Standard
Intermediate
ILDs
CGPS
CGPS+ILDs
Figure 1: Fractional gain on the Force and Motion
Conceptual Evaluation (FMCE) and the Force
Concept Inventory (FCI) for groups of students
having had various instructional experiences.
The fractional gain for a group of students
was calculated using the average of post-test
scores for the group and the average of pre-test
scores for the group. (This is referred to as
"calculating the gain on averages".) For two
reasons, we chose to calculate average gain in
this manner, rather than to average the
individual student gains (referred to as
“calculating the average of the gains”). First, it
allowed us to keep students who achieved a
100% correct score on the pretest in the study.
Individual gains can not be calculated for such
students, and so they cannot be included in the
investigation if one chooses to calculate the
average of individual gains. Second, calculating
the average in this way reduces the skewing
which occurs when students who pre-test with
quite high scores then post-test with somewhat
lower scores. The error bars shown in Figure 1
represent the standard deviation of the averages
of class sections, with each section weighted
equally.
Every question on the Force Concept
Inventory was considered, and equally
weighted, in calculation of pre-test and post-test
scores on this exam. In contrast, several
questions on the Force and Motion Conceptual
Evaluation were disregarded in score
calculations. The disregarded questions are
those which have been found to be important for
identifying students’ conceptual models.
Hence, they remain a part of the assessment.
However, these questions are not appropriate
for inclusion in an overall measurement of the
level of student understanding. Additionally,
several groups of closely related questions on
the Force and Motion Conceptual Evaluation
are considered as a unit when calculating a total
score on the exam. 11,15 This method of
calculating a single number score on the Force
and Motion Conceptual Evaluation was done on
the advice of the exam’s developers and is
discussed in detail in ref. 15.
Average fractional gain for each section on
the Force and Motion Conceptual Evaluation
and Force Concept Inventory is shown in
Figures 2a and 2b respectively. Standard
sections are on the left-hand side while
experimental sections are on the right-hand side;
between them is the intermediate section
(section 7). Standard and experimental group
averages for the Force and Motion Conceptual
Evaluation and the Force Concept Inventory are
indicated by a thin, horizontal line spanning
their respective groups. On both exams the
average gain for the experimental group was
approximately twice that of the standard group.
Moreover, the lowest-scoring experimental
section was at least one standard deviation away
0.6
0.5
0.4
<g>
0.3
0.2
0.1
0.0
-0.1
1
2
3
5
10 12 13
7
6
8
11
4
9
Section
Figure 2A: Fractional gain on the FMCE by class section number. Standard sections are on the left; experimental
sections are on the right. The average gain with standard deviation is indicated for each of the two categories.
0.6
0.5
<g>
0.4
0.3
0.2
0.1
0.0
-0.1
1
2
3
5
10 12 13 7
6
8
11 4
9
Section
Figure 2B: Fractional gains on the FCI by class section number. Standard sections are on the left; experimental
sections are on the right. The average gain with standard deviation is indicated for each of the two categories.
8
Table II: Section-by-section data
SECTION
1
2
3
Standard
5
10
12
13
Intermediate
7
6
8
Experimental
11
4
CGPS
ILD
GROUP
9
N
27
17
27
28
30
23
32
13
39
22
40
30
FMCE
Pre Post
ave ave
37.3 55.1
25.4 44.1
25.7 39.9
43.2 51.7
31.8 43.0
43.6 55.8
40.9 54.9
34.8 54.3
37.3 63.7
33.0 65.1
29.6 60.7
32.2 67.6
32 40.3
61.6
Pre
ave
49.6
43.0
42.1
61.0
42.0
57.5
53.7
56.7
53.3
52.2
46.3
50.1
FCI
Post
ave
63.9
51.7
54.8
59.0
56.6
63.9
66.2
66.7
69.8
66.4
68.2
66.8
<g>
ave
0.28
0.15
0.22
-0.05
0.25
0.15
0.27
0.23
0.35
0.30
0.41
0.33
0.36 33 49.9
68.0
0.36
<g>
ave
0.28
0.25
0.19
0.15
0.17
0.22
0.24
0.30
0.42
0.48
0.44
0.52
N
28
18
28
29
31
24
33
13
40
23
41
28
Table II: Section-by-section data for the Force and Motion Conceptual Evaluation (FMCE) and
the Force Concept Inventory (FCI).
P o st-T est S co re (% )
100
80
60
40
20
Standard Sections
0
0
20
40
60
80
P re -T e s t S c o re (% )
N=
Increased 20% or more
Increased from 10% to 20%
No change, to increased by 10%
Decreased up to 10%
100
184
33.7%
18.5%
31.0%
14.7%
97.8%
Figure 3A: Post- versus pre-test score on the Force and Motion Conceptual Evaluation (FMCE) for students in
standard sections. The size of the bubble indicates the number of students represented by the point. The lines
shown are lines of constant gain. The lowest of the four lines shown corresponds to a gain of –0.20, the line which
passes through the origin corresponds to a gain of zero, and the highest line corresponds to a gain of + 0.40. The
associated table indicates the percentage of students who increased their exam score by the percentage shown.
10
Post-Test Score (%)
100
80
60
40
20
Experimental Sections
0
0
20
40
60
80
Pre-Test Score (% )
N=
Increased 20% or more
Increased from 10% to 20%
No change, to increased by 10%
Decreased up to 10%
100
163
63.8%
18.4%
14.7%
2.5%
99.4%
Figure 3B: Post- versus pre-test score on the Force and Motion Conceptual Evaluation (FMCE) for
students in experimental sections. The size of the bubble indicates the number of students represented by
the point. The lines shown are lines of constant gain. The lowest of the four lines shown corresponds to a
gain of –0.20, the line which passes through the origin corresponds to a gain of zero, and the highest line
corresponds to a gain of + 0.40. The associated table indicates the percentage of students who increased
their exam score by the percentage shown.
from the average of the standard group. Table II
contains the section-by-section data.
It is interesting to note results for the
intermediate group and compare them to those
for Section 8. Despite the fact that these two
groups had the same instructor, and were
students in the same class for most of the
semester, the intermediate group had conceptual
11
learning gains which were more in line with the
standard studio sections than with Section 8 or
other experimental sections. Recall that Section
8 had a complete sequence of Interactive
Lecture Demonstrations, while the intermediate
group did not.
Figure 3 shows scatter plots of students' posttest score versus their pre-test score on the
Force and Motion Conceptual Evaluation.
Figure 3a shows this result for the standard
sections and Figure 3b shows this result for the
experimental sections. The size of the bubble
indicates how many students fell on the same
coordinate. The smallest bubble indicates that
one student had that set of scores, the next
largest bubble means there were two students on
that coordinate, and so on. Since there was a
smaller number of students in the experimental
group, the tables below the graphs indicate the
percentage of students that raised (or lowered)
their grade by the amount indicated from pre- to
post-test. The diagonal lines shown are lines of
constant gain. It is apparent from these graphs
that students in the experimental group did
better in terms of absolute gains on the Force
and Motion Conceptual Evaluation. A strikingly
similar trend was seen for the Force Concept
Inventory results. Furthermore, when we plotted
these data for either exam, we found that there
were fewer experimental students around or
below the zero gain line. There were also many
more students in the upper left-hand region (low
pre-test scores and gains of over 50%) in the
experimental group than in the standard group.
Upon viewing the data in Figure 3, we note
that weaker students (i.e. students with low pretest scores) benefited from being in the
experimental sections. What about the strongest
students (i.e. those with high pre-test scores)?
Figure 4 is a graph of the average <g> for
students divided into groups based on whether
their pre-test scores were in the upper, middle or
lower third of the entire pool for their group
(experimental or standard). Figure 4a
represents this result on the Force and Motion
Conceptual Evaluation and Figure 4b is for the
Force Concept Inventory. These figures clearly
show that all students, whether they pre-tested
high, middle or low, benefited from these
experimental teaching techniques. This result
was consistent on both exams. The correlation
coefficient between pretest score and gain, for
the entire group of students (experimental and
standard taken together, N = 347), is -0.06 for
the Force Concept Inventory and + 0.16 for the
Force and Motion Conceptual Evaluation.
Somewhat stronger correlations seem to exist
for subsets of the population.
V. CONCLUSION
Overall this study implies that the standard
studio format used for introductory physics
instruction at Rensselaer is no more successful
at teaching fundamental concepts of Newtonian
physics than traditional instruction. The average
<g> on the Force Concept Inventory reported
here for standard studio sections falls within the
range of earlier reported values for traditionally
taught courses.4 This result is disappointing in
light of the fact that Rensselaer has expended
the effort and resources necessary to break-up
large (500+ student) classes into small (35-45
student) sections. Rensselaer has introduced
group work and computer use as components of
in-class instruction. Furthermore, lecture time
has been reduced. In general, the Studio Physics
classrooms appear to be interactive and students
seem to be engaged in their own learning.
Nevertheless, use of the studio format alone
does not produce improvement in conceptual
learning scores as compared to those measured
on average in a traditionally structured course.
The implication of this study, that ostensibly
interactive classrooms do not necessarily result
in above average levels of conceptual learning,
verifies the work of others. For example,
Redish, Saul and Steinberg found that even
lectures “with much student interaction and
discussion” had very little impact on student
learning.16 After lengthy investigations, Kraus
reported:
In many of our efforts to improve student
understanding of important concepts, we have
been able to create an environment in which
students are mentally engaged during lecture.
While we have found this to be a necessary
condition for an instructional intervention to be
successful, it has not proved sufficient. Of
equal importance is the nature of the specific
questions and situations that students are asked
to think about and discuss.17
Force and Motion Conceptual Evaluation
Gain <g>
0.8
0.7
Experimental Sections
0.6
Standard Sections
0.5
0.4
0.3
0.2
0.1
0.0
Bottom Third
Middle Third
Top Third
Figure 4A: Average <g> on the Force and Motion Conceptual Evaluation (FMCE) for
students divided into groups based on whether their pre-test scores were in the upper,
middle or lower third of the entire pool for their group (experimental or standard).
Force Concept Inventory
Gain <g>
0.8
0.7
Experimental Sections
0.6
Standard Sections
0.5
0.4
0.3
0.2
0.1
0.0
Bottom Third
Middle Third
Top Third
Figure 4B: Average <g> on the Force Concept Inventory (FCI) for students divided
into groups based on whether their pre-test scores were in the upper, middle or lower
third of the entire pool for their group (experimental or standard).
13
However, introduction of research-based
techniques and activities does have clear
beneficial effects. Interactive Lecture
Demonstrations generated significant gains in
conceptual understanding with remarkably little
instructional time. Cooperative Group Problem
Solving resulted in similar conceptual learning
gains and seemed to also provide a mechanism
which fostered improved quantitative problemsolving skills.
Students in Cooperative Group Problem
Solving sections not only had significant gains
on the Force and Motion Conceptual Evaluation
and Force Concept Inventory but also
performed better on the problem solving section
of the last course exam. Nevertheless,
implementing Cooperative Group Problem
Solving required a semester-long commitment
on the part of the instructor.
As a result of our investigations, we are
optimistic about the future of the Studio Physics
program at Rensselaer. The entire infrastructure
necessary for true interactivity in the classroom
is in place; we feel we need only to adopt
researched-based student activities.
a) Current address: Department of Physics, University of Oregon, Eugene, Oregon.
1.
J. Wilson, "The CUPLE Physics Studio," Phys. Teach., 32 (12), 518-522, (1994).
2.
M. A. Cooper, An Evaluation of the Implementation of an Integrated Learning System for Introductory
College Physics, Ph.D. thesis, Rutgers, The State University of NJ, 1993.
3.
D. Hestenes, M. Wells, and G. Swackhamer, "Force concept inventory," Phys. Teach. 30(3), 141-158 (1992).
4.
R. Hake, "Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics
test data for introductory physics courses," Am. J. Phys. 66(1) 64-74 (1998)
5.
P. W. Hewson and M. G. A'Beckett-Hewson, "The role of conceptual conflict in conceptual change and the
design of science instruction," Instr. Sci. 13, 1-13 (1984).
6.
J. Clement, "Using bridging analogies and anchoring intuitions to deal with students' preconceptions in
physics," J. Res. Sci. Teach. 30(10), 1241-1257 (1993).
7.
K. Cummings and J. Marx were instructors in Studio Physics I for Engineers during the Spring, 1998
semester, and experimented with the use of research based activities in the Studio classroom. Cummings
taught sections #4, 9 and 11. Marx taught sections #6 and 8 as well as a weak control group, section #7.
8.
See for example, D. Sokoloff and R. Thornton, "Using Interactive Lecture Demonstrations to Create an
Active Learning Environment," Phys. Teach. 35(10), 340-347 (1997).
9.
P. Heller, R. Keith and S. Anderson, "Teaching Problem Solving through Cooperative Grouping-Part
1:Group versus individual problem solving" Am. J. Phys. 60(7) 627-636 (1992).
10. P. Heller and M. Hollabaugh, "Teaching Problem Solving through Cooperative Grouping-Part 2: Designing
problems and structuring groups,”Am. J. Phys. 60(7) 637-644 (1992).
11. R. Thornton and D. Sokoloff, "Assessing student learning of Newton's Laws: The Force and Motion
Conceptual Evaluation and the Evaluation of Active Learning Laboratory and Lecture Curricula," Am. J.
Phys. 66 (4), 338-352 (1998).
12. R.K. Thornton, D.R. Sokoloff, "Learning motion concepts using real-time microcomputer-based laboratory
tools", Am. J. Phys. 58 (9), 858-867 (1990).
13. Mechanics Interactive Lecture Demonstration Package (ILD), Vernier Software, 8565 S.W. BeavertonHillsdale Hwy., Portland, Oregon, 97225-2429, 503-297-5317.
14. “Instructor’s Handbook,” identified as “TA Orientation, School of Physics and Astronomy, Fall 1997. See
also http:\\www.physics.umn.edu/groups/physed/
15. K. Cummings, D. Kuhl, J. Marx and R. Thornton, "Comparing the Force Concept Inventory and the Force
and Motion Conceptual Evaluation”, to be submitted, Am. J. Phys. (1999).
16. E. F. Redish, J.M. Saul and R.N. Steinberg, “On the effectiveness of active-engagement microcomputerbased laboratories,” Am. J. Phys. 65, 45-54 (1997).
17. Pamela Ann Kraus, Promoting Active Learning in Lecture-Based Courses: Demonstrations, Tutorials, and
Interactive Tutorial Lectures. Ph.D. dissertation, University of Washington, 1997, University Microfilms,
UMI Number 9736313.
15
Download