Multimedia Learning and Quantum Tunneling

advertisement
Why should we change
how we teach physics?
Derek Muller & Manju Sharma
Sydney University Physics Education Research
(SUPER)
Conceptual Inventories
• Tests to evaluate conceptual physics
understandings
– Force Concept Inventory
– Force and Motion Conceptual Evaluation
– Mechanics Baseline
• Developed based on interviews/surveys
• Used before and after courses to assess
effectiveness
June 16, 2006
Derek Muller
Force Concept Inventory
• Do the questions ask about important
conceptual topics?
• Is the wording appropriate?
• Is the test easier/more difficult than a
standard mechanics test or the HSC?
• Do you think a student could understand
Newton’s laws well and score poorly?
• Do you think a student without a good
conceptual understanding could score well?
June 16, 2006
Derek Muller
How would your students do?
• How would year 11’s fair on this test before
instruction?
• After instruction?
• How would year 12’s go?
• First year Uni? Fundamentals, Regular,
Advanced?
20%
June 16, 2006
40%
60%
80%
100%
Derek Muller
The facts
• Before instruction, students average
20-30%
• After traditional lecturing or instruction,
most students gain 10-20% with a max
of 30%
June 16, 2006
Derek Muller
June 16, 2006
Derek Muller
Sydney University 2006
June 16, 2006
Derek Muller
Confidence?
June 16, 2006
Derek Muller
Questions
• Why do students do so poorly?
• Why do they think that they’re going
well?
June 16, 2006
Derek Muller
diSessa (1996) Early Interviews
June 16, 2006
Derek Muller
diSessa (1996) cont.
June 16, 2006
Derek Muller
diSessa (1996) Final interviews
June 16, 2006
Derek Muller
June 16, 2006
Derek Muller
Misconceptions
• Long history in physics education
research (many documented)
• Strategies devised for changing
misconceptions
–
–
–
–
–
June 16, 2006
Tutorials
Studio physics
Peer Instruction
Interactive Lecture Demonstrations
Interactive Engagement Lectures
Derek Muller
What makes these methods
more effective?
•
•
•
•
•
•
•
•
June 16, 2006
Students paying more attention?
Actual tangible contexts?
Discussion with other students?
Behavioral activity?
Misconceptions raised?
Slower pace?
Less math?
Teaching to the test?
Derek Muller
Research Questions
• Do students learn more by watching other
students discuss misconceptions? (no activity
required)
• Do students learn more by just hearing
common misconceptions raised and refuted?
• Are students confused when misconceptions
are raised in instruction?
• Will addressing misconceptions increase the
effectiveness of a multimedia segment?
• Does student prior knowledge matter?
June 16, 2006
Derek Muller
Experiment Design
• Four treatments created to explain
Newton’s First and Second Laws
• Administered through a website
(QuickTime videos) with pre-post
testing
• All first year students (~800) asked to
participate for one assignment mark
(fundamental, regular, advanced)
June 16, 2006
Derek Muller
Four Treatments
Treatment
Exposition
Number of
speakers
1
Length
Addresses
Misconceptions
June 16, 2006
Extended
Exposition
Refutation
Dialogue
1
1
2
7:02
11:22
9:33
11:22
No
No
Yes
Yes
Derek Muller
Results
Filter Criteria
All data
Failed to complete the post-test
Watched more than one treatment
Failed to watch entire treatment
Completed pre or post test in under four minutes
Failed to answer all questions
Scored higher than 95% on the pre-test
Removed
116
75
30
57
6
30
Remaining Sample
678
562
487
457
400
394
364
• Create measure of gain (Gain = Post-test – Pre-test)
June 16, 2006
Derek Muller
To see the video treatments
• The videos are available in QuickTime
and Windows Media video formats
through the following web links
–
–
–
–
Exposition
Extended Exposition
Refutation
Dialogue
• Keep in mind these are research tools
produced in a very short time frame
June 16, 2006
Derek Muller
Data analysis
• Simple measure of improvement for
each student
Gain = Posttest – Pretest
All values are actual numbers of questions
correct out of 26
June 16, 2006
Derek Muller
Gain for Fundamental Students
June 16, 2006
Derek Muller
Gain for Regular Students
June 16, 2006
Derek Muller
Gain for Advanced students
June 16, 2006
Derek Muller
Do the treatments have
different effects?
June 16, 2006
Derek Muller
ANOVA
Treatment
Sample Size
(n)
Mean Gain
Standard
Deviation
K-S Z
K-S p
Dialogue
Refutation
Extended exposition
Exposition
92
86
95
91
4.77*
4.41*
2.41
1.77
4.59
4.01
3.72
2.65
1.057
0.914
1.300
1.075
.214
.373
.068
.198
Effect size (difference in mean)/(Standard Deviation) = .83, .79 for
the Dialogue and Refutation respectively
The K-S statistics indicate that the distributions are not
significantly different from normal so the ANOVA comparison of
variance is a reliable analysis tool in this case
June 16, 2006
Derek Muller
Does Gain Depend on Prior
Knowledge?
Course
6
Fundamental
Regular
Advanced
5
Mean Gain
4
3
2
1
0
Dialogue
Refutation
ExtExposition
Exposition
Treatment
June 16, 2006
Derek Muller
What about Confidence Gain?
June 16, 2006
Derek Muller
Future Investigation
• Is there a difference between Dialogue
and Refutation methods?
• Interviews to gauge student
perceptions of videos
• Applications of ‘vicarious learning’ in
classrooms
• Comparison with other online methods,
collaborative learning
June 16, 2006
Derek Muller
If you want to try the FCI
• Use a different name (Mechanics
Concepts etc.)
• Make sure copies don’t get passed out
among students
• Find data and research papers on its
use with thousands of students
• Normalized gain Gain/(Max Gain) is
usually ~.23 for typical courses and .48
for ‘reform method’ teaching practices
June 16, 2006
Derek Muller
Should we change how we
teach physics?
• Many researchers believe physics
lectures/classes need to be significantly
altered
• Teaching physics at a slower pace with more
hands-on activities and more discussion
• Implemented in schools and universities
internationally (Curtain University in
Australia)
• But is it sustainable?
• At the very least, students seem to need
explicit exposure to misconceptions
June 16, 2006
Derek Muller
Download