Presentation Slides - NCSU Office of Faculty Development

advertisement
Office of Faculty Development
Clicker Pedagogy in Larger
Classes: How Using Clickers
Can Improve Learning
David McConnell
Marine, Earth, and Atmospheric Sciences
and Office of Faculty Development
NCSU
Seven Principles of Good Practice in
Undergraduate Education
1. Encourages student-faculty contact
2. Develops cooperation among students
3. Encourages active learning
4. Provides prompt feedback
5. Emphasizes time on task
6. Communicates high expectations
7. Respects diverse talents and ways of learning
Can clickers promote these practices?
Chickering & Gamson, AAHE Bulletin, 1987, p. 3-7
When you ask a question in class, what proportion of
your students will raise their hand or otherwise
indicate that they have an answer?
A.
More than 50%
B.
About 30%
C.
About 10%
D.
0-10%
0%
0%
0%
0%
Understanding Student Learning
More instructor
understanding
of learning
Learning
assessment
systems
Less instructor
understanding
of learning
• On-going assessment through
student dialog in small classes
• Instructor grading of short
answer and essay questions
• Computer grading of multiple
choice questions using bubblesheets
Clickers in Effective Pedagogy
Clicker
Pedagogy:
Peer Instruction
& Conceptests
It’s the message, not the medium
Harvard students in a traditional calculus-based
introductory physics class scored an average of 70%
on the pre-test. Predict the average post-test score.
%
0%
78
72
%
0%
0%
0%
%
D.
90
C.
%
B.
72%
78%
84%
90%
84
A.
Crouch, C.H., Mazur, E., 2001, American Journal of Physics, v. 69, #9, p.970-977
What we know about student learning
1. Students learn key concepts better when they
have opportunities to actively monitor their
understanding.
2. Knowledge is socially constructed and people
learn best in supportive social settings when
working with peers.
3. Students become better learners when we
challenge them to answer questions that
require the use of higher order thinking skills.
Pedagogy for Feedback Devices
Peer instruction (& Conceptests)
Development of technique by Eric Mazur, Harvard
 Short lecture (10-20 minutes)
 Conceptest – conceptual multiple choice question
 Individual students signal answers
 Student groups may discuss answers (peer instruction)
 Explanation of correct answer
U. Of Massachusetts model skips individual student answers
Mazur, E., 1997, Peer instruction: A user’s manual: Prentice Hall, 253p.
Example: Earth Science Conceptest
Examine the map and answer the question that
follows. How many plates are present?
a.a. 3 3 (26%;
b. 4 0%)
c. 5c.
b. 4 (19%; 18%) d.
5 (44%;
75%)
d.
6
6 (11%; 7%)
Individual responses
Post-discussion responses
Results when
using physical
models:
(56%; 84%)
Geology conceptest database :
http://serc.carleton.edu/introgeo/
interactive/conctest.html
McConnell, D.A., et al., 2006, Journal of Geoscience Education, v. 54, #1, p.61-68.
Student Performance on Conceptests
• About a third of
questions in an Earth
Science class were
asked twice.
Five questions had
fewer correct answers
after peer instruction.
• An average of 45% of
students responded
correctly on the first
attempt and 63%
answered correctly after
peer instruction.
Student Performance on Conceptests
Range of student
scores for a semester
of conceptests
• Nobody has ever
averaged above mid80s for semester
• Most students miss
at least a third of
questions asked
Impact of Alternative Pedagogy
100%
Traditional
Class
Peer Instruction
Classes
3
2
90%
Mazur’s results
4
1
Began PI
2
Refined
conceptests
3
Changed text
1
80%
FCI score gain
on post-test
70%
FCI pretest score
60%
1990
1991 1993 1994 1995 1996 1997
4
Open ended
reading
questions
n = 117 - 216
Crouch, C.H., Mazur, E., 2001, American Journal of Physics, v. 69, #9, p.970-977
Impact of Alternative Pedagogy
Poulis et al., (1998)
• Results from 5,000+ physics students – increase in
pass rates (55%  80%) in classes that used audience
paced feedback (clickers + student discussion)
Poulis, C., Massen, C., Robens, E., & Gilbert, M. 1998. American Journal of Physics, v.66 #5, p. 439-441.
Dori & Belcher (2004)
• Compared pre- and post-test scores for traditional and
technology-enabled physics classes – learning gains
were greater (27% vs. 52%) for technology enhanced
class
Dori, Y.J. and J. Belcher, J. 2004. Journal of the Learning Sciences 14(2).
Clickers in Effective Pedagogy
Clicker
Pedagogy:
Why this works
It’s the message, not the medium
Students completed a short reading assignment.
Population A studied the passage twice (7 minutes
each time). Population B studied the passage once
and then took a recall test. Two days later, both
groups were tested on their recall of information.
Predict the result.
0%
0%
0%
A. Population A scored higher on the test.
B. Population B scored higher on the test.
C. There was no difference in test score.
Roediger & Karpicke, 2006, Perspectives in Psychological Science, v. 1, p.181-210.
Test Enhanced Learning
120 students complete a
reading assignment
(~250 words)
• Population A studied the
passage twice (7 minutes
each time)
• Population B studied the
passage once and then
took a test
• Both populations then
tested at 5 minute, 2 day,
and 1 week intervals
A B
A B
A B
• Population B retained more
knowledge after 2 days, 1 week
Roediger & Karpicke, 2006, Perspectives in Psychological Science, v. 1, p.181-210.
Peer Learning Assessment
U. Of Colorado Genetics Course Conceptest Responses
Conceptest
posed
Individual
responses
Peer instruction
responses
Follow up
individual
responses
All Correct
Correct after
peer instruction
Smith et al., 2009, Science, v. 323, January 2, p.122-124.
The Value of Peer Instruction
Experimental Group: Students took physics test
individually, then again as a pair.
Control Group: Students took test individually.
Proportion of pairs of students who both got the question
wrong on the first test but correct on “paired” test: 29%
Students in both groups answered similar
questions on a second exam two weeks later.
Mean score on second exam for experimental group: 74%
Mean score on second exam for control group: 64%
Singh, C., 2005. American Journal of Physics, v.73 #5, p. 446-451.
.
The Value of Peer Instruction
Students taught key concepts using one of four methods.
Student learning assessed by proportion of correct answers
to open ended questions on same concepts on final exam
Teaching method
No demonstration
% correct answers
61
Observation of demonstration w/explanation
70*
Prediction prior to demo with a conceptest
77*
Prediction prior to demonstration using
discussion & a later conceptest
82*
n = 158-297; * = statistically significant result vs. no demonstration
Crouch, C.H., Fagen, A.P., Callan, J.P., & Mazur, E., 2004. American Journal of Physics, v.72 #6, p. 835-838.
Importance of Student Reflection
The weakest
students often do not
realize that they do
not understand key
concepts
Doubly cursed:
Students who can’t
answer questions
correctly can’t selfdiagnose their lack
of ability
Dunning et al., 2003. Current directions in psychological science, v.12 #3, p.83-87
The Value of Peer Reflection
Experimental Group: Three 2-minute pauses per
lecture, student discussion of lecture content with peer.
Control Group: No pauses for discussion in lecture.
All students completed a free recall exercise at end of
lecture and delayed multiple choice test 12 days later.
Exp. Group – mean number of facts recalled : 22.97*
Cont. Group – mean number of facts recalled : 16.63
Exp. Group – MC test average score : 84.39*
Cont. Group – MC test average score : 76.28
*statistically
significant gain
Ruhl, Hughes, and Schloss., 1987. Teacher Education and Special Education, v.10 #1, p.14-18
Value of Attendance
Emphasized attendance,
showed data graph weekly
Average attendance 70%
Average grade 73%
Verbal encouragement to
attend (less emphasis)
Average attendance 59%
Average grade 64%
Science classes at University of Minnesota
Moore et al., 2003, American Biology Teacher, v. 5, p.325-329.
Clickers in Effective Pedagogy
Clicker
Pedagogy:
. . . and it makes
me feel good
It’s the message, not the medium
Instructor Satisfaction Survey
Strongly Agree - Agree - Neutral - Disagree - Strongly Disagree
1
2
3
4
5
Helped determine students understanding
More contact, communication with students
More cooperation among students
Less lecture, more student discussion
Prompt feedback on student learning
Easier to emphasize critical concepts
Emphasize high expectations
More opportunity for diverse skill sets
Class was more enjoyable than previous
Would recommend CPS for other classes
(n = 35)
Student Satisfaction vs. Class Level
Student Satifaction Survey Scores
Helped gauge level of understanding
Lower (<200)
Upper (>200)
Reinforced important concepts
Graduate
Increased desire to come to class
Increased my interaction with other students
Improved my performance in class
Increased my willingness to ask questions
Would recommend use at UA
Made class more enjoyable
n = 1597
1
2
3
4
Average Score
(1=Strongly Agree, 5=Strongly Disagree)
5
Impact on Students
Student Comment Matrix
Total Students Surveyed:
1327
Pedagogy
Technology
Other
Total
What do you
like best about
the use of
conceptests and
clickers?
# of
Responses
950
93
52
1095
% of
Responses
86.8%
8.5%
4.7%
100%
What do you
like least about
the use of
conceptests and
clickers?
# of
Responses
162
621
64
847
% of
Responses
19.1%
73.3%
7.6%
100%
Benefits of Technology & Pedagogy
What did you like best about the use of the conceptests and CPS?
“That it really helped me participate more in class.” Natural Science
Biology
“CPS forced me to review class materials. This helped to reinforce my
memory/knowledge.” Microbiology
“I was able to gage my knowledge level to others in class.” Emergency
Management.
“It makes you want to learn more and enjoy the class more.” Basic
Mathematics II
“I knew what I had to study.” Human Diversity
“I actually had to figure out problems to answer them, so I understood it
better.” Principles of Chemistry
“Not having to talk to participate.” Government and Politics
Drawbacks of Technology & Pedagogy
What did you like least about the use of the conceptests and CPS?
“We got into discussion groups, but didn’t discuss.” Human Diversity
“Questions sometimes are a little difficult to understand.” Criminal
Case Management
“The stress of missing questions.” Statistics
“It takes time during class that we could be using to take notes.”
Human Diversity
“Some questions didn’t give enough time.” Macroeconomics
“Attendance and having to come to class all of the time.” Government
and Politics
“Don’t make questions all at the end of lecture. Throw some in the
middle of the lecture.” Macroeconomics
Suggestions for Using Clickers
Some suggestions from our experience:
 Use the clickers everyday to insure students bring them
with them to class and value their use
 Make questions sufficiently challenging to them worth
asking – aim for correct response rates 50-70%
 Avoid grading headaches – low stakes assessment,
consider using for participation points
 Bigger is best – use in lower level, gen. ed. classes
 Less value if class already uses active pedagogy
strategies or if students already participate fully
Seven Principles of Good Practice
1. Encourages student-faculty contact
2. Develops cooperation among students
3. Encourages active learning
4. Provides prompt feedback
5. Emphasizes time on task
6. Communicates high expectations
7. Respects diverse talents and ways of learning
Chickering & Gamson, AAHE Bulletin, 1987, p. 3-7
Clickers in Effective Pedagogy
Clicker
Pedagogy:
* Writing Good
Questions *
It’s the message, not the medium
Assessment with Clickers
Teaching and learning goals can be
ordered using Bloom’s Taxonomy
Knowledge
memorization and recall
Comprehension
understanding
Application
using knowledge
Analysis
taking apart information
Synthesis
reorganizing information
Evaluation
making judgements
Right/Wrong
answers
More complex
questions call
for more
sophisticated
guides and
responses
Degrees of
correctness
Introductory Exercise
Examine the six questions on page 2 of the
handout. Assume you are a student in classes
where these questions would be appropriate.
Rank the questions from the easiest to most
challenging based on the character of the
question and nature of knowledge needed to
answer it correctly.
Classify using Bloom’s Taxonomy
A. Which one of the following values approximates best to the
volume of a sphere with radius 5m?
a) 2000m³
b) 1000m³ c) 500m³ d) 250m³ e) 125m³
B. How successful were recent income tax cuts in spurring
economic growth?
C. What is the capital of Maine?
D. How would you restructure the school day to reflect children’s
developmental needs?
E. Contrast the floor of the Atlantic Ocean with the shape of a
bathtub.
F. Which statements in the President’s State of the Union
address were based on facts and which were based on
assumptions?
Which is a synthesis question?
A.
B.
C.
D.
E.
F.
A
B
C
D
E
F
0%
A
0%
B
0%
0%
C
D
0%
0%
E
F
Which is an application question?
A.
B.
C.
D.
E.
F.
A
B
C
D
E
F
0%
A
0%
B
0%
0%
C
D
0%
0%
E
F
Bloom’s Taxonomy
Knowledge
Which one of the following persons is the author of "Das Kapital"?
a) Mannheim b) Marx c) Weber d) Engels e) Michels
Comprehension
Fill in the blank to complete the analogy. The yolk is to the egg
as the ____________ is to Earth.
a) crust
b) mantle
c) core d) asthenosphere
Bloom’s Taxonomy
Comprehension
In the landscape below, how would the amount of rainfall
change at location X if the mountain eroded down to the
dashed line?
a. Rainfall would increase
b. Rainfall would decrease
c. Rainfall would stay the same
Bloom’s Taxonomy
Analysis
Read carefully through the paragraph below, and decide which of
the options a, b, c, or d is correct.
Rising saturated air undergoes: i) adiabatic cooling as air
contracts due to decreasing pressure with increasing altitude;
and, ii) warming due to the latent heat of condensation as water
vapor is converted to liquid water droplets.
a. The word “contracts” should be replaced by “expands”.
b. The word “condensation” should be replaced by
“evaporation”.
c. The word “warming” should be replaced by “cooling”.
d. The word “altitude” should be replaced by “elevation”.
Bloom’s Taxonomy
Evaluation
Judge the sentence in italics according to the criteria given
below: "The United States took part in the Gulf War against
Iraq BECAUSE of the lack of civil liberties imposed on the
Kurds by Saddam Hussein's regime.“
a) The assertion and the reason are both correct, and the
reason is valid.
b) The assertion and the reason are both correct, but the
reason is invalid.
c) The assertion is correct but the reason is incorrect.
d) The assertion is incorrect but the reason is correct.
e) Both the assertion and the reason are incorrect.
Clickers and Effective Pedagogy
Any Questions?
Download