Standard Poster Template - Center for Teaching Learning

advertisement
Peer Assessment of Oral Presentations
Kevin Yee
Faculty Center for Teaching & Learning, University of Central Florida
Research Question
For oral presentations, does the introduction of revised
peer-evaluation rubrics with increased focus and
greater detail improve student awareness of
presentation issues, and thus improve their perceived
performance?
Knowledge Base
•Students learn more from interacting with other
students than listening passively to teachers
(McKeachie’s Teaching Tips, Wilbert J. McKeachie,
New York, Houghton Mifflin, 2002). Also, “When
assessment criteria are firmly set, peer-feedback
enables students to judge the performance of their
peers in a manner comparable to those of the teachers.
However, the same is not found to be true with selfassessment” (Mrudula Patri. "The Influence of Peer
Feedback on Self- and Peer-Assessment of Oral
Skills." Language Testing, 19:2, 2002, p.109 ). Thus,
peer evaluation is the only means of assessment which
approaches the utility and practicality of teacher
evaluation.
•The potential benefit of collaborative work to
enhance learning is strong enough to consider it one of
the elite principles of good practice. But, “when
poorly managed, collaborative assignments can
decrease students’ sense of control and increase their
anxiety and anger” (Barbara Walvoord and Virginia
Anderson, Effective Grading, San Francisco, JosseyBass, 1998). Thus, peer edit rubrics must be properly
constructed, or else the exercise risks diminishing
learning rather than enhancing it.
Context
-Two groups of graduate students took the same noncredit course for a Teaching Certificate. G1 was five
students in Fall 2004, and G2 was 12 students in
Spring 2005.
Methods
Group One (Fall 2004):
G1 was given a less focused and more unstructured peer review rubric for oral presentations throughout the semester. G2
performed the same presentations and was given a more detailed rubric that focused more attention on just two
categories (voice and performance). In both cases, student presentations were followed by a class-wide discussion of
specific techniques, problems, and model behaviors from the performances. In the case of G2 only, the anonymous peer
evaluations were also handed back to each student after every session.
After six weeks, both groups completed a five-minute, non-anonymous midterm self-evaluation to answer these questions
with bullet points: What do you do well in presentations? What needs improvement? What have you learned about your
presentation style? No further instructions were given, and responses were intentionally free-form.
Group Two (Spring 2005):
Results of the self-evaluations were tabulated by group, and frequency of each type of reply was determined, enabling
several patterns to be discerned from the data.
Findings
Conclusion
- Adding extra details to peer edit sheets does
enhance student learning about the specifics being
highlighted, but with some loss of holistic and “big
picture” performance. Details on the peer edit
sheets should be chosen with care.
G2 was focused on issues of voice and performance rather than content and organization. G2, in considering time
management to be part of “organization”, listed this most often as the item needing improvement (6 out of 12
respondents). If one condenses the categories of answers given by G2 under the topic “Things I Do Well,” the majority
had to do with voice and performance (32 out of 45 responses). For G1, there was less agreement (only 5 out of 13
responses had to do with voice or performance).
- Peer-edit rubrics should not only be as detailed as
grading rubrics, but should also be open to revision
based on evaluations.
Transferability
Members of G2 gave answers similar to each other, and the words they chose mimic the language on the rubric.
Examples in “What I Learned” include eye contact, monotone, accent, and confidence. G1 chose words that did not
mimic their simpler and broader rubric, and seemed to focus on idiosyncrasies rather than categories common to the
rubric. Therefore, the focused rubric used by G2 seems to focus student attention on these particular concepts.
- Multiple disciplines can use grading rubrics to
evaluate student work whenever a sliding scale is
appropriate, including student oral presentations in
other fields, formal essays, written lab reports, or
math/computer problems where the process is as
important as the outcome.
Items at the top of G2’s rubric were the ones most often listed on the “Things I Do Well” list, and those at the
bottom of the rubric are mentioned the least. The top of the list, volume, was mentioned by 5 of 12 respondents, the
most correlation on the “Things I Do Well” list. Eye contact, at the top of the “performance” category, was mentioned
by 4 of 12 respondents. In G1 there was no meaningful correlation between things they believe they do well and their
rubric.
- This study suggests when detailed rubrics CAN be
used as peer edit sheets, they SHOULD be integrated
into the curriculum to focus attention on a few key
concepts, and to deepen and broaden student learning.
-Both groups gave weekly oral presentations. G1 was
given a less-focused and less-detailed rubric for peer
evaluation than G2.
G1 demonstrated little agreement within the group on any of the questions. Except for organization listed as “needing
improvement,” virtually all other responses by G1 were individual responses that did not match other responses in the
class. This suggests the class was not focused on any particular set of topics, and people pursued their own interests.
-At the midterm, both groups were asked to fill out
non-anonymous self-evaluations of their oral
presentation skills. The questions allowed for openform, prose answers.
The student responses can suggest ways in which the rubrics should be altered. Based on the feedback of the midterm
evaluation instrument, I will implement a follow-up rubric to highlight the two topics seen as most urgent by G2
needing improvement: organization (6 of 12 respondents) and time management (7 of 12 respondents).
Download