Using Peer Assessment & Technology to Support Learning

advertisement
Using Peer Assessment &
Technology to Support
Learning & Reflection
Fang Lou, Steve Bennett, Trevor Barker
School of Life Sciences/Computer Science
1
Introduction
• Overview of LTI project
• Course in two schools (COM and LFS)
– Level 4 BSc CS, Emedia Design,
– Level 4 BSc Bio Sci HP & Sport FOHP
•
•
•
•
Winning over difficult cohorts
Developing HOT skills
Supporting reflective learning
Improving performance
2
The technology
• Electronic Voting Systems
– CS: Used for peer marking of
previous cohort’s work
– LFS: workshop for students’ opinion
• Google Spreadsheets:
CV
– Feedback in great detail 220 students x 3 marks sheets
containing 15 items in each (CS)
• Optical mark sheet:
– Save time to input marks (LFS)
• Online data collection:
– Reflection and feedback from students (LFS)
3
Peer Assessment
• Large amount of research papers on the
beneficial effects of peer assessment on
student motivation and ability to self assess
• Three comprehensive meta-studies:
– Topping 1998,
– Falchikov and Goldfinch 2000,
– Zundert et al 2010
• Significant JISC projects REAP and PEER
4
The E Media Design Course
• A level 4 BSc course on the theory and
practice of digital media
• Students didn’t generally apply the design
theory to their creative artifacts
• Peer Evaluation of Previous Cohort’s work
Introduced
• Produced an increase in student attainment
but caused some student hostility
5
The Problem (or aftertaste)
2009 (Before Using Peer Assessment)
25% MCQ
25% MCQ
50% Flash CV
59% N=290
58% N=277
2010 (With Peer Assessment)
25% MCQ
56% N=215
25% EVS
Peer Marking
Exercise
Learner 1: “(.. for example the marking criteria,
it’s all over the place, how can we be tested on
someone’s opinion??) so who knows.
Learner 2: Maybe we will just guess what they
are thinking.
50% Flash CV
64.5% N=218
I'm confused, how can a test be solely based
on someone else opinion, this means even if
you can justify why you've put a certain option
it doesn't matter because your opinion doesn't
6
mean anything.”
2011 Version: The measures
• Better marking proportions
30% MCQ
10%
Peer
Marking
Event
60% Flash CV
• Better selection of exemplars
– The six pieces of previous cohort work with the lowest
variance between markers
• Rewriting of criteria
– More detailed and graduated attainment descriptors
• Using a Live EVS session instead of QMP
• Extremely detailed feedback on student marking
7
Result
8
Result
9
Result
10
Fundamental Issue
• Does using peer-assessment help with the
internalization of assignment criteria and
expected standards?
• It seems so
• But some students potentially merely regard it
as being asked to second guess the tutors.
• There was far less controversy with the
revised approach
• Students seemed to be buying in.
11
Some Issues
• This result was the culmination of 3 years
research. We are not convinced that simply
using without a great deal of additional work
would necessarily be as effective.
• Setting up the sessions, writing and revising
rubrics, selling the system to students and
moderators, producing support lectures and
materials etc. was not easy and took a great
deal of time.
12
Issues
• Student concerns had to be dealt with.
• The EVS is often used in formative contexts – quiz
rather than assessment. (Does this devalue it?)
• It is not absolutely reliable
• It requires experience to:
–
–
–
–
–
–
Manage large EVS sessions (200+ students)
Write and reflect on rubrics and presentations
Collects and configure good samples for the sessions
Collect data
Set up the PPTs, software and hardware
Relate handset numbers to students’ names (not that
easy)
13
Peer Assessment of a Full Lab Report
- differences in two cohorts
• Level 4 BioScience (Bio) programme has been
doing it for 5 years (Human Physiology
module) – positive
• Level 4 Sports Science and Sports Therapy
(Sports), introduced last year (Foundations of
Human Physiology module) – quite negative
• The disparity between Bio and Sports
• Challenge – winning over the SES/SPT cohort
14
Sequence of events
1. Laboratory study
2. Workshop after all students had
completed the laboratory class –
briefing
3. Submission of the report
4. Peer assessment (marking session):
clear marking criteria
5. Appeal/reflection/feedback: face-toface? Email? Online?
15
EVS question – What do you think
about peer assessment?
1. Glad to have a go
2. Curious to find out
how it works
3. Would prefer
lecturers to mark my
work
4. Not comfortable with
the responsibility
Bio
Sports
16
The problem
• Attitude – students did not see the point of
doing it
• No incentive – don’t care marking well or not
• Disappointment – when receive a carelessly
marked report
• Results – lower level of satisfaction and huge
amount of complain and staff moderation
17
The measures
• Link to professionalism – selling the idea of
peer assessment right from the Induction week;
stress again in the workshop
• Reduce peer mark allocation from 20% to 10%
• Allocate 5% for marking
• Example of a report in the marking session
• Moderate reports before releasing marks
• Introduce the online feedback system (WATS) –
5% to sports module
18
Findings
• More positive attitude – EVS results
• High engagement – many students
wrote a full page of comments
• Raised satisfaction – reflection and
feedback results
13%
48%
35%
le
le
w
ct
ur
it h
er
s.
th
.. .
.
...
fo
r
ta
b
ef
er
co
m
No
t
W
ou
ld
pr
fin
Cu
rio
us
t
o
d
Gl
a
ow
d
to
ha
v
ou
th
ea
go
4%
19
Return rates
FOHP
(Sports)
HP
(Bio)
Number of students
81
273
Reports submitted
Attended marking
session
79
255
80
240
Online reflection
49
103
20
What do the students think?
As a consequence of the
peer assessment I
feel better prepared
for my next lab report
70.0
59.2
60.0
50.0
40.0
30.0
22.4
20.0
16.3
10.0
2.0
0.0
0.0
SA
A
NAND
D
60.0
I benefited from being engaged
with the marking criteria prior
to writing up the report
SD
50.0
40.0
Sports (FOHP)
30.0
Bio (HP)
20.0
10.0
0.0
SA
A
NAND
D
21
SD
The discussions it raised while
the lab reports amongst peers
marking
the lab reports
also aided the learning process
amongst
peers
also
aided the
as you get
another
perspective.
learning process
I didn't find it useful
as the report I was
marking hugely
lacked in effort
It has made
me reflect
more deeply
than normal
I was surprised that
it would help me
with my lab report
and in the future
It is beneficial to do, however I do
not think it needs to be done all
the time. Once or twice is enough
to get a general idea of how the
marking works and how to
improve your work. 22
Key points for success
• Organisation of sessions
– Making sure the technology works
•
•
•
•
•
Marking criteria
Choosing Exemplars
Continuous improvement based on reflection
Selling the idea to students including briefing
Encouraging students: Reflection and
Feedback
– Technology can help
23
Acknowledgements
• LTI Support – Enhancement Awards 2011-12
• Fang’s colleagues (Mike Roberts and Chris Benham)
• References
– Barker, T. & Bennett, S., (2010), Marking Complex Assignments Using Peer Assessment
with an Electronic Voting System and an Automated Feedback Tool, Proceedings of
International Computer Assisted Assessment (CAA 2010),20-21 July, 2010, Southampton,
UK.
– Barefoot, H., Lou, F. & Russell, M. (2011) Peer Assessment: Educationally Effective and
Resource Efficient . Blended Learning in Practice, May, 2011
– Bennett, S. & Barker, T (2011a), Using Electronic Voting and Feedback: Developing HOT
Skills in Learners, presented at SOLSTICE 2011, June 8-9, Edge Hill University, UK
– Bennett, S. & Barker, T (2011b), The Use of Electronic Voting to Encourage the
Development of Higher Order Thinking Skills in Learners. , Proceedings of International
Computer Assisted Assessment (CAA 2011), July, 2011, Southampton, UK
– Lou, F., Barefoot, H., Bygate, D. and Russell, M. (2010) Using technology to make an
existing assessment more efficient. Poster at the International Blended Learning
Conference. June, University of Hertfordshire, UK
24
Download