Taylor, P. (2013)

advertisement
The use of peer assessment/review
in distance teaching via the Moodle
VLE.
Peter Taylor
17th January 2013
The origins of the project
• The refreshing assessment project – review the use of
assessment tools across the University and our internal
expertise in assessment and to identify areas for
development.
• Highlighted a number of issues
Issues highlighted
• Poor dissemination of good practice
• Lack of links to Learning outcomes – both at module
and but particularly at award level
• Lack of coherent assessment strategy across the
award
• Lack of coherent development of assessment tools
• The key area absent was peer assessment/review
which is used commonly across the sector
The use of peer
assessment/review in distance
teaching via the Moodle VLE
• At present there is very little use of peer
assessment/review across the Open University, despite
widespread use across the sector.
• We have very little institutional knowledge of;
– the advantages and disadvantages of peer
assessment/review;
– how it can be applied to distance learning
– automated systems for delivery
Initial Activities
• Review of the literature on the use of peer assessment
in Distance teaching
• Review the use of peer review across the Open
University
• Review the use of automated systems for peer review
at other universities
• Carry out a series of pilot studies on a number of
courses (using e-mail) across the University on the use
and advantages of peer assessment/review in distance
learning.
Outcomes so far - by others
• Frances Chetwynd, Christine Gardner and Helen
Jefferis carried out the literature review
• Janet Dyke evaluated the use of peer review in the OU
• Manish Malik reviewed external use of automated peer
review systems
• Richard Pederick – S104B
• Sue Nieland – ED209
• Chris Middup – T320
• Krushil Watene – A850
Definition of peer learning (Topping 2005):
• Peer learning can be defined as the acquisition of
knowledge and skill through active helping and
supporting among status equals or matched
companions.
• It involves people from similar social groupings who are
not professional teachers helping each other to learn
and learning themselves by so doing.
Peer assessment/review
An arrangement in which individuals consider the
amount, level, value, worth, quality, or success of the
products or outcomes of learning of peers of similar
status
(Topping, 1998)
Good practice in peer review
When implementing peer review in a module the first step
must be to decide on the goals that are hoped to be
achieved by introducing this:
•
•
•
•
an assessment tool
a learning tool (for and of learning)
a learning-how-to-assess tool
an active participation tool
(from Gielen et al, 2011)
Good practice in peer review
Issues for students:
effectiveness;
fairness;
reliability;
validity;
trust;
The quality of the feedback given by peers can be variable
and consequently the trust factor between students is key
Good practice in peer review
Decisions for teachers:
anonymity
consistency
fairness
nature of feedback
Literature work conclusions
• One of the benefits that has been noted for the teacher
is that there is possible time saving.
• From the student perspective one possible benefit of
peer assessment is that it may increase a sense of
community and so reduce isolation.
• Other benefits that have been found in the literature
include the possibility that peer review encourages
students to be more critical of their work; it improves
motivation and it potentially gives the students a better
understanding of the learning process.
Literature review continued
• Drawbacks for the student have been suggested to be,
the problem that they lack the skills needed to give
effective feedback, and they may have a fear of giving
negative feedback.
• There is also an issue with a lack of time for students
and this may be exacerbated if they need to learn new
systems.
• If anybody is interested in seeing the review I can make
it available.
Use in the OU
• Most use of peer assessment involve photography or
design where critical appraisal of each other work was
important.
• There was little peer review of written answers.
Activity
• Without worrying about how we might do it or the
drawbacks of distance education;
• How might peer review/assessment be used in your
module and/or award?
• What do you think are the benefits for your students
• What do you think are the drawbacks/pitfalls
First pilot study
• This involved a tutor asking for volunteers and within
their group.
• The students were e-mailed a task and sent the
answers back via e-mail by a particular date
• These were made anonymous and each volunteer was
sent three copies of other people’s answer as well as
the markscheme
• The results were then sent back to the Tutor who
forwarded them on to the original student.
Findings from the Pilot studies
• Under a third of students approached volunteered to take part
• Not all of those who submitted work carried out the assessment
phase (~75%)
 Time constraints within a busy course were a major factor in
volunteers dropping-out of the project.
• Marking was overall consistent with that of the tutor
 Overall experiences of the pilot project were very positive, indicating
that all students who provided feedback would take part in a similar
exercise, or study courses in which peer assessment/ feedback were
integrated “if time allowed”.
Benefits
• “It was interesting to see the differing styles of answering the
questions”
• “it forced me to grasp the concept so that I could relay it back to
another student”
• “I felt like a teacher and enjoyed it.”
• “I like being assessed and gaining someone else’s point of view on
my work.”
• “I’m fine with honest criticism, anonymous or not. I could tell that the
negative feedback I received was honest and objective, which was
totally fine”
• “This was a good learning experience which, for me, highlighted how
difficult it is to take a purely objective approach to assessing a
submitted piece of work.”
Issues
• “If I was doing this more and it actually went towards some final mark
or assessment, I would need to see exemplar material, I think, so I
know what an excellent, good, average and weak essay actually
looked like.”
• “I’d be worried if I was doing this and it actually had some influence
on the student’s final mark. It is very subjective and there’s a lot riding
on me getting it right. But it’s good for getting the hang of
assessment.”
• “I think it was quite scary doing it but knowing that the student didn’t
know it was me was better. I can see how hard it is to be a tutor now
as everyone knows who marked their work.”
• “…a bit hard [to provide feedback] as I don't like to criticize
people…”.
• “a fairly negative tone on the whole and just lacked ‘warmth’ so it
wasn’t very comfortable to read”
Comments from ALs
• “Students are likely to view the process as beneficial if the process is
able to fit in with the way in which students organize their time and
the way in which students write their essays.” – link to real TMAs
• “As such, and while flexibility is important, peer assessment can
contribute to encouraging good time management for students.”
• “ ….. achieve this by stressing the need to highlight positive aspects
of the submissions – but this did not prevent negative reviews in the
end. Clarifying the marking criteria – and possibly doing a dummy
assessment run would help here.”
• “Students did point out, however, that they should be able to
respond to reviews”
S104 experience
• The students were each sent a question sheet drawn
from previous S104 EMAs and covering a range of
learning outcomes
• Marking guidance including correct and exemplar
answers were provided, alongside ‘Learning Outcome
Grids’ as supplied to S104 tutors.
Plot of student marking against tutor marking
100
y = 0.851x + 6.3451
R2 = 0.9013
90
80
Tutor grade
70
60
Series1
50
Linear (Series1)
40
30
20
10
0
0
20
40
60
80
100
120
Tutor
Mean mark
87
90
81
87
64
70
12
14
57
54
41
40
58
57
Student grade
Only six of the fifteen marks awarded by the peer group deviated 8% or
more from that awarded by the tutor
Student marking wasn’t far from Tutor marking – if anything they were
more generous
Were you surprised by the
results?
Moodle has an application called
workshop for carrying out peer
review/assessment
• It was released for modules to use in April 2012 (soft
launch).
• It provides random distribution of scripts either across
the whole cohort or within groups
• Distributed assignments are anonymous
• Assessment grades can be moderated before
publication
• It links into grade book and so can be carried into the
OU assessment system
The Workshop application
• Setup phase
– Upload questions, exemplar answers and markscheme
– Set up grading/feedback strategy
– Set cut off dates for submission and assessment activities
• Submission phase
– Release question to students – submit online before cut off date
• Assessment phase
– Students receive mark scheme
– Students may receive exemplars for marking and moderation
exercise.
– Students receive randomly allocated scripts (1-20) and using the
mark scheme give feedback and grades
The Workshop application
• Evaluation phase
– Workshop calculates final grade for submission and assessment
activity
– Tutor can overide grades
– Tutor can provide feedback on assessment process
• Closed
– Final grades can be pushed into the gradebook and thus the OU
assessment system (if required)
Student receives
• Grades from other students
• Feedback from other students
• Grading of how consistent their marking was.
S390 example
• S390 is a project module – 5000 word dissertation
• Students were asked to produce an abstract for peer
assessment
• results
S366 example
• S366 is a standard module with an exam
• Students were asked to answer questions from a
previous exam paper at the start of their revision
process
• Results
Test sample
• Result
Comments from online pilot
• The VLE Workshop tool was very easy to find on the
website, the icon stood out clearly, noticeable as soon
as the website was accessed.
• The page layout in the Workshop was easy to read, a
good size font.
• Feedback from the 6 students in my group also agreed
that the Workshop tool was easy to find, though one or
two suggested improvements could be made for
reading the abstract while applying the feedback and
marks (a similar problem that tutors have when
provided with all onscreen work and marking guides,
solved only by printing out).
S366 experience
• The marking was reasonably consistent although one
marker was consistently severe.
• A couple of students interpreted the mark scheme a
little more broadly.
• The tutor only moderated a couple of the marks given
by peers.
• Feedback was variable in length, but overall very good,
with most students providing feedback on every answer
which was constructive and helpful to the student.
• Only one peer marker provided very little or weak
feedback.
Further comments
• The concern is whether, even though a worthwhile
exercise, whether enough students would embrace and
participate in this sort of exercise, and perhaps there
does need to be care in the timing and alignment of the
exercise, for example to revision and when there are no
clashes with summative assessment.
Table 1. Individual and average assessment scores
awarded by peer markers and amount of feedback
Student id
Marks (%) awarded by Peer marker
A
A
B
C
D
E
.
B
C
83.3
67.4
D
E
F
G
H
72.1
-
61.9
71.4
69
-
77.4
66.9
50.2
67.9
78.6
F
77.6
71.0
G
H
Feedback -
54.8
74.5
On all On all
Scripts Scripts
on all
on all
answers answers
63.1
56.0
On all
scripts
on
nearly
every
answer
70.2
Weak
On all
on all
scripts
scripts, and
only on most
one or answers
two
answers
54.8
56.7
(61.0)
51.9
(63.0)
On all
Scripts
on all
answers
On all
scripts on
all
answers
AVE
SCORE
74.3
66.7
67.1
58.6
67.7
(70.0)
66.8
(70.0)
57.9
72.4
Feedback
• In terms of feedback most peer markers wrote some encouraging
and helpful but short comments on every answer.
• Comments were quite helpful and positive explaining why they had
given or not given a mark.
• Only one peer marker actually provided the conversion percentage,
and explained this in relation to the University scale.
• Overall, the level of feedback was good considering the lack of
experience and training of those involved. However, one peer
marker, E, was pretty weak on feedback on all scripts, only
commenting on one or two answers.
• Only one student did not get round to marking or providing feedback
on their peers work and there was no explanation given by this
student.
• When the peer group were asked if they would do anything differently
about half said that they would probably provide a bit more feedback.
Student comment
• All students said that they found answering the
questions very useful to their study of the module
• “Yes, as it helped to consolidate my revision. It threw
light on my areas of weakness”.
• “Absolutely, answering questions makes me think
about the links between things as well as making me
review material”.
• Likewise they all said reviewing other students work
was very useful to their studying of the module.
• “Yes. It gave me a chance to compare how I had
answered the questions with how others studying the
same material had presented their answers. I think
there was some learning from this which should
hopefully help me improve my exam effort”.
Student comments
• “I now realise how difficult marking is. I have a greater
empathy for my Tutors now”!!
• Some of the students felt a little uncomfortable or
unqualified to provide feedback but on balance more
were happy to respond in this way as they had a mark
scheme and it was anonymous feedback.
• “The fact that we were given a marking scheme helped
tremendously, although at times I felt I might be
applying it too rigidly, and did not feel to have any
scope for discretion. I was comfortable giving feedback
knowing that this was a trial but would have felt a little
under qualified had it been a live situation”.
Anonymity?
• About half felt that this would not make any difference
and they would respond in the same way, whilst the
other half said they would feel more pressured and
perhaps not participated.
• “Probably would not have participated if not
anonymous”.
• “Personally no, although I can imagine some people
would prefer it to remain anonymous. In knowing who
the other participants were in a general discussion
forum could take place to mull over the results”.
How do you feel about being
assessed by another student
• “I thought it was useful. I had used the marking scheme
to self-assess and the results from my peers were
similar to my own marking of my work. The feedback
was useful as it will make me think harder about how I
word my answers – I lost a couple of points for missing
out key words which I had considered self-evident”.
Giving feedback
• “most people are more comfortable giving anonymous
feedback so the remarks I received would be a more
accurate representation than it otherwise would have
been”
• “I didn’t give much feedback – only on the first question
which I felt harsh in down-marking because Mendel’s
name wasn’t mentioned. I would probably give more
feedback if future”.
• “No, I am quite comfortable. I would enjoy doing this
again in the future – we do something similar on the
forum but without an answer mark scheme to help us”.
Workshop tool
• All students said it was generally ‘easy’ to find the
workshop application and the majority found it
‘straightforward’ to use the workshop application on the
VLE.
• “In the main, straightforward. It did help to print out a
hard copy of the book answers when it came to the
marking phase as having to scroll up each time was a
little cumbersome”.
More feedback on the
workshop tool
• Half the students typed their answers directly on the
VLE, and the other half typed them first and then
pasted them in.
• In terms of how easy the students found the workshop
application to assess other peoples work, they
generally reported that it was easy, except perhaps if
they had had to assess more than three scripts
• “As stated in Q10, the referral back to the book
answers was a little awkward so I printed out a hard
copy of these. I think if you had more than three
papers to assess then the workshop application might
have got a little unwieldy, but for this amount of work it
was fine”.
To quote one student “I thought it
was a brilliant idea and hope it is
used in my last two modules of my
degree”
Next Phase
• I’m looking for Modules interested in trying out the
Workshop application on Moodle
• I would like to develop a handbook for the use of the
Workshop application.
Is anybody interested in trying
out the workshop tool?
• Any questions?
Related documents
Download