Presentation to invited symposium at 4th EARLI/Northumbria Assessment Conference, Potsdam, August 2008

advertisement
E-assessment for learning?
Short-answer free-text questions with
tailored feedback
Sally Jordan
ENAC Potsdam August 2008
Centre for Open Learning of Mathematics, Science
Computing and Technology (COLMSCT)
The UK Open University
• Supported distance learning;
• 150,000 students, mostly studying part-time;
• Undergraduate courses are completely open entry, so
students have a wide range of previous qualifications;
• Normal age range from 18 to ??
• 10,000 of our students have declared a disability of
some sort;
• 25,000 of our students live outside the UK.
Implications for assessment
• Within the Open University context, learners are
geographically separated and we cannot assume that
they will meet their tutor in order to receive feedback.
• We are seeking to provide students with feedback on
assessment tasks which is personalised and received in
time to be used in future learning.
• We are using small regular assessment tasks to help
students to pace their study.
• We are also using assessment tasks to encourage
students to reflect on their learning and to enter into
informed discussion with their tutor.
• So perhaps e-assessment can offer benefits for
learning…?
The OpenMark system
• Uses a range of question types, going far beyond what
is possible with multiple choice;
• Question types include: numerical input, text input, drag
and drop, hotspot;
• Students are allowed three attempts with an increasing
amount of teaching guidance, wherever possible tailored
to the student’s previous incorrect answer;
• Different students receive variants of each question so
each has a unique assignment.
• OpenMark has been incorporated into Moodle, the open
source virtual learning environment being used by the
Open University.
Pushing the boundaries…
• We wanted to be able to ask questions requiring free
text answers of a phrase or sentence in length;
• This requires us to mark many different answers as
correct..
• and many different answers as incorrect…
• We are working with a commercially provided authoring
tool (from Intelligent Assessment Technologies Ltd.);
• The system copes well with poor spelling and, usually,
with poor grammar;
• It can handle answers in which word order is significant
and it accurately marks negated forms of a correct
answer.
Novel features
• The IAT questions sit within OpenMark and students are
offered three attempts with increasing feedback;
• We provide tailored feedback on both incorrect and
incomplete responses;
• We write the answer matching ourselves using a
template-based authoring tool;
• We use student responses to developmental versions of
the questions, themselves delivered online, to improve
the answer matching.
Evaluation 1:
User lab observations
• Six students were observed in June 2007;
• They reacted to the questions in interesting ways; most
gave their answers as phrases or in note form, even when
it had been suggested that answers should be given as a
complete sentence. Why?
• One student said ‘I’m going to give my answers in the
same way as I would for a tutor marked assignment’ – and
he did exactly that, composing his answers carefully as
grammatically correct sentences.
Evaluation 1
User lab observations cont.
• Students’ use of feedback was also variable;
• Some students read the feedback carefully, scrolling
across the text, nodding, making appropriate comments,
referring to the course text;
• These students were able to successfully amend their
previous answer i.e. to learn from the feedback
provided;
• But some students didn’t make use of the feedback
provided, especially when told that an incorrect answer
was correct.
Evaluation 2:
Human-computer marking comparison
• The computer marking was compared with that of 6 human
markers;
• For most questions the computer’s marking was
indistinguishable from that of the human markers;
• For all questions, the computer’s marking was closer to
that of the question author than that of some of the human
markers;
• The computer was not always ‘right’, but neither were the
human markers.
Almost final thoughts
• The computer’s answer matching is excellent, probably
because we have used real student responses to
develop it;
• This matters even in purely formative use – we want
students to receive the correct message;
• A few of these questions have been incorporated into
summative (low stakes) assessments. Do students trust
the computer to mark their work?
• So far, we have no evidence that they don’t (they are
aware that there is always a human arbitrator);
Final thoughts
• Most students are intrigued and impressed by the
technology;
• A few would prefer multiple choice questions: ‘you know
the answer is always there somewhere’;
• How useful are e-assessment questions of this type in
supporting student learning?
• What sort of feedback is most useful to students?
• How far is it appropriate to go?
Acknowledgments
• The Centre for Open Learning of Mathematics, Science,
Computing and Technology (COLMSCT) especially
Barbara Brockbank, Phil Butcher and Laura Hills;
• Tom Mitchell of Intelligent Assessment Technologies
Ltd.
Our demonstration questions are at
https://students.open.ac.uk/openmark/omdemo.iat/
Or if you want more…
https://students.open.ac.uk/openmark/s103-07b.block3world/
https://students.open.ac.uk/openmark/s103-07b.blocks45world/
https://students.open.ac.uk/openmark/s103-07b.block7world/
https://students.open.ac.uk/openmark/s103-07b.block8world/
https://students.open.ac.uk/openmark/s103-07b.block9world/
https://students.open.ac.uk/openmark/s103-07b.block10world/
https://students.open.ac.uk/openmark/s103-07b.block11world/
For more about OpenMark
http://www.open.ac.uk/openmarkexamples/index.shtml
For more about Intelligent Assessment Technologies Ltd.
http://intelligentassessment.com/
Sally Jordan
Centre for Open Learning of Mathematics, Science
Computing and Technology (COLMSCT)
The Open University
Walton Hall
Milton Keynes
MK7 6AA
s.e.jordan@open.ac.uk
http://www.open.ac.uk/colmsct/projects/sallyjordan
Download