Exam Question-setting Good Practice

advertisement
EXAM QUESTION-SETTING
Guidance on writing exam questions
1.
Rather than being included in this handbook, the School’s main guidance on ‘Writing
Good Exam Questions’ is available as a self-study workbook on the website.
 This provides guidance on preparing questions and marking criteria at an
appropriate Master’s level, along with examples and exercises.
 It is primarily intended for staff who are new to writing exam questions, although
more experienced colleagues may still find it a useful reference point.
2.
The following additional guidance provides a summary of good practice to apply.
Overall:
 Prepare the exam question, the model answer and the marking scheme together.
You need all three to judge whether a question is good.
Questions:
 Use the intended learning outcomes of the module or MSc to decide what to
assess.
 Work out what each draft question is really assessing (repeating a definition,
applying a method, etc).
 Don’t assess or measure the same things repeatedly.
 Include data or information in questions, to reduce the emphasis on memory and
increase the emphasis on critical thinking.
 Keep sentences short, using precise and non-ambiguous language
 Make the question layout easy to follow.
 Check the standard is appropriate, i.e. at Master’s level.
 Proof-read carefully.
Model answers:
 Write a model answer for the question.
 Check the question clearly states what you expect as portrayed by the model
answer – e.g. “discuss the result, including the interpretation of the OR and of the
CI”, rather than just “discuss the results”.
 Allocate times to question sections, attempting to estimate how long a student
should take to do each part of the question.
Marking scheme and criteria:




Write a marking scheme for the question which will be clear to co-markers.
The marking scheme should address format (e.g. structure, analytical level,
originality etc.) as well as content (which need not be absolutely prescribed for
interpretative discussion-based types of question).
Decide what the criteria for assessment are.
Indicate the weight to be given to different sections of a question, e.g. as a rough
fraction or percentage (though not necessarily being too prescriptive – the School
does not use percentage marking as standard).
1
Evaluating the question

When you have the question, model answer and marking scheme, ask someone
else to write an answer the question (do not give them the model answer), timing
each part of the question. Compare with your model answer and timings. Modify
the question, and timings and marking scheme based on any potential
misunderstandings identified.
Specific guidance on content of LSHTM exam papers
3.
Question-setters for formal summer exam papers should review exam questions from
past academic years, for the relevant course and possibly from other related courses
run at the School, to get ideas on the types and styles of questions which have been
used as well as the content covered.
 F2F past papers are available via the Teaching Support Office.
 DL exam past papers for each academic year are compiled into two “Examiners’
reports” for each course (one covering core modules and the other advanced
modules), along with commentary about how the questions should have been
answered or were answered by students. These provide a useful resource for
ideas on the types and styles of questions used, and how different approached
worked.
4.
Further general guidance about what summer exams are expected to address is
given in the Assessment handbook, but specific requirements on what elements of
curriculum to assess will always be determined at course level, in line with the
intended learning outcomes (ILOs) set out in the Programme Specification. Not all
ILOs will necessarily be assessed through the exams, as some may be assessed
separately e.g. through the project or practical exams.
Checklist for reviewing draft exam questions
5.
The following checklist can be used to help evaluate and improve draft questions.
Working through this with a small group of fellow-examiners (e.g. putting questions
up on overheads and discussing them) can be valuable.
a) What is the question intended to measure? E.g. factual recall, data
processing/analysis skills, problem-solving skills, policy analysis skills, critical
analysis skills.
b) What else does it actually measure? E.g. does it rely too much on factual recall?
c) How well does the question relate to intended learning outcomes of the module or
MSc?
d) Is the language simple, clear, unambiguous and straightforward?
e) What are the key words/verbs describing the task, and are they sufficiently clear?
E.g. ‘distinguish’, ‘estimate’, ‘propose’.
f) Is the language used easy to understand, including by candidates for whom
English is not their first language (e.g. does it use colloquial phrases)?
g) Check punctuation and grammar as this can markedly change the meaning of
sentences – for example the well-known “Panda eats, shoots and leaves”.
h) Does the question give an advantage or disadvantage to those candidates with
particular professional backgrounds, e.g. medics?
i) How reliably and consistently can the answers be marked?
j)
If the question is in sections, is the proportional split of marks between sections
appropriate? Are there consequences for later sections if a candidate makes an
error in an early section? If yes, how will the marking cope with this possibility?
2
k) Can the question be completed in the time available (including reading, thinking
and reviewing time), including those for whom English is not their first language?
l)
Does the question lead to answers which will distinguish between weak and
strong candidates, i.e. are there opportunities for candidates to demonstrate
distinction-level skills and knowledge?
3
Download