Question Items for Assessment of Receptive Macroskills notes

advertisement
AMEP Assessment Task Bank
Professional Development Kit
Question Items for Assessment of Receptive Macroskills
(Reading and Listening)
Notes to accompany the PowerPoint for the AMEP Assessment Task Bank Professional
Development Kit. Developed by Marian Hargreaves for NEAS, 2013.
Slide 1: Front page
Slide 2: Relevant issues





Provide a variety of questions, but not so many as to be confusing
Ensure that the students are familiar with the type of question and understand what
they are supposed to do
Ensure that students know what is expected of them (eg short answers rather than
complete sentences)
Start with some easy questions as a lead-in
Ensure that questions follow the text ie the answer to Q1 comes before the answer
to Q2, and summary questions come at the end.
Slide 3: Question types – the list (on screen)
Slide 4: Multiple choice questions (MCQs)
Advantages (on screen)
Disadvantages (on screen)
MCQs are useful as they reduce the amount of reading and writing required to answer
the item. However, they are difficult to write and you may need to re-write the text so that
you have enough information for the distracters. Ideally you should have 3 or 4 distracters
which should be plausible but incorrect. Note dichotomous items – popular but bad
academically as learners have a 50% chance of guessing the correct answer.
Slide 5: Example Cert II E1: Demonstrate understanding of a telephone message (on
screen)
Slide 6: Summary cloze
This is a good item type for listening as it requires minimal writing. It is also good for
Learning Outcomes which require the recognition of explicit information in the text. A hazard
with summary cloze is that there may be more than one possible answer, not just the
obvious or intended one. Conversely, learners may just copy from the text.
Slide 7: Example Cert I D1: Demonstrate understanding of a spoken information text (on
screen)
Slide 8: Sentence completion
Similar to Summary Cloze and one of the easiest item types to produce. It gives the
opportunity for longer phrases in the answer, which is an advantage when assessing
© NEAS Ltd 2014
1
reading, but not so good for listening assessments. Other disadvantages are the possibility
of alternative, equally valid answers, and the temptation for learners to write too much.
A structured response format can help restrict the amount of writing and indicate exactly how
many words are expected.
Slide 9: Structured Example Cert I D1: Demonstrate understanding of a spoken
information text (on screen)
As opposed to an unstructured response format such as:
Slide 10: Unstructured Example (on screen)
which gives enormous opportunity for quite a long answer.
Slide 11: Short answer questions
This is another question type that is relatively easy to write as they can be very specific:
Example (on screen)
Note however, that they have the same potential problem as sentence completion in that
learners may feel they have to write longer answers than necessary, and a guide in the
instructions is necessary.
Slide 12: Example of short answer questions (on screen)
Slide 13: Example 2 (on screen) this question for Cert III F1: Demonstrate understanding of
a spoken discussion had to be discarded as the answers produced by learners were quite
varied and hard to mark.
Slide 14: Grid completion
This is another question type that requires minimal writing and is, theoretically, simple to
answer, but to be able to use grids for answers, you will probably need to write the text so
that it is structured specifically for that particular answer format. A grid should have a
clear relationship to the way it is structured and not just be a box surrounding other item
types (such as summary cloze). For example, the range of columns and rows should
correspond to the way the information is structured in the text.
Another major disadvantage is that learners may not understand what they have to do and
waste time working it out.
Slide 15: Example (grid on screen)
For this example, you would have to ensure that the text gave the relevant information in the
order indicated, ie the first answer was for the advantage of looking in a daily newspaper, the
second answer was for the disadvantage of the daily, etc.
Learners would also need to know the difference intended by the daily (or national)
newspaper (eg The Australian) and the local newspaper (eg the True Local/Armidale
Express/Katherine Times).
Given the amount of writing required by this particular grid, it is better used for a reading
assessment. For a listening task you would also need to write a considerable amount of
redundancy into the script, which then has the potential to actually make the whole question
harder.
Slide 16: Matching
Matching is often used in teaching tasks as they can be very specific, and are good for low
levels where pictures can be used. These items generally have two lists of information which
© NEAS Ltd 2014
2
need to be matched together. They are more reliable if the two lists do not have the same
number of items.
However, given the amount of reading involved, they should be used for reading assessment
and not for listening.
Slide 17: Example (on screen)
Slide 18: Ordering
Ordering involves numbering or sequencing parts of a text or events in a story. This is
commonly used to check understanding of narrative structure and is good for testing
understanding of sequencing and time. However, both matching and ordering can be
problematic: if one of the items is wrongly numbered, the whole list is wrong. In a matching
item, it can become a process of elimination (not the skill being tested!). In ordering, the
whole item should therefore be counted as one question.
Although little writing may be involved in the answer, this is another question type better
suited to a reading assessment.
Slide 19: Example (on screen)
In this example, for a Certificate I task, the first event has been given.
Slide 20: Information transfer
Finally, information transfer which involves labelling a diagram or picture with information
from the text. Advantages include being able to use pictures, and they are easy to mark,
however, learners may not know what they have to do.
Slide 21: Example
In this example a text is given for a reading assessment. If the question was to be used for a
listening task, the relevant vocabulary should be given in a clearly marked box.
Slide 22: Rubrics
Instructions need to be simple and clear, in language at least below the level being tested.
You don’t want the learners to get stuck because they don’t understand the instructions!
They should also be consistent for all tasks in that Learning Outcome.
Slide 23: Answer Key
Ensure that there are only a few possible correct answers.
Check whether the responses correspond to what you were expecting and moderate the
items and/or the text if there are more than three possible responses.
Always recheck the answer key after any modifications have been made to the text or the
questions.
Slide 24: General guidelines for designing question items
To recap:
 Be aware of test method effect
This refers to the difficulty of certain item types and the familiarity that learners have with
different types. For example, if learners have not done a matching activity before, they may
not know how to do it and either waste time working out what they have to do or get it wrong
because they didn’t know what was required. Test preparation courses usually include a lot
© NEAS Ltd 2014
3
of practice with the types of items that will be met in the test. Make sure that students are
familiar with the type of question and understand what they are supposed to do.
 Use a range of item formats.
In order to minimise test method effect, it is a good idea to use a range of item types in the
task. In this way, learners who find one type difficult won’t be overly disadvantaged.
However, you need to be careful that the task isn’t too cognitively demanding because there
are too many different item types – balance is important.

Rubrics should be clear, simple and consistent across tasks.

Questions should be well spaced, especially for listening tasks.

Questions should follow the presentation of information.
o Start with some easy questions as a lead-in.
o Summary questions, eg what is the main topic of this text?, should usually
come at the end of the test.
 Check that the amount of lexical overlap with the text is appropriate for the level of
difficulty required (too much overlap will make the item too easy).

For a listening assessment, limit speakers to two, with clearly different voices, eg
male/female.

Avoid responses that are too wordy.
 Avoid ‘What’ and ‘How’ open questions which require lengthy responses and are
generally open to interpretation, making it difficult to mark.

Check against other tasks for the same LO to ensure consistency.

Review the answer key.
Question types overview handout.
Suggestion: a suitable activity here would be to take a text and develop questions for
it.
Activities 1.2, 1.3 & 1.4 from the Reading Activities would work well.
© NEAS Ltd 2014
4
Download