Administering, Analyzing, and Improving the Written Test

advertisement
Administering, Analyzing,
and Improving the
Written Test
Assembling the test
• You have already
– Written objectives
– Written test items
• Now you have to
– Package test
– Reproduce test
Packaging the test
• Group together all items of
similar format
• Arrange test items from easy to
hard
• Space the items for easy reading
• Keep items and options on the
same page
• Position illustrations near
descriptions
Packaging cont.
• Check your answer key
• Determine how student record
answers
• Provide space for name (and date)
• Check test directions
• Proofread test
Reproducing The Test
• Know the machinery
• Make extra copies (2 – 3)
• Specify copying instruction (if giving
to someone else to copy)
• Avoid
– Fine print
– Finely detailed maps or drawings
– Barely legible masters or originals
• File original test
Test Assembly Checklist
1. Are items of similar format grouped together?
_____
2. Are items arranged from easy to hard levels of
difficulty?
_____
3. Are items properly spaced?
_____
4. Are items and options on the same page?
_____
5. Are diagrams, maps and supporting material above
designed items and on the same page with items?
_____
6. Are answers random?
7. Have you decided whether an answer sheet
will be used?
8. Are blanks for name (and date) included?
_____
9. Have the directions been checked for clarity?
_____
10. Has the test been proofread for errors?
YES
NO
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
_____
Administering the Test
•
•
•
•
•
•
Maintain a positive attitude
Maximize achievement motivation
Equalize advantages
Avoid surprises
Clarify rules
Rotate distribution
Administering cont.
•
•
•
•
•
Remind students to check copies
Monitor students
Minimize distractions
Give time warnings
Collect test uniformly
Scoring the Test
• Prepare an answer key (ahead of
time)
• Check your answer key
• Score blindly
• Check machine-scored answer
sheets
• Check scoring
• Record scores
•
•
•
•
•
•
Analyzing the Test
Quantitative item analysis
Qualitative item analysis
Key
Distracter
Difficulty index (p)
Discrimination index (D)
– Positive
– Negative
– Zero
(p) and (D)
p=
Number of students selecting correct
answer
test
D=
Total number of students taking
Correct in UG – Correct in LG
½ student in class (if odd, bigger group)
Example
Example for item Y (Class size = 28)
Options A* B C D
Upper 4 1 5 4
Lower 1 7 3 3
p?
D?
Example
Example for item Z (Class size = 30)
Options A B* C D
Upper 3 4 3 5
Lower 0 102 3
p?
D?
Pre – Post Test results
• If this is done, look at
• Percentage answering each item
correctly on each test
• Percentage of items answered in
the expected direction for the
entire test
• Limitations
– Time
– Contamination
Debriefing Guidelines:
Before handing back tests
•
•
•
•
•
•
Discuss Problem items
Listen to student reactions
Avoid on-the-spot decisions
Be equitable in changes
Ask students to double-check
Ask students to identify problems
Identifying problems
• Test can be improved – you are
human
• Focus on test – it is not about you
• Sincerely examine if question is good
or not – discrimination index really
help – do not just throw out a
question
• Scores may change – rank typically
does not
• Objective of debrief is to make a
better test!
Download