Presentation Slides

advertisement
Automated Assessment and
Question Banks
Nidhi Parikh
March 12, 2012
References
• Joint Information Systems Committee, Effective Practice with e-
Assessment, 2007.
• http://docs.moodle.org.
• http://classes.sscnet.ucla.edu/docs/Using_GIFT_format_to
_create_multiple_questions_at_once.
• North Carolina A&T State University, Blackboard Support, Standard
Respondus Format for Microsoft Word Questions.
Outline
• Automated Assessment
• What is e-Assessment?
• Forms of Assessment
• Motivation
• Important Considerations
• Assessment and Learning
• Case Study 1
• Case Study 2
• Case Study 3
• Challenges
• Question banks
• Requirements
• Standard Question Formats
• GIFT: Moodle Format Examples
• Case Study 4
• Conclusion
What is e-Assessment?
• Broad definition:
• Any assessment activity that uses digital technology including
• Design and delivery of assessment
• Marking by computers or by scanners
• Reporting, storing and transferring of assessment data
• JISC definition:
• e-Assessment is the end-to-end electronic assessment processes
where ICT is used for the presentation of assessment activity,
and the recording of responses.
Forms of Assessment
• Based on at what stage of learner’s activity, it is taken:
• Diagnostic
• Assessment of a learner’s knowledge and skills at the outset of a
course.
• Formative
• Assessment that provides developmental feedback to a learner
on his current understanding and skills.
• Also known as assessment for learning.
• Summative
• The final assessment of a learner’s achievement, usually leading
to a formal qualification or certification of a skill.
• Also known as assessment of learning.
Motivation
• Can lead to more effective learning
• Perhaps the best way to identify support needs of learner.
• Provide good quality, timely feedback and appropriate
resources.
• Simulating and challenging ways to demonstrate understanding.
• Interactive and engaging tests could help people with
disabilities.
• Supports Personalization
• On demand assessment can help progress at their pace.
• Students can be fast tracked.
Motivation
• Ability to capture aspect of learning previously
impossible to capture.
• Reduced workload.
• Timely feedback/evaluations.
• Improved quality of feedback/evaluations.
• Multiple tests are possible.
Important Considerations
• Closer alignment of assessment with the pedagogic
approach used.
• Timeliness of assessment and quality of feedback are
important.
• Interactive and engaging assessment.
Assessment and Learning
The relationship between assessment and effective learning
Effective Practice for e-Assessment, JISC 2007
The TRIADS: Case Study 1
• Background:
• Tripartite Interactive Assessment Delivery System (TRIADS).
• Developed at the University of Derby in 1992 and further
developed in partnership with the University of Liverpool and
The Open University.
• In 2005-2006, it delivered assessments to 10000 students in
Derby.
The TRIADS: Case Study 1
• System:
• Flexible design structure.
• Capability to build in simulation and multimedia.
• Screen layout developed over many years according to current
guideline and accessibility.
• Code templates for generic question styles.
• Code template can be used in combination.
• Supports randomized selection and sequencing of questions.
• Contain full error trapping routines.
• Facilitate partial credits for methods, stages in answer and final
answer.
• Detail candidate performance reporting is provided helping to
pinpoint atypical result.
• Delivered over web, LAN or CD.
The TRIADS: Case Study 1
The TRIADS: Case Study 1
• Outcome:
• Despite sophistication offered, 70% of use in 1st year, 20% in
2nd year and 10% in 3rd year.
• May be due to workload so a centralized support unit is set up.
• Frequency and timing of access to test reveal that students
retake tests.
• Suggests that students are more likely to test themselves with eassessment than with pen-paper test.
Summative Assessment: Case Study 2
• Background:
• At the School of Pharmacy and Pharmaceutical Science, at the
University of Manchester.
• A subject with emphasis on accuracy.
• Have experienced with diagnostic and formative tests before
trying summative tests.
• 240 students, some with disabilities taking first year module in
cell biology and biochemistry.
Summative Assessment: Case Study 2
• System:
• Tests delivered on WebCT.
• Questions written in Microsoft Word, imported into WebCT
using Respondus.
• Student identity authenticated by university username and
password and photo ID check.
• Timed release of exam.
• Students could enter response online or on an optical marker.
Summative Assessment: Case Study 2
• Challenges:
• Problem with importing questions with decimal points.
• Editing has to be carried out in VLE.
• WebCT’s inability to work around spelling mistakes.
• List of subject specific keywords provided to students to
paste correct spelling.
• Reduced discrepancies between human and computer based
marking to less than 1%.
Summative Assessment: Case Study 2
• Outcome:
• Comparison with pilot study indicate that computer based tests
are more accurate compared to slower manual marking.
• Cascading style sheet allowed to customize display and helped
students with dyslexia.
Formative Assessment: Case Study 3
• Developed for an online BTEC Intermediate and Advanced
•
•
•
•
•
Award in IT skills for learners in small scale business
accredited by Edexcel.
Delivered via the Internet with online tutor support.
Assessment ongoing and incorporated into course structure
and content.
True of false questions.
Wrong answers advise learners to revisit activity.
Learner progress at their own pace.
Challenges
• Provide customized views.
• Need staff for technical support.
• Increased effort before test instead of after.
• Interoperability.
• Preparing question banks that can be shared between
departments and schools.
• Need of standard question formats.
• To make sure all students get test of equal measure.
• Ability to mark free text answers.
Question Banks
• Question format requirements:
• Question types
• Standardization
• Shared and personalized question banks
• Question versioning
• Import/Export to update existing questions
• Question Interface Requirements:
• Easy but powerful searching
• Rendering questions of equal measures
• Easy organization
• Customized Views
Standard Question Formats
Multiple choice
√
Moodle
(GIFT)
√
Multiple answer
√
√
√
True/False
√
√
√
Essay
√
√
√
Fill in the blanks
√
√
Question Type/ Respondus
File Format
Moodle
(XML)
√
Short answer
√
√
Match
√
√
Number match
√
√
Aiken
(simple)
√
√
√
√
√
Multi question
File format
Cloze
√
Text
doc
rtf
text
xml
text
xml
GIFT: Moodle Format Example 1
 Multiple Choice Question
::Question 1 - Multiple Choice::Where is UCLA
located?{=Westwood ~San Diego ~Downtown Los Angeles}
GIFT: Moodle Format Example 2
 Multiple Answer Question
::Question 3 - Multiple Choice with multiple correct
answers::Identify two former UCLA basketball
players.{~%50%Kareem Abdul-Jabbar~%50%Bill Walton~Kobe
Bryant~LeBron James}
GIFT: Moodle Format Example 3
 Matching Question
::Question 8 - Matching::Match the following countries with
their capitals. {=United States -> Washington DC=Japan ->
Tokyo=Germany -> Berlin=Venezuela -> Caracas}
GIFT: Moodle Format Example 4
 Number Range Question
::Question 9 - Number range::What year was UCLA founded in?
{#=1919:0=%50%1919:2}
The COLA Project: Case Study 4
• Background:
• COLEG Online Assessment project (COLA) is run by
Scotland's College Open Learning Exchange Group (COLEG).
• The aim is to create a bank of online formative assessment
across wide range of courses and levels in support of SQA
qualifications.
• Assessments are objective tests containing up to 20 questions
available in 17 subject areas.
The COLA Project: Case Study 4
• Challenges:
• Developed assessment for four Virtual Learning Environments:
• Blackboard
• Moodle
• Serco (Teknical) Virtual Campus
• WebCT
• The final question bank was delivered in IMS Question and Test
Interoperability (QTI) standard compliant form.
• None of the VLEs complied with IMS QTI standard.
• Solution was that question bank holds only IMS QTI format
questions.
• Conversion tools were developed for different platforms.
The COLA Project: Case Study 4
• Challenges:
• Question types:
• Multiple choice
• Multiple response
• True or false
• Match
• Fill in the blanks
• Later due to difficulties with fill in the blank they were removed
from question bank.
• Problem with user tracking and scoring in some VLEs.
• Accessibility of questions also affected by the way VLEs render
them.
• Influenced some colleges to upgrade or change VLE.
The COLA Project: Case Study 4
• Outcome:
• Around 500 formative tests have been written between 2005 o
2007.
• Practitioners and learners have found them useful.
• Raised the profile of e-learning in Scotland.
• Highlighted need of innovation in interoperability of systems.
Conclusion
• Studies show following benefits for automated assessment:
• More interactive and engaging environment.
• Quality feedback.
• Immediate feedback.
• Help in effective learning.
• Helps students with disabilities.
• Reduces workload on academic staff.
• But there are some challenges:
• Need better ways to incorporate in learning.
• Need more sophisticated question formats.
• Need better converters for question formats.
• Interoperability.
Thank you !
Download