What`s the point of the quiz

advertisement
Questionmark: multiple choice and beyond.
2/12/2016
page 1
Testing versioning in confluence
What’s the point of a quiz?
Determine the learning objectives first! Has the learner met your objectives?
Don’t simply test how good the learner is at guessing!
Quizzes can be learning tools as well as diagnostic: learners can learn material
through a well designed quiz.
Reports:
Measurement of both the learner and the test:



Testing the learner’s achievement:
o Participant report, coaching report
o Use pre-tests and post-tests
Reports for compliance:
 List report
Testing the quality of the questions (continuous improvement)
o question analysis report
Surveys and Survey reports
http://ummcelrncontent.mcit.med.umich.edu/q/open.dll?session= 9630580410100117
Questionmark: multiple choice and beyond.
2/12/2016
page 2
Types of questions:
question types
http://ummcelrncontent.mcit.med.umich.edu/q/open.dll?session=9091776733972741
Questionmark’s tryout example
http://tryout.questionmark.com/us/open.dll?name=tryitout&session=4515659428325277
http://ummcelrncontent.mcit.med.umich.edu/q/open.dll?session4198685306869791
Questionmark: multiple choice and beyond.

2/12/2016
page 3
Drag-and-Drop: the participant clicks and drags up to ten images into position. The
feedback and score is dependant upon the final position of the images.
Try a Drag-and-drop question

Essay question: the participant answers by typing up to 30,000 characters of text.
Perception’s Scoring Tool enables grading essay questions within assessments by
using customized rubrics. You may define what is right or wrong in advance by entering a
list of acceptable answers or print out a report of the responses for manual grading.The
logic can also allow scoring based on the presence or absence of keywords or key
phrases. This question type is also used to solicit opinions or suggestions on a particular
subject.

Explanation screens: insert text or graphics for the participant to view prior to answering
a series of questions.

File Upload: participants are often required to complete an assignment which requires
them to create a document in the form of a computer file. Question authors can use File
Upload questions to enable participants to upload their document files.

Fill-in-the-blank: the participant is presented with a statement where one or more words
are missing and completes the missing words. The score can be determined from
checking each blank against a list of acceptable words and can checked for misspelled
words.
Try a fill-in-the-blank question

Hotspot: a participant clicks on a picture to indicate their choice. Depending upon their
choice, certain feedback and grades will be assigned. A graphics editor is provided to
simplify specifying the choice areas.
Try a Hotspot question

Knowledge Matrix: this question type presents several multiple-choice questions
together where the participant selects one choice for each statement or question
presented. This question type is used to cross-relate responses from a single item.
Try a Knowledge Matrix Question

Survey Matrix: This question type enables you to include multiple rows of Likert
questions within a table with column headers included.
Try a Survey Matrix Question

Likert scale: the participant selects one of several options such as "strongly agree"
through "strongly disagree" that are weighted with numbers to aid analysis of the results.
Try a Likert Scale question

Matching: two series of statements/words are presented and the participant must match
items from one list to items within the other list.
Try a matching question

Multiple choice: the participant selects one choice from up to 40 possible answers.
There is no limit to the length of each answer.
Try a multiple choice question
Questionmark: multiple choice and beyond.
2/12/2016
page 4

Multiple response: similar to multiple choice except the participant is not limited to
choosing one response; he/she can select none, one or more of the choices offered.

Numeric questions: a participant is prompted to enter a numeric value, and this may be
scored as one value for an exact answer and another score if the response is within a
range.

Pull-Down List (selection question): a series of statements are presented and the
participant can match these statements with a pull-down list.
Try a selection question using a Pull-down list

Ranking (Rank in Order): a list of choices must be ranked numerically with duplicate
matches not allowed.

Select-a-blank: the participant is presented with a statement where a word is missing;
words can be selected from a pull-down list to indicate their answer.

True/False: the participant selects "true" or "false" in response to the question.

Word response (text match): the participant types in a single word or a few words to
indicate their answer. You define right or wrong words or phrases in advance by entering
a list of acceptable answers. The grading logic can also allow scoring based on the
presence or absence of keywords or key phrases and check for misspellings.

Yes/No: the participant selects "Yes" or "No" in response to the question.

Adobe Flash: Perception supports an interface with Adobe Flash that allow
programmers to program customized items using Flash and have the results recorded
within the answer database.
Try a sample Adobe Flash question
Passive vs. active participation

Adobe Captivate Simulations: Perception supports an interface with Adobe Captivate
that allows subject matter experts to create simulations that can provide scoring
information for multiple interactions and have the results recorded within the answer
database.
Try a Question using an Adobe Captivate Simulation

Spoken Response: Using the Horizon Wimba Connector you to record a participant's
voice as the answer to a question. Scores for spoken responses can be processed along
with other test scores using Perception’s reporting tools.
See How Spoken Response Questions Work

Java: Perception supports an interface with Java that allows programmers to program
customized items using Java and have the results recorded within the answer database.
Questionmark: multiple choice and beyond.
2/12/2016
page 5
Question and Quiz structure
Question Text
Can be HTML, flash, animated, video, sound, etc.
Question Choices
This depends on the type of question - can be HTML, animated, hotspot,
draggable, etc.
Question Scoring
all or nothing
based on rules: choices A+B but not C = 10 points, A+B+C=5 points, etc.
negative numbers can be used
accumulate points: A=3, B=3, C=3, D=0, E=0
Block Score
Assessment Score
Outcomes
At the assessment level these are determined by the score bands you
specify. Traditionally UMHS quizzes use 80%, but anything is possible.
You can have more scorebands than simply “passed” and “failed”, but this
must be planned in advance since our current template assumes pass/fail.
An outcome at the assessment level delivers feedback to the learner and
sends information about the score to MLearning.
Jumps are determined by block or question-level outcomes, and will be
discussed under branching.
Feedback:
Question level:
 can be individualized to each choice.
 Can be immediate feedback for each question if using Question by
Question format
Topic level (can be used for each learning objective)
 diagnose knowledge and skill deficiencies
 correct misconceptions
 prescribe additional items.
Assessment level - you passed, you failed, and what to do next
All in one page vs. Question by Question format
We have typically used all-in-one page format, but QxQ may be more appropriate
for some types of assessments.
Questionmark: multiple choice and beyond.
2/12/2016
page 6
Branching
You can jump back and forth, or from block to block or to the end of the
assessment based on criteria that you set up within the question blocks:
 based on a particular choice: “I prefer X or Y”, “I am an RN”
 based on a block score: below 50%, you get more questions from bank A,
above 50, more from bank B
 based on the total quiz score
uses
 a block must be retaken until it is passed
 sort people by level of skill
 sort people by job classification
 games
 scenarios: let them choose a path through a complicated scenario diagnose or triage a patient, decision trees, etc. (this takes
PLANNING).
Special features:
 Time limits
 limiting number of tries on an assessment
Be sure to see us to plan these out
Text-based and browser- based authoring
If you have a lot of multiple choice questions, you may deliver them to us in ASCII
format. We can import them (this only takes a minute) then you can edit them yourself
through the web. Many authors prefer having this level of control over their content, and
it goes much faster.
We also have a word template that can be used by authors.
Complicated question types must be planned out and done by our team.
Reference:
A trivia game built with Questionmark Perception
Whitepapers and PowerPoint files:
http://mlearning.med.umich.edu/home/downloads/qmark/
 Assessments through the Learning Process
 Creating Fair and Accurate Tests
 Item Writing: best practices
 Learning Benefits of Questions
 Level 1 assessments
 Measuring Learning Results
 Perception Question types
 Planning Certification programs
 Producing the exam
Questionmark: multiple choice and beyond.



Training evaluation
Writing Test Blueprints and Test Items
Writing Good Test Questions
Try out some question types
2/12/2016
page 7
Download