Test Taker Experiences and Beliefs A Case Study: Recent Examinees

advertisement
Test Taker Experiences
and Beliefs
A Case Study: Recent
Examinees
Krista Breithaupt, MCC
Greg Pope, Yardstick Inc.
Bruno D. Zumbo, University of British
Columbia
Snapshot – Experiences
and Beliefs from our test
takers, February 2011.
Evolving notions of social &
psychological variables in
assessment.
TOPICS
2
Questionmark Survey
• Case Study: MCC test takers were invited to respond to a
survey in the first two weeks of February of 2011.
• These responses are the most current snapshot of MCC
test taker experiences and beliefs.
– MCC sent out an email invitation to 24,000 test takers in our
repository who had registered for exams in the last five years.
• We obtained a fantastic response: 3296 respondents
completed the survey, most in just a few days!
• We used Questionmark Perception to author and
administer the web-based surveys.
– French and English versions of the survey were selectable.
• What follows is a summary of results.
3
Demographics
The MCC is currently examining
our past exam results to confirm no
gender bias is evident in
performances of candidates.
Most took the exam within 5 years.
4
Beliefs of Test Takers - Fees
“I believe that exam
fees are reasonable for
candidates.”
The MCC exams are
comparable to other
certification exams in cost.
However, finding ways to
operate more cost-effectively
will benefit our candidates.
Reviewing existing processes
for operational efficiency can
reduce our costs in future..
78% Disagree
5
Beliefs of Test Takers – Adaptive Designs and Score Apeals
“I believe that computerized adaptive tests
(where different sets of questions are
administered to each candidates
depending on their answers to previous
questions on the exam) are fair.”
71% Strongly Agree
or Agree
71% Strongly Agree
or Agree
“I believe that if I need to appeal my
exam score that there is a fair and
accessible method for me to do so.”
6
Beliefs of Test Takers – Bias and Security
“I believe that the questions on the
certification and licensing exams are not
biased against one gender or a specific
racial demographic group.”
91% Strongly Agree
or Agree
35% Strongly Agree or
Agree
“I believe that cheating on certification
and licensing exams is a significant
problem.”
7
Beliefs of Test Takers – Security…
“I believe that it is easy to obtain leaked
actual exam questions on the internet if
I wanted to find them.”
Perhaps the biggest threat to
test security is collusion among
test takers in high stakes
examinations - sharing what
they recall after taking the test.
More and more testing programs
are taking pro-active steps to
identify and act when violations
occur over copyright protected
test content.
19% Strongly Agree or
Agree
8
Beliefs of Test Takers – Security…
“I believe that using automatic
statistical methods to detect people
cheating on tests should be done more
often.”
62% Strongly Agree or Agree
“I believe that biometric
authentication (e.g., fingerprint
analysis) is a good way to
identify candidates.”
64% Strongly Agree
or Agree
Candidates see increased value in their
credential if we’ve been aggressive about
security.
9
Beliefs of Test Takers – Security…
“I believe that it is acceptable to be video and/or
audio recorded during an exam.”
63% Strongly Agree or Agree
36% Strongly Agree or
Agree
“I believe that remote exam proctoring
using web-cameras and audio
listening devices is as effective as inperson exam proctoring methods.”
At MCC, we administer our exams outside of secure test centers. This has been a
convenient way to hold exams in convenient locations (Medical Schools and
Hospitals) across Canada. It may be time to closely examine the impact on fees,
and potential security risks this convenience creates.
10
Beliefs of Test Takers – Fairness and CBT
63% Strongly Agree or Agree
“I believe that scores that I obtain on
the exam are an accurate reflection of
what I know and can do.”
“I prefer taking computer-based exams
versus paper-and-pencil exams.”
Only 35% Strongly Agree or Agree
While only 65% of our test takers see the
exams as a strong predictor of practice,
this may be appropriate and true! We
need to expand our measurement to
broader competencies.
11
Clinical Skills Exams (OSCEs)
“ The exam where candidates attend a
series of clinical situations with a
standardized patient is an accurate measure
of what a physician would do in practice.”
If only 54% of candidates see
the OSCE as a valid
representation of their practice
skills. What is contributing to
their belief these are not
representative performances?
Beliefs of Test Takers – Meaning of Scores
“I believe that the exam scores I obtain
accurately predict how well I will do in my
profession; e.g., higher exam scores = higher
pay or greater promotions.”
56% Strongly Agree or Agree
“I believe that if I took the same
certification or licensing exam again that I
would obtain a similar score.”
90% Disagree!
Candidates recognize that success in the profession
depends on a broad range of skills and expertise that
are outside of the domain of the exams. However,
many don’t trust the reliability of scores. What is the
basis of their view the exam may lack relevance?
13
Beliefs of Test Takers – Score Reports and Practice
Tests
93% Strongly Agree or Agree
“I believe that the reports I obtain
regarding my performance on the exam
are clear and concise.”
65% Strongly Agree or Agree
“I believe that taking practice tests
before I take the actual certification or
licensing exam reduced my stress and
anxiety when taking the actual exam.”
MCC could explore how to improve communicating what scores mean. Focus
groups are often useful in determining how the reports are in fact used, and where
misinterpretations may occur. Additional practice tools would be well-received too!
14
WHAT DOES
ALL THIS MEAN
FOR MCC and
THE TESTING
INDUSTRY?
15
An Evolving Notion:
…a social psychology of assessment & testing is
developing that forces us to consider the
experiential and contextual factors in our exams.
16
Experiential Factors
• We are beginning to form a more formal
understanding of the social psychological and
experiential variables that may be relevant to
assessment and testing.
– Social psychological (experiential and contextual)
variables may have us asking familiar questions about:
• The role (effects) of attitudes toward testing.
• The role (effects) of emotional factors like stress.
• The role (effects) of test taker characteristics and context
they live in (e.g., gender).
17
Experiential Factors
• Lets us remind ourselves of our core
responsibilities in test development:
– Providing test scores (and test data) that fosters
valid inferences and decisions from the test
scores.
– Zumbo (2007, 2009) has presented a view of test
validity which shines a light on these contextual
and social psychological (experiential) variables
drawing them out of the background and bringing
them up front in the test validation process.
18
Experiential Factors
• But what are our responsibilities as test
developers and providers for the experiential
factors?
– With our responsibilities in mind, we can begin to
imagine that there are experiential variables that
pose challenges to valid score interpretations
– Whereas, other factors are, at worst, nuisances
but don’t muddy the construct assessed.
19
Experiential Factors
• In terms of our responsibilities we can work
from advice from Messick and Cronbach.
– We can work from the idea of tracing the source
of the experiential factors to construct relevant
versus construct irrelevant factors.
– When making valid inferences from the test
scores, we must distinguish between
• Deterrents (construct relevant variance)
• Nuisances but not deterrents (construct irrelevant
variance)
20
Lets take as an example the
experiential variable that may
require more attention as a
potential deterrent:
Examinee Stress
21
Experiential Factors
• The matter of stress now comes down to
tracing back the “source” of the stress to
either construct irrelevant variance or
construct relevant variance and this requires
input from the test design and test score use
in the profession.
– We can make this matter more complex by looking
at the moderating effect of test taker gender.
22
“When I am taking a certification or licensing exam I
feel a great deal of stress and anxiety.”
Even in a convenience sample
of responders, we see
important differences by
gender. Validation research is
needed to understand this
factor.
Χ2 = 98.2, 3 df, p <.0001
Agree/Strongly Agree
Females: 91%
Males: 81%
In the total sample, 86% strongly agree or agree.
23
An example of a variable that may
be nuisance but not necessarily a
deterrent to valid score inferences:
Test Fees
24
Age Range Analysis: “I believe that exam fees
are reasonable for candidates.”
We expect new
professionals to object
more strongly to fees,
compared with those
more advanced in their
practice.
This may be useful for
costing programs, but is
not significant, nor a
validity threat.
25
Rating Concordance – Security Web/Audio by
OSCE Validity
• Test takers who believe the
OSCEs valid are supportive of
greater security, as are those
who see cheating as a
problem.
• Younger test takers are more
likely to see cheating as a
problem.
• These are statistically
significant associations, but
what does this mean for how
we design future
assessments?
Future Considerations
• In order to drive program and service improvement, it
is important to reach out to our test takers and
understand their experiences.
• A survey or focus group approach is useful in
determining/validating program strategy.
• These trends are similar to those in other licensure
examinations.
• What are our responsibilities for test taker attitudes
and beliefs?
– How can we meet the challenge of better service and
improved communication in evolving our programs?
27
Krista Breithaupt
kbreithaupt@mcc.ca
Greg Pope
gregp@getyardstick.com
Bruno D. Zumbo
bruno.zumbo@ubc.ca
28
Download