MAURA-PNAT-TECH-MANUAL - C

advertisement
Introduction
The PNAT-PN was designed to measure the competencies required for success in
a program leading to licensure as a practical/vocational nurse. Therefore, it is intended to
be a predictor of success in an LPN/LVN educational program.
The test was developed systematically by first surveying approximately 500
schools preparing LPN/LVNs to determine the competencies expected or required for
admission to the programs. Although the survey was anonymous, postmarks indicated
that returns were received from all 50 states, the Virgin Islands, and Puerto Rico. From
the results of the survey, the test blueprint was developed. The results of this survey are
described in the next section of this manual.
From the beginning, every attempt was made to match the content of the test as
closely as possible to the skills needed by students in an LPN/LVN program. That is,
reading passages were written about content similar to content in LPN/LVN textbooks,
math items considered the numerical skills required in LPN/LVN nursing practice, and
vocabulary items contained nontechnical terms that are commonly used in nursing.
Items were written to correspond to the blueprint and then were experimentally
tested on a large sample of students (N = 3,408) who had recently been admitted to
LPN/LVN programs. Following statistical analysis, items were selected for the final test.
A sufficient number of items were pretested to develop two different, parallel forms of
the final Pre-Nursing Assessment Test-PN.
At the time of the experimental testing, students were asked to sign release forms
permitting C-NET to obtain their final grade point averages and NCLEX-PN results.
Survey
Reading ability. Over half (53.7%) of the 492 respondents expected a reading
level at 12th grade or higher for admission. Another 38.7% expected a 10th or 11th grade
reading level. Only 7.5% expected a reading level at 8th or 9th grade. The generally high
reading level expected/required is consistent with the level of reading needed to
comprehend the written material in standard textbooks for practical/vocational nursing
students.
Numerical ability. Similarly, there was strong agreement about the level of
mathematics achievement expected for admission to the LPN/LVN program. Most
schools expected competence in basic functions (addition, subtraction, multiplication, and
division), using whole numbers, fractions, decimals, and percents. Over 98% also
expected the ability to convert fractions to decimals or percents. Somewhat fewer
programs (84%) expected knowledge f ration and proportion, and only 10% expected
competence in plane geometry.
Other competencies. The majority of programs expected competence in general
science (70%), general health (68%), and biology (50%). Only 21% expected
competence in chemistry. Many schools indicated that they expected students to have the
ability to understand science concepts as they are applied in nursing, rather than having
specific knowledge of current scientific facts. Similarly, schools often cited the need for
such competencies as reading comprehension, and critical thinking or a related skill, e.g.,
“judgment,” “reasoning,” “problem-solving,” “decision-making.” Other areas mentioned
were vocabulary and spelling skills needed to communicate effectively in nursing.
Development of the Blueprint (Test Specifications)
Using the information from the survey results, the following three major areas of
the test were identified for the test blueprint for the Pre-Nursing Assessment Test-PN:
A.
B.
C.
Reading Comprehension/Reasoning Ability – 40 questions (1hour).
Numerical Ability – 50 questions (1 hour).
Language Ability – 60 questions (1 hour).
The total number of items was established as 150. each of the major areas was assigned
one hour of test time, for a total of three hours of testing.
Reading Comprehension/Reasoning Ability. The reading passages in the test
were designed to be similar in content and reading level to the reading required in PN
textbooks. Topics of passages include nutrition, infection, safety, child abuse, exercise,
etc. The questions in the Reading Comprehension/Reasoning Ability area measure the
following abilities:
A.
Reading comprehension – 20 items.
1.
Identify main ideas.
2.
Understand/comprehend details.
B.
Reasoning ability – 20 items.
1.
Recognize unstated assumptions.
2.
Distinguish facts from opinions.
3.
Determine strong and weak points of an argument.
4.
Evaluate the importance of arguments and ideas.
5.
Draw conclusions/inferences.
6.
Apply information to another context or new situation.
7.
Interpret (explain, tell meaning, restate).
Numerical Ability. The numerical ability section focuses on the types of
operations needed to succeed in a program preparing practical/vocational nurses. The
questions in the Numerical Ability area measure the following abilities:
A.
Basic operations with whole numbers (add, subtract, multiply, divide) –
10 items.
B.
Fractions, percents, decimals (basic operations plus conversions of
fractions to decimals & vice versa) – 20 items.
C.
Using skills in applied situations, e.g., ratio an proportion, formulas, and
conversions (Fahrenheit to Centigrade and vice versa, IV drip rates,
pounds to kilograms, converting household to metric system and vice
versa). Formulas and conversions are provided for students who are asked
to apply them in a situation – 20 items.
Language Ability. The questions in the Language Ability section involve
common, non-medical terms that are used frequently in nursing practice. The questions
in the Language Ability area measure the following abilities:
A.
B.
C.
Antonyms and synonyms (synonyms in context) – 20 items.
Spelling – 20 items.
Grammar and usage (sentence correction) – 20 items.
Item Writing and Experimental Testing
Multiple-choice test questions for the Pre-Nursing Assessment Test-PN were
written by C-NET staff with the assistance of experienced secondary school teachers. All
items were reviewed, revised, and edited prior to assembly in test booklets for
experimental testing.
Experimental testing was done to develop a norming sample and to statistically
test the items. In order to experimentally test a large number of items, four experimental
test booklets were prepared. To keep the testing time to two hours, each booklet
consisted of 100 items. Each test corresponded to the test blueprint, with 25 items on
Reading/Reasoning Ability, 35 items on Numerical Ability, and 40 items on Language
Ability. Identical “anchor” items were included in each of the four tests in order to
eventually place all of the items on the same scale, using item response theory (IRT)
equating methods.
A total of 102 schools in 32 states participated in the experimental testing to 3,407
students who had recently been admitted to the practical/vocational nursing program.
In selecting items for the final test, items that were too easy or that had flaws
revealed by item analysis, were eliminated. Therefore, the final test form is more
difficult than any of the experimental forms; i.e., the average difficulty of the final form
is 73%, compared to the 77% or 78% of the experimental forms.
After items were selected for the final form, percentiles were calculated by
determining the ability (theta) levels of the students in the experimental sample and their
corresponding scores on the experimental forms of the test. These scores were then
linearly transformed to determine corresponding percentiles on the final, real test. From
these data, additional classical statistics were derived.
Summary of Validity Study
Data Collection. At the time of the experimental testing, students were asked to
sign release forms permitting C-NET to obtain their final grade point averages (GPAs)
and NCLEX-PN pass/fail result.
Usable returns were received from 48 LPN/LVN programs located in 26 states.
The number of students in the final sample for the validity study was 1333. data were
analyzed using the SPSS statistical package.
Of the 1333 students in the sample, 1038 graduated, 175 did not graduate for
academic reasons, and 120 did not graduate for personal reasons. Analysis of variance
was used to determine differences in performance on PNAT-PN between those who
graduated and those who did not graduate for academic reasons. The graduate group
earned significantly higher scores on total score and subscores of the PNAT-PN.
N-CLEX-PN pass/fail results were available for 892 of the students in the sample;
of these, 832 passed and 32 failed NCLEX-PN. Analysis of variance was used to
determine significant differences between passers and failers in GPA, as well as
performance on total subareas of the PNAT-PN. The passers performed significantly
higher on all PNAT-PN scores.
Download