CoA Medical Coding

advertisement
2013-2014 Assessment Plan Report
Date Submitted: February 2015
School: Engelstad School of Health Sciences
Program: Medical Coding
Person(s) responsible for the design and implementation of assessment plan and writing the report: Peggy
Perkins-Arnot, RHIA, CCSP
1.
Project Overview and Assessment Goals
A. List the program Learning Outcome(s) that the plan is assessing.
Outcome 2: Evaluate diagnostic/procedural medical codes and groupings for inpatient,
outpatient, and physician records according to current guidelines and regulations.
Outcome 4: Validate accuracy of computer assisted coding assignment with encoder software
and other electronic software technology in coding and billing processes.
B.
Provide a brief description of the plan and the assessment question(s) being addressed.
•
Outcome 2: (Question) Were students able to both apply and evaluate
diagnostic/procedural medical codes for inpatient scenarios to apply MS-DRG groupings and
POA designations appropriately according to current guidelines as understood by the
American Health Information Management Association (AHIMA)?
This outcome was addressed within an assessment instrument delivered to students
through an online activity in the American Health Information Management Association
(AHIMA) Virtual Lab titled “Assigning MS-DRGs and POA Designations.” In this instrument,
students are required to complete the assessment with 100% accuracy using the NUANCE
QUANTIM ENCODER. Multiple tries are allowed; however, those tries are recorded by the
system, giving further data to assess.
•
Outcome 4. (Question) Were students able to analyze current regulations in clinical
classification systems for compliance with ethical coding concerns well enough to be able to
adequately apply those concerns appropriately to the physician query process as understood
and promoted by the American Health Information Management Association (AHIMA)?
This outcome was addressed within an assessment instrument delivered to students
through an online activity in the American Health Information Management Association
(AHIMA) Virtual Lab titled “Quantim Physician Query.” In this instrument, students are
required to complete the assessment with 100% accuracy using the NUANCE QUANTIM
ENCODER. Multiple tries are allowed; however, those tries are recorded by the system,
giving further data to assess.
C.
If this report does not correspond to the most recent program assessment plan sent to the school
assessment coordinator please submit a copy of the revised plan along with this report.
The most current plan submitted describes the following outcome assessment method:
“The final assessment will be delivered to students at the end of their course studies in HIT 210
1|Page
Coding Practice Experience. Effectiveness of learning will be demonstrated by student scores
on competencies of at least 73%.“
D. Indicate when the outcome(s) were assessed and what conclusions were reached from the
previous assessment, and what “closing the loop” changes, if any, were made based on those
conclusions. Also indicate whether the assessment method used previously is the same as the
one described in this report. If not, what is changed?
•
Outcome 2: Evaluate diagnostic/procedural medical codes and groupings for inpatient,
outpatient, and physician records according to current guidelines and regulations.
•
Outcome 4. Analyze current regulations in clinical classification systems for compliance with
ethical coding and privacy and security concerns
Both of the above outcomes were addressed at the same time within the Spring 2015 HIT 210
Coding Practicum. They were not previously assessed.
The next report will assess the entire range of outcomes using a mock exam patterned after
the AHIMA exam.
2.
Project Design and Coherence
A. Identify the student product(s) used for direct assessment of the Learning Outcome(s) that
you listed in 1.A. Explain the context for this product [course name(s) and number(s), place in
curriculum, instructor(s), and so forth). NOTE: If your project depends on anonymity, report
only contextual information that doesn’t need to be anonymous. If the project focused on a
single or common assignment, please attach the assignment handout or explanation in the
syllabus as an appendix.
Both of the assessment instruments used for this project were produced, delivered, and scored
by the American Health Information Management Association (AHIMA) Virtual Lab. They were
delivered within the culminating course for the Medical Coding Certificate, HIT 210. Faculty:
Peggy Perkins-Arnot.
B. Explain how the student product was scored and by whom [for example, objective scoring by
machine; course instructor using a rubric; judging panel using a rubric] If the project used a
rubric, please explain who created the rubric and attach it as an appendix.
The exam was electronically delivered within HIT 210 during Spring 2015. It was electronically
scored by AHIMA Virtual Lab. For a passing score, students had to complete the assessment event
with a 100% score. Students were allowed multiple attempts to succeed; however the scores and
attempts were registered and noted by the AHIMA Virtual Lab. This assessment is delivered
across the nation to many coding students and has been vetted by professionals for validity and
integrity.
C. Explain the “fit” or “match” between the program Learning Outcome(s) being assessed and the
student product used as a direct measure. In other words, how fully does the quality of the
product reveal achievement of the learning objective? [Sometimes there may be a one-to-one
correspondence between the learning objective and the product. At other times, only some
features of the product are relevant to the learning objective. In such cases, the learning
objective might be assessed only by one or two rows of a rubric or by a few selected questions
on an exam.]
Since these outcomes are in higher level cognitive skills, any assessment product can only
2|Page
employ some of the skills necessary to achieve the outcome at one time. That is what these
products are designed to accomplish.
D. Explain how program faculty defined achievement terms (example: minimally competent,
proficient, aspiring, satisfactory etc.) for the learning objective and how they distinguished
between the levels of achievement (criteria). If the project used a rubric, does the rubric
clearly indicate these categories and specify the corresponding criteria? If not, explain how
rubric scores correspond to these categories. If program faculty haven’t yet defined terms and
criteria for achievement of the learning objective(s), how and when do you plan to do so?
The learning objective outcome rubric was determined by setting passing scores at the generally
accepted scores for generating a “C” level in a course.
0-64%
65-74%
75-89%
90-100%
Unprepared Requires
immediate action
Unprepared Remedial
work required
Prepared Acceptable
No action required
Prepared Exceptional
No action required
E.
If your project used a rubric, did program faculty try to establish inter-rater reliability in the
use of the rubric? If so, explain how. If not, explain why.
Inter-rater reliability is established by using products written by professionals in the field, both
academic and industry. It is reviewed and/or updated on an annual basis. Students take these
exams through a web site and they are graded electronically.
3.
Project Methods
See 2B.
4.
Project Results
Report your results as a table or chart showing the number of student products evaluated and the
distribution of performances across the quality categories shown in your rubric or across the continuum of
objective scores. Particularly highlight the percentage of performances meeting your program’s
aspirational goals and the percentage failing to meet minimal standards.
Outcome 2
3|Page
Outcome 4
5.
Discussion of Results
This section of your report is central to the assessment process and therefore essential to the assessment
report. Explain what program faculty learned through their analysis of the results. Discuss results from the
perspective of both summative and formative assessment. Summative assessment: To what extent are
program faculty members satisfied with these results? Why or why not? Formative assessment: what
problem areas or patterns of weakness were uncovered? How might these problem areas be addressed
through changes in pedagogy, course or assignment design, or sequencing of instruction? If the outcome(s)
have been previously assessed, what changes in student performance have been revealed?
It appears that there are no problems with the Outcome 2 assessment. Nor are there significant problems with
the Outcome 4 end results. However, the results show the beginning student attempts at Outcome 4 fell well
below the 74% competent level (remedial action needed) and close to the range established by the program to
need immediate follow up. This indicates that the students were not fully prepared, and a thorough review of
courses leading to the practicum is recommended.
6.
“Closing the loop” actions
What follow-up actions do program faculty plan to take next as a result of this project? You might decide to
institute changes, to decide no changes are needed, or to gather more data. Typically, your action plan will
fall into one of the following categories. Choose the most appropriate.
A. If your assessment data suggest changes that might improve student learning, what changes will
you make? (Changes might include new assignments, shifts in pedagogy, closing of gaps in the
curriculum, improved scaffolding, more effective sequencing, and so forth.) How and when will
you try to assess whether the changes were helpful?
Coding courses will be reassessed at a Program level to assure that the appropriate content is
addressed at an advanced level.
4|Page
7.
Project ownership
To what extent were all program faculty involved in this project’s discussion and analysis? When, how,
and by whom were assessment findings discussed and decisions made about appropriate actions to take
in relation to these findings?
Documentation of faculty input in assessment process 2013-2014.
Date
09/22/2013
09/23/2013
01/14/2014
03/07/2014
04/22/2014
08/25/2014
08/27/2014
09/18/2014
09/23/2014
11/04/2014
–
11/05/2014
02/18/2015
Faculty involved
Cassie Gentry, Peggy PerkinsArnot, Rhonda Faul
Cassie Gentry, Peggy PerkinsArnot, Rhonda Faul
Cassie Gentry, Peggy PerkinsArnot, Rhonda Faul
Cassie Gentry, Peggy PerkinsArnot, Rhonda Faul
Peggy Perkins-Arnot
Cassie Gentry, Peggy PerkinsArnot
Cassie Gentry, Peggy PerkinsArnot
Cassie Gentry, Peggy PerkinsArnot
Cassie Gentry, Peggy PerkinsArnot
Cassie Gentry, Peggy PerkinsArnot
Cassie Gentry, Peggy PerkinsArnot, Rhonda Faul
02/25/2015- Cassie Gentry, Peggy Perkins02/26/2015 Arnot, Rhonda Faul
5|Page
Meeting subject
HIT Assessment Committee training delivered via email by
Peggy Perkins-Arnot
Assessment Committee Report presented to committee via
email
Email meeting to committee regarding request for review of
both HIT and Medical Coding Assessment plans.
Email meeting regarding decision not to include a Medical
Transcription Assessment plan in addition to the HIT and
Medical Coding plans.
Delivery of draft assessment plans to Program Director
Email meeting regrading draft assessment plans
Email meeting regarding finished assessment plans
Email meeting discussing Curriculum Map completed by
Peggy Perkins-Arnot and competency assessments within HIT
program
Assessment Committee Meeting
Email discussion of Program Annual Assessment Plans
Assessment Program Report follow up for 2014 presented to
committee via email
Email meeting regarding request for 2013-2014 Assessment
Plan Report to ESHS
Download