Rasmussen ’ s Performance-based Errors in Radiotherapy

advertisement
Errors in Radiotherapy
Rasmussen’
Rasmussen’s Performance-based
Actions
Bruce Thomadsen
Shi-Woei Lin
University of
Wisconsin Madison
Slides © Bruce Thomadsen
One Example of Error
Analysis in Radiotherapy
Errors
l
Systematic Errors:
–
–
–
–
l
Usually one mistake tucked into the procedure
Affects all, or a large class of patients.
Often found in Process Audit
Must be rooted out
Random Errors:
– Happen on a per-patient basis
– May be caught through QM
– Will never be eliminated (because of creativity)
l
l
We did a study of brachytherapy errors
based on all misadministrations reported to
the NRC.
We performed several analyses of the events.
Analysis
Caveat
l
l
l
The NRC does NOT keep any records of
physician errors in diagnosing or
prescribing. That, they say, would be
dictating medicine.
The only data is on deviations from
prescriptions.
Unintended Area
(P) HRI
(R) Execution is
erroneous
(P) HRV
(R) Identification not correct
The step size
(parameter) was
wrong
Fail to identify the
error
The step size stayed at
the wrong size when
the physicist moved it
with mouse
Requirement for a
manual entry
(S) External
Interference
(R) Procedure is
incorrect
The computer
would not
transfer the file at
that size
The normal
dosimetrist who
check the plan left
with an
emergency
The dosimetrist
who check was
not familiar with
the program
enough
The physician
just missed the
error
(R) Manual variability
(P) OK
(R) Spontaneous
(R) Excessive demand on
human variability
knowledge/training
(S) Inadequate Search
(S) Bounded Rationality
Behavior
l
l
We constructed a process tree for the
procedure.
We constructed a fault tree for the procedure.
For each event, we:
– Contacted the principal and got the facts.
– Constructed a root-cause analysis tree.
– Marked the position of the failure on the fault and
process trees.
– Classified the events using three taxonomies.
Example
Root-causeanalysis
Tree
LDR Brachytherapy Process Tree 1:
Placement followed by dosimetry
Application
Hardware operation
Applicator
Software operation
1
Satisfaction
Identity
Fiducials
Geometry
Identification
Correct
selection
1
Procedures
leading to an
LDR Patient
treatment
Applicator
fixation
Per
algorithm
Targeting
Duration calculation
Optimization
2 Specification
Dummies
Image quality
Record
Set source
Calibration
2
1
Record Setup
Localization
9
Prescription
clinical
Protocol
stage
Anatomic
information
Image quality
Identification
Set chamber
Monitoring
Successful
Treatment
treatment duration
executed
Dose
Other Tx
information information
Correct films Interpretation
Calibration factors
Reading
Measurement
Removal
calculation
1
Data entry
1
Entering data
in computer
1
Time recording
1
Per localization
Placement in well
He was
interrupted
Source fixation
Completion
strength
Calculation of strength
(R) Information not seen or
sought
(S) Distraction
(R) Distraction from other
person
Source loading
4
Dummies
Anatomy
(P) TD
The physicist not
familiar with the
program?
Source verification
10
Consistence
date
(P) OK
(R) Excessive demand on
knowledge/traning
(S) Incomplete rule
(S) Bounded Rationality
Applicator
check
1
Procedure
Placement
factors
The physicist did
not notice it
Patient
identification
Correctness
correct Correct
target anatomy Correct
applicator
placement
(P) TM
The arrow key
rotate through
different sizes in
the step field
Source
loading
Planning quality
assurance
Reconstruction
3
Geometry
source
strength
Dose / time
calculation
Limitations of
algorithms
Dosimetry
calculation
8
3
factors
patient
data
input
2 source
integrity
Recording
Emergency response
Source Radiation
count
survey
Insertion in
carrier
1
10 Source
selection
Source
preparation
Treatment
termination
1 Post Tx
monitoring
Source removal
Removal preparation
1
Removal time
verification
Conclusions from
Process Tree Analysis 2
Conclusions from
Process Tree Analysis 1
l
l
For HDR
– By far the most common step with failure was
entering the treatment distance, usually not
changing the default value.
– Almost all steps in treatment unit programming
or delivery had some errors.
– Dose specification accounted for several errors.
– The only problems with source calibration were
in entering the calibration data into the treatment
planning computer.
Conclusions from
Process Tree Analysis 3
For LDR (placement followed by dosimetry)
– Errors in four steps accounted for most of the events:
» Selection of the sources,
» Loading of sources into the applicator,
» Using the required units when entering data into the
computer, and
» Fixing the sources in the applicator, or applicator in the
patient.
– Most steps in “Source Loading,”
Loading,” “Dose/time
calculation,”
calculation,” and “Treatment termination”
termination” had
errors.
0
0
HDR Fault Tree
Fractionation
failure
0
0
Wrong
applicator used
C
l
For LDR (Dosimetry followed by placement),
errors occurred only in
Deviation from
adequate
treatment
Wrong dose
distribution or
site
or
13/44
Treatment
planning failure
13/44
30/44
1/44
1/44
Wrong patient
treated
Treatment
implementation
failure
Wrong patient
selected
and
Error in
treatment
planning
D
Go to Page 3
Verification
error
Applicator
positioning
error
G
Go to Page 6
2/44
Applicator
connection
error
H
Go to Page 7
I
Go to Page 7
Treatment
delivery error
J
Go to Page 8
Treatment
terminated
prematurely
L
Go to Page 10
or
Treatment
programming
failure
21/44
Failure to
identify patient
0
Page 1
Go to Page 2
3/44
4/44
1/44
Go to Page 2
B
and
or
– source preparation (usually ordering), and
– source delivery (usually a failure to monitor).
A
Accounting
error
Go to Page 2
13/44
43/44
44/44
Prescription
error
or
Incompatible
factors for
calibration and
dose calculation
Wrong or
incompatible
units
Incorrect data
transfer
Physician's error
Error in transfer
(transcription)
or
entry
2/44
7/44
F
Dose calculation
error
Incorrect entry
Wrong dose
or
Wrong location
of dose
distribution
or
Dose
specification to
wrong points
Wrong dwell
positions
activated
Page 5
2/44
Inappropriate
marker
Some Parts
of the Tree
are Deep
E
data
Erroneous
strength for
source data
Error in data
entry
3/44 Failure to enter
or
and
Wrong source in
device
Failure of
verification
F
Go to
Page 5
or alter data (unit
default)
Discrepency in
strength between
device and
planning system
Wrong data
format (US/Euro)
and
Incorrect entry
Incorrect marker
used
Programming
error
Algorithm error
and
Software version
incompatibility
and
Acceptance
testing error
Page 4
File corruption
QM failure
Corrupt file
Summary from LDR Fault
Tree Tabulation 2
2/3 errors in delivery
1/3 errors in treatment planning
– 16% of errors in calculation, but of various types.
– 7% of errors due to incorrect source strength entry.
l
3/44 Wrong source
Wrong data
(wrong decay
factor)
– 40% of errors due to default value for distance not
being changed.
– Applicator shifting in patient was only other
frequent problem in delivery.
l
or
Measurement
error
Calculation error
3/44
Summary from HDR Fault
Tree Tabulation
l
and
or
Dose calculation
error
or
Failure of
verification
error
Dosimetry Error
Error in
specification
or
Wrong
calibration
3/44 Source strength
10/44
7/44
or
Wrong units
Calibration error
3/44
Marker in wrong
position
Incorrect dwell
times entered
(manual, not
optimized)
Software error
Inaccurate
source position
entry
Entry error
or
Incorrect shape
of dose
distribution
(incorrect
optimization)
Inconsistent step
size
2/44
Physician's error
QM failure
or
2/44
Data
transposition
Interpretation
error
1/44 Incorrect data
2/44
Some
Parts of
the Tree
are
Broad
Wrong chart
referenced
or
Wrong patient's
data used
Almost all events had failures in verification.
l
1/4 errors in treatment planning
– 15% of errors in calculation,
» 11% due to incompatible units.
– 8% of errors due to incorrect source strength entry.
l
Again, almost all events had failures in
verification.
Summary from LDR Fault
Tree Tabulation
l
3/4 errors in delivery
Analysis Based on
Taxonomies
l
– 11% because the patient removed the sources and
the staff didn’
didn’t notice or correct.
– 12% because the sources were never placed in the
applicator correctly.
– 14% because the wrong source strengths were
used.
– 8% because the physician placed the applicator
incorrectly.
Rasmussen’
Rasmussen’s
What
Happened
Pathway
l
l
Taxonomies, as you have heard, are ordered
and organized classifications.
They often can give insight into the nature of
the errors occurring.
While we looked at several, and developed
our own, we will just present two today.
Rasmussen’
Rasmussen’s
Why It
Happened
Pathway
Rasmussen Human Error Model (HDR)
25
22
20
15
6
2
0
Why
From Rasussen’
Rasussen’s Model: “How”
How”
l From both HDR and LDR, the single most
common failure is “manual variability”
variability”.
– This is probably an artifact of the model, which
expects a human reactions to a plant problem.
– This reflects that the initiating events in medicine is
usually some person’
person’s action.
l
l
Grouped, Stereotype responses come close.
Information not seen, assumed or
misinterpreted also was significant; for HDR
they formed the dominant failure modes.
Other, specify
Instruction incorrect
1
Operator incapacitated
Excessive physical demand
Distraction from other person
1
Interferring task
Other
3
2
Distraction from system
Information not seen
Information assumed
Side effects not adequately considered
How
Information misinterpreted
Other slip of memory
Mistakes alternatives
1
Familiar association shortcut
Stereotype fixation
5
1
Forgets isolated act
Stereotype takeover
Familiar pattern not recognized
Execution
Manual variability
Topographic disorientation
Goal
Target
Identification
Detection
l
For both HDR and LDR, the next ranking
failure was in the execution of procedures,
Followed by the procedures being wrong.
3
4
Conclusions from
Taxonometric Analysis 2
Conclusions from
Taxonometric Analysis 1
l
3
1
What
– For HDR, mostly the problem was identifying the
problem using verification procedures in place
(either they were not adequate or not performed).
– For LDR, mostly there were no procedures in place
to look for problems.
3
0
0
From Rasussen’
Rasussen’s Model: “What”
What”
l For both HDR and LDR, noticing the problem
was the most significant variable.
7
4
3
2
12
10
9
8
5
Procedure
10
15
13
10
Excessive demand on knowledge
15
Spontaneous human variability
20
Task
Rasmussen’
Rasmussen’s
Why It
Happened
Pathway
Conclusions from
Taxonometric Analysis 3
From Rasussen’
Rasussen’s Model: “Why”
Why”
l These categories were not codable for many
events.
l The most common classification was the
catchall “Spontaneous human variability”
variability”.
l “Excessive demand on knowledge”
knowledge” was
significant, particularly for HDR, which is more
technical.
l Interfering tasks were also important in HDR,
which in more intensive at a given time.
80%
70%
HDR
LDR
60%
50%
40%
30%
20%
10%
Unclassifiable
Patient related
failure
Slips
Tripping
Monitoring
Verification
Intervention
Coordination
Qualifications
Knowlede
(Knowledge based)
Culture
External (Human
behavior)
Protocols
Management
priorities
Knowledge transfer
Materials
External
(Organizational)
Design
0%
Construction
van der Schaaf et al.
SMART’
SMART’s Suggested Actions
SMART Human Error Model (Pinball Method)
90%
External (Technical)
SMART
Pathway
Conclusions from
Taxonometric Analysis 4
From the SMART model: The results are very
similar for both LDR and HDR.
l By far, the dominant failure mode was
“Verification failure”
failure”, followed by
“Intervention”
Intervention” (which was scored if someone
just goofed).
l Inadequate “ Protocols”
Protocols” (i.e., procedures) were
important, particularly in LDR. Also in LDR,
lack of “Monitoring”
Monitoring” was a common problem.
Conclusions from
Taxonometric Analysis 6
Conclusions from
Taxonometric Analysis 5
From the SMART model: (Continued)
l Of about equal importance, “Knowledge
transfer “ (training), “Management
priorities”
priorities” (lack of staffing), and “Culture”
Culture”
(disregard for safety procedures) each
showed up as important.
l Design was a common problem, as as was
noted by all the other analyses.
Overall Conclusions 1
1.
2.
l
Very few of the events involved knowledgebased errors.
l
While the taxonomies tested did give useful
information, they obviously did not match
the medical setting well.
Evaluation of a medical procedure using risk analysis
provides insights.
Failure to consider human performance in the design of
equipment led to a large fraction of the events
reviewed.
• While the equipment per se did not fail, the design
facilitated the operator to make mistakes that resulted
in the erroneous treatments.
• Of particular danger were those situations where
equipment malfunctions force operators to perform
functions usually executed automatically by machines.
• Entry of data in terms of units other than those
expected by a computer system also accounted for
several events.
Overall Conclusions 2
3.
4.
HDR brachytherapy events tended to happen
most with actions having the least time
available.
LDR brachytherapy, the most hazardous steps
in the procedure entailed:
– selecting the correct sources to place in the patient,
– setting the sources in place properly in the patient
and keeping them in place.
– These events mostly result from lack of attention at
critical times.
Overall Conclusions 3
5.
6.
Overall Conclusions 4
7.
8.
9.
Lack of training (to the point that persons
involved understand principles) and
Lack of procedures covering unusual
conditions likely to arise (and sometimes, just
routine procedures) frequently contributed to
events.
New procedures, or new persons joining a
case in the middle also present a hazard.
– 7/46 evaluable in LDR.
– 12/38 evaluable in HDR.
Many events followed the failure of persons
involved to detect that the situation was
abnormal, often even though many
indications pointed to that fact.
Once identified, the response often included
actions appropriate for normal conditions,
but inappropriate for the conditions of the
event.
Overall Conclusions 5
10.
Most of the events suffered from ineffectual
verification procedures, a failure noted by
all three taxonomies. For the most part,
improved quality management would serve
to interrupt the propagation of errors by
individuals into patient events.
Observations on Common
Causes of Events
l
l
l
l
l
Failures in medicine parallel those in industry.
Errors don’
don’t just happen from a single cause,
but are surrounded by complicating situations.
Distraction (due to pressures and other
assignments)
Rushing (due to pressures and lack of staffing)
staffing)
Lack of communication (between parties)
Analysis of External-beam
Events (continued)
l
Systematic errors
– Errors in commissioning or calibration (note: the
errors themselves are random, but propagate as
systematic).
– Errors in formulae
– Errors in data entry or use of incorrect units
– Usually there has been no verification or check of
the data (strange, that we now always check a
single patient’
patient’s calculation, sometimes several
times)
Analysis of External-beam
Events
The events fall clearly into categories:
l Random errors in a patient treatment
– Few calculation errors (where much of QA falls)
– Frequent errors when treatments are odd (e.g., odd
angles used in the wrong direction)
– Not uncommon following a change in prescription midcourse.
– Not checking patient set-up after pause or interruption.
Commonalty in Most Events
The persons involved often fall into traps, set
by the practice environment, and respond
like human beings.
Download