Is Your Organization a HRO? (High Reliability Organization) How can you tell? If not, why Not ? David Eibling University of Pittsburgh, VA Pittsburgh CRNA Conference April 11, 2014 What is a “High Reliability Organization” ? Work groups that function in high stress environments – Highly Complex – Tightly Coupled – High levels of Uncertainty – High Production Pressure And – have fewer adverse events than expected Seemingly exempt from “Normal Accidents” “Normal” Accidents Classic research by Perrow, Sagan and others Studied accidents that occured during “normal” operations – Nuclear power, Petro-chemical plants Accident rate and impact modified by numerous factors “Accidents are Inevitable in complex and tightly coupled systems” Sagan 1993 From NASA PPT Examples of HROs Navy carrier operations Space Shuttle flights – Despite two catastrophic crashes Commercial aviation There is a science! Deciphering the “R: in HROs Research dates back to 1980’s Organizational Theory researchers – LaPorte, Rochlin, Roberts, Weick, Schulman – Why do Organizations do what they do? Extensive literature – Academics tend to be in schools of business and public policy Science just starting to be recognized in medicine Characteristics of High Reliability Organizations Preoccupation with Failure – What could happen? Reluctance to Simplify – Always more complex than seems Sensitivity to Operations – What are we doing? Commitment to Resilience – What will stop the chain of error? Deference to expertise – Not always apparent who has it Where is Healthcare? Medical Error 8th most common cause of death in US – Recent paper suggests is 3rd most common* Chances of ADE range from 2 -7 /100 Everyone has a story Doesn’t seem very reliable *James Journ Pat Safety 2013 Lets go back 40 Years to 1973 George Foreman knocks out Joe Frazier Howard Cosell shouts “Down goes Frazier, Down goes Frazier, Down goes Frazier” Yon Kippur war OPEC cuts off oil 1973 Henry Kissinger wins Nobel Peace Prize Watergate Hearings begin Rose Mary Woods accidentally erases the tape 1973 Pioneer 10 sends back first close-up pictures of Jupiter Monica Lewinski is born 1973 Emergency Rooms are just rooms – Eibling begins his internship at Wilford Hall, San Antonio Tx A tale of multiple errors 18 Y/O man falls/jumps from 3rd floor barracks – Chest trauma Transported to Wilford Hall USAF Med Center On-call surgeon (Eibling) paged STAT – mid – July 1973 A tale of multiple errors Patient combative, pale, tachypneic Unable to obtain vital signs Obvious contusion over lateral thorax Reduced breath sounds Paged Thoracic surgery STAT #14 angio placed in hand A tale of multiple errors IV lost immediately Chief of Cardiac surg arrives Multiple attempts to restart IV – Saphenous cut-down attempted Patient codes Patient dies Autopsy demonstrates lung laceration & hemothorax – no liver/spleen lac A tale of multiple errors Morbidity and Mortality Conference one week later Focus on Eibling’s actions/lack thereof – – – – – Why Why Why Why Why didn’t you restrain patient? didn’t you place antecubital line? didn’t you place chest tube? did you wait so long to intubate? didn’t you call for help? We couldn’t imagine that . . The system could be improved Dedicated Emergency Medicine physicians would improve outcomes Trauma teams should take group call – Rapid response teams should train together – That Resuscitation training and ATLS would save lives That fixing the intern wouldn’t solve the problem That our system was not “highly reliable” 20 Years Later “Error in Medicine” JAMA 1994 Error in Medicine Lucian Leape JAMA 1994 Landmark Paper tying Concepts of Human Error (by Reason) to Medical Error Amazingly pertinent even today Emphasized extent of problem – Harvard Medical Practice study 1991 Quoted Schimmel’s 1964 report – Prospective analysis of 1014 medicine patients at Yale-New Haven Hospital Emphasized value of voluntary reporting “at the bedside by the caregivers themselves” To Err is Human Institute of Medicine 1999 Emphasized role of human error in poor outcomes Estimated Medical Error Results in 44,000 98,000 deaths yearly in US (Actual figures much greater) Emphasized necessity of studying errors The title tells it all . . . . Humans are Imperfect – we must design systems that take such imperfections into account Are we there Yet? Consensus is no substantial improvement since 1999 To Err is Human Progress has been made – But has been incremental – not Transformational* Pre-procedure checklists Bar coding Time out CPRS Alerts Marking sites Simplification Medication safety Standardization Learning from mistakes Avoid reliance on memory Root Cause Analysis Hand hygiene focus Using Checklists Patient Safety Goals Team huddles *Anesthesia may be exception to the rule “There is nothing New Under the Sun” “Human Error in medicine, and the adverse events that may follow, are problems of psychology and engineering, not of medicine” John Senders, Chapter 9 Human Error in Medicine Maybe this story will help explain it . . . . . . . . Who is to Blame? The Patient – 2013 60 y/o smoker with 2 cm pleomorphic adenoma On VA disability for PTSD, tinnitus, hearing loss, diabetes (HbA1c 9.9) Additional co-morbidities: hypertension, hyperlipidemia, prior gastric bypass for morbid obesity, prior CABG, known OSA, known ETOH abuse history Multiple medications managed by non-VA primary care doctor (“shared care”) – Patient not aware of medications/doses “my wife manages my medications” Preop eval by IMPACT clinic – Med list in CPRS reviewed – Some meds from VA, some from outside pharmacy – Wife not present for IMPACT, no information from non-VA PCP Who is to Blame? The Case Uneventful Parotidectomy – Post-op hypertension to systolic >200 Urgent medicine consult – HTN - likely multifactorial given anxiety w/o SSRI, pain, ?OSA, CKD and likely under-treated HTN at baseline with goal BP ~130/80. on metoprolol currently as outpatient only which is less than ideal. allergy to Ace/ARB documented and with GFR ~30 HCTZ likely to be less effective. Would recommend starting 2.5 mg of amlodipine now, restarting his SSRI at home dose, continuing Metoprolol and treating pain PRN. prn hydralazine or clonidine as needed for SBP>180. Would recheck Chem8 in AM. Small hematoma opened prior to DC – Discharged on prior medication regimen – New BP med missed in discharge orders (communication failure? slip?) In dictated DC summary, not on nursing DC note ER 2 days later admitted 6 days post op for additional management with uncontrolled hypertension – Med consultant discovered prior (non-VA PCP) dosing of metoprolol as well as missing ACEI/diuretic combo not reflected in any available med list Who is to Blame? Context “Shared care” – Care coordinated with VA and non-VA PCP – Exception rather than rule (most frequent example is anticoagulation) – Extensive templated notes 1 yr and 6 months previously – “Medications reconciled” – No data from outside PCP in most cases VA med co-pay $9.00 per month per med (NSC). Generic meds at Wallmart $4.00 per month or $10.00 for 3 months. ($68 per med per year) – What would YOU do? – Relies on human to enter/update non-VA meds Medication Recognition? Assigning Blame Medication reconciliation known problem – Failure to “reconcile” at discharge well known issue – No single time-linked display of medications across continuum of care Previously reported to internal system- 3 work groups have addressed – Pharmacy work group developed single combined list of all meds (multiple problems such as duplicates) – Engineering group – formal study instituted by Patient Safety group concluded that with constraints of information system best solution is to assign dedicated pharmacist to inpatient med-rec Level 3 Peer review level assigned to attending for all medication reconciliation errors What do you think has happened? “It will be evident to anyone who has read the foregoing pages, that the history of the problem of error does not bear witness to a steady and well defined progress, from initial perplexity, through stages of ever increasing light, up to a final and triumphant solution. Perhaps it was hardly to be expected in the case of a question so baffling in itself, so open to evasions, and so dependent on others of positive interest. The same difficulties keep coming back under slightly difficult forms, the same postulates and general distinctions, the same ambiguities and incoherences; til one begins to wonder whether after all it is possible to give a rational and philosophic account of this irrational product of the mind” Keller The problem of error from Plato to Kant 1934 And just this Monday . . . . Finding med list from “Spoke” Hospitals (Don’t appear in CPRS Meds Tab) Click here But this only lists meds from VA Pharmacy Here’s how you find the rest of them To find all meds from spokes you need to go to “Health Summary” (near bottom of list) Clarksburg Erie Pittsburgh Click to open Clarksburg Health Summary Now this list opens in Health Summary (partial list – too long for slide) Click on Medication Reconciliation Now click on the specific site health summary “Med Rec” Non-VA Meds NOT in Pharm Tab Asprin Atorvastatin Budesonide Celecoxib Dutasteride (twice) Lansoprazole Latanoprost (twice) Levothyroxine Metformin Metasone Lodrane D herbal?? Patanase Olopatadine Pioglitazone Ramipril Terazosin What would an HRO Do? Preoccupation with Failure Reluctance to Simplify Sensitivity to Operations Commitment to Resilience Deference to expertise Can we use an event as a “biopsy” ? What would an HRO Do? Preoccupation with Failure – Constantly asking “why do we have so many med rec errors?” – Med rec failures would demand high level attention – Leadership would feel responsible and insist on a solution Reluctance to Simplify – How does the system work, anyway? – What are the areas of linkage that contribute to failure? – What are the “little failures” that combine to cause catastrophe? – What is the context we work in - ie the larger systems such as medication labeling, cost issues, etc – Why don’t we understand all of the components and links? What would an HRO Do? Sensitivity to Operations – What is really happening? – Who at the front line is using work-arounds? – What are these work-arounds and why are they necessary? – Who knows what is really happening and is ready to talk about it? Commitment to Resilience – Where is the resilience in our system? – Where is resilience missing? – Are there areas of tight linkage that impair resilience? – How can we help our front line people stop the chain of error? Humans are Source of System Resiliency and Adaptability How many times each day do: – You – Your fellow practitioners – Your colleagues in other specialties – Your OR nurses Use a “Work-around” to solve some problem? Studying Work-arounds is recognized as key to Fix the system, not the human understanding humansystem incompatabilities What would an HRO Do? Deference to expertise (Internal) – Who knows what is going on? Is it the nurse? Resident? Pharmacist? Patient? – Who might have ideas on how to reduce the likelihood of failure? – Will we heed their observations and recommendations? – Are they willing to speak up? “The greatest obstacle to discovery was not ignorance – it was the illusion of knowledge” Daniel Boorstin How to Find out what “Sharp End” Practitioners Know ? “Knowledge is more than information” Challenge is to capture knowledge Theme of the “Just Culture” movement Overall, healthcare has done poorly – 2012 Safety Attitudes survey - 40% not talking Involves more than merely “Reporting” What would an HRO Do? – Deference to expertise (External) – Are we the first to encounter this failure? – Has this been studied before and where are the reports? – What are others doing? – Are we willing to invest the time and resources to attend meetings and study the literature when it exists? “Education is learning that you didn’t even know what you didn’t know” Daniel Boorstin Science of Error Not a new topic – Cognitive psychologists Human Error – James Reason Cambridge Press 1990 – Precipitated by major accidents of the 70’s Attempted to answer the question – Why do we do what we do? Leape tied Medical error to Reason’s work The famous Swiss-cheese illustration Deflected Error Triggers Accident Defenses Adapted from Reason 1990 An HRO knows where the holes are – and worries about the ones it doesn’t know about Slip versus Mistake After Reason Slip is an error due to failure of execution – 1 Qt oil in Radiator – Occurs at the “Sharp End” of a system Mistake is a fundamental error in judgment – Often occurs at the “Blunt End” of a system Slips are often due to mistakes in system design Human Error “Natural consequence” of human adaptation to environmental stimulation Focusing attention Recognizing patterns “Filling in the blanks” Sequencing events The same strategies we use to manage information overload ! Knowledge and error flow from the same mental sources, only success can tell the one from the other.” Ernst Mach 1905 Human Error – the Scapegoat Human Error serves valuable role for organizations Blaming the human “absolves” organization from blame – Reduces work required to understand event – Eliminates need to either seek or alter underlying source(s) Concept integrated into culture of medicine “Any RCA that concludes “Human Error” was the cause has fundamentally failed” (Richard Cook Christopher Nemeth) “If we design our way into difficulty we can design our way out.” (John Thakara) AE’s nearly always more complex than appear initially Organizations often restricted by regulatory forces, competing national goals, etc. “Fish can’t see water” VA examples legendary – Software issues Medication recognition – Patient photo in record BCMA – Out of bed Pain needs In X-ray Family visiting Disease process An example of complexity Patient location Medication ordering workflow Pharmacy Correct armband? Arm Band labeling How to print? location Competing tasks Error checking Usability Information System Log in tasks Nurse workload Compatibility with EHR System reliability Interruptions Doctor Competing tasks Physical Environment Ward lay-out Equipment fit in room? Competing tasks take nurse away A Constant Theme “The judgment that this was human error simply produces too many Institutional Benefits” “By attributing my colleague’s accident to his inattention or stupidity, though, I make it possible to believe that the accident has no relevance for me” Dekker A Tale of Two Stories* The Front line story versus the investigation – Focus on individual actions – Focus on retraining – Backward vs Forward looking “Hindsight Biasis” First and Second order Problem Solving** *Cook, Woods 1997 **Tucker and Edmundson First Order Problem Solving Worker compensates for system deficiencies – Classic “work-arounds” – The “Spackle Resident” – The system never is changed Failure recognized as human failure – Unreliable, inattentive, etc. Solution is by changing Human or role in process Second Order Problem Solving: Assume human actions result of something (or many things) Begin by assuming assessments and actions of humans are predictable – Seek to understand roles of context and competing goals as decision architects – Role of cognitive psychology Much more challenging – Incongruent with prominent themes of medicine Hindsight Bias ¾ of all AEs “Human Error” Attribution easy when outcome known Causal attribution relies on social/psychological constructs – Previous learning – Context – Decision architects Difficulties in “tracing back” May be impossible to understand decision-making processes that led to AE David Woods 4 Reasons to Blame Individual 1. 2. 3. 4. People are available to blame People were there (maybe even lots of people) Human performance in complex systems usually very good – (humans compensating for system) hence AEs are rare Knowledge of outcomes when tracing backwards leads to incorrect assumptions regarding cognitive processes (outcome drives diagnosis) Finding Out What Happened Who does it right? The Aviation Industry Why? Incentive for US Aviation Safety Reporting System Why the emphasis on preventing aircraft accidents? Public Visibility of aircraft accidents Costs – Economic costs of single event – Lives lost per event The pilot is first at the scene! Our congressmen fly too The Aviation Safety Reporting System How does it work? No-fault reporting of errors – No repercussions for pilot if reported in 10 days De-identified after verification of facts Reviewed by panel of retired pilots Specific recommendations for system changes – Changes mandated by regulation – Disseminated to entire industry Follow-up to verify compliance Establishes a culture of safety Aviation Safety Reporting System www.asrs.arc.nasa.gov Note – NASA – Not FAA How does it work? The Aviation Industry: Accepts that Errors Happen Works to understand: – Why? When? Where? “Even the little ones!” Begins by Reporting – No Fault – “Everyone has expertise” Looks for root cause(s) – Defective system – not defective people Fixes the system – not the individual Follow-ups to confirm fix has been implemented History of Human Factors Early years of aviation checkered history Aircraft reliability improved during WWII Post-war introduction of jets did not go well Began to realize cause not the airframe – but the pilot Human Error “If we design our way into difficulty we can design our way out.” (John Thakara) Human Factors Engineering A “new” applied science Military began to realize aircraft too difficult to fly G-forces, dehydration New technology Pressure suits, etc. Cockpit controls Too much to remember – Wheels-up landing Instrumentation – Autopilot programing AA 965 Dec 21, 1995 Cali Columbia Teamwork Eastern Flight 401 Human Factors Research in Medicine Estock et al 2014 Some actions ARE blameworthy Just Culture in the Aviation Industry “a no-blame culture is neither feasible nor desirable” Reason 1997 There are some rules you would never break How do we define the line between acceptable and unacceptable behavior ? Global Aviation Information Network Available from http://204.108.6.79/products/documen ts/roadmap%20to%20a%20just%20cu lture.pdf “Just Culture” There is a “line in the sand” between driving 75 in a 65 MPH zone versus 75 down Fifth Ave “A just culture recognizes that competent professionals make mistakes and acknowledges that even competent professionals will develop unhealthy norms (shortcuts, routine rule violations), but has zero tolerance for reckless behavior.” (AHRQ website) “Who draws the line is the most critical question” Dekker 2012 Culture of Safety What is it we are talking about? Patient Safety is first priority – Controversial Leadership leads safety initiatives Employees believe safety top priority for leadership Employees empowered to speak up Employees expect changes will be made to correct threats to safety Measured by “Patient Safety Attitudes Survey” Distributed and collated by AHRQ Data published in February Overall poor results Most healthcare workers in US work in systems that address mistakes with name-shame-blame Leadership is Key “The most important question in establishing a Just Culture is Who gets to decide what is acceptable” Dekker So, What is Our Responsibility? Recognize that most adverse events are due to latent errors Highlight these latent errors in our hospitals, clinics, and offices Educate leadership to the fundamentals of a culture of safety Accept Responsibility for the Systems of care in which we care for our patients. We can help our organizations become HROs Begin by Reporting Recognize that as front end staff YOU have expertise Speak up and encourage others to report Participate in analyses of failures – Look for the “second Story” Don’t accept the Status Quo High Reliability Organizations consistently demonstrate a: Preoccupation with Failure – What could happen? Reluctance to Simplify – Always more complex than seems Sensitivity to Operations – What are we doing? Commitment to Resilience – What will stop the chain of error? Deference to expertise – Not always apparent who has it We Can Show the Way