Uploaded by Michael Rosenfeld

PPIM 2023 Rosenfeld-Anderson bias paper final

advertisement
Thought Bias: The Hidden
Pipeline Integrity Threat
Michael Rosenfeld and Joel Anderson
RSI Pipeline Solutions LLC
Pipeline Pigging and Integrity
Management Conference
February 6-10, 2023
Organized by
Clarion Technical Conferences
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
Proceedings of the 2023 Pipeline Pigging and Integrity Management Conference.
Copyright ©2023 by Clarion Technical Conferences and the author(s).
All rights reserved. This document may not be reproduced in any form without permission from the copyright owners.
2
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
Abstract
Pipeline regulations, industry standards, and technical research set forth extensive guidance for
managing threats to pipeline integrity through a formal integrity management (IM) plan. Such plans
rely on a rigorous procedural approach to identify threats and mitigate risk on a prioritized basis in
a systematic and repeatable process. One subtle threat that is not just overlooked but is almost
invisible to many integrity management personnel is that of biased thought processes. Because they
typically go unrecognized such biases can seriously undermine the effectiveness of IM programs in a
variety of ways that lead to poor decisions. Such biases may also affect routine pipeline construction
and maintenance projects outside of IM work, but which may lead to harmful long-term IM
implications. Even when information to the contrary exists prior to the decision, people can become
anchored to a fallacy, unwilling to move from it. The various forms of bias, examples of their
potential adverse effects on pipeline integrity, warning signs, and potential avoidance methods are
discussed.
Introduction
Pipeline integrity management standards and regulations list the integrity threats to be managed,
mitigated, or prevented by integrity management plans. Almost all recognized threats are
metallurgical or mechanical in nature, for example corrosion, seam fatigue, earth movement, or
equipment failure. They can be managed with specifications, inspections, tests, or other engineering
barriers against failure. The exception is “operator error”, which occurs when an operator responds
incorrectly when presented with information about a potential problem and the resulting decision,
action, or inaction is directly responsible for the failure. Operator error can be managed with
administrative barriers such as procedures or worker training.1
The authors have observed that many pipeline incidents are more complex than what is implied by
the simple direct cause. Complexity arises in the form of interactions of physical factors in about 7%
of reportable incidents.2 “Interaction” is a combination of factors that result in a more severe
condition or higher probability of failure than the individual factors considered separately.
Operators often fail to address interactions because they overlook the possibility of their occurrence
and therefore fail to collect the necessary data to reveal the potential for interaction. One or more
types of thought bias may lead to such a sequence. Sometimes operators experience multiple
incidents or near-miss events that are seemingly unrelated, but such patterns are indicative of an
underlying thought bias that prevents the operator from connecting the events.
It is the authors’ opinion that thought bias can negate the effectiveness of a formal IM plan no matter
how detailed or well thought out, lead to unsafe practices in routine work, or counteract best
1
One could take the reasonable position that all failures are due to underlying operator error. Consider a
failure due to external corrosion, a naturally occurring physical process. The direct cause is a double failure
of the coating system and the cathodic protection (CP) system to perform adequately, but the underlying
causes likely originated with people. Was the coating applied properly? Was the pipeline carefully lowered
into the ditch and inspected diligently? Was the CP system adequately specified and its performance
monitored? Was the in-line inspection properly interpreted and acted on? An answer to “why” for any of
those factors is the essence of a root cause. They originated with people; physics is not at fault.
2
Munoz, E. and Rosenfeld, M.J., “Improving Models to Consider Complex Loadings, Operational
Considerations, and Interactive Threats”, Task III.B.3 Final Report, US DOT Contract DTPH56-14-H00004, Dec. 30, 2016.
3
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
intentions in pipeline projects. Moreover, such bias appears to be pervasive within the industry.
What thought bias is, how it can manifest itself to adversely affect pipeline integrity management,
and strategies to discover and mitigate it are discussed in this paper.
What is Thought Bias?
Heuristic thought is a strategy for problem solving or decision making using mental shortcuts
developed from previous experience. Heuristic thought arises unconsciously when people are
confronted with complex or rapidly changing situations because attention, or time to reflect and
analyze, are finite resources. Such thought processes are believed to have evolved from human origins
as hunter-gatherers,3 so they apparently are ingrained.
Thought biases, also known as cognitive biases, are examples of such mental shortcuts. Because they
are based on prior successful experience, those mental shortcuts often result in accurate thinking and
appropriate responses more efficiently than the “analyze everything” approach. In the pipeline
industry, engineers, integrity managers, and other workers often have multiple responsibilities, work
long hours under incredible time pressures, or must make decisions while subjected to information
overload or uncertainty overload. In those circumstances, mental shortcuts made automatically are
not just useful but necessary. But those shortcuts may also lead to errors that can adversely impact
decisions4 including those that occur when managing the integrity of a pipeline or when performing
other pipeline work. When information is incomplete or uncertain and the consequences of an
incorrect decision are large, it becomes even more important to defend against these mental
shortcuts.
Numerous categories of thought bias have been described. A Wikipedia article5 lists no fewer than
220 named varieties under 21 categories, many with intriguing names such as the Google Effect,6 the
Curse of Knowledge,7 the Hindsight Bias,8 the Hot Hand Fallacy,9 or the Women are Wonderful
Bias.10,11 Almost all of the numerous biases, effects, or syndromes listed in the literature could
influence decision making somewhere in a large organization, but it is more useful to focus on a few
broad categories that occur more often with respect to managing the integrity of pipelines. These
include:


Cultural bias
Availability bias
3
“16 Cognitive Biases that can Kill Your Decision Making”, www.boardofinnovation.com.
Sunstein, C.R., “Moral Heuristics”, Behavioral and Brain Sciences, Cambridge University Press, 2004.
5
https://en.wikipedia.org/wiki/List_of_cognitive_biases
6
Sparrow, B., Liu, J., and Wegner D.M., “Google Effects on Memory: Cognitive Consequences of Having
Information at Our Fingertips”, Science, Vol. 333, August 5, 2011.
7
Kennedy, J., "Debiasing the Curse of Knowledge in Audit Judgment", The Accounting Review, v. 70, no. 2,
April 1995
8
"I Knew It All Along…Didn't I?' – Understanding Hindsight Bias", APS Research News, Association for
Psychological Science, Sept. 6, 2016.
9
Green, B., Zwiebel, J., "The Hot Hand Fallacy: Cognitive Mistakes or Equilibrium Adjustments? Evidence
from Baseball", Stanford Graduate School of Business, April 2014.
10
Eagly, A.H.; Mladinic, A., "Are people prejudiced against women? Some answers from research on
attitudes, gender stereotypes, and judgments of competence", European Review of Social Psychology, Mar. 4,
1994.
11
The authors acknowledge possible susceptibility to this bias.
4
4
Pipeline Pigging and Integrity Management Conference, Houston, February 2023





Confirmation bias
Representativeness bias
The “flaw” of small numbers
Sunk cost bias
Groupthink
Cultural bias
Cultural bias is the interpreting or judging of matters by standards inherent to one’s group (in this
case the pipeline company or a contractor). Cultural biases may discourage critical evaluation of
long-held practices (“this is how we do it here”) as well as hostility to ideas or information originating
outside the culture (the “not invented here” syndrome). Most of these biases appear valid based on
experience (“Contractor/supplier X has always done good work”). But they are sometimes held
without good evidence or even with contrary evidence, or they may be based on habits (“we always
use that contractor/supplier” or “this has been our SOP for years”) that no longer stand the test of
time as knowledge or technology changes. In the IM context, they may interfere with continual
improvement, lead to a failure to recognize new hazards or risks, or directly lead to a poor decision.
Availability bias
Availability bias is a mental shortcut that focuses on the most recent information in one’s memory.
The ease of recollection can lead to a false assumption that an event occurs more frequently than it
does, causing one to focus on the wrong risk. Availability bias could occur due to specific problems
on the most recent pipeline construction project, or the cause of the most recent incident, causing
intense focus on avoiding those issues rather than a more general threat. This sometimes leads to
company standards, technical procedures, or risk modeling covering a particular subject at a
peculiarly high level of detail that is out of proportion to the coverage of other subjects. In the worst
case it may lead to a complete distraction from another, potentially greater risk.
Representativeness bias
Representativeness bias occurs when one thinks a current situation is like another that may already
exist in one’s mind. For example, “this is just like a situation we saw before”, or “we have done this
many times in the past and it always worked fine” or more generally, the probability that event A
belongs to group B based on how much A resembles B. While this seems like a logical way to judge
things, it may lead to overlooking the unique aspects of the problem, underestimating the probability
of a particular threat, or overestimating the effectiveness of a particular solution to a problem. A
problem that arises from using this heuristic is the insensitivity to prior probability. In this case,
people will judge the probability of an event purely based on how well something resembles one
group or another regardless of the percentage of the population that truly belongs to that class. For
instance, if a person is trying to decide if something belongs to Group 1 or Group 2 and decides that
the description fits 80% of the population of Group 1, they will assume that the probability that it
belongs to that group is 80%. Even if they are told that Group 1 only makes up 1% of the population,
they will not change their initial probability judgement (the correct answer considering the prior
probability is about 4% which is far below the assumed probability).12 These failures to thoroughly
think through the matter have led to choosing incorrect mitigations or other actions.
12
Anderson, J., “Optimizing Risk Decisions with Imperfect Data”, Paper #36, PPIM2023, Houston, February
2023.
5
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
The “law” of small numbers
The reason for putting the word “law” in quotation marks is that it’s not a mathematical law but a
fallacy of faulty reasoning from insufficient evidence due to small sample size. People have an
exaggerated faith in small samples because they want to assign causality. Yet the only “cause” might
have simply been chance. For instance, 6 confirmation digs performed for an ILI run with 5 of them
being within specification leads the engineer to conclude that the tool met the requirements for 80%
within specification. After all, 5/6 = 83%, right? However, since the sample is so small, even if the
tool was only performing to 50% within specification it would be expected to see that same result (5
out of 6 in-spec) by chance almost 10% of the time, not an insignificant probability that can be
ignored. This goes hand in hand with the “flaw of averages” which states, “decisions based on
averages are wrong on average”13. The tool performs to 80% within specification on average but that
does not mean it performed to 80% within specification on this run.
Confirmation bias
Confirmation bias is selectively paying attention to information that agrees with what one believes
to be true. It may lead to failure to recognize other important information. One example we observed
involved an integrity engineer who was enthused about a recent in-line inspection (ILI) that had
identified some significant hook cracks. But he had disregarded the dig data showing a large
proportion of hook cracks observed in the ditch were not indicated by the tool and they were no less
severe than the ones indicated by the tool. This suggested a high likelihood of other cracks existing
undetected in unexamined joints of pipe. Can that ILI really be considered a success?
Sunk cost fallacy
The sunk cost fallacy occurs when one is reluctant to abandon a course of action because of having
already heavily invested time or resources in it, even after it becomes clear that a different course of
action would be more beneficial going forward. For example, we have observed this bias to cause
operators to doggedly continue with a “pig and dig” approach to managing the integrity of a pipeline
at great cost although analysis of the results showed that the only effective way to lower risk is to
replace segments of the pipeline.
Groupthink
Groupthink occurs within a group of people in which the desire for harmony or conformity, usually
driven by adherence to a defined process, results in dysfunctional decisions. It can cause a group to
minimize conflict and reach a consensus without critical evaluation of the data or alternative actions,
suppressing dissenting viewpoints (by punishing “rocking the boat”). It may also lead to an insular
approach uninformed by outside ideas or experiences that could be effective or valuable.
Several of the thought biases described above, and others, are noted in Section 7.3 “Cognitive
limitations in decision making – Heuristics and biases” in CSA EXP24814.
13
Sam L. Savage, “The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty” (Wiley,
2009)
14
Canadian Standards Association (CSA), “Pipeline Human Factors”, EXP248:2015.
6
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
Some Real Life Examples
As stated earlier, bias is the making of unwarranted assumptions or making assumptions that are not
based on demonstrated experience or accurate data. Bias is introduced in decisions within the IMP
or in executing the IMP, as well as in routine work or in new construction activity. It causes us to
fail to recognize threats or make the appropriate response. A few illustrative cases are described
below.
Bellingham, WA (1999):
The Bellingham, WA incident occurred at mechanical damage caused during installation of water
treatment plant piping over the pipeline. The ILI tool identified the anomaly that eventually
ruptured but because the site was difficult to access and after examining “several” other anomalies,
the person in charge concluded that the tool was over-reporting the actual depth. So, a decision was
made that the anomalies near the water treatment plant would not be excavated and examined.15
The probability of a 4% dent being an integrity threat given that the tool over-called other
deformations is low. However, the probability of a 4% dent being an integrity threat given that it
was not called on the previous run and that the location coincided with where a large water line had
been installed over the pipeline since the last run is very high.
The investigation also revealed that despite many meetings with the contractor working at the water
treatment plant and site visits by the operator, the operator did not have full awareness of work at
the site. Last minute design changes were not communicated, and a witness reported that the
excavator chose not to report hitting the line.
Compounding these conditions, the operator had been updating its SCADA system without
validating the changes offline. These interfered with the ability of controllers to respond effectively
to the pipeline failure.
The biases that contributed to this incident included:




Cultural bias – Overconfidence in weak administrative barriers against excavator damage,
inadequate management of change for the software update
Representativeness bias – the unexamined ILI feature would be like those that were
examined
Confirmation bias – Disregard of the fact that the unexamined dent was a new indication at
a recent work site
Law of small numbers fallacy – Overreliance on conclusions from a small number of
observations.
Carlsbad, NM (2000):
The pipeline failed due to internal corrosion. The operator assumed that an existing drip would
catch any liquids in the line and that warning signs and the remoteness of the site would keep people
out of the area even though the affected site was on property not under the operator’s control.
15
NTSB, “Pipeline Rupture and Subsequent Fire in Bellingham, Washington June 10, 1999”, NTSB/PAR0202, PB2002-916502, https://www.ntsb.gov/investigations/AccidentReports/Reports/PAR0202.pdf.
7
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
However, per the NTSB report16, the drip siphon had become plugged. When the collection leg
filled up liquids were pushed downstream to the next low point. The drip stopped producing liquids
when it was checked which seemed to confirm a reasonable but incorrect assumption that there were
no liquids in the line. The pipeline operator routinely collected liquid samples from inlet scrubbers
and maintenance pigging and had them analyzed for environmentally hazardous substances but never
tested them to determine if they were potentially corrosive, even though an internal corrosion failure
had previously occurred elsewhere in the system and the segment that failed was incapable of being
pigged.
The biases that contributed to this incident included:



Cultural bias – Failure to recognize an internal corrosion problem, not collecting the data
needed to manage the condition, and overconfidence in weak barriers against failure (the
drip) or consequences (warning signs)
Confirmation bias -- The drip was not producing liquids so they must not be present
Representativeness bias – What they had been doing was working so far so it was adequate.
A Series of Unfortunate Events17
The two incidents cited above were selected specifically because they led to integrity management
planning as specified in US pipeline safety regulations. Thought bias clearly played prominent roles.
But integrity management regulations and industry practices have not eliminated the role of thought
bias in incidents. The authors believe that almost any incident or mishap can be analyzed in terms
of unwarranted assumptions stemming from some type of bias. Some further examples follow.


RSI personnel have performed root cause failure analyses (RCFAs) of several incidents and
near misses in a pipeline operator’s systems. The operator was technically proficient, had
detailed procedures in place, and the incidents seemed unique and completely unrelated.
But a common thread running through all of them was cultural bias that promoted
overconfidence in the effectiveness of barriers against failure including threat identification,
material specifications, field practices and procedures, construction inspections, in-line
inspection, and organizational communication, among other things.
An NTSB study18 from 2005 found that a SCADA issue played a role in 10 of 13 hazardous
liquid pipeline incidents. In 7 of those 10 cases, an inability by controllers to distinguish
between alarms triggered by spurious or normal event conditions and those triggered by real
emergencies contributed to delayed or incorrect actions that caused or increased the severity
of the events. The controllers often made incorrect assumptions about the causes of the
alarms based on many prior observations of false alarms associated with other common
operating situations, which were indicative of classic representativeness bias. Although these
incidents preceded control room management requirements in pipeline safety regulations,
RSI’s RCFA team have observed false-alarm overload in contemporary events.
16
NTSB, “Natural Gas Pipeline Rupture and Fire Near Carlsbad, New Mexico August 19, 2000”,
NTSB/PAR-03/01, PB2003-916501,
https://www.ntsb.gov/investigations/AccidentReports/Reports/PAR0301.pdf.
17
This is not a reference to the similarly titled children’s book written by Daniel Handler under the
pseudonym Lemony Snicket.
18
NTSB, “Supervisory Control and Data Acquisition (SCADA) in Liquid Pipelines”, Safety Study NTSB/SS05/02, PB2005-917005, Notation 7505A, November 29, 2005.
8
Pipeline Pigging and Integrity Management Conference, Houston, February 2023

RSI personnel have observed numerous incidents involving defective new pipeline
construction owing to unjustified confidence in the competence or good intentions of
material suppliers, construction contractors, fabricators, welding inspectors, site inspectors,
or other parties. Positive or negative change in a contractor’s business, or change in
ownership, has been observed to adversely affect performance in some cases. The unjustified
confidence led to inadequate supervision and control of construction quality at differing
stages in many projects. Most of these led to damage-only outcomes but a couple events led
to injury or fatality. Those that did not have such serious consequences had the potential
for worse and that they did not was a matter of luck.19
With respect to this last item, the authors recognize the need to trust that people can and will do the
job they are tasked with and do it well. In fact, most people intend to. But time and again, operators
are too quick to conclude that once qualified, it is no longer necessary to think about that supplier,
contractor, or worker, or that once written, it is no longer necessary to think about the effectiveness
of specifications and procedures. We have often observed that the engineers in the glass towers
downtown have no idea what is happening in the field.
The Cloak of Invisibility
Pipeline safety and integrity management regulations and industry guidance documents are
insensitive to patterns of thought bias that may undermine their goals. They may even promote the
making of unjustified assumptions primarily by error of omission. Some examples follow:




The “TTO5” baseline seam assessment decision chart20 applies to pressure-cycle fatigue of
low-frequency ERW seams. Some operators have incorrectly assumed that since highfrequency ERW21 and DSAW seams are omitted they are not susceptible to fatigue.
ASME B31.8S22 and Part 192, Subpart O both state that the manufacturing defect integrity
threat is mitigated in a natural gas pipeline if it has sustained a hydrostatic pressure test to at
least 1.25 times the MAOP. However, studies have shown that a test pressure ratio of 1.25
is effective only for pipelines operating at Class 1 stress levels, but it is inadequate for
pipelines operating at lower stress levels.
ASME B31.8S states that pipelines that operate at stress levels of 60% of SMYS or greater
are susceptible to stress-corrosion cracking (SCC). Many operators have then incorrectly
inferred that pipelines operating below that stress level are not susceptible.23
The pipeline regulations specify minimum criteria for CP without limiting overprotection.
To comply, some operators overprotect their pipelines in an attempt to meet the minimum
requirement over all parts of a segment, potentially disbonding coatings or inducing a risk
19
They were what are sometimes referred to as “near-miss” events.
Michael Baker Jr., Inc., Kiefner and Associates, Inc., and CorrMet Engineering Services, PC, “Low
Frequency ERW and Lap Welded Longitudinal Seam Evaluation”, Report to US DOT, RSPA, PHMSA,
Delivery Order DTRS56-02-D-70036, Final Report, Rev. 3, April 2004.
21
Kiefner, J.F., and Kolovich, K.M., “ERW and Flash Weld Seam Failures”, Subtask 1.4, US DOT, DTPH5611-T-00003, September 24, 2012, compiled failures in ERW and flash welded seams due to various causes.
Five of the 37 cases reported as due to fatigue crack growth occurred in high-frequency ERW seams. The RSI
RCFA team is aware of others since the publication of that study.
22
ASME, “Managing System Integrity of Natural Gas Pipelines, Supplement to B31.8”, B31.8S-2020.
23
This is like stating that driving 90 mph could be considered speeding, which is true, but driving well below
90 mph could also be considered speeding in many circumstances.
20
9
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
of hydrogen cracking, instead of taking steps necessary to manage CP to more uniform levels
that are not excessive.
The criteria or guidance in these examples are correct as far as they go, but they are incomplete
statements of the concern through error by omission. On the other hand, it is impossible to write
enough rules to prevent all errors or to account for all variations of complex problems. Operators
are responsible for correctly understanding the intent of the provisions, and for applying appropriate
technical knowledge to their interpretation; many operators have done so successfully. Those that
made unjustified assumptions or convenient interpretations have sometimes discovered their errors
through unhappy experiences.
For the most part, the regulations and guidance documents are not written in a way that promotes
discovering erroneous assumptions, though exceptions exist. For example, ASME B31.8S requires
annual review of risk model assessment assumptions. It also requires performing evaluations of timedependent conditions using appropriate defect growth rates to assure that a failure will not occur
before the next integrity assessment. The guidance for the “check” portion of the Deming Cycle
described in API 116024 recommends activities having the potential for discovering and correcting
bias. Both Parts 192 and 195 require that ILI be performed in accordance with API 116325 which
provides methods for validating the ILI tool performance. The regulations and guidance documents
all require evaluating IMP effectiveness, though that does not assure discovering built-in bias if the
performance metrics are biased. API 1160 provides recommendations for self-review that can be
helpful in that regard. However, procedures that are written as compliance documents will not
inherently reveal bias written into them or in their implementation., nor will internal audits that are
box-checking compliance exercises.
Recognizing and Preventing Bias
Assumptions by themselves are not deleterious to the decision process. They become a fallacy when
the decision maker fixates on an assumption and refuses to move from it. A rigorous decision process
allows for the data to change the opinion even if it goes against the prior assumptions. The decision
process needs to assess all the information that is available not just the parts that fit the initial
assumptions.
Question your beliefs
Awareness of bias in your own thought process is a place to start.26,27 As you consider your decisions
about threat identification, ILI tool selection, contractor selection, or almost anything else, challenge
your own perception of the problem. Ask yourself questions such as:
24
American Petroleum Institute, “Managing System Integrity for Hazardous Liquid Pipelines”,
Recommended Practice 1160, 3rd Ed., February 2019.
25
American Petroleum Institute, “In-line Inspection System Qualification”, Standard 1163, 2nd Ed., 2013,
2018.
26
We refrain from saying “knowing is half the battle” as that is a classic biased response known as the “G.I.
Joe Fallacy”, named after a 1980s TV show in which every episode ended with a short object lesson and the
adage “Now you know. And knowing is half the battle”. To the contrary, it has been shown (Kristal and
Santos) that certain biases that are embedded emotionally cannot be overcome by awareness and conscious
reflection alone. In those cases, knowing is much less than half the battle.
27
Kristal, A.S. and Santos, L.R., “G.I. Joe Phenomena: Understanding the Limits of Metacognitive
Awareness on Debiasing”, Harvard Business School, Working Paper 21-084, 2021.
10
Pipeline Pigging and Integrity Management Conference, Houston, February 2023




Why do I think this? Is this always true?
Am I jumping to conclusions? What am I missing?
Is this what the data shows? Is my data reliable?
Do I believe this inspection report? Is this ILI feature what it is reported to be?
Your questioning can extend to the beliefs you hold as an organization:



Why do we do it this way? Is it still the best way to do it? What do other operators do?
We have procedures for everything, but are they being followed? We get all these reports
from the field but what are they telling us? Do we even look at them?
We have been using this contractor/suppler/consultant for years, but are they still the best
choice? Have we recently evaluated their performance?
The answers may be unsettling. If that is the case, you may have encountered a personal view or an
organizational behavior that has been influenced by an unproductive bias.
Behavior of a high-performance organization
When driving a vehicle in challenging conditions (heavy or fast-moving traffic, bad weather, winding
or poorly lit roads), the driver must continuously perform at a high level of attention and skill. A
momentary lapse in attention, or a deficit in skill or judgment, may cause the driver to fail to respond
to a situation with the near-instantaneous correct judgment and coordinated response needed, which
can be catastrophic.28 Vehicles are routinely operated in unforgiving environments that present the
potential for error and risk to others. Pipelines also operate in unforgiving environments (physical,
social, and political) and present the potential for error and risk to the public.29 Managing the risks
on the road requires the driver to perform at a high level; managing the risks of a pipeline requires
the operator’s organization to perform at a high level.
Business consultancies, educational institutes, and publications that cater to the business market
often try to distill the distinguishing characteristics of a high performance organization (HPO) to a
Top 5 list, such as: communication (and buy-in) of values, reinforcement of positive behavior, open
communication, management trust of employees, and collecting feedback.30 Others list safety
performance and skill development31 or accountability and flat organizational structures.32 Any list
of 5 characteristics is inadequate. Some characteristics at the staff and management level of a HPO
might include:33
28
According to the National Safety Council, in 2020 there were over 42,338 motor vehicle fatalities.
Most Americans are exposed to hazards from automobiles either as drivers, passengers, or pedestrians.
Based on 2021 data from various sources, the authors estimate that approximately 50% of the US population
is exposed to the risk from natural gas distribution, 15% from natural gas transmission, and 11% from
hazardous liquid pipelines. With 10-year average annual fatalities rates of 2.1, 1.4, and 7.7, respectively, if
there were enough pipelines to expose 100% of the US population to the risk, the projected number of
fatalities would be around 42/yr. Thus, automobiles pose an average 42,000/42 = 1,000 times greater net
societal risk than do pipelines, but pipelines do pose a larger potential hazard on an incidental basis.
30
https://www.ottawa.edu/online-and-evening/blog/november-2020/5-key-elements-to-creating-a-highperformance-comp
31
https://www.industryweek.com/leadership/article/21146834/what-makes-a-highperformance-organization
32
https://www.bcg.com/publications/2011/high-performance-organizations-secrets-of-success
33
Discussions with Dr. Alan Murray, Pipeline Consultant.
29
11
Pipeline Pigging and Integrity Management Conference, Houston, February 2023











They do what they say they will do
They have the courage to do what is right
They hold themselves and others accountable
They understand what they are doing and why
They anticipate problems and are alert for unusual conditions
They ask questions, consult, and verify, rather than act on assumptions
They value collective input and the views of others
They speak up when problems are identified
They communicate information in a disciplined way
They seek staff development training
They are willing to listen to contrary opinions
It is more difficult for bias to thrive and persist in such a culture. The authors have observed
organizations in which management holds the above values to be important, but either that message
is somehow not communicated convincingly to staff, or the intended behaviors become overwhelmed
by time pressures or attention deficits that cause individuals to go into a “survival mode” in which
they fall back on heuristic patterns in the moment of decision.
On the other hand, there are many cultural behaviors that clearly will promote thought bias and
other unproductive behaviors in many forms. Some of those include:

















Management by intimidation
Starving integrity budgets
Overworked staff
Impossible deadlines
Management inability or unwillingness to either motivate or get rid of continually poor
performing employees, which demotivates high performing workers
Intracompany competition for resources
Subject matter expert arrogance or dismissiveness
Overconfidence in procedures, knowledge, and barriers against failures
Denial or trivialization of errors
Information hoarding
Failure to hand off records – Construction to O&M, O&M to Integrity
Siloing – a lack of communication or information sharing between groups
Lack of constructive or relevant training or professional development
Echo chamber – low awareness of problems, developments, or trends in the industry
Unwillingness to look at new ways of doing something (“not invented here”)
Reliance on rules of thumb
Procedures and integrity plans written to comply with the letter of regulations
12
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
We are not alone
Civil engineers have attempted to classify the causes of structural failures much as the pipeline
industry has done. In one study34, the main factors which affect “proneness to structural accidents”
were:
1.
2.
3.
4.
5.
6.
7.
8.
New or unusual materials
New or unusual methods of construction
New or unusual types of structure
Experience and organization of design and construction teams
R&D background
Financial climate
Industrial climate
Political climate
The fourth leading cause was related to organizational issues. Another study35 that focused on failures
of steel bridges concluded that 10 of 24 cases were the result of human error in the design or
construction phase. A Canadian study36 confessed that the construction industry (at that time) lacked
an understanding of the basic factors involved in human error in design and construction of
structures and suggested turning to other disciplines such as human behavior for guidance.
The Chernobyl disaster in 1986 shocked the international nuclear power industry into developing
the concept of the “safety culture” to explain the organizational dynamics that led to the accident.37
A definition of safety culture was developed by the British Advisory Committee for Safety in Nuclear
Installations (ACSNI)38 which was generalized to all hazardous industries:
“Safety culture is the product of individual and group values, attitudes,
competencies, and patterns of behavior that determine the commitment to, and
the style and proficiency of an organization’s health and safety programs.
Organizations with a positive safety culture are characterized by communications
founded on mutual trust, by shared perceptions of the importance of safety and by
confidence in the efficacy of preventive measures.”
This was 10 or more years after the Canadian structural industry contemplated delving into
behavioral science to explain structural failures. The pipeline industry has moved tentatively in that
direction also with the concept of a pipeline safety management system (PSMS).39 The PSMS sets
34
Pugsley, A.G., “The Prediction of Proneness to Structural Accidents”, The Structural Engineer, 51(6), June
1973.
35
Blockley, D.I., “Analysis of Structural Failures”, Proc. Inst. Civil Engineers, January 1977.
36
Allen, D.E., “Structural Failures Due to Human Error: What Research to Do?”, National Research Council
Canada, Proc. Symposium on Structural Technology and Risk, July 1983.
37
Wilpert, B. and Fahlbruch, B., “Safety Culture: Analysis and Intervention”, Conference Proceedings,
Probabilistic Safety Assessment and Management, 2004.
38
Advisory Committee on the Safety of Nuclear Installations, Study Group on Human Factors (1993), Third
Report: Organizing for safety, London: HMSO, 1993.
39
American Petroleum Institute, “Pipeline Safety Management Systems”, Recommended Practice 1173, July
2015.
13
Pipeline Pigging and Integrity Management Conference, Houston, February 2023
forth requirements for leadership, middle management, and employees that, as it turns out, mirror
many of the favorable behaviors listed earlier for an HPO but directed toward safety. Thus,
developing and maturing a PSMS may eventually contribute to reducing the risk of thought bias
interfering with the effectiveness of an operator’s integrity management plan. Though a PSMS
system, even if carried out correctly, will not debias any individual, it does have the potential to make
the organization less reliant on bias.
Acknowledgements
The authors wish to thank RSI’s RCFA team colleagues Ms. Stephanie Flamberg and Ms. Cara
Macrory for their insightful observations in many complex RCFA investigations. The authors also
wish to thank Dr. Alan Murray for his astute comments over many discussions about the role of bias
in incidents.
14
Download