The Risks of Risk Registers

advertisement
The Risks of Risk Registers
A Challenge to “MIS and the Illusion of Control”
1 Drummond's Propositions
Drummond eloquently illustrates the Janus-head of MIS using a vivisection of risk
registers, noting how, although risk registers are commonly used, they often fail.
Drummond finds two root causes for their failure (1) the fog of risk, and (2) the surplus
reality of risk registers.
Drummond’s first notes that risk registers compile information about the futures of the
world and the possible impacts the futures may have on the company. As such, risk
registers are predisposed to fail, because of managerial myopia and the indeterminism of
the future.
Secondly, Drummond argues, risk registers fail because they constrain managerial
thinking. Risk registers are limited to measurable and explicable information, thus
discarding the value of intuition, gut feeling and emotions.
Because language frames managerial thinking, Drummond suggests novel metaphors to
reinvent risk management, i.e., reclassifying risks as ghost and shadows, viruses, imps, or
choices.
The challenge put forth to Drummond (this issue) argues that (1) risk registers should not
be viewed as a metonymy; (2) the root causes of the failure of risk registers are not only
because of individual fallacies but are also rooted in organizational nature, e.g.,
organizational self-service and distrust in experts; and (3) a way out of the predicament
needs to go beyond semantics.
2 Risk Registers are more than a Metonymy
Risk Registers are more than a rhetorical device. Thus by framing the issue a rhetorical
question Drummond narrows our search for solutions. To open our thinking once could
apply Alfred North Whitehead's notion of misplaced concreteness to risk registers.
Whitehead (1979:7) juxtaposes philosophy's biggest achievement, rationality, with its
biggest error: the fallacy of overstatement. For example although in philosophy the
objective of generalizing is justified, philosophy overshoots its target - “the estimate of
success is exaggerated” (ibid. 1979:7). A very similar claim can be made for risk
registers. In fact, Drummond illustrates how, although the aim of generalizing the risk of
an organization is justified the assumed success is often overstated. Whitehead (1979)
identifies two root causes of overstatement - fallacy of misplaced concreteness and the
fallacy of false certainty.
Drummond pointed out the fallacy of false certainty. The notion of misplaced
concreteness, however, is omitted. Misplaced concreteness can be argued to occur at two
different levels (1) in compiling risk registers and (2) in managing risk registers.
While compiling risk registers managers make the “accidental error of mistaking the
abstract for the concrete” (Whitehead, 1925:59) they also neglect the degree of
abstraction by using actual examples to exemplify larger categories of thought. The latter
fallacy is easy to spot in the compiled exemplary risk register as Drummond inadvertently
indicated: “Risk that product failure... will result in adverse publicity that damages our
reputation...”. This example demonstrated how listing a very concrete risk under the
broader category of customer care might lead to forgetting the other risks within this
broader category of customer care risks.
Likewise, the fallacy of misplaced concreteness occurs when managing risk registers. As
Drummond's observations show, the risk register itself is considered something concrete
that replaces the abstractness of all future risks. Moreover it is misplaced in a
management ritual where the management of risk registers becomes more important than
managing the risks within the organization. What Drummond labels as ‘mistaking the
map for the territory’ and ‘ignoring the surplus reality’ has been observed with other
management by model approaches, such as Rayner (2010) who observed the fallacy of
misplaced concreteness in conservation modeling.
3 Root Causes are Organizational
While Drummond explores root causes on the individual level (self-loathing effect,
illusion of control), the article neglects a discussion of organizational root causes.
Ritualistic management of risk registers, as described by Drummond, resembles
escalating behavior. Staw & Ross (1989) explain how the escalation of commitment is
determined by psychological, social, and organizational determinants. While Drummond
eloquently explains the psychological and social root causes of failure, such as need for
self-consistency (Heider, 1958), the article omits organizational determinants. Staw &
Ross (1989) hypothesize that similar to individuals organizations suffer from inertia,
loose coupling between organizational goals and actions, and an imperfect sensory
system. Moreover Staw & Ross (1989) point out that politics (Pfeffer & Salancik, 1974)
play an important role in organizational decision-making. Whyte (1989) argues that group
think is caused by uniformity pressures and group polarization, in addition to framing
effects and risk seeking in the event of losses. Another institutional argument is put
forward by Seo & Creed (2002) who point out that institutions can be caught up in
contradictions: a contradiction can occur when legitimate practice, such as managing risks
by risk registers, undermines functional efficiency. In those cases institutions
ritualistically conform to rationalized myths, for example that risk registers are the best
method to successfully manage risks.
4 Solutions need to go beyond Semantics
According to Seo & Creed (2002), change to organizational rituals needs three steps (1) a
reflective shift in consciousness, (2) mobilizing actors, and (3) emerging collective action.
Drummond’s article covers the initial stepping stone, yet there are additional processes
that need to be undertaken to give substance to the change in need.
Communication research in public health tells us that the reflective shift in consciousness
can only break bad habits when a solution is offered at the same time (Rogers, 1975).
What might be a solution to the predicament? This challenge proposes two facets of a
solution (1) it needs to go beyond semantics; (2) it needs to go back to the roots of all evil.
Firstly, merely relabeling risks will not change the discourse and break the paradigm that
frames our current decisions. Although the suggested re-framing and re-imaginings might
help indentify and focus on relevant risks, and although it might be more adequate to
stress the emergent nature of risks; new rhetoric can only be the starting point of the
discussion.
This challenge starts by asking the question: Why did we start using risk registers in the
first place? By going back to the roots one finds that risk registers perform two important
functions in organizations (1) risk registers are boundary objects and (2) risk registers
prioritize.
Firstly, Drummond argues that the instrumental reason for having risk registers is
questionable. However, the example shows one important function of risk registers - risk
registers act as boundary objects in organizations. As Star & Griesemer (1989) explain
boundary objects are “concrete and abstract, specific and general, conventionalized and
customized” at the same time. Risk registers are created to bring together different worlds
of the organization, actors with different purposes. Risk registers enable communication
between diverse groups of people. A communicative process which Star & Griesemer call
“translations of each others’ perspective” (1989:412), for instance the engineers’
translation of a marketing perspective or the middle managements’ translation of the top
management perspective. However as Drummond points out the discourse around risk
registers can become ritualistic and might destroy rather than create value. Habermas
(1971) picked up on a similar call that has been adopted to evaluate Technical Risk
Assessments of Technologies (Webler, Rakel & Ross, 1992). Habermas posed the
question: What are the biggest threats to the necessary conversation we should have about
risk? Habermas (for a clear and concise summary see Flyvbjerg, 2001 chapters 5-7)
argues that discourse needs to be free to establish communicative rationality.
Communicative rationality helps organizations to establish an understanding of the issue
at the center of the debate and which course of action is best to take. Even though
communicative rationality might be unobtainable in practice, Habermas (1971) outlines
five processural requirements for free discourse, to which Flyvbjerg (2001) adds an
additional dimension:
“1. No party affected by what is being discussed should be excluded from the discourse
(the requirement of generality)
2. All participants should have equal possibility to present and criticize validity claims in
the process of discourse (autonomy)
3. Participants must be willing and able to emphasize with each other's validity claims
(ideal role taking)
4. Existing power differences between participants must be neutralized such that these
differences have no effect on the creation of consensus (power neutrality)
5. Participants must openly explain their goals and intentions and in this connection desist
from strategic action (transparence)...
6. Unlimited time.” (Flyvbjerg, 2001:91).
And although the criteria are unobtainable in practice, they can help risk practitioners
assess how far an organization is from a truly rational discourse about risk.
Secondly, risk registers help to focus and prioritize resources, attention, and effort on the
root of the problem. Davis (2003) describes a novel approach to plan for resilience - to
build a network autonomous cells whose capabilities, objectives, and interfaces to other
cells are defined but not their tasks and schedules. Is quantification of risks obsolete? No,
a second possible pathway to resilience is to buy insurance not only against external
events but also potentially ruinous large capital investments and change projects executed
within the company. This challenge argues that quantitative risk estimation should focus
on planning contingencies (e.g., Reference Class Forecasting as outlined in Flvybjerg et
al., 2006) stretching the concept of probabilistic quantification too far into task level
planning is again falling for the fallacy of misplaced concreteness and the fallacy of false
certainty.
The Risk of Risk Registers
To summarize – this challenge to Drummond’s paper argues that the shortcomings of
managing by risk registers is better described by the fallacy of misplaced concreteness
than by organizational rhetoric. Secondly, while the root causes are social and cultural
they are also organizational. Finally, a solution needs to go beyond rhetoric. It needs to be
a critical assessment of the quality of the risk discourse – a discourse in which risk
registers are a boundary object and also a reflection of planning approaches for
organizational resilience.
References
Flyvbjerg, Bent (2001): Making social science matter - why social inquiry fails and how it
can succeed again; Cambridge University Press, 2001.
Flyvbjerg, Bent; Bruzelius, Nils; Rothengatter, Werner (2006): Megaprojects and Risk –
An Anatomy of Ambition; Cambridge University Press, Cambridge, 2006.
Heider, Fritz (1958): The Psychology of Interpersonal Relations; Lawrence Erlbaum
Associates; Hillsdale, NJ, 1958.
Seo, Myeong-Gu & Creed, Douglas W.E. (2002): Institutional Contradictions, Praxis, and
Institutional Change: A Dialectical Perspective; in: The Academy of Management Review
Vol. 27, No. 2 (Apr., 2002), pp. 222-247.
Pfeffer, Jeffrey & Salancik, Gerald R. (1974): The Bases and Use of Power in
Organizational Decision Making: The Case of a University; in: Administrative Science
Quarterly, Vol. 19, No. 4 (Dec., 1974), pp. 453-473.
Rayner, Steve (2010): Handbags and Goats Entrails, Conference Presentation at CRASSH
– Challenging Models in the Face of Uncertainty, Cambridge University, 28-30
September 2010.
Rogers, Ronald W. (1975): A Protection Motivation Theory of Fear Appeals and Attitude
Change; in: Journal of Psychology; Sep 1975, Vol. 91, Issue 1, pp. 93-115.
Star, Susan Leigh & Griesemer, James R. (1989): Institutional Ecology, 'Translations' and
Boundary Objects - Amateurs and Professionals in Berkeley's Museum of Vertebrate
Zoology, 1907-39; in: Social Studies of Science, Vol. 19, No. 3 (Aug 1989), pp. 387-420.
Staw, Barry M. & Ross, Jerry (1989): Understanding Behavior in Escalation Situations;
in: Science, 13 October 1989, Vol. 246 no. 4927 pp. 216-220.
Webler, Thomas; Rakel, Horst & Ross, Robert J.S. (1992): A critical theoretic look at
technical risk analysis; in: Organization Environment, March 1992, Vol. 6, No. 1, pp. 2338.
Whitehead, Alfred North (1979): Process and Reality; Simon and Schuster, 1979.
Whitehead, Alfred North (1925): Science and the Modern World – 1925 Lowell Lectures,
Macmillan, published 1962.
Whyte, Glen (1989): Groupthink Reconsidered; in: The Academy of Management
Review; Vol. 14, No. 1 (Jan., 1989), pp. 40-56.
Davis, Paul K. (2003): Uncertainty Sensitivity Planning; in: Johnson, Stuart; Libicki,
Martin; Treverton, Gregory F. (Eds.): New Challenges - New Tools for Defense Decision
Making, 2003, pp. 131-155; ISBN 0-8330-3289-5
Download