Learning to Solve the Right Problems:

advertisement
1
Learning to Solve the Right Problems
Learning to Solve the Right Problems:
The Case of Nuclear Power in America
Successful
problem
solving
requires finding the right solution
to the right problem. We fail more
often because we solve the wrong
problem than because we get the
wrong solution to the right
problem.
Russell Ackoff, 1974
But then, you may agree that it
becomes morally objectionable for
the planner to treat a wicked
problem as though it were a tame
one, or to tame a wicked problem
prematurely, or to refuse to
recognize the inherent wickedness
of social problems.
Rittel and Webber, 1973
Just how safe are nuclear power plants?
While the construction of new plants is on hold
in America and while Sweden plans to phase
hers out over the next decade or so, France and
Japan continue to build them.
The recent Chernobyl disaster has clarified
little. Most realize that sooner or later the
secrecy endemic to a closed society comes
back to haunt it and that operating graphite core
reactors with no containment structures is
playing with fire. However, we don't have
graphite core reactors in America; we have
neither the (former) USSR's "administrativecommand" structure nor its KGB; our press is
open, perhaps to a fault. Yet, both proponents
and opponents of nuclear power can and have
claimed to find support for their positions in what
happened at Chernobyl.
One reason for continuing controversies are
the inherent uncertainties in assessing the
safety of complex systems such as nuclear
power plants. For example, while the
Rasmussen report, nearly a foot thick, reassures
us that the chances of a catastrophic accident
are vanishingly small, the Union of Concerned
Scientists
and
the
Clamshell
Alliance
vehemently disagree. A second reason is that
safety issues are unavoidably embedded in
Jonathan B. King
controversies over future energy needs. While
some view the Greenhouse Effect as a mandate
for nuclear power, others propose that we
pursue conservation with a vengeance. A third
reason is that the issues of both safety and
future energy needs are, in turn, embedded in
still larger controversies over what constitutes
the good life, over the kind of world we want for
our children and why. For example, while some
view the Clamshell Alliance as a bunch of
Luddites, others see the pro-nuclear folks as
naively utopian.
Given these wildly different views, who is
right? Perhaps nobody is right in the sense that
we may be trying to solve the wrong problems.
Instead of being a "tame problem," nuclear
power is decidedly a "mess" if not also a "wicked
problem."1
There are compelling reasons for learning to
solve the right problems. First, strategies for
solving tame problems differ qualitatively from
strategies appropriate for messes. Messes are
puzzles; rather than "solving" them, we sort out
their complexities. In turn, solving and sorting
both differ qualitatively from strategies for
dissolving the barriers to consensus implicit in
wicked problems.2
A second and more compelling reason is not
so much that solving the wrong problems fails to
solve the right problems. Rather, the greater
danger is that by solving the wrong problems,
we unwittingly undermine what it takes for us to
solve the right problems. The danger is not so
much that we fail to build our bridges across the
right rivers. Rather, the greater danger is that we
destroy the materials we need to build our
bridges across the right rivers.
A third reason for learning to solve the right
problems is that controversies over nuclear
power in America may be paradigmatic of things
to come. Other more powerful technologies are
being rapidly developed which give every
indication of generating messes if not wicked
problems.
Unfortunately, we face a number of obstacles
to solving the right problems. Developing our
capacity to frame problems as messes--learning
Jonathan B. King
how to sort through complexity and uncertainty-constitutes a major challenge in our turbulent
times. In turn, developing our capacity to frame
problems as wicked problems--learning how to
deal with those sorts of problems for which there
are no "solutions"--constitutes an even greater
challenge in our increasingly pluralistic times.
The alternatives to solving the right problems
are potentially catastrophic. Continuing to try to
"tame" a world increasingly filled with messes,
let alone wicked problems, makes it a
dangerously unstable place.
TAME PROBLEMS AND MESSES
For every complex problem there is
a simple solution. And it is wrong.
H.L. Mencken.
Discovering a vaccine for smallpox, analyzing
the chemical components of air pollution, and
lowering the prime interest rate are tame
problems. Tame problems can be solved in
relative isolation from other problems.
We solve tame problems through analytical
methods--breaking things down into parts, fixing
components, assessing the probability of known
sequences of failures leading to a nuclear
meltdown. We organize ourselves to solve tame
problems through specialization--the division of
labor, departmentalization, teaching a course in
nuclear engineering over here and a course in
group dynamics over there and yet another
course in international terrorism somewhere
else.
Culturally,
tame
problems
enjoy
consensus: everybody pretty well agrees why
something needs to be done and the right way
to go about doing it.
There are countless examples of tame
problems, the type of problems that Warren
Weaver termed problems of "organized
simplicity."3 Solving them has been the great
forte of science for several hundred years. Due
in large part to such successes, they remain the
ideal for many social scientists as well as
managers and administrators.
However, things have become messier. We
are increasingly faced with problems of
"organized complexity," clusters of interrelated
2
or interdependent problems, or systems of
problems. "English does not contain a suitable
word for 'system of problems.' Therefore, I have
had to coin one. I choose to call such a system
a mess" (Ackoff, p. 21).
Problems which cannot be solved in relative
isolation from one another form messes. We
sort out messes through "systems" methods,
through focusing on "processes" and through
"interdisciplinary" approaches. Rather than
simply breaking things down into parts and fixing
components,
we
examine
patterns
of
interactions among parts. We look for patterns
such as vicious and virtuous circles, self-fulfilling
and self-defeating prophecies, and deviationamplifying feedback loops. We organize
ourselves to sort out messes through such
things as cross-functional groups, redundant
training, and so-called "learning organizations"
(Senge, 1990). Culturally, messes entail the
widespread consensus that "if you become
obsessed with interdependence and causal
loops, then lots of issues take on a new look"
(Weick, p. 86). Messes demand a commitment
to understanding how things going on here-andnow interact with other things going on thereand-later.
Many examples illustrate the concept of
messes. AIDS is messier than smallpox; dealing
with water pollution is more puzzling than
building
sewage
systems;
automobile
congestion isn't solved by simply building more
freeways; macro-economic policies are a whole
lot messier in a global economy.
A primary danger in mistaking a mess for a
tame problem is that it becomes even more
difficult to deal with the mess.4 The simplest of
examples illustrates this key point. Asking which
of your teenage kids started the argument
mistakes a mess for a tame problem. Trying to
tame the problem by blaming one of them
usually makes things worse.
Are nuclear power plants different from
teenagers' arguments in the sense that blaming
"operator error" for a mishap mistakes a mess
for a tame problem? Why, for example, did The
President's Commission to Investigate the
Accident at Three Mile Island primarily blame the
operators, and why did the builders of the plant's
Learning to Solve the Right Problems
equipment blame only the operators (Perrow,
p.7)?
Of course, it is often politically expedient to
blame operators rather than the "system," for
managers and administrators are primarily
responsible for the system. However, consider
the implications of the following argument made
in one of Britain's most prestigious journals
merely two years ago:
A point has been reached in the
development of technology where the
greatest dangers stem not so much from
the breakdown of a major component or
from isolated operator errors, as from the
insidious accumulation of delayed-action
human failures occurring primarily within
the organizational and managerial
sectors...[which] emerge from a complex
and as yet little understood interaction
between the technical and social aspects
of the system. (Reason, p.476)
Perhaps blaming operator error is not merely
politically expedient. Perhaps it is because
managers and administrators also do not know
how to think in terms of messes; they have not
learned how to sort through complex sociotechnical systems. Over two decades ago, Karl
Weick noted that "[m]ost managers get into
trouble because they forget to think in circles.
Managerial problems persist because managers
continue to believe that there are such things as
unilateral
causation,
independent
and
dependent variables, origins, and terminations.
Examples are everywhere" (Weick, p. 86). Over
twenty years later, Peter Senge drives home the
same point in his acclaimed book, The Fifth
Discipline.
A widespread failure to think in terms of
"circles" is all the more sobering when you add
things like nuclear power plants to the
equation. It is still more sobering when you
consider
that
the
gap
between
our
understandings of complex systems and those
we are creating may be growing.
But the most disturbing implication is a
continuing reluctance on the part of social
scientists, managers, administrators, and
educators to ask the kinds of questions
3
germane to messes. Indeed, it sometimes
seems many don't know that they don't know.
There are a number of reasons why news
travels too slowly.
The ways we talk about things matter. For
example, talking about nuclear power plants as
if they are "power plants" is a fundamentally
misleading analogy, a point repeatedly emphasized by Medvedev in The Truth About
Chernobyl. More generally, messes offend our
sense of linear logic, the linear syntax of our
language, and our continuing belief in prediction
(an issue to which we shall return).5
We also remain grossly ignorant of the
dynamics of too many messes; we are only now
developing some of the tools we need. In
particular, we are still predominantly organized
to solve tame problems: our business
organizations and our institutions of higher
education
remain
largely
strangers
to
interdisciplinary or cross-functional groups and
integrative or synthetic studies.
Politically, messes require top and middle
managers to relinquish traditional authority and
forms of control, something most are loath to do.
More disturbing, in turbulent times people often
feel insecure and threatened, turning to those
who offer reassuring but simplistic answers.
These obstacles themselves constitute a
mess. At least we appear to be moving toward a
consensus that the ways we talk, our very
methods of inquiry, and the ways we organize
ourselves predetermine most of what we are
able to know.
NUCLEAR POWER IS A MESS
New methods of risk assessment and risk
management are needed if we are to
achieve any significant improvements in the
safety of complex, well-defended, sociotechnical systems.
J. Reason, 1990
Compelling arguments and evidence suggest
that the methods of probability risk assessments
(PRAs) and risk management limit our thinking,
for we start assuming that we face tame
problems. Thus, the reason we need new
methods of risk assessment is to enable us to
see things we otherwise would overlook.6
Jonathan B. King
Consider the facts
The primary argument that we are solving the
wrong problems is offered by Charles Perrow in
Normal Accidents: Living With High-Risk
Technologies (1984) with corroborating insights
from other general system theorists. Perrow's
and others' arguments can easily be
summarized (Table 1).
TABLE 1
Type I:
Known outcomes + fixed sequences =
deterministic
Type II:
Known outcomes + known probabilities =
stochastic
Type III:
Known outcomes + unknown probabilities =
uncertainty
Type IV:
Unknown outcomes + moot issue =
emergence
Perrow essentially argues that conventional
methods of risk assessment and management
presuppose Type I and Type II problems from
the outset. The methods used by a number of
"authoritative" risk assessments on the safety of
nuclear power plants assign probabilities to
known sequences of failures leading to one of
several known disasters.7 This approach
presumably answers the question, "How safe
are they?" Applying the same approach to
alternative energy sources then allows us to
calculate optimum risk-benefit options based on
alternative discount rates.
The major shortcoming of this approach is that
it does not address unknown sequences of
failures--it does not "measure" unanticipated
interactions among components which may
interactively escalate into a systems collapse.
To take such surprises into account, we need
measures of our ignorance.
By reconceptualizing systems such as nuclear
power plants as Type III messes, Perrow
derives two measures of a system's capacity to
4
surprise us. These measures of the degree of
our ignorance are "interactive complexity" and
"coupling."
"Interactive complexity" is a measure of the
degree to which we cannot foresee all the ways
things can go wrong. This may be because
there are simply too many interactions to keep
track of. More likely, it is because our various
theories are simply not up to the task of
modeling socio-technical interactions.
"Coupling" is a measure of the degree to
which we cannot stop an impending disaster
once it starts. This may be because we don't
have enough time, because it is physically
impossible, or because we don't know how.
The greater the degree of interactive
complexity, the less our capacity to prevent
surprises. The greater the degree of coupling,
the less our capacity to cure surprises. The
greater the degree of interactive complexity and
coupling, the greater the likelihood that a system
is an accident waiting to happen.
In such systems, "operator errors" merely
serve as triggers. Trying to find, let alone blame,
the particular straw that broke the camel's back
is therefore an exercise in futility--a "fundamental attribution error."
Worse, assuming that we are dealing with
tame problems leads to fixing components. Yet
adding active safety features may, in fact,
increase the system's overall complexity,
increase its degree of tight coupling, or both.
Similarly, reducing the role of operators to
passively monitoring a system may backfire by
effectively de-skilling them and, in the longer
run, by boring them to death.8
Strategies for dealing with Type III messes are
therefore quite different from those appropriate
for tame problems. Strategies logically follow
from the ways problems are conceptualized.
Thus, increasing our capacity to prevent
unanticipated interactions among components
entails simplifying systems (KISS); increasing
our capacity to cure them entails de-coupling
major components (e.g., build in longer times-torespond).
Neither do conventional approaches and the
standard literature address Type IV problems.
These are problems where unknown or
unimagined outcomes emerge as a result, say,
Learning to Solve the Right Problems
of operating nuclear power plants. For example,
who would have imagined in the heyday era of
nuclear power that less than two decades later
Saddam Hussein would buy a reactor from
France with the all-too-probable aim of
blackmailing--if not taking out--Tel Aviv with a
nuclear weapon?
Strategies appropriate for Type IV messes
essentially insure us against real surprises,
namely, the emergence of unanticipated
outcomes. Thus, increasing our resilience when
confronted with undesirable outcomes entails
fall-back positions. We need to build in diversity,
reversibility, or both, in systems that indicate the
potential for "emergent" or unknown outcomes.
In sum, Type III and Type IV strategies are
essentially insurance policies. However, it is
difficult to convince people to pay for such
insurance if we continue to mistake messes for
tame problems.
So, to what degree do some, many, or most of
our nuclear power plants qualify as complex,
tightly coupled systems? Since the way we
frame problems in the first place determines
what we can know about them, what do we find
when we look at degrees of interactive
complexity and coupling?
In fact, there is quite a history of unanticipated
near misses. For example, in the early sixties a
Nobel Laureate physicist claimed a core
meltdown was impossible at the Fermi sodium
cooled breeder reactor, and another expert
claimed that, even were the impossible to
happen, automatic safety devices would shut
the reactor down. But then read the subsequent
"hair-raising decisions" and "terrifying thoughts"
as "We Almost Lost Detroit," when parts of the
core did melt and the automatic safety devices
did not shut down the reactor. So what? Read a
classified report by the Atomic Energy
Committee
before
the
near-catastrophe
estimating that a severe accident coupled with
unfavorable wind conditions would result in
around 70,000 quick deaths plus another couple
hundred thousand intensely radiated with
serious-to-deadly effects (Perrow, pp. 50-52).
Perrow points out with graphic examples that
in the years subsequent to the Fermi plant's
near-disaster, the "incidents" documented in the
Nuclear Regulatory Commission's (NRC)
5
regularly published Nuclear Safety provide
"endless, numbing fascination as they describe
all the things that can go wrong in these
awesome plants" (Perrow, p. 46). In The Truth
About Chernobyl, Gregori Medvedev makes a
similar claim which he explicitly argues is not
confined to the USSR's graphite core reactors
sans containment shelters: "Unfortunately no
instructions and regulations can encompass the
enormous variety of conceivable combinations
of regimes and mishaps that may occur"
(Medvedev, p. 258).
Corroborating evidence is offered by Harding
(1990). Of the two dozen-odd studies done on
specific American reactors in the last decade
(there are at least seven major reactor and
containment designs, not to mention specific
site differences--e.g., located in an earthquakeprone area), there is no evidence that Chernobyl
was a "unique" occurrence. Even granting the
potential significance of such unusual configurations, the NRC only began to order "Individual
Plant Evaluations" in 1989.
Moreover, such "external" factors as fires and
earthquakes obviously complicate probabilistic
estimates still more. For example, earthquakeinduced problems such as power outages and
electric relay chatter are viewed by some as
potentially more significant than structural flaws
in nuclear power plants. So are terrorist attacks,
but how do we even begin to attempt to guess
the probability of a terrorist (whomever that
might include) attack (whatever form and timing
this might take) on large-sitting targets (which
one)?
Clearly one of the untidy issues in dealing with
messes is where you draw the boundaries of a
particular system. Pointing out that everything is
ultimately related to everything else isn't very
helpful. What we need are methods for sorting
things out, for boundaries are rarely self-evident.
As Rittel and Webber pointed out nearly twenty
years ago,
[The systems-approach of the "second
generation"] should be based on a model
of planning as an argumentative process
in the course of which an image of the
problem and of the solution emerges
gradually among the participants, as a
Jonathan B. King
product of incessant judgment, subjected
to critical argument. (Rittel and Weber,
p. 162)
Thus, the major obstacle may be less a matter
of actually drawing boundaries and more a
matter of investing the time and effort in
boundary drawing processes. In the words of a
leading authority on total quality management:
"The challenge, actually, is to not jump to
conclusions too soon" (Berry, p. 67).9
While drawing boundaries is a crucial
strategy for sorting out messes, allowing an
"image" of the problem to "gradually emerge" is
a very different process from testing hypotheses
in science, from management by objectives in
business, or from the adversarial process of
courts of law. Instead, boundary drawing in
science corresponds to the still mysterious
process of coming up with good hypotheses in
the first place; in business firms, it corresponds
to the continuous improvement of processes;
and in government regulatory policy, it entails a
strategic shift away from our overreliance on an
"advisory legal system that makes what
regulations we do have much more difficult to
implement" (Thurow, p. 122).
In sum, by redrawing the boundaries of
nuclear power plants to include complex sociotechnical systems, we can more effectively
reconceptualize the problem in terms of
messes. Once we have sorted out this
dimension of the nuclear power controversy,
remedies are pretty evident.
One remedy which has gained even the
reluctant endorsement of the Union of
Concerned Scientists is designing what are
termed
"inherently safe
reactors."
Not
surprisingly, these are systems designed to be
both radically simpler and more resilient. A
second complementary strategy is to study and
apply the characteristics of what Reason terms
"high-reliability organizations" or what Senge
terms "learning organizations." This entails
developing indicators of "latent failures" built into
complex socio-technical systems or isolating
various organizational "learning dysfunctions,"
respectively.
However, this still leaves us with scores upon
scores of currently operating nuclear reactors,
not to mention related problems of nuclear
6
waste disposal and the proliferation of weaponsgrade materials. Therefore, the boundaries of
the nuclear power mess necessarily expand to
include assessments of future energy needs,
especially as plant after plant starts coming up
for extension of its operating life.
Consider the "what ifs"
A sensible argument proposed is that we need
to extend the operating lives of our currently
operating nuclear power plants because they
are essential to help meet our future energy
demands (not to mention those of the
developing countries). Not only do we need a
growing supply of cheap energy, but given the
various risks associated with each major source
of energy, a balanced policy makes sense. So,
our energy "portfolio" should spread risk by
including some coal, some natural gas, some
oil, some hydroelectric, some windmills, some
solar, and some nuclear power. Plus a dash of
conservation.
Other people, however, can and do arrive at
very different solutions based on equally
plausible scenarios of the future. What if U.S.
policy pursues conservation with a vengeance?
What if our automobiles average 75 miles per
gallon by the year 2000? What if those variableload devices we read about somewhere were
retrofitted to our millions of electric motors
resulting in major energy savings? What if we
had light bulbs which were cheap, which lasted
for years, and which consumed less energy?
What if superconductors come on strong?
What if the energy needs of our so-called
"third wave" information economy are radically
less than those of our "second wave" industrial
economy? What if future birthrates fall
significantly in our and various other countries?
What if the Greenhouse Effect increasingly
looks to be benign?
Then there are the "what ifs" we cannot even
ask because we don't have a clue as to what
they might be. These are Type IV problems
noted in Table 1. For example, there is no
reason not to suppose that some fundamental
surprises may emerge over the next decade.
Perhaps rapidly advancing technologies such as
macro-molecular "engines of creation" (Drexler,
1986) and genetically engineered organisms will
Learning to Solve the Right Problems
shift current demand and supply projections of
future energy needs off the map. Who knows?
The basic problem is that we cannot predict
future energy needs because we cannot predict
the future. We cannot even predict possible
futures. At best, we can only predict probable
aspects of the future. There are several reasons
for this messy state of affairs.
As Donald Michael points out, "All we have are
endless fragments of theory that 'account' for
bits and pieces of individual, organizational, and
economic behaviour. But we have no
overarching or truly interconnecting theories,
especially none that accounts for human
behaviour in turbulent times" (Michael, p. 95).
Second, our world is indeed becoming more
turbulent. Not only are things happening much
faster these days, but more wild cards are
showing up--emerging--in the deck. For these
reasons alone, there may be no way to know the
answer to many questions any faster than what
is going on.
Third, there are no grounds to suspect things
could be better in principle. There are no sound
reasons to claim that the social sciences are
going to "mature" or "evolve" to the point that they
achieve the predictive power of those sciences
which deal with things that don't need to talk to
each other. More generally, "chaos theory" poses
fundamental challenges to longstanding and
dominant conceptualizations of predictability.
Owning up to these realities is going to be hard
for a lot of people to swallow. For example,
The expert's claim to status and reward
is fatally undermined when we recognize
that he possesses no sound stock of lawlike generalizations and when we realize
how weak the predictive power available
to him is...I do not of course mean that
the activities of purported experts do not
have effects and that we do not suffer
from those effects and suffer gravely.
(MacIntyre, p. 107)
While we cannot predict the future, what we
choose to do now certainly affects the future.
Giving up illusions of what is variously termed
"machine-age
thinking,"
"a
mechanistic
epistemology," or the "Newtonian worldview"
7
does not mean everything turns to mush. Our
choices have consequences even if we cannot
pretend to know what they may turn out to be.
So, what are appropriate strategies given this
added dimension of messiness? In The New
Realities (1989), Peter Drucker draws on the
analogy of maintaining the climate versus
predicting the weather:
The new mathematics of complexity
raises an even more disturbing question:
Can there be an economic policy at all?
Or is the attempt to control the "weather"
of the economy, such as recessions and
other cyclical fluctuations, foredoomed to
failure?
Economics rose to its eminence in this
century precisely because it promised to
be able to control the "weather"...are we
going to shift from government as the
economic activist of the last sixty years to
emphasis on government responsibility to
maintain the right "climate"? (Drucker, pp.
167-8)
Put another way, one of W. Edwards Deming's
fundamental tenets is to shift from strategies
which focus on results, outcomes, or objectives,
to strategies which focus on continuously
improving processes. This strategic shift
requires a mindset change of almost heroic
proportions for many managers, administrators,
and other "experts."
Nevertheless, the notion of maintaining the
right climate increasingly makes strategic sense
to an increasing number of people. Key climate
terms are becoming familiar: diversity, flexibility,
adaptability, and rapid response times. So, also,
such notions as learning how to learn, thinking
about how we think, and the learning
organization.
So far, so good. At least we know the scope
and methods that we need to employ in order to
sort through messes. We now know that
effectively sorting through messes entails some
fundamental changes in what we think and how
we think, in what we teach and how we teach,
and ultimately in the ways that we organize
ourselves.
MESSES AND WICKED PROBLEMS
Jonathan B. King
Our point is that diverse values are held
by different groups of individuals -- that
what satisfies one may be abhorrent to
another, that what comprises problemsolution for one is problem-generation
for another. Under such circumstances,
and in the absence of an overriding
social theory or an overriding social
ethic, there is no determining which
group is right and which should have its
ends served.
Rittel and Webber, 1973
What if we choose to continue to build an
economic -- and political -- infrastructure
predicated on the belief we will need more
energy? What if we instead pursue conservation
with the vengeance some claim it deserves?
Which provides us with more flexibility, greater
adaptability? These clearly are policy decisions
that will surely determine significant aspects of
our futures.
The criteria for us to jointly evaluate such
choices, however, become more complicated
and demand more social decision-making than
individual adaptability or flexibility. Maintaining a
favorable climate is fine until you get specific, for
your version of a sunny climate may strike me
as a stormy one. For example, nuclear power
plants entail very different configurations of
power than do solar heated and lighted
buildings; a landscape dotted with concentrations of power is not everybody's idea of a
sunny climate.
In short, strategies for dealing with messes are
fine as long as most of us share an overriding
social theory or overriding social ethic. If we
don't, we face wicked problems.
Wicked problems are what E.F. Schumacher
termed "divergent" as opposed to "convergent"
problems. A convergent problem promises a
solution. The more it is studied, the more
various answers sooner or later converge.
Tame problems are convergent by definition.
Messes are convergent if we agree on what
overlaps, on appropriate strategies, and on the
kind of "climate" we wish to maintain. A
divergent problem does not promise a solution.
8
The more it is studied, the more people of
integrity and intellect inevitably come to different
solutions.
As with messes, there are very real dangers in
"solving the wrong problem." Mistaking or
misrepresenting wicked problems for messes,
let alone tame problems, almost inevitably leads
one to conclude that those with different
answers lack integrity, intellect, or both. The
great danger is that such conclusions undermine
trust, and trust is a fundamental strategy for
collectively coping with wicked problems.
If wicked problems are becoming more
common in our modern era, and there is
compelling evidence they are, we face a
strategic choice. We can continue to
misrepresent them as messes or tame
problems, hoping they will not degenerate into
culture wars, class warfare, or revolution. This
seems increasingly risky in our increasingly
pluralistic society if for no other reason than this
strategy may itself be further exacerbating the
dark side of pluralism.
On the other hand, we can acknowledge
wicked problems for what they are and try to
stabilize them as "conditions." This is not going
to be easy because wicked problems offend our
sense of logic and our common beliefs even
more than messes. In our modern times, it is
pretty hard to accept that such-and-such a
problem has no solution. This seems
tantamount to giving up, leaving the field to
one's adversaries.
Irving Kristol offers an instructive example of a
wicked problem and a thoroughly modern
reason for why we ignore wicked problems for
what they are.
One of the wisest things ever said
about the study of human affairs was
uttered by an Israeli statesman...who,
being sharply examined about Israeli
foreign policy...and the future of East
Jerusalem (the Old City), an area
sacred to all three Western religions,
said "East Jerusalem? That's no
problem at all...In politics, if you don't
have a solution, you don't have a
problem. What you have is a condition,
in the medical sense of the term."
Learning to Solve the Right Problems
With those words, he was affirming a
traditional political way of looking at
human affairs, rather than the more
modern "social-scientific" way, with its
"problem-solution" dichotomy. This
tradi-tional way has its own fund of
wisdom to draw upon, based on
generations of experience and finding
formulation
in
something
called
"common sense." (1978, p. 15)
"Common sense" means common ground.
Establishing common ground is arguably
becoming a strategic necessity in our turbulent
times and not merely in issues of nuclear power.10
NUCLEAR POWER IS A WICKED PROBLEM
It is a loss of orientation that most
directly gives rise to ideological activity,
an inability, for lack of usable models, to
comprehend the universe of civic rights
and responsibilities in which one finds
oneself located.
Clifford Geertz, 1973
Wicked problems are synonymous with what
Geertz terms "a loss of orientation" or what
Rittel and Webber term the absence of an
"overriding social theory or an overriding social
ethic." Thus, wicked problems are evidenced by
the ideological controversies that result when
the boundaries of messes expand to include
socio-political and moral-spiritual issues.
Some will argue that those who expand the
boundaries of nuclear power to include, say,
"deep ecology" are themselves the problem.
This notion is naive. So-called empirical studies
and the social sciences are necessarily shot
through with implicit and explicit value
assumptions and ideological considerations.
Moreover,
our
values
and
ideological
considerations
are
"objectified"
or
"institutionalized" as our prevailing ways of
talking, as power structures, as tools, and as
patterns of interactions. Thus, those who
support the status quo are no more or less
"ideological" than those who oppose it; Scientific
9
American articles assessing the riskiness of
nuclear power plants are as shot through with
value assumptions and ideological considerations as articles appearing in Mother Jones or
The Whole Earth Review.
"Wickedness" occurs when people confer
immutability on value assumptions and
ideological considerations. Thus, the strategic
issue is whether we choose to allow wicked
problems to degenerate into tyranny or chaos,
whether we choose to stabilize them as
"conditions," or, more radically, whether we
choose to try to dissolve them together.
Consider the following, constructed debate.
This set of familiar and often passionately stated
positions not only illustrates the absence of an
overriding social theory or social ethic, but also
the inescapable and ideological dimensions of
the nuclear power problem.
Me: Let's take the portfolio approach to
our energy policy. It is economically less
risky than phasing out our nuclear power
plants.
You: No, the conservation approach is
actually less economically risky.
Me: No, the conservation approach is
actually more risky. It will require significant
government intervention in the economy
and this is politically and economically risky
because it leads to concentrations of state
power.
You: Relying on the market system is
economically risky for it is notoriously shortsighted and ignores all manner of externalities. Worse, it leads to concentrations of
private power.
Let's dig deeper. Like it or not, the nuclear
power controversy inevitably involves basic
assumptions concerning the role of modern
technology in our lives.
Me:
Modern
technology,
properly
managed, will relieve us from the miseries
10
Jonathan B. King
which have plagued us from time
immemorial and will help us solve the
messes in which we find ourselves today.
Nuclear power is one of those technologies. So, too, genetic engineering,
nano-technology, the neurochemistry of
the brain, and advanced generation computers. The doomsayers have been proven wrong again and again. Moreover, we
can't roll back the clock even if we wished
to. The glass is only half-full!
You: Modern technology is out-ofcontrol! Using technology to "cure"
problems caused by technology is a fool's
promise. Your promising technologies are
Pandora's boxes. In our haste to decode
the human genome, what are we going to
do if and when we discover how to arrest
the body's aging process? What are we
going to do if and when we succeed in
building artificial intelligences that surpass
us mere human beings in nearly anything
we can do? We are like a bunch of little
kids who are playing with toys whose
power they can barely imagine. In fact,
many doomsayers were right when
enough people took them seriously that
things didn't turn out as they prophesied.
And the we-can't-roll-back-the-clock-bit is
a red herring; it merely sanctifies the
status quo. The hour glass is half-empty
and time is running out!
Let's dig even deeper, expanding the scope of
the debate to include our relationships with
Nature. For like it or not, some believe -- more
accurately, they illustrate -- that there is necessarily a "spiritual" aspect to ecological issues.
Nuclear power is inextricably involved in ecological issues.
Me: The kind of environ-mentalism that
likes to consider itself spiritual is nothing
more than sentimental. Thus, the basic
principle is the same whether we are
dealing with atmospheric ozone, the
spotted owl, or pollutants from coal- and
oil-fired generators: "protect the environment -- because it is man's environment.
And when man has to choose between
his well-being and that of nature, nature
will have to accommodate."
You: Matters are not so simple. You
see, we have chosen to see nature in
economic terms--the land as thing. However, "if we do not retrieve and nurture, I
think, some more gracious relation-ship
with the land, we will find our sanctuaries,
in the end, have become nothing more
than commodities. They will not be the
inviolate and healing places we yearn for,
but landscapes related to no one."11
Those holding these polarized views have
been characterized as "cornucopians" and
"catastrophists."12 However, it is a mistake to
dismiss them as extremists. Lots of us become
extremists when the particulars of inescapable
policy decisions become clear, especially if our
individual thinking opposes the prevailing social
norm. Moreover, extremists of every stripe have
ample opportunities to intervene in political and
especially legal processes.
Julian Simon and Ivan Illich are good representatives of such polarized views, for both are
persons of integrity and intellect.13 So, what
would happen if we were to lock them in a room
together until they had hammered out a strategy
for dealing with nuclear power? This is more
than an interesting exercise.
THE BOTTOM LINE
The point is to live in ways that makes
what is problematic disappear.
Wittgenstein
Simon and Illich would soon concede that the
other was a person of intellect and, soon after
that, of integrity. This essentially means that
both would realize they were confronted by a
mess if not, perhaps, an inherently wicked
problem. The fact that they are both locked in
the same room together might also change the
11
Learning to Solve the Right Problems
game plan. After all, neither Julian nor Ivan has
the option of riding off into the sunset.
Sooner or later they would start exploring
alternatives and compromises. Perhaps they
would even come up with something radically
different as they explored divergent and convergent aspects of their views. At this point, they
would be engaging in what David Bohm calls
"dialogue" or what Edward de Bono terms
"mapmaking." Rather than arguing their respective positions, rather than trying to persuade the
other, they would start rethinking, together,
some of their basic assumptions. This might be
a bit dicey for a while because each would
probably feel a bit vulnerable. Were they to
persist, however, they would gradually come to
recognize the significance of that ancient
injunction, "Know Thyself."
As they continued to talk together, Julian and
Ivan would doubtless range across economic,
political, social, and cultural considerations.
They would talk about the shorter and the longer
run. Perhaps they would even talk about their
children, that is, about their most profound
hopes and fears. Perhaps this would break new
ground. It has before.
Sometime during this process--perhaps quite
soon, it would most likely occur to them that
because they share a common room, they also
share common ground. This essentially means
both would realize that their differences are less
significant and profound than what they share in
common, and that this common sense represents the beginnings of wisdom.
At this point, we could unlock the door and ask
them two simple but crucial questions. "Do you
trust each other?" to which they would undoubtedly reply, "Yes." We would then ask, "Are
you ready to leave?"
Would it be any surprise if they answered,
"No"?
This fanciful scenario bears little semblance to
the controversies over nuclear power in America
today or lots of other controversies. That is
unfortunate, for this scenario illustrates some
strategic principles for keeping our world from
becoming an increasingly unstable place.
Real listening--dialogue--is essential to
mapping the boundaries and learning to recog-
nize patterns of interactions which are the crux
of sorting out messes. Real listening is also
essential in establishing trust and trust is the
sine qua non of effectively working together.
More significant, mistrust is the dark heart of
wicked problems.
The strategic principles for establishing trust
include compassion, for compassion is grounded in the realization that what we share in
common is far more significant and profound
than our differences. Compassion is the crux of
the Golden Rule; as Rabbi Hillel pointed out
long ago, "This is the whole law, all else is mere
commentary."
The strategic principles for establishing trust
also include "Know thyself." Terms expressing
the significance of this quest include integrity,
moral excellence, leading from the inside-out,
the hero's journey. By contrast, those who live
from the "outside-in" are not worthy of our
trust.14
In sum, these strategic principles are essential
to sorting out messes together. More significant,
in our increasingly complex and interdependent
times, these principles guide us to live in ways
that help us dissolve what is problematic.
If the above scenario and its principles sound
a bit old-fashioned, they are. If they appear
simplistic, it is only because they are profoundly
simple. If they seem idealistic, they are not.
They are as realistic as we can get.
FOOTNOTES
1. The terms "tame" and "wicked" are Rittel and
Webber's (1973); "mess" is Ackoff's (1974).
2. See Lakoff and Johnson (1980) on the
significance of metaphor in general and on the
differences between military, puzzle, and
chemical metaphors in particular. I am implying
that the military metaphor is appropriate for
"solving" tame problems (see Weick (1979) on
the significance of the military metaphor in
business). I am using the puzzle metaphor
("sorting out" the relationships between various
components of a system) and the chemical
metaphor ("dissolving" the tensions of wicked
12
Jonathan B. King
problems) as ways of expanding and
restructuring our understandings of problems.
3. Relevant excerpts from Warren Weaver's
essay in the 1958 Annual Report of the
Rockefeller Foundation are cited in Jane Jacobs
(1961, pp. 429-433).
4. On this specific point, see Senge (1990) for a
brief elaboration of nine of around a dozen
"systems archetypes" identified by researchers
to date.
5. In particular, see Gregory Bateson (1979, pp.
58-61) and Peter Senge (1990, pp. 73-79) on
limitations of our "linear" logic and syntax.
6. Chaos theory is an especially dramatic case
of looking at the "same" things in new ways (see
King, 1989).
7. For example, see Lewis' "event-tree" and
"fault-tree" approach to assessing "The Safety
of Fission Reactors," Scientific American (March
1980) 242: 53-65.
8. For example, see Kenyon B. De Greene's
succinct analysis (1990) of the state of the art in
human factors engineering in which he stresses
the strategic importance of actively involving the
operators in the design of equipment in the first
place rather than the typical, American approach
of effectively deskilling them. Also see Taguchi
and Clausing's (1990) similar emphasis on
designing in reliability.
9. See Edward de Bono (1985) and David
Bohm (cited at length in Senge, 1990, pp. 238249) on the widespread Western (especially
American) tendency to jump to conclusions.
Both essentially argue that "mapmaking" or
"dialogue" is essential to sorting out messes, but
that jumping to conclusions--"argumentation" or
"discussion"--remains the dominant approach to
problems.
10. For example, see James Hunter's Culture
Wars (1991), Robert Bellah et al's The Good
Society (1991), and more generally, Alvin
Toffler's Powershift (1990).
11. The two quotes are from a Time essay by
Charles Krauthammer (1991) and from a brief
commentary by Barry Lopez in Life (circa 1987).
12. See Douglas and Wildavsky's (1982)
argument that not only are risks largely in the
eye of the beholder, but that beholders can be
grouped into three classes depending on the
types of organizations they inhabit--market,
hierarchical, or communitarian. "Cornucopians"
correspond to the free market folks, while
"catastrophists" correspond to communitarians.
The significance of this argument is suggested
by Douglas' subsequent and acclaimed book,
How Institutions Think (1986). Succinctly, not
only is most of our knowledge "out there"
instead of being in our heads, but most of it is
organized or institutionalized in non-arbitrary
ways. Ferreting out patterns or classificatory
typologies helps us understand how institutions
think. Cornucopians and catastrophists inhabit
different institutions.
13. For example, see Simon (1981) and Illich
(1974).
14. For example, see Stephen Covey (1989).
Covey is emphatic that getting one's own inner
compass aligned to true North is fast becoming
the sine qua non for business survival in the
global economy.
REFERENCES
Ackoff, Russell L.: 1974, Redesigning the
Future: A Systems Approach to Societal
Problems (John Wiley & Sons, New
York).
Bateson, Gregory: 1979, Mind and Nature: A
Necessary Unity (E.P. Dutton, New York).
Bellah, Robert, Richard Madsen, William M.
Sullivan, Ann Swidler, Steven M. Tipton: 1991,
The Good Society (Alfred A. Knopf, New York).
Berry, Thomas H.: 1991, Managing the Total
Quality Transformation (McGraw-Hill, New
York).
Learning to Solve the Right Problems
Bono, Edward de: 1985, Six Thinking Hats
(Penguin Books, New York).
King, Jonathan: 1989, "Confronting Chaos,"
Journal of Business Ethics 7, 29-40.
Campbell, Joseph: 1987, "Introduction: The
Hero's Journey," The World of Joseph
Campbell: Transformations of Myth Through
Time (Mythology Ltd.).
Krauthammer, Charles: 1991, "Saving Nature,
But Only for Man," Time (June 17): 82.
Covey, Stephen R.: 1989, The Seven Habits of
Highly Effective People: Restoring the
Character Ethic (Simon & Schuster, New York).
De Greene, Kenyon B.: 1990, "Contextual
Aspects of Human Factors: The Case for
Paradigm Shift," Human Factors Society
Bulletin 33 (September 1990): 1-3.
Douglas, Mary and Aaron Wildavsky: 1982, Risk
and Culture: An Essay on the Selection of
Technical and Environmental Dangers (University of California Press, Berkeley).
Douglas, Mary: 1986, How Institutions Think
(Syracuse University Press, Syracuse).
Drexler, K. Eric: 1986, Engines of Creation: The
Coming Era of Nanotechnology (Doubleday,
New York).
Drucker, Peter: 1989, The New Realities (Harper & Row, New York).
Geertz, Clifford: 1973, The Interpretation of
Cultures (Basic Books, New York).
Harding, Jim: 1990, "Reactor Safety and Risk
Issues," Contemporary Policy Issues (July) 8:
94-105.
Hunter, James Davidson: 1991, Culture Wars:
The Struggle to Define America (Basic Books,
New York).
Illich, Ivan: 1974, Energy and Equity (Harper &
Row, New York).
Jacobs, Jane: 1961, The Death and Life of
Great American Cities (Random House, New
York).
13
Kristol, Irving: 1978, "Where Have All the
Answers Gone?" Think: 12-14.
Lakoff, George and Mark Johnson: 1980,
Metaphors We Live By (University of Chicago
Press).
MacIntyre, Alasdair: 1981, After Virtue: A Study
in Moral Theory (University of Notre Dame
Press, Notre Dame).
Medvedev, Grigori: 1991, The Truth About
Chernobyl (Trans. Evelyn Rossiter) (Basic
Books, New York).
Michael, Donald N.: 1985, "With Both Feet
Planted Firmly in Mid-Air: Reflections on
Thinking About the Future," Futures (April
1985): 94-103.
Perrow, Charles: 1984, Normal Accidents: Living
With High- Risk Technologies (Basic Books,
New York).
Reason, J.: 1990, "The Contribution of Latent
Human Failures to the Breakdown of Complex
Systems," Philosophical Transactions of the
Royal Society of London (Series B) 327: 475484.
Rittel, Horst W.J. and Melvin M. Webber: 1973,
"Dilemmas in a General Theory of Planning,"
Policy Sciences 4: 155-169.
Senge, Peter M.: 1990, The Fifth Discipline: The
Art and Practice of The Learning Organization
(Doubleday, New York).
Simon, Julian L.: 1981, The Ultimate Resource
(Princeton University Press, Princeton).
Taguchi, Genichi and Don Clausing: 1990,
"Robust Quality," Harvard Business Review
(January-February): 65-75.
Jonathan B. King
Thurow, Lester: 1980, The Zero-Sum Society:
Distribution and the Possibilities for Economic
Change (Basic Books, New York).
Toffler, Alvin: 1990, Powershift: Knowledge,
Wealth, and Violence at the Edge of the 21st
Century (Bantam Books, New York).
Weick, Karl: 1979, The Social sychology
of Organizing (Random House, New
York).
14
Download