Analyses of uncertainties Sensitivity analysis

advertisement
Sensitivity analysis
A built-in irony:
Analyses of uncertainties
If alternatives are difficult to choose from, they probably
are not too different in expected utility, so it doesn’t
matter which alternative to choose. . . ?
Almost by definition, hard decisions are sensitive to our
assessments.
316 / 401
317 / 401
An introduction to sensitivity analysis
Structural sensitivity analysis
Sensitivity analysis answers the question “what makes a
difference in this decision problem?”
Structural sensitivity analysis pertains to the qualitative
elements of the decision problem:
• sensitivity analysis is part of the modelling process
• typical questions include:
• are we solving the right problem?
• are no alternatives dominated?
• is the level of detail correct?
• how important are the numbers?
• the different alternatives of the decision variable(s);
• the outcomes of chance variables;
• the attributes and their values.
More in particular, a sensitivity-to-value-range analysis serves to
distinguish between deterministic and probabilistic outcomes.
Sensitivity analysis studies the effect of changes in the different
elements of a decision problem on the optimal decision strategy.
318 / 401
319 / 401
Sensitivity to value range
Consider a scenario s in a decision tree, and the value xs of
attribute X associated with the consequence of the scenario.
Suppose there exists uncertainty as to the exact value of X in
the consequence of scenario s:
• the decision tree specifies the base-value xs , however
• the plausible range of values for X in this scenario is
max
xmin
.
s , . . . , xs
Does the optimal strategy depend on the value of parameter xs
of X?
(Numerical) sensitivity analysis
Numerical sensitivity analysis pertains to the quantitative
elements of the decision problem:
• for each chance variable, (conditional) probabilities are
specified;
• for each consequence, a utility is specified, often as a
function of the utilities of multiple attributes.
The various parameters are inaccurate. Can these inaccuracies
make a difference in the ultimate strategy ?
If so, use an additional chance variable. . .
320 / 401
321 / 401
The Eagle Airlines problem
Sensitivity analysis in general
Sensitivity analysis is a technique for studying the effects of
(second order) uncertainties in a decision tree:
• one or more parameters are varied systematically from their
base-value;
• the effect of the variation on the expected utility of a
strategy is studied.
Two types of sensitivity analysis are used:
• in a one-way analysis, a single parameter is varied;
• in a two-way analysis, two parameters are varied
simultaneously.
A more general n-way analysis, n ≥ 3, is hardly ever used in
practice.
322 / 401
Dick Carothers, president of Eagle Airlines, has the option of
buying an additional airplane to expand his operation, or to
invest his cash in the money market:
• investing in the money market has an expected profit of
$ 4,200
• the profit of buying the airplane depends on different factors,
such as
• operating cost
• insurance costs
• hours flown
• capacity of flights
• ...
Initially, all factors are assumed constants in calculating the
expected profit. Dick, however, is rather uncertain as to their
values.
323 / 401
The Eagle Airlines problem – cndt
Threshold analysis
Reconsider the Eagle Airlines problem, with the decision
alternatives “buying an additional airplane” and “invest in the
money market”.
Consider a decision tree. A one-way analysis of a given
parameter results for each strategy in a sensitivity curve:
The effect of varying the single parameter “Hours Flown” on the
expected utility of the two decisions can be plotted in a 2D
sensitivity graph:
• the intersection of two sensitivity curves indicates a change
in the most preferred strategy;
• establishing the intersection of the two curves is termed a
threshold analysis.
Example:
The threshold value of “Hours Flown” where the optimal
decision changes from investing in the money market to buying
the airplane is 664 hours. Note that this is within its plausible
range of values, so it is important to consider the associated
uncertainty!
324 / 401
325 / 401
The oesophageal varices problem
A plausible interval
For Derek, a 55-year-old man suffering from oesophageal
varices, shunt surgery and sclerotherapy are the only treatment
severe
alternatives:
0.04 qalys
Sensitivity analysis of a parameter often takes a plausible
interval into consideration:
• a plausible interval for the parameter under study is an
interval within which the true parameter value lies with high
certainty;
• the less certain the parameter’s assessment, the larger the
associated plausible interval;
• plausible intervals are established in many different ways:
• the interval equals the 95%-confidence interval;
• the interval is based upon the extreme values that have been
reported;
• the interval is assessed by the decision maker, based upon
experience;
• ...
326 / 401
shunt
surgery
sclerotherapy
survive
p = 0.95
die
p = 0.05
no rebleed
p = 0.70
rebleed
p = 0.30
encephal.
p = 0.20
no enc.
p = 0.80
p = 0.33
mild
10.5 qalys
p = 0.67
11.6 qalys
0 qalys
21.0 qalys
die
p = 0.12
em. shunt
p = 0.88
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
severe
0.04 qalys
p = 0.67
mild
5.70 qalys
p = 0.33
6.20 qalys
0.75 qalys
The various probabilities and utilities have been assessed by
the attending physician and are inaccurate.
327 / 401
The oesophageal varices problem — cntd.
The sensitivity function
Reconsider the part of the decision tree pertaining to shunt
severe
surgery:
0.04 qalys
p = 0.33
survive
p = 0.95
die
p = 0.05
encephal.
p = 0.20
no enc.
p = 0.80
Consider a decision tree in reduced form. A one-way analysis of
a given probability parameter yields for each strategy a separate
sensitivity function:
mild
10.5 qalys
p = 0.67
11.6 qalys
• the sensitivity function is of the form
0 qalys
A one-way analysis of the probability of mild encephalopathy for
the shunt surgery decision results in:
12
11
qale
10
9
f (x) = a · x + b
where x is the parameter being varied, and a and b are
constants;
• the probabilities from the same distribution as x are
co-varied (proportionally);
• the sensitivity function can always be constructed by
8
• substituting the variable x for the parameter to be varied, and
• computing the expected utility for the strategy.
7
6
0
0.2
0.4
0.6
0.8
1
329 / 401
328 / 401
varied probability
The oesophageal varices problem — cntd.
The oesophageal varices problem — cntd.
Reconsider:
encephal.
p = 0.20
survive
p = 0.95
no enc.
p = 0.80
die
p = 0.05
severe
1−x
0.04 qalys
mild
x
10.5 qalys
Now reconsider the part of the decision tree pertaining to
sclerotherapy:
no rebleed
1−x
rebleed
x
11.6 qalys
0 qalys
A one-way analysis of the probability of mild encephalopathy for
the shunt surgery decision results in:
12
a curve characterised by
the sensitivity function
11
f (x) = 1.99 · x + 8.82
9
where x is the probability
under study.
6
0
0.2
0.4
0.6
0.8
em. shunt
p = 0.88
die
p = 0.25
no enc.
p = 0.74
severe
0.04 qalys
p = 0.67
mild
5.70 qalys
p = 0.33
6.20 qalys
0.75 qalys
20
15
8
7
0.75 qalys
survive
p = 0.75
encephal.
p = 0.26
A one-way analysis of the probability of rebleeding after the
sclerotherapy decision results in:
a curve characterised by
the function
qale
qale
10
21.0 qalys
die
p = 0.12
f (x) = −17.39·x+21.0
10
5
1
varied probability
0
0
330 / 401
0.2
0.4
0.6
varied probability
0.8
1
where x is the probability
under study.
331 / 401
Threshold analysis revisited
Thresholds in the oesophageal varices problem — cntd.
Reconsider the oesophageal varices problem. A one-way
analysis of the probability of rebleeding after sclerotherapy
results in:
Reconsider the oesophageal varices problem. A one-way
analysis of the probability of rebleeding after sclerotherapy
results in:
sclerotherapy
surgery
20
sclerotherapy
surgery
20
15
qale
qale
15
10
10
5
5
0
0
0.2
0.4
0.6
0.8
1
0
varied probability
0
0.2
0.4
0.6
0.8
1
varied probability
The sensitivity functions associated with the curves are:
From the threshold analysis we conclude:
• for probabilities less than 0.62, sclerotherapy is the most
preferred treatment;
• for probabilities greater than 0.62, shunt surgery is the most
preferred treatment.
fsurgery (x) = 10.16
fsclero (x) = −17.39 · x + 21.0
The intersection x = 0.62 of the functions can be determined
analytically from
Is the decision sensitive to variation of the parameter?
10.16 = −17.39 · x + 21.0
332 / 401
333 / 401
The oesophageal varices problem — cntd.
Reconsider the oesophageal varices problem. A one-way
analysis of the probability of rebleeding after sclerotherapy
results in:
An extension
sclerotherapy
surgery
20
The technique of one-way sensitivity analysis is extended to
decision trees in general form:
qale
15
10
• for each decision node, the analysis results in a piece-wise
5
linear sensitivity function;
• the boundaries of the intervals for the function are
established by performing a threshold analysis.
0
0
0.2
0.4
0.6
0.8
1
varied probability
Suppose that for the probability under study, the following
plausible interval is taken:
0.09 − 0.45
For probabilities within this interval, sclerotherapy is always the
better treatment alternative.
334 / 401
335 / 401
The marketing problem — cntd.
The marketing problem revisited
Now reconsider the part of Colaco’s decision tree that captures
the initial decision to perform a local market survey:
Reconsider the upper part of the decision tree for Colaco’s
marketing problem:
do not
market
do not
market
120 000
national
success
x
local
success
p = 0.60
420 000
market
120 000
national
success
x
420 000
national
failure
1−x
20 000
market
national
failure
1−x
20 000
A one-way analysis is performed of the probability of a national
success after a market survey has shown local success:
• the sensitivity functions for the decision to market and the
decision not to market are
fmarket (x)
= 400 000 · x + 20 000
fnot-market (x) = 120 000
survey
do not
market
local
failure
p = 0.40
fsurvey (x) = 0.60 · fM (x)
+ 0.40 · 120 000
120 000
national
success
p = 0.10
420 000
national
failure
p = 0.90
20 000
market
where fM (x) =
• the functions intersect at x = 0.25.
The sensitivity function for
this decision is:
fnot market (x) if x < 0.25
fmarket (x)
if x ≥ 0.25
336 / 401
337 / 401
Multiple one-way analyses
The marketing problem — cntd.
The results of multiple one-way sensitivity analyses can be
depicted simultaneously in a tornado diagram, capturing the
variation in expected utility of a single strategy upon the
parameter variations:
For the decision to perform a market survey, we thus have:
500000
450000
400000
asset
350000
300000
250000
200000
150000
100000
50000
0
fsurvey (x) =
0.2
0.4
0.6
varied probability
0.8
1
120 000
if x < 0.25
240 000 · x + 60 000 if x ≥ 0.25
338 / 401
339 / 401
The oesophageal varices problem revisited
The sensitivity function revisited
Consider a decision tree in reduced form. A two-way analysis of
two probability parameters yields for each strategy a separate
sensitivity function:
• the sensitivity function is of the form
Reconsider the part of the decision tree pertaining to
sclerotherapy:
no rebleed
p = 0.70
21.0 qalys
die
x
rebleed
p = 0.30
em. shunt
1−x
f (x, y) = a · x · y + b · x + c · y + d
where x and y are the parameters being varied, and a, b, c
and d are constants;
• the probabilities from the same distributions as x and y,
respectively, are co-varied (proportionally);
• the sensitivity function can always be constructed by
• substituting the variables x and y for the parameters to be
varied, and
• computing the expected utility for the strategy.
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
severe
y
0.04 qalys
mild
1−y
5.70 qalys
6.20 qalys
0.75 qalys
A two-way analysis is performed of
• the probability of dying as a result of rebleeding after
sclerotherapy — x;
• the probability of severe encephalopathy after emergency
shunt surgery — y.
The sensitivity function for sclerotherapy is:
fsclero (x, y) = 0.33 · x · y − 1.20 · x − 0.33 · y + 16.12
340 / 401
341 / 401
The oesophageal varices — cntd.
Reconsider the sclerotherapy part of the decision tree:
no rebleed
p = 0.70
The 3D sensitivity plane that is described by a two-way
sensitivity function, is represented in 2D by iso-utility contours:
• an iso-utility contour connects the combinations of
parameter values that result in the same expected utility;
• the more distant the contours, the larger the variation of the
parameters’ values needed to attain a specific effect on
expected utility;
• if the iso-utility contours are equidistant, then the
parameters under study do not have any interaction effects
on expected utility.
rebleed
p = 0.30
21.0 qalys
die
x
em. shunt
1−x
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
severe
y
0.04 qalys
mild
1−y
5.70 qalys
6.20 qalys
0.75 qalys
The two-way sensitivity function fsclero (x, y) describes the
f(x,y) = 15.9
sensitivity plane:
1
f(x,y) = 15.1
0.8
probability y
A graphical representation
0.6
0.4
0.2
0
0
342 / 401
0.2
0.4
0.6
probability x
0.8
1
343 / 401
The oesophageal varices problem revisited
The oesophageal varices problem — cntd.
Reconsider the oesophageal varices decision tree:
sclerotherapy
die
x
no rebleed
y
rebleed
1−y
encephal.
p = 0.20
no enc.
p = 0.80
0 qalys
21.0 qalys
die
p = 0.12
em. shunt
p = 0.88
mild
10.5 qalys
p = 0.67
11.6 qalys
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
0.75 qalys
1
0.8
probability y
shunt
surgery
survive
1−x
The intersection of the two functions fsurgery (x, y) and
fsclero (x, y) is the line y = − 0.61 · x + 0.41:
severe
0.04 qalys
p = 0.33
severe
0.04 qalys
p = 0.67
mild
5.70 qalys
p = 0.33
6.20 qalys
0.4
0.2
0
0
0.2
0.4
0.6
0.8
1
probability x
The two-way analysis results in the sensitivity functions:
From the threshold analysis, we conclude:
• for value combinations below the line, shunt surgery is the
most preferred treatment;
• for value combinations above the line, sclerotherapy is the
most preferred treatment.
fsurgery (x, y) = − 10.69 · x + 10.69
fsclero (x, y)
0.6
= 17.39 · y + 3.61
Their intersection is the line y = − 0.61 · x + 0.41.
345 / 401
344 / 401
The Eagle Airlines problem revisited
An introduction to the analysis of utilities
Reconsider the Eagle Airline problem and a two-way analysis of
value-range for the parameters “Capacity” and “Operating Cost”
after buying the airplane.
A one-way sensitivity analysis of a utility amounts to varying this
severe
utility:
0.04 qalys
The threshold is given by:
shunt
surgery
sclerotherapy
346 / 401
survive
p = 0.95
die
p = 0.05
no rebleed
p = 0.70
rebleed
p = 0.30
encephal.
p = 0.20
no enc.
p = 0.80
p = 0.33
mild
10.5 qalys
p = 0.67
11.6 qalys
0 qalys
21.0 qalys
die
p = 0.12
em. shunt
p = 0.88
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
severe
0.04 qalys
p = 0.67
mild
5.70 qalys
p = 0.33
6.20 qalys
0.75 qalys
If the consequences involve multiple attributes, however, varying
a single utility is not realistic: the utilities of the attributes need
be studied.
347 / 401
The sensitivity function
The oesophageal varices problem revisited
Reconsider the oesophageal varices problem. The utilities of
the various consequences have been computed as follows:
consequence
utility
le
qale
status quo after sclerotherapy
1
21
21
status quo after elective shunt
0.99
11.7
11.6
status quo after emergency shunt
0.99
6.3
6.2
mild encephalopathy after elective shunt
0.90
11.7
10.5
mild encephalopathy after emergency shunt
0.90
6.3
5.7
severe encephalopathy
0.10
0.4
0.04
s.q. after sclero, then death after rebleed
1
0.75
0.75
operative death
0
0
0
Consider a decision tree in reduced form. Let u(Y ) be the utility
function defined over the set Y of attributes.
A one-way analysis of a given utility parameter yields for each
strategy a separate sensitivity function:
• the sensitivity function is of the form
f (x) = a · u(Y )(x) + b
where x is the parameter being varied, and a and b are
constants;
• if the utility function is linear in x, then the sensitivity
function is of the form
f (x) = c · x + d
where c and d are constants;
• the sensitivity function can again be constructed as before.
348 / 401
349 / 401
The oesophageal varices problem — cntd.
Reconsider the problem’s decision tree:
shunt
shunt
sclerotherapy
survive
p = 0.95
die
p = 0.05
no rebleed
p = 0.70
rebleed
p = 0.30
encephal.
p = 0.20
no enc.
p = 0.80
severe
0.04 qalys
p = 0.33
Why sensitivity analysis ?
mild
x · 11.7 qalys
p = 0.67
Sensitivity analysis is a technique for studying the robustness of
the optimal decision computed from a decision tree:
11.6 qalys
0 qalys
21.0 qalys
die
p = 0.12
em. shunt
p = 0.88
0.75 qalys
survive
p = 0.75
die
p = 0.25
encephal.
p = 0.26
no enc.
p = 0.74
• sensitivity analysis points to the parameters that are the
severe
0.04 qalys
p = 0.67
most crucial to the decision being optimal;
• sensitivity analysis reveals how much the computed
mild
x · 6.3 qalys
p = 0.33
decision can be relied upon to be an optimal one;
6.20 qalys
• sensitivity analysis shows whether or not the decision tree
0.75 qalys
can be used for other decision makers.
A one-way analysis of the utility of mild encephalopathy results
in the following sensitivity functions:
fsurgery (x) = 1.49 · x + 8.82
fsclero (x)
= 0.11 · x + 15.69
350 / 401
351 / 401
The use of information
A decision maker can decide to gather information in order to
reduce uncertainty. Information gathering includes
The value of information
• hiring and consulting an expert;
• conducting a survey;
• performing statistical analyses of data;
• reading literature;
• ...
352 / 401
353 / 401
The marketing problem revisited
Colaco has some assets and considers putting a new soda on
the national market. It has a choice of
• marketing the new soda;
• not marketing the new soda.
The following decision tree organises the elements of Colaco’s
core decision problem:
do not
market
p = 0.55
p = 0.45
• the expected reward computed from the original decision
• the expected reward computed from the problem extended
with the decision to gather the additional information.
450 000
The difference between the two expected rewards is termed the
expected value of information (EVI).
market
national
failure
Consider a given decision problem and a costless source of
additional information. We consider
problem;
150 000
national
success
The expected value of information
50 000
The best decision is to market, which has an expected reward of
270 000 euro.
354 / 401
355 / 401
The marketing problem — cntd.
Studying expected value of information
Colaco can decide to first perform a market survey to gain
additional information and thereby reduce its uncertainty about
the national success of its soda:
• the prior probability distribution over N , having an entropy
of 0.69, is
Pr(n) = 0.55
Pr(n̄) = 0.45
• the posterior probability distribution over N , having a
Consider a given decision problem and a costless source of
additional information.
do not
original decision
gather
problem
gather
more informed
problem
more informed
problem
The expected value of the information is computed from a new
decision tree with two branches emanating from the root:
• following the decision not to gather the information is the
original decision problem;
• following the decision to gather the information is
conditional entropy of 0.42, given a local success is
Pr(n | l) = 0.85
Pr(n̄ | l) = 0.15
• the posterior probability distribution over N , having a
• a chance node capturing the possible information from your
conditional entropy of 0.33, given a local failure is
Pr(n | ¯l) = 0.10
Pr(n̄ | ¯l) = 0.90
source, and
• following each value of the chance node is the more
informed problem: original structure, updated probabilities.
356 / 401
357 / 401
The marketing problem — cntd.
To study the expected value of a local market survey, the
following decision tree is constructed:
do not
market
p = 0.60
The marketing problem — cntd.
150 000
local
success
national success
p = 0.85
Reconsider Colaco’s marketing problem:
450 000
market
national failure
p = 0.15
survey
do not
market
problem is 270 000 euro;
• the expected reward computed from the problem extended
with the decision to gather information from the survey is
294 000 euro.
150 000
local
failure
p = 0.40
• the expected reward computed from the original decision
50 000
national success
p = 0.10
450 000
market
national failure
p = 0.90
do not
market
The expected value of the survey information thus is 24 000 euro:
Colaco should not pay over 24 000 euro for the survey !
50 000
150 000
do not
survey
national success
p = 0.55
450 000
market
national failure
p = 0.45
50 000
358 / 401
359 / 401
The expected value of perfect information
Perfect information
Consider a given decision problem and a costless clairvoyant
providing perfect information. We consider
A distinction is made between
• information that is always correct, called perfect information;
• the expected reward computed from the original decision
• information that may be incorrect, called imperfect
problem;
• the expected reward computed from the problem extended
with the decision to consult the clairvoyant.
information.
While inperfect information reduces the uncertainty about a
chance variable, perfect information removes it.
The difference between the two expected rewards is termed the
expected value of perfect information (EVPI).
360 / 401
The marketing problem — cntd.
Studying the value of perfect information
Consider a given decision problem and a costless clairvoyant.
do not
consult
361 / 401
To study the expected value of perfect information, the following
do not
decision tree is constructed:
market
national
success
p = 0.55
original decision
problem
consult
clairvoyant
consult
clairvoyant
inverted decision
problem
362 / 401
market
do not
market
national
failure
p = 0.45
The expected value of the clairvoyant’s information is computed
from a new decision tree with two branches emanating from the
root:
• following the decision not to consult the clairvoyant is the
original decision problem;
• following the decision to consult the clairvoyant is the
inverted problem.
150 000
market
do not
market
do not
consult
450 000
150 000
50 000
150 000
national success
p = 0.55
450 000
national failure
p = 0.45
50 000
market
363 / 401
M.C. Airport: solution
A foldback analysis resulted in the following top ten alternatives:
The marketing problem — cntd.
Alternative
1985
1975
Reconsider Colaco’s marketing problem:
• the expected reward computed from the original decision
problem is 270 000 euro;
• the expected reward computed from the problem extended
with the decision to consult the clairvoyant is 315 000 euro.
The expected value of perfect information thus is 45 000 euro:
Colaco should never pay over 45 000 euro for any information !
Z (new)
D
IDMG
I
ID
ID
ID
I
IG
DG
D
T(old)
IMG
–
DMG
MG
MG
MG
DMG
DM
IM
IMG
Z
ID
IDMG
ID
ID
IDMG
ID
IDMG
IDMG
IDMG
IDMG
T
MG
–
MG
MG
–
MG
–
–
–
–
1995
Z
ID
IDMG
ID
ID
IDMG
IDMG
IDMG
IDMG
IDMG
IDMG
T
MG
–
MG
MG
–
–
–
–
–
–
Expected
utility
(× 100)
91.23
90.90
90.79
89.30
88.10
86.75
86.55
86.19
86.17
85.60
364 / 401
M.C. Airport: further analyses and conclusion
365 / 401
M.C. Airport: implementation
Several analyses of the model parameters and assumptions
were performed:
• sensitivity to probabilitity assessments;
• sensitivity to scaling constants;
• effects of dependencies among chance variables;
• discount rate for C1
August 2003: The ’Benito Juárez International Airport’ of
Mexico City is still located at its original spot (expected utility:
5.20). What happened?
???????
The discussion is ongoing. . .
Finally, an informal dynamic analysis was performed, taking into
account political and prestige effects.
Strategy advised: “phased development at Zumpango”:
• buy land for major airport at Zumpango;
• construct one major runway and modest passenger and
access facilities at Zumpango;
• extend one runway at Texcoco;
• construct freight- and parkingfacilities, and new control
tower at Texcoco.
“No other site will have the technical operating
advantage that Texcoco would have had, but they are
not necessarily bad. [. . . ]
We move from the ideal to the convenient, to what is
viable, to what is possible. [. . . ]
Some alternatives that in the past were rejected are
now worth reconsidering.”
What happened to the Zumpango alternative???
366 / 401
367 / 401
M.C. Airport: situation in 2008
M.C. Airport: implementation
• commercial
airport;
• 2 runways;
• served 26
million
passengers in
2007;
• Terminal 2
opened in
November
2007;
January 2008: Approved plans for a new airport are presented
by federal ministry of transportation and communications:
• at least 8 kms NE of current MC airport;
• 3 or 4 runways, able to handle new-generation aircraft;
• 60 million passengers per year;
• first operations scheduled to begin by late 2012;
• current airport will be fully disabled
368 / 401
369 / 401
The Job example
Multiattribute decisionmaking
under certainty: The Analytic
Hierarchy Process
Suppose you have the choice between four job offers
(alternatives):
A : Acme Manufacturing
B : Bankers Bank
C : Creative Consulting
D : Dynamic Decision Making
Your choice between alternatives is to be based on how well
they meet the following objectives for several attributes:
L : a nice Location
P : good (longterm) Prospects
R : large amount of Risk Analysis (which you like)
S : high Salary
You know the location, prospects, amount of risk analysis and
salary for each of the jobs.
How do you make your decision?
370 / 401
371 / 401
Analytic Hierarchy Process
Value functions
D EFINITION
Let X {X1 , . . . , Xn } be a set of attributes and let be a
preference order on consequences x involving these attributes.
Then a real function v on x is termed a value function iff
for all x, x′ , we have that x x′ iff v(x) ≥ v(x′ )
Saaty’s Analytic Hierarchy Process (AHP) is a powerful tool for
multi-objective decision making under certainty.
P ROPOSITION
If attributes X are mutually preferentially independent, i.e. Y PI
Y for all Y ⊂ X, then the value function v(X) has an additive
form:
v(X) =
n
X
Consider such a decision problem with objectives (factors) Oi ,
i = 1, . . . , n ≥ 2, and alternatives (choices) Aj , j = 1, . . . , m ≥ 2.
AHP now basically amounts to establishing:
1 an importance rank order on the objectives Oi ;
2 a preference rank order on the alternatives Aj in the context
of each objective Oi ;
3 a score over all objectives for each alternative Aj .
vi (Xi )
i=1
for some value functions vi (Xi ).
372 / 401
373 / 401
AHP – Pairwise comparisons
Consider two items Ii and Ij . The relative importance of these
items can be scored on a 9 point interval-valued scale using the
following table:
value interpretation
An example
1
Ii and Ij are of equal importance
Exercise:
Consider the previous Job example describing a choice
between four jobs.
3
Ii is weakly more important than Ij
How would you compare the following objectives?
5
experience and judgment indicate that
Ii is strongly more important than Ij
• nice Location,
7
Ii is very strongly or demonstrably more
important than Ij
• large amount of Risk Analysis
9
Ii is absolutely more important than Ij
• good (longterm) Prospects,
• high Salary
(Use the table on the previous slide)
2, 4, 6, 8 intermediate values
Fractions similarly capture less importance.
The table can also be used to capture degree of preference.
374 / 401
375 / 401
Consistency
The pairwise comparison matrix
D EFINITION
D EFINITION
A n × n pairwise comparison matrix X is consistent iff for all i, j,
k ∈ {1, . . . , n}:
• xii = 1;
• xij = 1/xji ;
• xik = xij · xjk , where i 6= j 6= k
Let I1 , . . . , In , n ≥ 2, be items.
A pairwise comparison matrix is an n × n matrix X with
elements xij , indicating the value of item Ii relative to item Ij :
Example:
I1
.
.
Ij
.
.
In
 I1
x11
 .

 .

 xj1

 .

 .
xn1
...
.
.
xii
xji
.
.
.
Ij
x1j
.
xij
xjj
.
.
xnj
In
x1n
.
.
xjn
.
.
xnn
...
.
.
xik
xjk
.
.
.













X= 


.
1
xji
.
.
1
.
xj1
.
.
x1j
xij
1
.
xnj
.
xik
xjk
1
.
.
.
xjn
.
1






376 / 401
377 / 401
Comparing objectives: an example
Which pairs to compare?
Suppose that for the Job example, we have the following result
from 3 pairwise comparisons of objectives:
A consistent n × n pairwise comparison matrix can be
constructed from only n − 1 comparisons:
I1
.
.
.
In






Ij
x1j
.
.
.
xnj


 ⇒



I1
.
.
.
Ij
.
.
In
 I1
x11
 .

 .

 .

 xj1

 .

 .
xn1
...
.
.
.
xii
xji
.
.
.
Ij
x1j
.
.
xij
xjj
.
.
xnj
Note that proportions are preserved: xik =
xij
xjj
...
.
.
.
xik
xjk
.
.
.
In
x1n
.
.
.
xjn
.
.
xnn












· xjk = xij · xjk .
Location
 
1
Location

Prospects 
 2 

3 
Risk An.
5
Salary
⇓ consistency
 L
L
1

P 
 2

R  3
S
378 / 401
5
P
R
1
2
1
3
2
3
1
3
2
5
2
S 
1
1
5
2
5
3
5
5
3
1





379 / 401
X · wT = n · wT
Rank ordering: weights
Consider an n × n pairwise comparison matrix X.
Let X be a consistent n × n pairwise comparison matrix, then X
is of the following form, i.e. xij = wwji :

w1
 w1
 w2
 w1

X= 
 .
 .

wn
w1
w1
w2
.
.
.
.
.
.
.
wn
w2
.
w1
wn
A weight vector w = [w1 , . . . , wn ] can be recovered from X by
finding a (non-trivial) solution v to a set of n equations with n
unknowns:
X · wT = v · wT

If X is consistent then v = n gives a unique non-trivial solution:




n
·
w
1
w1
w1


· w1 + w
· w2 + · · · + wwn1 · wn
2

 w1
 
n
·
w
2


 
.
.


T
 =
X ·w = 

.

 

.
.

 

.


wn
wn
·
w
·
w
+
·
·
·
·
·
·
+
n
1
w1
wn
n · wn

. 


. 

. 

wn
wn
where wi > 0, i = 1, . . . , n, denotes the weight of item Ii .
= n · [w1 , w2 , · · · , wn ]T = n · wT
For convenience, weights are taken to sum to 1.
380 / 401
Comparing objectives: the example revisited
Rank-ordering objectives: an example
Suppose that for the Job example, we have assessed the
following consistent pairwise comparison matrix of objectives:
 L
1
L

P 
 2

R  3
S
5
P
R
1
2
1
3
2
3
1
3
2
5
2
S 
1
1
5
2
5
3
5
5
3
1
381 / 401
Suppose however, that for the Job example we use (4 − 1)!
pairwise comparisons of objectives to construct the following
matrix O:
P
R
S 
 L
1

P 
 2

R  3
5
S
L





The only non-trivial weight vector with weights summing to one
is
1 2 3 5
w=[ , , , ]
11 11 11 11
382 / 401
1
1
3
1
3
3
1
1
5
1
4
1
2
4
2
1
1
2





The matrix displays slight inconsistencies:
• amount of Risk Analysis is about three times as important
as good Prospects;
• good Prospects are twice as important as a nice Location;
• amount of Risk Analysis is only three times as important as
the Location.
383 / 401
Weights revisited
Approximation of the weight vector
Let X be an inconsistent n × n pairwise comparison matrix, then
the weight vector w no longer follows from
X · wT = n · wT
We can, however, approximate the weight vector:
1 normalise each column j in X such that
X
xij = 1

• ⇒ call the resulting matrix X ′ ;
for each row i in X ′ , compute the average value
1 X ′
xi = ·
x
n j ij
•
⇒
w
ei = xi is the approximated weight of item Ii .
wn
w1
1
0.471
0.545
1
1
1
We normalise its columns and average the rows of the resulting
matrix.
The (approximate) weight vector capturing a rank order on the
objectives L, P, R and S, is:
w = [0.086, 0.130, 0.288, 0.496]
(wL ) (wP ) (wR ) (wS )
.
.
.
.
.
.
wn
w2
.


. 
 ⇒

. 

. 

wn
wn

w1

 w2


 ···
wn
w1
···
...
···
w1
wn
1
1






385 / 401
Checking for Consistency
0.513 0.496
1
.
w1
wn
384 / 401
Reconsider the inconsistent pairwise comparison matrix O for
the objectives in the Job example.
P
R
S Avg.
 L
L 0.091 0.059 0.091 0.103 0.086


0.130
P
0.182
0.118
0.091
0.128


⇒


R  0.273 0.353 0.273 0.256 0.288
0.455
w1
w2
···
1
1 X ′
x .
Obviously, the weights wi are row-averages: wi = ·
n j ij
Rank-ordering objectives: the example revisited
S
w1
w1
w2
w1




X= 
 .
 .

i
2
Consider normalising the columns of a consistent n × n pairwise
comparison matrix X, resulting in matrix X ′ :
Can we use this?
386 / 401
Consider an n × n pairwise comparison matrix X with xii = 1
and xij = 1/xji for all 1 ≤ i, j ≤ n, and consider its approximated
e
weight vector w.
Consistency of X can be checked using the following procedure:
eT
1 Compute X · w
n
eT
1 X ith entry in X · w
2 Compute n
e =
T
e
n i=1
ith entry in w
n
e−n
≥0
3 Compute the consistency index CI =
n−1
4 If CI= 0 then X is consistent;
If CI/RIn ≤ 0.10 then X is consistent enough;
If CI/RIn > 0.10 then X is seriously inconsistent.
where random index RIn is the average CI for random X:
n 2
3
4
5
6
7 ...
RIn 0 0.58 0.90 1.12 1.24 1.32 . . .
387 / 401
Consistency checking: an example
Consistency checking: an example
Reconsider the Job example with the pairwise comparisons
matrix of objectives O and the weight vector w computed from
O. We check the matrix O for consistency:


 
 
1.
1 21 13 51
0.346
0.086


 
 
 2 1 13 41   0.130   0.522 
O · wT = 
=
·





 3 3 1 12   0.288   1.184 
2.022
0.496
5 4 2 1
n
2.
1 X [O · wT ]i
n
e =
=
n i=1 [wT ]i
1
= ·
4
0.346 0.522 1.184 2.022
+
+
+
0.086 0.130 0.288 0.496
4.074
Reconsider the Job example with the pairwise comparisons
matrix of objectives O and the weight vector w computed from
O. We check the matrix O for consistency:
3.
CI =
n
e−n
4.057 − 4
=
= 0.019
n−1
3
4. For n = 4 we have that RI4 = 0.90, and so
CI/RI4 =
0.019
= 0.021 ≤ 0.10
0.90
We conclude that matrix O is consistent enough.
w = [0.086, 0.130, 0.288, 0.496] can therefore be considered a
good enough approximation of the weight vector for O.
= 4.057
388 / 401
389 / 401
Example: rank ordering jobs given their location
Scoring alternatives on objectives
Recall that weight vector w captures an importance rank order
on the n objectives O1 ,. . . , On .
For the different jobs (A, B, C, and D), we assess the following
pairwise comparison matrix for the objective L (nice Location):
B C D
A
1

B
 2

C  15
D 3
A
We in addition require preference rank orders on the m
alternatives A1 , . . . , Am , in the context of each objective.
The next step in the AHP is therefore to determine how the m
alternatives score on the n objectives:
i
1 assess an m × m pairwise comparison matrix A of
alternatives, for each objective Oi ;
i
2 approximate the weight vector for each A .
6 51
1
2
5
1
7
1
7
1
1
3
1
2
1
9
2
9
1
9
3 14
22





17
1 18
We then approximate the weight vector.
We will denote the weight vector for matrix Ai by si = [si1 , . . . sim ].
The scores (relative weights) of the alternatives A, B, C and D
on objective L are given by:
This vector represents the relative weights (scores) sij for each
alternative Aj with respect to objective Oi .
sL = [sLA = 0.174, sLB = 0.293, sLC = 0.044, sLD = 0.489]
390 / 401
391 / 401
Example: rank ordering jobs given nice work
Example: rank ordering jobs given their prospects
For the different jobs (A, B, C, and D), we assess the following
pairwise comparison matrix for the objective P (good Prospects):
 A B C D
A

B


C
D
1
1
9
1
5
1
2
1 73
90
9
5
2
1
1
9
9
1
1
9
1
2
9
2
1
28
8 91





11
3 18
= 0.511;
= 0.035;
sPC
= 0.173;
sPD
1
1
2
1
6
6
1
8
1
2
2 16
6
21
1
2
1
8




2 
1
3 85
We then approximate the weight vector.
The scores of the alternatives A, B, C and D on objective P are:
sPB
1
 1
B
 6

C 2
D 2
A
5 61
We then approximate the weight vector.
sPA
For the different jobs (A, B, C, and D), we assess the following
pairwise comparison matrix for the objective R (amount of Risk
Analysis):
 A B C D
= 0.280
The scores of the alternatives A, B, C and D on objective R are:
sRA = 0.212; sRB = 0.048; sRC = 0.422; sRD = 0.319
392 / 401
393 / 401
Example: rank ordering jobs given their pay
Making a decision
For the different jobs (A, B, C, and D), we assess the following
pairwise comparison matrix for the objective S (high Salary):
C D
A B
1


B 9

C 4
D 6
A
20
1
9
1
4
1
2
1
2
1
1
2
11
2 18
5 14
1
6

1 

1 
2 
1
2 23
We then approximate the weight vector.
The scores of the alternatives A, B, C and D on objective S are:
What is the best alternative in an AHP?
Given
• 1 rank order on the n objectives in terms of weight vector w,
and
• n rank orders on the m alternatives per objective Oi , in
terms of score vectors si
we construct a scoring matrix S with elements sij , i = 1, . . . , n,
j = 1, . . . , m such that sij equals the jth entry of vector si .
We then compute the overall score vector s, containing for each
alternative Aj a summarising score over all objectives, from
sT = S · wT
sSA = 0.051; sSB = 0.397; sSC = 0.192; sSD = 0.360
Now select the alternative with the highest overall score.
394 / 401
395 / 401
Example: deciding on a job
From the computed scores of the alternatives on all objectives,
we construct the following scoring matrix and compute:
P
R
S  

 L
A
T
S· w =
B
C
D
0.174

 0.293


 0.044
0.489
0.511
0.212
0.035
0.048
0.173
0.422
0.280
0.319
0.086
 
·
0.397 
  0.130
 
0.192   0.288
0.496
0.360
0.051





Concluding observations
The overall score for each of the jobs now is:
sA =
=
sB =
sC =
sD =
0.174 · 0.086 + 0.511 · 0.130 + 0.212 · 0.288 + 0.051 · 0.496
0.168
Dynamic Decision Making (job D) it is!
0.240
0.243
0.349
396 / 401
397 / 401
Summary I
The general ingredients of decision problems are:
Summary II
◦ decision alternatives
decision variables
◦ uncertain events
chance variables and their probabilities
◦ consequences
attribute tuples
◦ objectives
attributes and preference orders on
their values
◦ preferences and risk
attitudes
(decomposed) (utility) function(s)
The relevance and importance of the different ingredients of a
decision problem can be analysed:
• deterministic and stochastic dominance of decision
alternatives (to reduce the number of strategies);
• value of information analysis (to reduce uncertainty);
• sensitivity analysis (to reduce second order uncertainty)
They each have their own formal representation.
398 / 401
399 / 401
Representation and evaluation
We used decision trees as a representation for structuring and
evaluating decision problems.
Optimisation is done using
• a foldback analysis to compute expected utility/reward of
the different strategies.
Other formalisms for structuring and evaluating decisions are:
(see Clemen, Ch. 3 + references)
• influence diagrams (Howard & Matheson, 1976);
• valuation networks (Shenoy, 1992);
• sequential decision diagrams (Covaliu & Oliver, 1995);
• ...
Each formalism has its own advantages, drawbacks and
optimisation algorithms.
400 / 401
Methods for decision making:
what we have and have not seen
1 attribute
n ≥ 2 attributes
certainty
simple optimisation
- AHP
- dominance
- value functions
uncertainty
- maximin
(unspecified) - maximax
id
- minimax regret
- deterministic dominance
risk
- Bayes criterion
- multi-attribute utilities
(specified)
- stochastic dominance
- utility theory
Other subjects we have not touched upon:
• preferences over time;
• infinite outcome space;
• negotiating;
• game theory;
• group decision making;
• behavioural decision making;
401 / 401
Download