Decision Analysis Lecture Notes

advertisement
Decision Analysis

A method for determining optimal strategies
when faced with several decision alternatives
and an uncertain pattern of future events.
The Decision Analysis Approach
Identify the decision alternatives - di
 Identify possible future events - sj

 mutually
exclusive - only one state can occur
 exhaustive - one of the states must occur
Determine the payoff associated with each
decision and each state of nature - Vij
 Apply a decision criterion

Types of Decision Making Situations

Decision making under certainty
 state
of nature is known
 decision is to choose the alternative with the best
payoff
Types of Decision Making Situations

Decision making under uncertainty
 The
decision maker is unable or unwilling to
estimate probabilities
 Apply a common sense criterion
Decision Making Under Uncertainty

Maximin Criterion (for profits) - pessimistic
 list
minimum payoff for each alternative
 choose alternative with the largest minimum
payoff
Decision Making Under Uncertainty

Maximax Criterion (for profits) - optimistic
 list
maximum payoff for each alternative
 choose alternative with the largest maximum
payoff
Decision Making Under Uncertainty

Minimax Regret Criterion
 calculate
the regret for each alternative and each
state
 list the maximum regret for each alternative
 choose the alternative with the smallest maximum
regret
Decision Making Under Uncertainty

Minimax Regret Criterion
 Regret
- amount of loss due to making an incorrect
decision - opportunity cost
Rij | V
* j
 Vij |
Types of Decision Making Situations
Decision making under risk
 Expected Value Criterion

 compute
expected value for each decision
alternative
 select alternative with “best” expected value
Computing Expected Value

Let:
 P(sj)=probability of

occurrence for state sj
and
 N=the
total number of states
Computing Expected Value

Since the states are mutually exclusive and
exhaustive
N
 P( s )  P( s )  P( s )      P( s )  1
j
1
j 1
and
P( sj )  0 for all j
2
N
Types of Decision Making Situations

Then the expected value of any decision di is
N
EV (di )   P(sj )Vij
j 1
Decision Trees
A graphical representation of a decision
situation
 Most useful for sequential decisions

Decision Making Under Risk:
Another Criterion

Expected Regret Criterion
 Compute
the regret table
 Compute the expected regret for each alternative
 Choose the alternative with the smallest expected
regret

The expected regret criterion will always yield
the same decision as the expected value
criterion.
Expected Regret Criterion
The expected regret for the preferred decision
is equal to the Expected Value of Perfect
Information - EVPI
 EVPI is the expected value of knowing which
state will occur.

EVPI – Alternative to Expected Regret
EVPI – Expected Value of Perfect Information
 EVwPI – Expected Value with Perfect
Information about the States of Nature
 EVwoPI – Expected Value without Perfect
Information about the States of Nature
 EVPI=|EVwPI-EVwoPI|

Bayes Law
P( A | B) P( B)
P( B | A) 
P( A | B) P( B)  P( A | B ) P( B )
P( A)  P( A | B) P( B)  P( A | B ) P( B )

In this equation, P(B) is called the prior probability of B and
P(B|A) is called the posterior, or sometimes the revised
probability of B. The idea here is that we have some initial
estimate of the probability of B, we get some additional
information about whether A happens or not, and then we
use Bayes Law to compute this revised probability of B.
Expected Value of Sample Information
– EVSI
EVSI – Expected Value of Sample Information
 EVwSI – Expected Value with Sample
Information about the States of Nature
 EVwoSI – Expected Value without Sample
Information about the States of Nature
 EVSI=|EVwSI-EVwoSI|

Efficiency of Sample Information – E

Perfect Information has an efficiency rating of 100%, the
efficiency rating E for sample information is computed as
follows:
EVSI
E
 100
EVPI

Note: Low efficiency ratings for sample information might
lead the decision maker to look for other types of information
Accounting for Risk in Decision
Analysis

Mean-Variance
m
Vari   (rij  ERi ) pj
2
j 1
Accounting for Risk in Decision
Analysis

Utility Theory
 replacing
the payoffs with a unitless scale that
accounts for both the value of the payoff and the
decision makers risk attitude

Risk Aversion
 A decision
maker is risk averse if he/she would
prefer a certain x dollars to a risky alternative with
ER=x dollars.
Accounting for Risk in Decision
Analysis
Direct assessment of utility
 Utility functions

For example
U ( x)  1  e
x / r
where x is the amount of the
payoff and r is the risk aversion
parameter.
Download