File

advertisement
6.6 Matrices Applied to Probability
- In some situations, the probability of an outcome depends on the outcome of the
previous trial. Often this pattern appears in the stock market, weather patterns, and
consumer habits.
- Dependent probabilities can be calculated using ____________ chains, where the
outcome of any trial depends directly on the outcome of the previous trial.
Terminology:
i) Initial probability vector, S  0  : represents the probability in the initial state. It
must be a ________________.
ii) Transition matrix, P: represents the probabilities of moving from any initial state
to a new state in any trial. It must be a ____________ matrix.
iii) First-step Probability Vector, S 1 : calculated from S 0  P .
iv) Nth step Probability Vector, S n  : calculated by repeatedly multiplying the
probability vector, P.
- Each entry in a probability vector or a transition matrix is a probability and must be
between ___________.
- The possible states in a Markov chain are always mutually exclusive and therefore
must sum to 1.
Ex/ Bob and Jim are extremely competitive tennis players. Over the years a pattern
has developed. Whenever Bob wins, his confidence increase and the probability of
his winning the next week is 0.7. However, when he loses, the probability of his
winning the next week is only 0.4.
Initial Probability Vector:
Transition Matrix:
- In general, the nth step probability vector, S n  , is given by
S n   S 0P n
Ex/ A marketing research firm has tracked the sales of three brands of hockey sticks.
Each year on average,
 Player-one keeps 70% of its customers, but loses 20% to Slapshot and 10% to
Extreme Styx
 Slapshot keeps 65% of its customers, but loses 10% to Extreme Styx and 35% to
Player-one
 Extreme Styx keeps 55% of its customers, but loses 30% to Player-one and 15%
to Slapshot
a) Assuming an equal share of the market to begin, determine the initial state vector.
b) What is the transition matrix?
c) Determine the market share after two and three years.
d) Determine the market share after 10 years. (This is the long-term probability)
- If a probability vector remains unchanged after being multiplied by the transition
matrix it has reached a ‘steady state’. That is:
- Regular Markov chains always achieve a steady state regardless of the initial
probability vector. A Markov chain is regular if the transition matrix P or some power
of P ___________________________.
Ex/ Determine if the Markov chain with the following transition matrix will be regular.
1  0
1   0.5
0.5 
 0
P2  




 YES
0.5 0.5 0.5 0.5 0.25 0.75
- The steady state of a regular Markov chain can also be determined analytically.
Ex/ The weather near a certain ski hill follows this pattern: If it is a sunny day, there
is a 60% chance that the next day will be sunny and a 40% chance that it will be
snowy. If it is a snowy day, the chances are 50/50 that the next day will also be
snowy. Determine the long-term probability for the weather at the ski hill.
Transition Matrix:
If the Markov chain is regular:
Now…
p  0.6p  0.5q
0.4 p  0.5q
p q 1
p  0.8 p  1
0.8p  q
1.8 p  1
p  0.555
q  0.445
Homefun: pp. 353 - 354 # 1 - 5, 7, 8, 13
There is a 55.5% chance of
sunny weather, and 44.5% of
snowy weather long term.
Download