Math 166 – Spring 2016 Ch M – Markov Process states .

advertisement
Math 166 – Spring 2016
Ch M – Markov Process
Markov Process: A Markov Process is a sequence of experiments (or trials), performed at regular
intervals such that
a) Each trial has the same possible outcomes. The outcomes are know as states . The outcome of the
current experiment is known as the current state.
b) The probability of the outcomes depends only on the preceding trial, and not on the past history.
A Transition Diagram shows the states of Markov Process and the process of going from one state to
another.
A Transition(or stochastic) matrix is a square matrix satisfying the following:
a) All entries must be between 0 and 1.
b) Sum of all entries in a column is 1.
1
Math 166 – Spring 2016
Ch M – Markov Process
Example 1. A survey of a certain group indicated that 85% of the sons of fathers who attended college
also attended college, and 65% of the sons of fathers who did not attend college also did not attend
college. If 30% of the initial group of fathers were college educated,
a) Draw the probability tree, the transition diagram, and the transition matrix.
b) Find the probability that a son in this group is college educated.
c) Find the probability that a son in this group is not college educated.
d) If the initial generation of fathers is Generation 0, find the probability that a son in the 3rd generation
is college educated.
2
Math 166 – Spring 2016
Ch M – Markov Process
Example 2. Every Valentine's Day, Eugene gives his girlfriend either roses or tulips. If he gives roses
one year, 30% of the time he will give roses again the next year. If he gives tulips one year, 80% of the
times he will give roses the next year.
a) Write the Transition matrix, and draw the transition diagram.
b) If he gave her roses this year, what is the probability that he will give roses again 2 years from now?
3
Math 166 – Spring 2016
Ch M – Markov Process
Example 3. A house cat tries to catch a mouse every night in one of the three neighboring fields.
• If some night she visits the first field,then she has an 80% chance of returning to the first filed
the next night, and an equal chance of going to one of the other 2 fields.
• If she visits the second field, she has a 90% chance of returning to the second field, and a 10%
chance of going to the first, the next night.
• If she visits the third filed, she has a 70% chance of returning to the third field the next night,
and 30% chance of going to the second.
• If she initially has the same chance of visiting one field as any other, what is the chance she will
be in each of the fields two nights later?
4
Math 166 – Spring 2016
Ch M – Markov Process
Regular Matrix : A Transition (or stochastic) matrix T is regular if some power of T has all positive
entries. A Markov process with a regular Transition matrix is called a regular Markov process.
Example 2: Are the following Transition matrix regular?
5
Math 166 – Spring 2016
Ch M – Markov Process
Steady-State: If a Markov process does not have a change in the next distribution state, we say it has
reached a steady state or equilibrium state.
That is,
where
X.
is the steady-state distribution and L is the steady-state matrix, whose columns are identical to
Example 4. Find the steady-state distribution and the steady-state matrix for the given Transition
matrix.
6
Math 166 – Spring 2016
Ch M – Markov Process
Example 5. Find the steady-state distribution and the steady-state matrix for examples 1, 2, and 3.
7
Math 166 – Spring 2016
Ch M – Markov Process
M.3: Absorbing Markov Process
A state is called an absorbing state if there is no chance of leaving the state, once entered.
Examples:
A Markov Process is absorbing if
a) there is at least one absorbing state.
b) It's possible to move from any non-absorbing state to one of the absorbing states in a finite number
of stages.
The transition matrix for an absorbing Markov process is an absorbing stochastic matrix.
Example.
8
Math 166 – Spring 2016
Ch M – Markov Process
Theorem 1. In an absorbing Markov process, the long term probability of going from any
non-absorbing state to some absorbing state is one.
An absorbing matrix is said to be in a standard form if the absorbing states are listed before the
non-absorbing states.
Theorem 2 (Part I). Let T be the transition matrix of an absorbing Markov process with a absorbing
states and b non-absorbing states. Write the matrix in the standard form as follows :
Then, the limiting matrix can be found by
L=
The entry in the ith row and jth column of the sub-matrix - A*(I – B) –1 , gives the probability that the
system will end up in the ith absorbing state when initially in the jth non-absorbing state.
9
Math 166 – Spring 2016
Ch M – Markov Process
Example 6. Find the limiting matrix (or the long term behavior) of the given absorbing transition
matrix.
T=
10
X
Y
Z
X
1
0
0.1
Y
0
1
0.2
Z
0
0
0.7
Math 166 – Spring 2016
Ch M – Markov Process
Example 7. For the given absorbing transition matrix,
T=
X
Y
Z
U
X
1
0
0.1
0.25
Y
0
1
0.3
0.5
Z
0
0
0.2
0.15
U
0
0
0.7
0.10
a) Find the limiting matrix (or the long term behavior)
b) In the long term, what percentage of time will you end up in State B, if you started in State A? In
State D? State C?
11
Math 166 – Spring 2016
Ch M – Markov Process
Example 8. A person plays a game in which the probability of winning $1 is 0.4, and the probability of
losing $1 is 0.6. If he goes broke or reaches $3, he quits. Find the long term behavior if he starts with
$1 or $2.
12
Math 166 – Spring 2016
Ch M – Markov Process
Example 9. A certain manufacturing process consists of 2 manufacturing states and a completion state,
together with a fourth state in which the item is scrapped if improperly manufactured. At the end of
each of the 2 manufacturing states, each item is inspected. At each inspection, there is a probability of
2/3 that the item will be passed on to the next state, probability of 1/6 that it will be sent back to the
same state for reworking, and a probability of 1/6 that the item will be scrapped. An item that is
complete stays complete and an item that is scrapped stays scrapped.
a) Write the Transition Matrix for this Markov Process.
b) Is this an absorbing Markov Process? IF yes, which ones are the absorbing and non-absorbing
states?
c) Determine the long-term behavior.
13
Download