Regular Markov chains

advertisement
Markov Chains
Part 5
Are you regular Or Ergodic?
• Absorbing state: A state in a Markov chain that you cannot leave, i.e.
pii = 1.
• Absorbing Markov chain: if it has at least one absorbing state and it is
possible to reach that absorbing state from any other state.
• Transient : In an absorbing Markov chain any state that is not
absorbing.
• Ergodic Markov Chain: if it is possible to go from every state to every
state (not necessarily in one move). Also called irreducible.
• Regular Markov chain: if some power of the transition matrix has only
positive elements. That means that for that n it is possible to go from
any state to any state in exactly n steps.
Special Matrices
Q
The following form (rearrangement of rows/columns) of the
0
transition matrix is called canonical. The properties of an absorbing
Markov chain are described by the transition matrix P as well the matrices
Q, R, N, t, and B, where:
• the matrix N = (I − Q)−1 is called the fundamental matrix; the entry nij of
N gives the expected number of times that the process is in the transient
state sj if it started in the transient state si
• The vector t = Nc, where c = <1,1, …, 1>, is called time to absorbtion
• The matrix B = NR has entries bij that give the probability that an
absorbing chain will be absorbed in the absorbing state sj if it starts in
the transient state si
R
Id
Misc Questions
• If the transition matix contains entries pij = 1, is that state
automatically absorbing?
• Can a matrix with a 1 entry still be ergodic? How about regular!
About Regular Chains
• Theorem: Let P be the transition matrix for a regular chain. Then,
as n goes to infinity, the powers Pn approach a limiting matrix W
with all rows the same vector w (whose components are all
positive and they sum to one)
• Example: Find the limiting matrix W for our Land of Oz
• Examples 1 and 2 on page 38
Download