Section M.2 MATH 166:503 April 23, 2015

advertisement
Section M.2
MATH 166:503
April 23, 2015
Topics from last notes: finite stochastic processes, Markov processes, transition matrix, initial
probability matrix
1
1.2
MARKOV CHAINS
Regular Markov Processes
ex. Sofia always goes to either Chipotle, Subway, or Antonio’s Pizza for lunch every day. If she
goes to Chipotle, she will return 2% of the time and will go to Antonio’s 15% of the time for lunch
tomorrow. If she goes to Subway, she will return 2% of the time and will go to Chipotle 36% of the
time for lunch tomorrow. If she goes to Antonio’s, she will return 73% of the time and will go to
Subway 10% of the time for lunch tomorrow. If she went to Subway today, what is the probability
distribution for 10 days from today? 30 days from today?
1
T2 =
T4 =
T 16 =
T 256 =
Now assume that we are given an arbitrary initial probability matrix X0 , what is Xn for large
n?
2
Markov processes in which the limiting distribution is always the same regardless of initial conditions
is called regular. The associated transition matrix is called a regular stochastic matrix.
Checking if a matrix is regular: A stochastic matrix T is regular if some power of T has all
positive entries.
ex. Is
0
1
0
.1
.8
.1
1!
0
0
a regular stochastic matrix?
ex. Is
.25
.75
a regular stochastic matrix?
If T is a regular stochastic matrix,
3
0
1
ex. A survey indicates that in a certain area, people either take their summer vacations at the
beach, lake, or the mountains. Of the people who went to the beach one summer, 15% will return
to the beach next summer and 50% will visit the lake. Of the people who went to the lake, 20%
will return and 45% will visit the beach. Of the people who went to the mountains, 80% will return
and 20% will visit the lake. Last year, the lake was contaminated so no one visited. Half of the
people went to the beach and half went to the mountains. Does this describe a regular Markov
process? If so, find it’s steady state.
4
Download