PS1

advertisement
INDR 343 Problem Session 1
09.10.2014
http://home.ku.edu.tr/~indr343/
Consider the second version of stock market
model presented as an example. Whether the
stock goes up tomorrow depends upon
whether it increased today and yesterday.
 If the stock increased today and yesterday, it
will increase tomorrow with probability α1.
 If the stock increased today and decreased
yesterday, it will increase tomorrow with
probability α2.
 If the stock decreased today and increased
yesterday, it will increase tomorrow with
probability α3.
 If the stock decreased today and yesterday, it
will increase tomorrow with probability α4.


(a) Construct the (one-step) transition matrix
of the Markov chain.
(b) Explain why the states used for this
Markov chain cause the mathematical
definition of the Markovian property to hold
even though what happens in the future
(tomorrow) depends upon what happened in
the past (yesterday) as well as the present
(today).



Suppose that a communications network
transmits binary digits, 0 or 1, where each digit
is transmitted 10 times in succession.
During each transmission, the probability is 0.99
that the digit entered will be transmitted
accurately. In other words, the probability is 0.01
that the digit being transmitted will be recorded
with the opposite value at the end of the
transmission.
For each transmission after the first one, the digit
entered for transmission is the one that was
recorded at the end of the preceding
transmission. If X0 denotes the binary digit
entering the system, X1 the binary digit recorded
after the first transmission, X2 the binary digit
recorded after the second transmission, . . . ,
then {Xn} is a Markov chain.
(a) Construct the (one-step) transition matrix.
(b) Use your OR Courseware to find the 10step transition matrix P(10). Use this result to
identify the probability that a digit entering the
network will be recorded accurately after the
last transmission.

An urn always contains 2 balls. Ball
colors are red and blue. At each stage
a ball is randomly chosen and then
replaced by a new ball, which with
probability 0.8 is the same color, and
with probability 0.2 is the opposite
color, as the ball it replaces. If initially
both balls are red, find the probability
that the fifth ball selected is red.
Accessible: Possible to go from state i to state j (path exists in
the network from i to j).
d
0
a0
a
0
1
d
2
2
1
a
0
d
1
a1
2
4
3
a
1
d
3
a
2
a2
4
3
…
3
a3
4
…
Two states communicate if both are accessible from
each other. A system is irreducible if all states
communicate.
State i is recurrent if the system will return to it after
leaving some time in the future.
If a state is not recurrent, it is transient.
A state is periodic if it can only return to
itself after a fixed number of transitions
greater than 1 (or multiple of a fixed
number).
A state that is not periodic is aperiodic.
(0.5)
0
0
4
(0.5)
(1)
(1)
2
1
(1)
a. Each state visited
every 3 iterations
(1)
(1)
2
1
(1)
b. Each state visited in multiples
of 3 iterations
An absorbing state is one that locks in the system once it enters.
d
0
d
1
d3
2
2
1
a
1
3
a
2
4
a
3
This diagram might represent the wealth of a gambler who
begins with $2 and makes a series of wagers for $1 each.
Let ai be the event of winning in state i and di the event of
losing in state i.
There are two absorbing states: 0 and 4.
Class: set of states that communicate with each other.
A class is either all recurrent or all transient..
State i is ergodic if it is recurrent and aperiodic.
A Markov chain is ergodic if all of its states are ergodic.
3
0
2
1
5
6
4
Example
1
0
State
0
1
2
3
0
0
X
X
0
1
X
0
0
0
2
0
0
0
X
3
X
0
0
X
1
3
2
Every pair of states communicates forming a single recurrent
class; moreover, the states are not periodic.
Thus the stochastic process is aperiodic and irreducible.
Example
2
0
State
0
1
2
3
4
0
X
X
0
0
0
1
X
X
0
0
0
2
0
0
X
0
0
3
0
0
X
X
0
4
X
0
0
0
0
4
1
3
2
States 0 and 1 communicate and form a
recurrent class.
States 3 and 4 form separate transient classes.
State 2 is an absorbing state and forms a
recurrent class.
Example
3
0
State
0
1
2
3
0
0
X
X
0
1
0
0
0
X
2
0
0
0
X
3
X
0
0
0
1
3
2
Every state communicates with every other state, so we
have an irreducible stochastic process.
Periodic Yes, so Markov chain is irreducible and
?
periodic.
Example
1
.4
.6
.5
1  0 .4
2  0 .5

P 3  0

4
0

5  0
0 .6
0
0
0 .5
0
0
0
0 .3
0 .7
0
0 .5
0 .4
0
0
0 .8
0 
0 

0 

0 .1

0 . 2 
.7
2
3
.5
.3
4
.5
.8
.1
5
.2
.4

Given each of the following (one-step)
transition matrices of a Markov chain,
determine the classes of the Markov chain
and whether they are recurrent.



Consider the Markov chain that has the
following (one step) transition matrix.
(a) Determine the classes of this Markov chain
and, for each class, determine whether it is
recurrent or transient.
(b) Determine the periods of each state.
Download