CTMC

advertisement
Chapter 5
Continuous time Markov Chains
Learning objectives :
Introduce continuous time Markov Chain
Model manufacturing systems using Markov Chain
Able to evaluate the steady-state performances
Textbook :
C. Cassandras and S. Lafortune, Introduction to Discrete
Event Systems, Springer, 2007
1
Plan
•
•
•
•
•
Basic definitions of continuous time Markov Chains
Characteristics of CTMC
Performance analysis of CTMC
Poisson process
Approximation of general distributions by phase type
distribution
2
Basic definitions of continuous
time Markov Chains
3
Continuous Time Markov Chain (CTMC)
Stochastic
process
Continuous
event
Discrete
events
Discrete
time
Continuous
time
Memoryless
A CTMC is a
continuous time and
memoriless discrete
event stochastic
process.
4
Continuous Time Markov Chain (CTMC)
Definition : a stochastic process with discrete state space and
continuous time {X(t), t > 0} is a continuous time Markov Chain
(CTMC) iff
P[X(t+s)= j X(u), 0≤u≤s] = P[X(t+s)= j X(s)], "t, "s, "j
Memoryless:
In a CTMC, the past history impacts on the future evolution of
the system via the current state of the system
5
Continuous Time Markov Chain (CTMC)
Exponential
service time
Poisson
Arrivals
N(t) : number of
customers at time t
Customer
Arrivals
Customer
departures
6
Homogenuous CTMC
Definition : A CTMC {X(t), t > 0} is homogeneous iff
P[X(t+s)= j X(t) = i] = P[X(t+s)= j X(t) = i] = pij(s)
Homogeneous memoryless:
In reliability, we only say "a machine that does not fail at age t is
as good as new"
Only homogeneous CTMC will be considered in this chapter.
7
Characteristics of CTMC
8
Behavior of a CTMC
X(t)
Two major components:
• Ti = sojourn time in state i (random variable)
• pij = probability of moving to state j when leaving state i
9
Sojourn time in a state
• Let Ti be the random variable corresponding to the time spent
in state i
• The memoryless property of the homogenuous CTMC implies
PTi  t  x ¨Ti  t  P Ti  x , "t , "x
• The exponential distribution is the only continuous probability
distribution having this property.
In an CTMC, the sojourn time in any state is exponentially
distributed.
10
Exponential distribution
•
Let T be a continuous random variable with an exponential distribution of
parameter l
•
Distribution Function (figure) :
1  e lt , t  0
FT  t   
FT(t) = P{T ≤ t}
t0
0,
•
Probability density function :
l e  l t ,
fT  t   
fT(t) = dFT(t)/dt
0,

t0
t0
•
•
Mean : E[T] = 1/ l
Standard deviation: s[T] = 1/ l
•
Coeficient of variation: Cv(T) = s[T]/ E[T] = 1
•
Parameter l often corresponds to some event rate (failure rate, repair
rate, production rate, ...)
11
Exponential distribution
•
Memoryless :
P T  t  s ¨T  t 
P t  T  t  s
 l t  s 
e  lt  e

e lt
P T  t
 1  e l s  P T  s
• For a machine with exponentially distributed lifetime, we
say that it is "as good as new" if it is not failed.
• The remaining lifetime of an used but UP machine has the
same distribution as a new machine.
12
Transition probability
Whe a CTMC leaves state i, it jumps to state j with
probability pij. This probability is:
• independent of time as the CTMC is homogeneous
• independent of sojourn time Ti as the process is markovian
(memoryless)
13
1st characterization of a CTMC
An CTMC is fully characterized by the following
parameters:
• {mi}iE with mi as the parameter of the exponential
distribution of sojourn time Ti
• {pij}i≠j , with pij as the transition probability from i to
j when leaving state i
14
Classification of a CTMC
Each CTMC is associated an underlying DTMC by
neglecting sojourn times.
A state i of a CTMC is said transient (resp. recurrent,
absorbing) if it is transient (resp. recurrent, absorbing)
in the underlying DTCM
A CTMC is irreducible if its underlying DTMC is
irreducible.
Remark: the concept of periodicity is not relevant.
15
2nd characterization of a CTMC
Each state activates several potential events leading to different
transitions.
A CTMC travels from state i to state j in Tij time, an
exponentially distributed random variable with parameter mij.
mi is called transition rate from i to j.
16
Equivalence of the two representation
Let
• Ti = MINj{Tij}
• pij = P{Tij = Ti}
Result to prove: Ti = EXP(Smij), pij is independent of Ti
Moment generating function MX(u) = E[exp(uX)]
17
Performance analysis of CTMC
18
Probability distribution
• State probability
pi(t) = P{X(t) = i}
• state probability vector, also called probability
distribution
p(t) = (p1(t), p2(t), ...)
19
Transient analysis
By conditionning on X(t),
With
20
Transient analysis
It can be shown,
Letting dt go to 0,
21
Infinitesimal generator
• Let
• The matrix Q = [qij] is called infinitesimal generator
of the CTMC
• As a ressult,
22
Steady state distribution of a CTMC
Thereom: For an irreducible CTMC with postive recurrent
states, the probability distribution converges to a vector of
stationary probabilities (p1, p2, ...) that is independent of the
initial distribution p(0). Further it is the unique solution of
the following equation system:
normalization equation
flow balance equation
or
equilibrium eq
23
Flow balance equation
• The balance equation equivalent to : Si≠j pjmji = Si≠j pimij
•
Associate to each transition (i,j) a probability flow : pimij
• Si≠j pjmji : total flow into state i
• Si≠j pimij : total flow out of state i
• Interpretation : Total flow in = Total flow out
24
Flow balance equation of set of states
• Let E1 be a subset of states
• Flow balance equation :
Total flow into E1 = Total flow out of E1
25
A manufaturing system
•
•
•
•
Consider a machine which can be either UP or DOWN.
The state of the machine is checked continuously.
The average time to failure of an UP machine is 10 days.
The average time for repair of a DOWN machine is 1.5 days.
• Determine the conditions for the state of the machine {X(t)} to be a
Markov chain.
• Draw the Markov chain model.
• Find the transient distribution by starting from state UP and DOWN.
• Check whether the Markov chain is recurrent.
• Determine the steady state distribution.
• Determine the availability of the machine.
26
Poisson process
27
Poisson process
A Poisson process is a stochastic process N(t) such that
• N(0) = 1
• N(t) increments by +1 after a time T random distributed
according to an exponential distribution of parameter l.
An arrival process is said Poisson if the inter-arrival times are
exponentially distributed.
28
Properties of Poisson process
A Poisson process is an irreducible CTMC
N(t) has a Poisson distribution with parameter lt
29
Properties of Poisson process
A Poisson process is an irreducible CTMC
P{N(t+dt) = k+1 | N(t) = k} = ldt + o(dt)
Probability of 0 arrival in dt
P{N(t+dt) = k | N(t) = k} = 1- ldt + o(dt)
Probability of more than one arrival in dt
P{N(t+dt) > k+1 | N(t) = k} = o(dt)
30
Properties of Poisson process
The superposition of n Poisson process of parameter li is a
Poisson process of parameter Sli
Assume that a Poisson process is split into n processes with
probabilities pi. These n process are independent Poisson process
with parameter lpi
31
Birth-Death process
32
Definition
• Consider a population of individuals
• Let N(t) be the size of the population with N(t) = 0, 1, 2, ...
• When N(t) = n, births arrive at according to a Poisson pocess of birth
rate ln > 0
• Deaths arrive also according to a Poisson process of death rate mn > 0.
33
Key issues
• Graphic representation of the Markov chain
• Relation with the Poisson process (also called pure birth process)
• Condition for existence of steady state distribution
 l0 ...ln 1
S   n 1

m1 ...m n
• Sufficient condition (larger death rate than birth rate)
ln
   1, "n  n *
mn 1
• Steady state distribution pn
34
Approximation of general
distributions by phase type
distribution
35
Phase-type distribution
A probaiblity distribution that results from a system of one or more
inter-related Poisson process occurring in sequence, or phases.
The sequence in which each of the phases occur may itself be a
stochastic process.
Phase distribution = time until the absorption of a CTMC one
absorbing state. Each of the states of the Markov process represents one
of the phases.
Phase-type distributions can be used to approximate any positive
valued distribution.
36
Definition
•
A CTMC with m+1 states, where m ≥ 1, such that the states 1,...,m are
transient states and state m+1 is an absorbing state.
•
An initial probability of starting in any of the m+1 phases given by the
probability vector (α, αm+1).
The continuous phase-type distribution is the distribution of time from the
above process's starting until absorption in the absorbing state.
This process can be written in the form of a transition rate matrix,
S S 0 
Q

0
0


where S is an m×m matrix and S0 = -S 1 with 1 represents an m×1 vector with
every element being 1
37
Characterization
Time X until the absorbing state is phase-type distributed PH(α,S).
The distribution function of X is given by,
F(x) = 1 - aexp(Sx)1,
and the density function,
f(x) = aexp(Sx)S0,
for all x > 0.
38
Erlang distribution
Ek : k-stage Erlang distribution with parameter m
X = sum of k independent random variable of exponential
distribution with parameter m
E[X] = k/m
Var[X] = k/m2
CX = sX / E[X] = 1/k1/2
m
m
●●●
m
39
Hyper-exponential or mixture of exponential
distribution
X = a1X1 + a2X2 ... + anXn
where
• a1 + a2 ... + an = 1,
• Xi = EXP(mi)
E[X] = a1/m1 + a2/m2 ... + an/mn
Var[X] = a1/m12 + a2/m22 ... + an/mn2
40
Coxian distribution
Coxian distribution can be used to approximate any
distribution.
m1
1-p1
p1
1-p2
m2
p2
●●●
pn-1
mn
1
41
A manufaturing system
•
•
•
•
Consider a machine which can be either UP or DOWN.
The state of the machine is checked continuously.
The average time to failure of an UP machine is 10 days.
The average time for repair of a DOWN machine is 1.5 days.
• Assumed that UP time = E2 and DOWN time = E3.
• Draw the Markov chain model.
42
Download