Lecture notes – 4

advertisement
Peer to Peer Networks
Lecture notes – 4
Rehearsal and introduction
Discrete Markov Chain
Continuous Markov Chain
2
2
3
p – transition
prob.
1
Equilibrium:
1
  P
λ – transition
rate
3
Equilibrium:   Q  0
At this course we remain in the Markovian space. We’ll focus mostly on continuous Markov
chains from now on.
 is the transition rate, and not the transition probability, and p    t .
We defined Q to be the transition rate matrix such that   Q  0 .
Birth-Death Chains:
λ0
0
λ1
1
µ1
λ2
2
µ2
……..
µ3
λ𝑘−1
µ𝑘
k
λ𝑘
µ𝑘+1
The assumptions we need on a birth-death process are that in homogeneous Markov chain X(t)
where births and deaths are independent and
B1: Pr[exactly 1birth in (t , t  t ) | k in population]  k  t  o(t )
D1: Pr[exactly 1death in (t , t  t ) | k in population]  k  t  o(t )
B2 : Pr[exactly 0 birth in (t , t  t ) | k in population]  1  k  t  o(t )
D2 : Pr[exactly 0 death in (t , t  t ) | k in population]  1  k  t  o(t )
While o( t ) is an expression describing a term that converges to 0 ‘faster’ than t .
Look at the time Axis at t  t . If we want to be at Ek , what are all the possible states to be at at time t?
states
𝐸𝑘+1
D1
𝐸𝑘
𝐸𝑘
B1
𝐸𝑘−1
t
t+Δt
Time
Note: when t is infinitesimally small, only one death/birth can happen at t+Δt.
Denote Pk (t  t ) the probability to be at state k at time t+Δt:
Pk (t  t )  Pk (t )  pkk (t ) 
stay
Pk 1 (t )  pk 1,k (t ) 
birth
Pk 1 (t )  pk 1,k (t ) 
death
o(t )
const
The above equation is for any k>0. If we open the above terms we get:
Pk (t  t )  Pk (t )  [1  k  t  k  t  o(t )] 
Pk 1 (t )  [k 1  t  o(t )]
Pk 1 (t )  [ k  t  o(t )]
Changing sides and dividing by Δt would result:
Pk (t  t )  Pk (t )
o(t )
  Pk (t )  k  Pk (t )  k  Pk 1 (t )  k 1  Pk 1  k 1 
t
t
Adding lim to both sides results:
dPk (t )
  Pk (t )  (k  k )  Pk 1 (t )  k 1  Pk 1  k 1
dt
Note: when k=0, we need to zero the k-1 terms in the above equation.
The above describes the probability growth rate to be at state k.
We see that the probability growth rate to be at any state is the sum of all ‘incoming’ rates
minus the sum of all ‘outgoing’ rates.
We can ‘wrap’ any number of states and find out what the incoming and outgoing rates are, and
retrieve the probability to be at those states.
λ0
0
λ1
1
µ1
λ2
2
µ2
……..
µ3
λ𝑘−1
µ𝑘
k
λ𝑘
µ𝑘+1
Birth Chains:
Let’s examine a simpler process where k : k  0 - meaning birth-only process:
dPk (t )
 k  Pk (t )  k 1  Pk 1 (t )
dt
We can make the process even simpler by assuming k : k   meaning the rates are
constant. We now get:
 dP (t )
 0    P0 (t )
 dt
 dPk (t )
   ( Pk (t )  Pk 1 (t ))

 dt

1 if k  0
 Pk (0)  
0 otherwise

 P0 (t )  e  t
where the above is a simple solution of differential equation.
From that we can deduce P1 (t ) :
dP1 (t )
  P1 (t )   P0 (t )  P1 (t )  t  e  t
dt
The global formula for Pk (t ) would then be:
Pk (t ) 
( t ) k   t
e
k!
This describes a Poisson distribution function: the probability of k arrivals during interval t.
Mean value E[k]:
Consider the random variable K: the number of arrivals at interval t (what used to be α(t)).
K’s mean value:

k (t )k  t
(t )k 1  t
e  t  
e
k!
k 0
k 1 (k  1)!

( t ) i   t
Remember the following holds: 
e  1 . It’s easy to understand why if we think of
i!
i 0

E[k ]  
each i as the probability for exactly i arrivals to happen at interval t. the summation of all
probabilities of possible arrival# must give 1.
Therefore, for every t:
E[k ]  t and the variance:  k 2  t
Moment generating function G(z):
Let’s examine the Z-Transform of the Poisson probability function:
Define: g k  Pr[ K  k ]
G ( Z )   g k  Z k  G ( Z )  E[ Z k ]
( t ) k  Z k   t
(t ) k  Z k  tZ
tZ
 t
  Z  Pk (t )  
e  e e 
e
 et ( Z 1)
k!
k!
k
Finally resulting G(Z) – Pk’s generating function:
G(Z )  E[ Z k ]  et ( Z 1)
By differentiating m times and assigning z=1, we can calculate moment m of K specifically its
mean and variance.
Interarrival-times distribution
We’d like to break (0,t) to 2k+1 time frames as shown in the figure below:
0
𝛼1
𝛽1
𝛼2
𝛼1
𝛽1
𝛼2
𝛽2
……..
𝛽𝑘
𝛼𝑘+1
𝛼𝑘+13 Time
Consider the joint distribution of arrivals when it’s known beforehand that exactly k arrivals
have occurred during (0,t).
Define: 𝑨𝒌 - [exactly 1 arrival in each of the 𝛽𝑖 ’s and no arrivals at in any of 𝛼𝑖 ’s].
We’d like to calculate the probability of 𝐴𝑘 given exactly k arrivals occurred in (0,t):
Pr[ Ak | exactly k arrivals in (0, t )] 
Pr[ Ak and exactly k arrivals in (0, t )]
Pr[exactly k arrivals in (0, t )]
Note that the α’s and β’s events are independent events, thus the probability can be calculated
as a product of the individual probabilities:
Pr[1 arrival in int erval of length i ]  i e  i
And
Pr[no arrival in int erval of length  i ]  e  i
Giving:
Pr[ Ak | exactly k arrivals in (0, t )] 
*

1 2 ... k
tk
(1   2 ... k  e 1 e 2 ...e k )(e 1 e  2 ...e  k 1 )

[(t )k / k !]e t
k!
SAME AS IF EACH OF THE POINTS WAS PLACED UNIFORMILY ON (0,t) !!!
*the last equation is since t=𝛼1 +𝛼2 + ⋯ + 𝛼𝑘 + 𝛽1 + 𝛽2 + ⋯ + 𝛽𝑘+1
Note that it doesn’t matter if the points on the time axis were picked randomly with uniform
distribution, or picked in advance like in the above analysis.
Furthermore, it is easy to prove that the interval’s starting time doesn’t matter- only its length:
Pr[ X ( s, s  t )  k ] 
(t )k  t
e
k!
t interarrival time = exponential distribution:
Define t - the time between adjacent arrivals.
t
arrival
arrival
Time
Accordingly,
Define the PDF and pdf A(t )  Pr[t  t ] and a(t )  A '(t )
Calculation A(t) and a(t) is easy:
A(t )  1  P[t  t ]  1  P0 (t )  1  e t
a(t )  A '(t )  e t
This is the PDF and pdf of the exponential distribution – concluding:
Poisson process implies interarrival process with exponential distribution!
Specifically, this is a memoryless process whose properties were largely discussed at lecture-2:
Pr[t  t  t0 | t  t0 ] 
Pr[t0  t  t  t0 ]

P[t  t0 ]

Pr[t  t  t0 ]  Pr[t  t0 ]

P[t  t0 ]

1  e   (t t0 )  (1  e  t0 )

1  (1  e  t0 )
 1  e  t 
 Pr[t  t ]
t mean value:


E[t ]   a (t )  t    e  t  t   
0
0
E [t ] 
1



 1
1
e  t dt  




 0
   
2
 - a constant birth rate
Let’s finally examine the rate at an infinitesimal period Δt. Recall that:
Pr[t  t0  t | t  t0 ]  1  e t  1  [1  t 
(t )2
 ....]  t  o(t )
2!
Similarly (using Taylor):
Pr[t  t0  t | t  t0 ]  et  1  t  o(t )
Moreover – the probability to have two or more arrivals in (t0,t0+Δt) is 0:
Pr[morethan1 arrival in (t0 , t0  t )]  1  Pr[0 arr in (t0 , t0  t )]  Pr[1arr in (t0 , t0  t )]
 1  (1  t  o(t )]  [t  o(t )] 
 o(t )
And so –
Exponentially distribution implies constant birth rate  .
Summary:
We can summarize by concluding the relations between Poisson’s arrival process, the
exponential interarrival times and the constant birth rate as depicted in the following figure.
Each circle implies the other one!
Poisson
arrival
process
Exponential
interarrival
times
Constant
birth rate
For further usage we’ll calculate A(t) Laplace transform:

A *( S )   e  st a(t )dt 
0

S
M/M/1 Birth-Death process:
Consider a model where k   and k   - Poisson arrival process and Poisson service time:
A(t )  1  e t
λ0
B(t )  1  e t
0
λ1
1
Note that the service time is the
same as the death rate, since
when finishing service the
customer is gone.
λ2
2
µ1
……..
λ𝑘
k
µ𝑘
µ3
µ2
λ𝑘−1
µ𝑘+1
From now and during chapter 3 we’re going to talk about the system after reaching equilibrium
– the changes ‘rate’ is 0. Recall that Pk (t ) is the probability to have k ‘customers’ at time t, the
mathematical equivalent would be:
dPk (t )
 (   ) Pk (t )   Pk 1 (t )   Pk 1 (t )  0
dt
This is the stationary state  t has no meaning since no changes occur in time anymore. For
each state, the ‘enter-rate’ equals the ‘exit-rate’.
consider the following diagram,
focusing on the dashed cut between
states k and k-1.
The following holds:
Pk 1    Pk   and therefore:
λ
0
µ
k


 Pk  P0   .



Define 

k
And we get that Pk  P0  
λ
1
λ
2
µ
……..
µ
λ
λ
k
µ
µ
Pk  Pk 1
As always, the following holds:
Giving: P0 
1

k 0
k 0
 Pk   P0   k  1
 1 



k
k 0
Note that P0 is the probability for the queue to be empty – meaning that 1- P0 is the probability
for the queue to be full/active – which is why  is the utilization of the queue/server.
Pk   k (1   )
k+1
Chapter 3: Birth Death queueing Systems in Euqilibrium
What would change in case k   and k   ?
The equation should be changed to represent the dashed cut in the above figure:
Pk  Pk 1
k 1

k
k 1
Pk  P0 
i 0
i
i 1
At some cases, this phrase converges to something we can calculate (for example when special
relations between the different i ’s and i ’s apply).
Convergence note: The above term converges iff starting some k0 for every i  k0 : i  i 1 .
Note: the probabilities converge – the system does not converge to a certain state.
Calculating the queue size’s mean value N:



k 0
k 0
k 0
N   k  Pk   k   k (1   )  (1   )   k   k 1  (1   )  
N
  k

  (1   )  

 k 0

 1 
1   



1 
Recall Little Law: N  T  T 
1/ 
 /
T 
1 
1 
Note that the average service time is
1
and that 1   is 1-(utility)  if   0 the customer will

be in the system exactly the average service time as described in the following graph for M/M/1:
T
1

1
Upto here 3/5/2012

Download