Uploaded by Drew Neff

Stochastics Formula Sheet

advertisement
e−λ λk
k!
Exponential Distribution: F (x) = 1 − e−λx , x ≥ 0
Poisson distribution PMF: P (X = k) =
F̄ (x) = P (X > x) = e−λx , x ≥ 0
Process Properties:
a queueing system, ℓ, is equal to the rate at which customers
Stationary: N (t + s) − N (s) has a distribution that only de-
arrive, λ, × the average sojourn time of a customer, w.
pends on t (the length of the interval
CTMC Definition:
f (x) = λe−λx , x ≥ 0
E(X) =
1
,
λ
2
E(X ) =
V ar(X) = λ12
R∞
E(e−sX ) = 0
2
,
λ2
n
E(X ) =
n!
λn
Poisson
Little’s Law:
The time average number of customers in
P (X(s + t) = j|X(s) = i) = Pij (t)
Independent Increments: For two non-overlapping intervals of
where Pij (t) is the probability that the chain will be in state
time, the RVs are independent
j, t times units from now, given it is in state i now. Each state
Properties of Exponential:
Partitioning Poisson Processes: If ψ ∼ P P (λ) and if each
has a constant rate ai > 0 such that the holding time in state
Memoryless: P (X − y ≥ x|X > y) = P (X > x) for x, y ≥ 0
arrival is independently type 1 or 2 with probability p and
i is Hi ∼ exp(ai ) where E[Hi ] =
e−sx λe−x =
λ
,
λ+s
s≥0
1
.
ai
P (X > x + y) = P (X > x)P (X > y) for x, y ≥ 0
q = 1 − p then ψ1 ∼ P P (pλ) and ψ2 ∼ P P (qλ) and they are
#1: The minimum of X1 and X2 , Z = min{X1 , X2 } is expo-
independent.
Chapman-Kolmogorov for CTMC:
P
P (t + s) = P (s)P (t) ↔ Pij (t + s) = k∈SPik (s)Pkj (t)
nential with λ = λ1 + λ2
Superposition of Independent Poisson Processes: If
Birth and Death Processes: CTMC’s that can only change
#2: P (X1 < X2 ) = λ1 /(λ1 + λ2 )
ψ1 ∼ P P (λ1 ) and ψ2 ∼ P P (λ2 ) then we can superpose them
state by +1 or -1; Pi,i+1 + Pi,i−1 = 1.
#3: Z = min{X1 , X2 } is independent of which event is ac-
to obtain ψ = ψ1 + ψ2 at λ = λ1 + λ2 . The partitioning
States i and j communicate in continuous time iff they
tually the minimum.
probability is p = λ1 /(λ1 + λ2 ).
communicate in the embedded discrete-time MC. A CTMC is
Order Statistics: If we condition on N (t) = n, the joint
irreducible iff its embedded chain is irreducible. State i is
Alg for Simulating X ∼ exp(λ): Generate U unif (0, 1)
distribution of the n arrival times t1 , . . . , tn is the same as the
recurrent/transient iff recurrent/transient in the embedded
then set X = − λ1 ln(U )
joint distribution of V(1) , . . . , V(n) , the order statistics of n IID
MC. Let Tii be the amount of (continuous) time until the chian
This implies that E(Z|X1 < X2 ) =
E(Z|X2 < X1 ) = E(Z) =
1
λ1 +λ2
th
Alg for Simulating PP at rate λ up to the n
Set t0 = 0 and i = 1. Generate U . Set ti = ti−1 −
point:
1
ln(U ).
λ
If
i < n, set i = i + 1 and generate U again.
P
P
n−1
n
a
Series Identities: ∞
= ∞
n=1 ar
n=0 ar = 1−r if |r| < 1
P∞
n
ar
n=0 anr = (1−r)2


a( 1−rn ) for r ̸= 1
Pn−1 k Pn
1−r
k−1
=
k=0 ar =
k=1 ar

an
for r = 1
Stopping Time: A RV that represents the time at which
a process exhibits a specific behavior for the first time. Can
use information up to time t. ie. First passage/hitting times/
unif (0, 1) RVs: f (s1 , . . . , sn ) =
Wald’s Equation: If τ is a stopping time with respect ot
an IID sequence Xn and E[τ ] < ∞ and E[|X|] < ∞ then
Pτ
E
n=1 Xn = E[τ ]E[X]
N (t)
t
= λ w.p.1.
Null Recurrent: E[Tii ] = ∞
M|G|∞ Queue Arrivals are P P (λ) and service times Sn are
equivalent to the embedded chain.
some general distribution G(x) = P (S ≤ x) with mean
V ar[N (t)] = λt
1
.
µ
Limiting Probability:
E[Hj ]
E[Tjj ]
For a positive recurrent CTMC,
> 0, j ∈ S. This means the long-run pro-
Pj =
E[X(t)] = α(t) = V ar[X(t)]
Rt
n
α(t) = λ 0 P (S > s)dx and P (X(t) = n) = e−α(t) (α(t))
n!
R∞
α(t) converges: λ 0 P (S > x)dx = λE[S] = µλ
portion of time the chain spends in state j equals the expected
n
e−ρ ρn!
= pn where ρ =
amount of time spent in state j during a cycle divided by the
expected cycle length. Also Pj = lim Pij (t), i, j ∈ S
t→∞
λ
µ
P (S > x) = 1, x ∈ [0, a). For example, if S ∼ unif (1, 3) then
R1
R2
α(2) = λ 0 1dx + λ 1 3−x
dx
2
Uniform CDF:
F (x) = P (X ≤ x) =
F̄ (x) = P (X > x) =



0



x−a
 b−a



1



1



b−x
 b−a



0
for x < a,
=
1/aj
E[Tjj ]
X(t) denotes the number of customers in service at time t.
A Poisson process at rate λ is a renewal point process with
k
Positive Recurrent: E[Tii ] < ∞
IMPORTANT: Positive/null recurrence is not necessarily
0 < λ < ∞ it’s positive recurrent.
Poisson process PMF: P (N (t) = k) = e−λt (λt)
k!
re-visits state i.
within interval of length t given that exactly n events occur.
When
λ = 0 the renewal process is null recurrent. When
exp(λ) interarrival times.
This is
IMPORTANT: If S is on an interval (a, b) with a > 0 then
P (τ = n|{Xk : k ≥ 0}) = P (τ = n|X0 , X1 , . . . , Xn )
t→∞
n!
.
tn
the probability of a particular configuration of events occurring
t→∞
Elementary Renewal Thm: lim
0 < s1 < s2 < · · · < sn < t
This means P (t1 = s1 , . . . , tn = sn |N (t) = n) =
Limiting Dist: lim P (X(t) = n) =
Gambler’s ruin
E[N (t)] = λt
n!
,
tn
If the chain is null recurrent or transient then Pj = 0, j ∈ S
⃗ using π from embedded MC: Pj = P πj /aj
Find P
i∈S
πi /ai
NOTE: Even if the embedded chain is positive recurrent, if
π
/a
i+1
ai is chosen such that i+1
=
πi /ai
P πi
i ai diverges to ∞ and Pj = 0.
Transition
Rate
Matrix
πi+1 ai
πi ai+1
Q: If S
> 1 for all i, then
=
{0, 1, 2, 3, 4}
for a ≤ x ≤ b,
for x > b.
for x < a,
for a ≤ x ≤ b,
for x > b.
P
(Qt)n
Solving For P(t): P (t) = eQt = ∞
n=0
n!
⃗ with Pij (t): Pj = P
Find P
j ∈ S, t ≥ 0
i∈S Pi Pij (t)
⃗ must satisfy P
⃗ P (t) = P
⃗
NOTE:Any solution for P
P
Balance Equations: aj Pj = i̸=j ai Pi Pij
NOTE: Xe is always a continuous random variable because
go to B, then they immediately rent a car. Compute the long
the density function fe (x) always exists (even if X is not con-
run average number of cars out rented at A, and out rented at
For a positive recurrent CTMC, the long-run rate entering
tinuous). However, Xs is not always continuous.
B.
state j equals the long-run rate departing state j. An irre-
SOLUTION: By the iid partitioning of the Poisson process,
ducible CTMC with a finite state space is positive recurrent;
EXAMPLES:
we get two independent queues: A itself is an M/M/1 loss
there is a always a unique probability solution to the balance
M ∥M ∥∞ queue has the following Birth and Death Balance
queue with arrival rate λ/2 and service rate µ; B itself is
equations.
Equation: λPj = (j + 1)µPj+1
an M/M/∞ queue with arrival rate λ/2 and service rate µ.
It’s always positive recurrent for any λ, µ > 0 and its station-
For A we can solve the birth and death balance equations
PASTA:
πja
= Pj ,
j ≥ 0 which means the proportion of
Poisson arrivals who find j customers in the system is equal
ary distribution is Poisson with mean ρ = λ/µ
(λ/2)P0 = µP1 ; P0 + P1 = 1. If we define ρ = λ/µ, then
to the proportion of time there are j customers in the system.
If P (X = c) = 1, that is, X is a constant c, then Fe is the
we get
We can use it as long as the LAC is met.
uniform distribution on (0,c) as derived as follows: λ = 1/c,
Lack of Anticipation Condition (LAC): For each fixed
F̄ (x) = 1, x ∈ (0, c)
V ar(X) = E[X 2 ] − E[X]2
Suppose buses arrive exactly every 15 minutes. Indepen-
So the average at A is P1 =
dently, Taxis arrive according to a Poisson process at rate 20
P1 =
t > 0, the future increments of the Poisson process after time
t, {N (t + s) − N (t); s ≥ 0},be independent of the joint past,
{(N (u), X(u))}.
Any future increment be independent not
only of its own past but of the past of the queuing process as
well.
Renewal Reward Theorem: For a positive recurrent renewal process in which a reward Rj is earned during cycle
length Xj and such that {(Xj , Rj ) : j ≥ 1} is IID with
E[|Rj |] < ∞, the long run rate at which rewards are earned is
given by
lim R(t)
t→∞ t
=
E[R]
E[X]
= λE[R] where λ =
1
E[X]
Forward Recurrence Time: The time until the next point
def
strictly after time t. A(t) = tN (t)+1 − t, t ≥ 0
Rt
2
]
lim 1t 0 A(s)ds = E[X
w.p.1
2E[X]
per hour. You arrive at random. You decide to take whichever
Backward Recurrence Time: The time since the last point
def
before or at time t. B(t) = t − tN (t) , t ≥ 0
Rt
2
]
lim 1t 0 B(s)ds = E[X
w.p.1
2E[X]
P1 =
ρ/2
.
1 + ρ/2′
ρ/2
.
1+ρ/2′
(If we instead define
the solution would be written as P0 =
1
,
1+ρ
For B we easily derive the answer is ρ/2, via either ℓ = λw,
or using the fact that the limiting distribution is Poisson with
SOLUTION: The time until the next bus arrives, denote this
mean ρ/2. Thus the final answer is (with ρ = λ/µ):
by Xe , has the Unif(0,15) distribution, Fe , for the bus interrival
ρ/2
+ ρ/2
1 + ρ/2
times. The time until the next taxi arrives, denote this by Te ,
has an exponential distribution with rate 20 per hour, or 1/3
per minute (Te has the equilibrium distribution for taxi interrival times, exponential by the memoryless property). Thus
your waiting time is given by W = min{Xe , Te } with Xe
its tail. Since W > x iff both Xe > x and Te > x, and since
Suppose instead, that every arrival to the building first goes to
Company A hoping to get the 1 car, but if the 1 car is already
out rented, then they immediately go to Company B.
SOLUTION:
A is now an M/M/1 loss queue but with arrival rate λ, so now
the average is
P (Xe > 15) = 0, we conclude that P (W > x) = 0, x ≥ 15 and
−(1/3)x
P (W > x) = P (Xe > x)P (Te > x) = e
t→∞
ρ =
λ
, then
2µ
ρ
.)
1+ρ
1
,
1 + ρ/2′
arrives first. How long on average must you wait?
and Te independent. We will compute E[W ] by integrating
t→∞
P0 =
(1 −
x
),
15
P1 =
x∈
ρ
1+ρ
(ρ = λ/µ.)
To get B we have 2 choices: Use “l = λw” with the arrival
[0, 15).
There are two car rental shops next to each other in the
rate = λP1 , where P1 above is by PASTA the long-run propor-
S(t) = tN (t)+1− − tN (t) = B(t) + A(t), t ≥ 0
Rt
2
]
lim 1t 0 S(s)ds = E[X
w.p.1
E[X]
same building, A and B. A has only 1 car to rent, while B has
tion of arrivals who find A busy, hence go to B; w = E(S) =
an infinite number of cars to rent. Car rental times at A or B
1/µ. So for B, the average is λP1 (1/µ) = ρP1 =
Inspection Paradox: For every fixed t ≥ 0, S(t) is stochas-
are iid with an exponential distribution at rate µ (mean 1/µ)
tically larger than X: P (S(t) > x) ≥ P (X > x), t, x ≥ 0
(e.g., a person rents a car for that length of time and then re-
M/M/∞ queue with arrival rate λ, hence the mean of A + B
turns it.) People arrive to the building to rent cars according
is ρ. Now we subtract the mean of A which is P1 yielding
to a Poisson process at rate λ per hour. Assume that at time
ρ − P1 = ρ(1 −
Spread: Length of the interarrival time containing time t.
def
t→∞
E[X 2 ]
E[X]
> E[X]
def
Equilibrium Distribution of F: Fe (x) = λ
Rx
0
F̄ (x)dx
P (Xe ≤ x) = P (Xb ≤ x) = Fe (x)
fe (x) =
d
F (x)
dx e
t = 0 there are no cars out rented.
= λF̄ (x).
F̄s (x) = λxF̄ (x) + F̄e (x) where P (Xs ≤ x) = Fs (x) and
Suppose that independently, with probability p = 0.5, each ar-
fs (x) = λxf (x)
rival to the building goes to A or B. If they go to A and the
1 car is already out rented, then they go away forever. If they
ρ2
.
1+ρ
As a second method: observe that together A + B form an
1
)
1+ρ
= ρP1 =
ρ2
.
1+ρ
Download