Uploaded by Jesse van der Ende

Markov Examen

advertisement
Basic properties:
Probability and Expectation:
•
•
•
•
•
•
The conditional probability of A given B:
Expectation for discrete case:
Expectation for continuous case:
The law of total expectation:
Bayes’ rule:
The law of total probability:
P(A | B) = P(A∩B) / P(B)
E(h(X)) = ∑xh(x) * p(x)
E(h(X)) = ∫h(x) * f(x)dx
E(X ) = E(E(X |Y))
P(A|B)=P(B|A)*P(A) / P(B)
P(A) = ∑yP(A |Y = y) * P(Y = y)
P(A) = ∫P(A |Y = y ) * fy(y)dy
• Variance:
Var(X) = E(Var(X |Y )) + Var(E(X |Y))
Var(X |Y ) = E(X² |Y ) − (E(X |Y))
Exponential distribution:
• The probability density function (pdf) of an exponential distribution is:
• The cumulative distribution function of an exponential distribution is:
• Properties:
โ–ถ ๐ธ(๐‘‹ ) =
1
๐œ†
1
โ–ถ ๐‘‰๐‘Ž๐‘Ÿ(๐‘‹ ) = ๐œ†2
๐œ†
โ–ถ ๐‘€๐บ๐น: ๐œ‘(๐‘ก) = ๐œ†−t , ๐‘ก < ๐œ†
• Memoryless property: ๐‘ƒ(๐‘‹ > ๐‘  + ๐‘ก |๐‘‹ > ๐‘ก) = ๐‘ƒ(๐‘‹ > ๐‘ ) ๐‘“๐‘œ๐‘Ÿ ๐‘Ž๐‘™๐‘™ ๐‘  , ๐‘ก ≥ 0.
• Hazard/failure rate is equal to ๐œ† (only distribution with constant hazard rate).
๐œ†1
• ๐‘ƒ(๐‘‹1 < ๐‘‹2 ) = ๐œ†1 + ๐œ†2
• Let X1,...,Xn be iid exponential with common mean 1/λ. Then the random variable Y =∑i Xi has a gamma
distribution with parameters n and λ.
• min Xj has an exponential distribution with rate ∑j λj.
• The random variable mini Xi and the rank ordering of the Xi (i.e., Xi1 < Xi2 < ···< Xin ) are independent.
Poisson distribution:
• The probability mass function of a Poisson distribution is:
• Properties:
โ–ถ ๐ธ(๐‘‹ ) = ๐‘‰๐‘Ž๐‘Ÿ(๐‘‹) = ๐œ†
โ–ถ Als ๐‘‹๐‘– ~๐‘ƒ๐‘œ๐‘–๐‘ (๐œ†๐‘– ) ๐‘Ž๐‘›๐‘‘ ๐‘–๐‘›๐‘‘๐‘’๐‘๐‘’๐‘›๐‘‘๐‘’๐‘›๐‘ก, ๐‘กโ„Ž๐‘’๐‘› ∑๐‘›๐‘–=1 ๐‘‹๐‘– ~๐‘ƒ๐‘œ๐‘–๐‘ (∑๐‘›๐‘–=1 ๐œ†๐‘– )
โ–ถ๐‘€๐บ๐น: ๐œ‘(๐‘ก) = ๐‘’ ๐œ†(๐‘’
๐‘ก −1)
1
Discrete time Markov Chain:
Markov Chain definition:
• A stochastic process is a collection of (infinitely many) random variables:
- Discrete time Stochastic process is of the form {Xn ,n ≥0}
- Continuous time Stochastic process is of the form {X (t),t ≥0}
• A stochastic process {Xn ,n≥0} with state space S is called a discrete time Markov chain if for all states
i ,j ,s0,...,sn−1 ∈ S: P(Xn+1 = j |Xn = i ,Xn−1 = sn−1,...,X0 = s0) = P(Xn+1 = j |Xn = i) (Markov property).
• In time homogeneous Markov Chains we have: P(Xn+1 = j |Xn = i) = P(X1 = j |X0 = i) = Pij.
• A random walk is a Markov Chain where if the chain is in state I, it only can got to i+1 or i-1.
Recurrent and Transient States:
• Recall the notation Pijk: the probability that starting from state i, after k steps, arriving at state j. Then j is
called accessible from i if Pijk > 0 for some k ≥ 0.
• States i and j are said to communicate if they are accessible from each other. We denote this by i ↔ j.
• Communicating states form a class. If there is only 1 class, the MC is ‘irreducible’, otherwise it’s ‘reducible’.
• A state is recurrent if fi = 1, and transient if fi < 1. (fi is probability that if you start in i, that you ever come
back to i).
• โ–ถ State i is recurrent if ∑Piin is infinite.
โ–ถ State i is transient if ∑Piin is finite.
• Recurrence and transience are class properties.
• In a MC not all states are transient, and in a irreducible MC all states are recurrent.
• Two types of recurrence (both are class properties): Denote Nj = min{ n > 0, Xn = j}
โ–ถ Positive recurrent if the expected time until the process returns to the same state is finite:
E(Nj |X0 = j) < +∞, in a finite state MC, alle recurrent states are positive recurrent.
โ–ถ Null recurrent if the expected time until the process returns to the same state is infinite:
E(Nj |X0 = j) = +∞
• Period d of state i is (class property): d = gcd{n > 0 : Piin > 0}, with ‘gcd’ greatest common divisor:
โ–ถ A state is periodic if d > 1.
โ–ถ A state is aperiodic if d = 1.
• An aperiodic, positive recurrent state is called ergodic.
Long run limit:
• For an irreducible ergodic Markov chain: lim ๐‘ƒ๐‘–๐‘—๐‘› exists and is independent of i.
๐‘›→∞
• Denote π๐‘— = lim ๐‘ƒ๐‘–๐‘—๐‘› for j ∈ S and the limiting distribution π = (πj) for each j ∈ S.
๐‘›→∞
• Denote the stationary distribution with w = (wj) j ∈ S which is the unique solution of the steady-state
๐‘ค๐‘— = ∑๐‘– ๐‘ค๐‘– ๐‘ƒ๐‘–๐‘— , ๐‘— ∈ ๐‘†
๐’˜ = ๐‘ƒ๐‘‡ ∗ ๐’˜
equations: {
or
{ ∑
∑๐‘— ๐‘ค๐‘— = 1
๐‘— ๐‘ค๐‘— = 1
• Once the MC starts from w, we will always have P(Xn = j) = wj.
• For an irreducible ergodic Markov chain, the limiting distribution π coincides with the stationary distribution
w.
• Let {Xn, n ≥ 1} be an irreducible Markov chain with stationary probabilities πj , j ≥ 0, with r(j) as reward of
being in state j. Then: ∑๐‘— ๐‘Ÿ(๐‘—)π๐‘— is called the average reward per unit time.
2
Standard questions:
• Denote T = {1,…,t} as the transient states and {t+1,…,s} as the recurrent states.
๐‘ 11 โ‹ฏ ๐‘ 1๐‘ก
๐‘ƒ11 โ‹ฏ ๐‘ƒ1๐‘ก
โ‹ฑ
โ‹ฎ )
โ‹ฑ
โ‹ฎ )and S = ( โ‹ฎ
• Let Pt = ( โ‹ฎ
๐‘ ๐‘ก1 โ‹ฏ ๐‘ ๐‘ก๐‘ก
๐‘ƒ๐‘ก1 โ‹ฏ ๐‘ƒ๐‘ก๐‘ก
Notation:
• fi = probability that, starting in state i, the process will ever re-enter state i.
• sij = expected number of time periods the MC is in j, given that it started in i (mean time spent). Note: i and j
are transient states.
• fij = probability that, starting in state i, the process will ever enter state j. Note: i and j are transient states.
• miR = expected number of steps to enter recurrent class R, given that it started in i (mean time it takes to
entry R). Note: i is a transient state and R is the only recurrent class.
• fiR1 = probability that, starting in state i, the process will ever enter recurrent class R1. Note: i is transient and
there can be multiple recurrent classes.
Solutions:
1 + ∑๐‘ก๐‘˜=1 ๐‘ƒ๐‘–๐‘˜ ๐‘ ๐‘˜๐‘— , ๐‘Ž๐‘™๐‘  ๐‘– = ๐‘—
• ๐‘ ๐‘–๐‘— = {
or ๐‘† = (๐ผ − ๐‘ƒ๐‘‡ )−1
∑๐‘ก๐‘˜=1 ๐‘ƒ๐‘–๐‘˜ ๐‘ ๐‘˜๐‘— , ๐‘Ž๐‘™๐‘  ๐‘– ≠ ๐‘—
(๐‘ ๐‘–๐‘— −1)
• ๐‘“๐‘–๐‘— = ๐‘ƒ๐‘–๐‘— + ∑๐‘ก๐‘˜=1,๐‘˜≠๐‘— ๐‘ƒ๐‘–๐‘˜ ๐‘“๐‘˜๐‘— or ๐‘“๐‘–๐‘— = {
๐‘ ๐‘–๐‘—
๐‘ ๐‘–๐‘—
๐‘ ๐‘—๐‘—
๐‘Ž๐‘™๐‘  ๐‘– = ๐‘—
๐‘Ž๐‘™๐‘  ๐‘– ≠ ๐‘—
• ๐‘š๐‘–๐‘… = 1 + ∑๐‘ก๐‘—=1 ๐‘ƒ๐‘–๐‘— ๐‘š๐‘—๐‘… or ๐’Ž = S ∗ ๐Ÿ
• ๐‘“๐‘–๐‘…1 = ๐‘ƒ๐‘–๐‘…1 + ∑๐‘ก๐‘—=1 ๐‘ƒ๐‘–๐‘— ๐‘“๐‘—๐‘…1 or ๐’‡๐‘…1 = S ∗ ๐‘ท๐‘…1 with ๐‘ท๐‘…1 = (
3
∑๐‘—∈๐‘…1 ๐‘ƒ1๐‘—
โ‹ฎ
)
∑๐‘—∈๐‘…1 ๐‘ƒ๐‘ก๐‘—
Continuous time Markov Chain:
Counting process:
• A stochastic process {N(t),t ≥ 0} is a counting process whenever N(t) denotes the total number of events that
occur by time t. It should satisfy the following:
โ–ถ N(t) ≥ 0.
โ–ถ N(t) is integer valued.
โ–ถ For s < t, N(s) ≤ N(t).
• For s < t: N(t) − N(s) represents the number of events that occur in the interval (s,t].
• A counting process has independent increments whenever the number of events that occur in one time
interval is independent of the number of events that occur in another (disjoint) time interval.
โ–ถ That is, N(s) is independent of N(s + t) − N(s).
• A counting process has stationary increments whenever the number of events that occur in any interval
depends only on the length of the interval.
โ–ถ That is, the number of events in the interval (s,s + t] has the same distribution for all s.
Poisson Process:
First definition:
• The counting process {N(t),t ≥ 0} is a Poisson process with rate λ, λ > 0 when:
1. N(0) = 0.
2. The process has independent increments.
3. The number of events in any interval of length t is Poisson distributed with mean λt. In other words, for all
s,t ≥ 0:
๐‘ƒ(๐‘(๐‘ก + ๐‘ ) − ๐‘(๐‘ ) = ๐‘›) = ๐‘’ −๐œ†๐‘ก
(๐œ†๐‘ก)๐‘›
๐‘›!
, ๐‘› = 0, 1, …
• Note that the last condition implies that a Poisson process:
โ–ถ has stationary increments.
โ–ถ E(N(t)) = λt.
Second Definition:
• The counting process {N(t),t ≥ 0} is a Poisson process with rate λ, λ > 0 when:
1. N(0) = 0.
2. The process has stationary and independent increments.
3. ๐‘ƒ(๐‘(โ„Ž) = 1) = ๐œ†โ„Ž + ๐‘œ(โ„Ž), ๐‘Ž๐‘  โ„Ž → 0
4. ๐‘ƒ(๐‘(โ„Ž) ≥ 2) = ๐‘œ(โ„Ž), ๐‘Ž๐‘  โ„Ž → 0
๐‘”(โ„Ž)
โ„Ž→๐‘œ โ„Ž
• A function g(·) is said to be o(h) if lim
= 0, so g(h) goes faster to zero than h.
Third Definition:
• For a Poisson process, let Tn, n > 1 be the nth interarrival time: the time elapsed between the (n − 1)th event
and the nth event. It follows that Ti is exponential with rate λ for every i.
• The arrival time of the nth event, Sn, is also called the waiting time until the nth event. Clearly,
๐‘›
๐‘†๐‘› = ∑ ๐‘‡๐‘– , ๐‘› ≥ 1.
๐‘–=1
• Thus, Sn has a gamma distribution with parameters with n and λ yielding
(๐œ†๐‘ก)๐‘›−1
๐‘›
๐‘›
๐‘“๐‘†๐‘› (๐‘ก) = ๐œ†๐‘’ −๐œ†๐‘ก (๐‘› − 1)! , ๐‘ก ≥ 0. ๐ธ(๐‘†๐‘›) = λ . ๐‘‰๐‘Ž๐‘Ÿ(๐‘†๐‘›) = λ2
• Note that ๐‘(๐‘ก) ≥ ๐‘› ⇐⇒ ๐‘†๐‘› ≤ ๐‘ก.
• If we denote N(t) by: ๐‘(๐‘ก) ≡ ๐‘š๐‘Ž๐‘ฅ{๐‘› โˆถ ๐‘†๐‘› ≤ ๐‘ก}, with ๐‘†๐‘› = ∑๐‘›๐‘–=1 ๐‘‡๐‘– and Ti i.i.d. exponential random
variables with rate λ. It then follows that {N(t), t ≥ 0} is a Poisson process with rate λ.
4
Merging two Poisson Process:
• Suppose that {N1(t),t ≥ 0} and {N2(t),t ≥ 0} are independent Poisson processes with respective rates λ1 and λ2,
where Ni(t) corresponds to type i arrivals. Let N(t) = N1(t) + N2(t), for t ≥ 0. Then the following holds:
โ–ถ The merged process {N(t),t ≥ 0} is a Poisson process with rate λ = λ1 + λ2.
โ–ถ The probability that an arrival in the merged process is of type i is ๐œ†๐‘– /(๐œ†1 + ๐œ†2 ).
Decomposing a Poisson Process:
• Consider a Poisson process {N(t),t ≥ 0} with rate λ. Suppose that each event in this process is classified as
type I with probability p and type II with probability (1 − p) independently of all other events. Let N1(t) and
N2(t) respectively denote the type I and type II events occurring in time (0,t]. Then, the counting processes
{N1(t),t ≥ 0} and {N2(t),t ≥ 0} are two independent Poisson processes with respective rates λp and λ(1 − p).
Conditional arriving process:
• If Y1, … , Yn are i.i.d. with density f, then the joint density of the order statistics Y(1), . . . , Y(n) is:
๐‘›
f(y1 , . . . , y๐‘› ) = n! ∏ f(y๐‘– ) ,
y1 ≤ · · · ≤ y๐‘› .
๐‘–=1
• Given that N(t) = n, the n arrival times S1, ... , Sn have the same distribution as the order statistics
corresponding to n independent random variables uniformly distributed on the interval (0,t). So:
(S1 , S2 , . . . , S๐‘› ) =๐‘‘ (U(1) , U(2) , . . . , U(๐‘›) )
With U(1) ≤ U(2) ≤. . . ≤ U(๐‘›) order statistics of i.i.d. random variables from U(0,t).
• For any function f (with symmetric operation): ∑๐‘›๐‘–=1 ๐‘“(๐‘†๐‘– ) =๐‘‘ ∑๐‘›๐‘–=1 ๐‘“(๐‘ˆ(๐‘–) ) =๐‘‘ ∑๐‘›๐‘–=1 ๐‘“(๐‘ˆ)
Nonhomogeneous Poisson process:
• The counting process {N(t),t ≥ 0} is said to be a nonhomogeneous Poisson process with intensity function λ(t),
t ≥ 0, if
1. N(0) = 0.
2. {N(t),t ≥ 0} has independent increments.
3. P(N(t + h) − N(t) = 1) = λ(t)h + o(h).
4. P(N(t + h) − N(t) ≥ 2) = o(h)
• For a nonhomogeneous Poisson process N(t) with intensity function λ(t) holds that ๐‘(๐‘  + ๐‘ก) – ๐‘(๐‘ก) is a
Poisson random variable with mean ๐‘š(๐‘  + ๐‘ก) − ๐‘š(๐‘ก), with:
๐‘ก
๐‘š(๐‘ก) = ∫ ๐œ†(๐‘ฆ)๐‘‘๐‘ฆ
0
General CTMC:
• โ–ถ Let {X(t),t ≥ 0} be a continuous-time stochastic process taking values in {0, 1, 2, . . .}.
โ–ถ Let {x(t),t ≥ 0} be any deterministic function taking values in {0, 1, 2, . . .}.
• The process {X(t),t ≥ 0} is called a continuous-time Markov chain if:
๐‘ƒ(๐‘‹(๐‘ก + ๐‘ ) = ๐‘— | ๐‘‹(๐‘ ) = ๐‘–, ๐‘‹(๐‘ข) = ๐‘ฅ(๐‘ข), 0 ≤ ๐‘ข < ๐‘ ) = ๐‘ƒ(๐‘‹(๐‘ก + ๐‘ ) = ๐‘— | ๐‘‹(๐‘ ) = ๐‘–)
for all s, t ≥ 0, functions {x(t), t ≥ 0} and i, j = 0, 1, 2, …
• If a continuous-time Markov chain {X(t),t ≥ 0} satisfies:
๐‘ƒ(๐‘‹(๐‘ก + ๐‘ ) = ๐‘— | ๐‘‹(๐‘ ) = ๐‘–) = ๐‘ƒ(๐‘‹(๐‘ก) = ๐‘— | ๐‘‹(0) = ๐‘–) for every s,t ≥ 0, then {X(t),t ≥ 0} is stationary or
time-homogeneous.
• Let Ti denote the time the process {X(t),t ≥ 0} spends in state i before making a transition into a different
state. Then by Markov property it must have exponential distribution with rate vi.
• Let Pij denote the probability of next entering state j, given that the current state is i, then:
๐‘ƒ๐‘–๐‘– = 0 ๐‘Ž๐‘›๐‘‘ ∑ ๐‘ƒ๐‘–๐‘— = 1 ๐‘“๐‘œ๐‘Ÿ ๐‘’๐‘ฃ๐‘’๐‘Ÿ๐‘ฆ ๐‘– = 0,1,2, …
๐‘—
5
Birth and death process:
•
•
•
•
•
If only the transition from i to i + 1 is allowed, then the process is called a pure birth process.
If only transitions from i to i − 1 or i + 1 are allowed, then the process is called a birth and death process.
A pure birth process starting at zero is a counting process.
Arrivals occur with rate λi . That is, the time until the next arrival is exponentially distributed with mean 1/λi .
Departures occur with rate µi . That is, the time until the next departure is exponentially distributed with
mean 1/µi .
• The Poisson process is a pure birth process with all λi equal to common arrival rate λ.
• ๐ธ(๐‘‡๐‘– ) =
1
,
๐œ†๐‘– + µ๐‘–
๐‘ƒ๐‘–,๐‘–−1 = ๐œ†
µ๐‘–
, ๐‘ƒ๐‘–,๐‘–+1
๐‘– + µ๐‘–
=๐œ†
๐œ†๐‘–
๐‘– + µ๐‘–
Transition probabilities:
• The transition probability function of the continuous-time Markov chain is given by
๐‘ƒ๐‘–๐‘— (๐‘ก) = ๐‘ƒ(๐‘‹(๐‘ก + ๐‘ ) = ๐‘— | ๐‘‹(๐‘ ) = ๐‘–).
• The rate of transition from state i into state j is given by ๐‘ž๐‘–๐‘— = ๐‘ฃ๐‘– ๐‘ƒ๐‘–๐‘— .
โ–ถ The qij values are called the instantaneous transition rates.
โ–ถ The vi values are the rates of the time until next transition given that you are currently in state i.
โ–ถ Note that qii = 0 as a consequence of the fact that Pii = 0.
๐‘ž๐‘–๐‘—
• It follows that: ๐‘ฃ๐‘– = ∑๐‘—≠๐‘– ๐‘ž๐‘–๐‘— and ๐‘ƒ๐‘–๐‘— =
∑๐‘—≠๐‘– ๐‘ž๐‘–๐‘—
• It can be proven that: lim
โ„Ž→0
๐‘ƒ๐‘–๐‘— (โ„Ž)
โ„Ž
= ๐‘ž๐‘–๐‘— , which shows that the instantaneous transition rate qij is the derivative
Pij’(0) of the transition probability Pij(t) with respect to t, evaluated in t = 0.
• It can be proven that: lim
โ„Ž→0
1−๐‘ƒ๐‘–๐‘– (โ„Ž)
โ„Ž
= ๐‘ฃ๐‘– , which shows −vi is the derivative Pii’(0) of the transition probability
Pii(t) with respect to t, evaluated in t = 0.
Kolmogorov equations:
• Chapman-Kolmogorov equations: for all ๐‘  ≥ 0, ๐‘ก ≥ 0
∞
๐‘ƒ๐‘–๐‘— (๐‘ก + ๐‘ ) = ∑ ๐‘ƒ๐‘–๐‘˜ (๐‘ก)๐‘ƒ๐‘˜๐‘— (๐‘ )
๐‘˜=0
• Kolmogorov backward-equations:
∞
๐‘ƒ๐‘–๐‘—′ (๐‘ก)
= ∑ ๐‘ž๐‘–๐‘˜ ๐‘ƒ๐‘˜๐‘— (๐‘ก) − ๐‘ฃ๐‘– ๐‘ƒ๐‘–๐‘— (๐‘ก)
๐‘˜≠๐‘–
• Kolmogorov forward-equations:
∞
๐‘ƒ๐‘–๐‘—′ (๐‘ก)
= ∑ ๐‘ž๐‘˜๐‘— ๐‘ƒ๐‘–๐‘˜ (๐‘ก) − ๐‘ฃ๐‘– ๐‘ƒ๐‘–๐‘— (๐‘ก)
๐‘˜≠๐‘–
Limiting probabilities:
๐‘ฃ๐‘— ๐‘ƒ๐‘— = ∑๐‘˜≠๐‘— ๐‘ž๐‘˜๐‘— ๐‘ƒ๐‘˜ ๐‘ฃ๐‘œ๐‘œ๐‘Ÿ ๐‘Ž๐‘™๐‘™๐‘’ ๐‘—
• Balance equations: {
∑๐‘— ๐‘ƒ๐‘— = 1
• Limiting probabilities exist if and only if all states of MC communicate and the MC is positive recurrent, i.e.
the MC is ergodic. Like in discrete-time, Pj are also called stationary probabilities.
• For a birth and death process follows that the balance equations are:
๐œ†0 ๐‘ƒ0 = ๐œ‡1 ๐‘ƒ1 ๐‘ฃ๐‘œ๐‘œ๐‘Ÿ ๐‘– = 0
{
(๐œ†๐‘– + ๐œ‡๐‘– )๐‘ƒ๐‘– = ๐œ‡๐‘–+1 ๐‘ƒ๐‘–+1 + ๐œ†๐‘–−1 ๐‘ƒ๐‘–−1 ๐‘ฃ๐‘œ๐‘œ๐‘Ÿ ๐‘– > 0
• It follows: ๐‘ƒ0 = 1/(1 + ∑∞
๐‘–=1
๐œ†0 …๐œ†๐‘–−1
)
๐œ‡1 …๐œ‡๐‘–
6
Queuing theory:
• Kendall’s notation: Queueing systems are often indicated via two letters followed by one or two numbers:
M/M/1, M/M/2/5.
โ–ถ The first letter indicates the arrival process:
D: Deterministic: clients arrive at equidistant time points.
M: Markovian: clients arrive according to a Poisson process.
G: General: clients arrive according to a general arrival process.
โ–ถ The second letter indicates the type of service times:
D: Deterministic: service times are fixed.
M: Markovian: service times S1, S2, . . . are independent exponential random variables with common rate.
G: General: service times S1, S2, . . . are independent and identically distributed (i.i.d.) random variables.
They may have any distribution.
โ–ถ The first number indicates the number of servers.
โ–ถ The second number indicates the capacity of the system, that is, the maximum number of clients in the
system. The capacity is equal to the number of servers plus the maximum number of waiting clients.
• M/M/1 queue:
โ–ถ Customers arrive at the server according to a Poisson process with rate λ.
โ–ถ The service needs some time to complete.
โ–ถ The successive service times are independent exponential random variables with mean 1/µ
โ–ถ The number of clients in the system is a birth and death process with common arrival rate λ and common
departure rate µ.
๐œ† ๐‘–
๐œ‡
๐œ†
๐œ‡
๐œ†
๐œ‡
โ–ถ Limiting probabilities are ๐‘ƒ๐‘– = ( ) (1 − ) ๐‘๐‘Ÿ๐‘œ๐‘ฃ๐‘–๐‘‘๐‘’๐‘‘ ๐‘กโ„Ž๐‘Ž๐‘ก < 1.
Little’s law:
• N(t) is number of arrivals up to time t.
• Overall arrival rate into the system: ๐œ† = lim
๐‘ก→∞
๐‘(๐‘ก)
.
๐‘ก
• Let Vn denote sojourn time of client n, that is, the time client n spends in the system.
• Then average sojourn time W (the average time client n spends in the system) is ๐‘Š = lim
1
๐‘ก
1
๐‘›→∞ ๐‘›
∑๐‘›๐‘—=1 ๐‘‰๐‘›
• Let X(t) denote the number of clients in the system at time t. Then, ๐ฟ = lim ๐‘ก ∫0 ๐‘‹(๐‘ ) ๐‘‘๐‘  is the average
๐‘ก→∞
number of clients in the system (over time). (sometimes ๐ฟ = ∑๐‘ ๐‘—=0 ๐‘—๐‘ƒ๐‘— if there are s+1 states)
• Little’s law: ๐ฟ = ๐œ†๐‘Š
๐œ†
1
• For M/M/1 queue: ๐ฟ = ๐œ‡−๐œ†, ๐‘Š = ๐œ‡−๐œ†
PASTA Principle:
• โ–ถ Define the long-run or steady-state probability of exactly n clients in the system by:
๐‘ƒ๐‘› = lim ๐‘ƒ(๐‘‹(๐‘ก) = ๐‘›), Pn is often also the long-run proportion the system contains exactly n clients.
๐‘ก→∞
โ–ถ an long-run proportion of clients that find n in the system upon arrival.
โ–ถ dn long-run proportion of clients that leave n in the system upon departure.
• In systems in which clients arrive and depart one at a time, the two probabilities an and dn coincide.
• The PASTA property: Poisson arrivals see time averages.
If the arrival process is a Poisson process:
โ–ถ The arrivals occur homogeneously over time
โ–ถ The averages over time and over clients are the same
Then it holds that ๐‘ท๐’ = ๐’‚๐’
7
Gaussian Processes:
1
• A random variable R which satisfies P(๐‘… = −1) = ๐‘ƒ(๐‘… = +1) = 2 is called a Rademacher random
variable. Properties: ๐ธ(๐‘…) = 0, ๐‘‰๐‘Ž๐‘Ÿ(๐‘…) = 1.
Brownian Motion:
First definition:
• 1. Brownian motion starts at zero: W (0) = 0.
2. Brownian motion has stationary and independent increments.
3. Brownian motion evaluated at a fixed time t1 is a normal random variable with mean zero and variance t1.
Second definition:
• 1.Brownian motion starts at zero: W (0) = 0.
2. For t1 ≤ t2, W (t1) and W (t2) have a bivariate normal distribution with mean zero and covariance t1.
Properties:
• W(0) = 0.
• Cov(W(t1),W(t2)) = min(t1,t2).
• For t1 ≤ t2: Cov(W (t1), W(t2) − W(t1)) = 0.
1
๐‘›
√
๐‘›→∞
[๐‘›๐‘ก]
• Brownian motion is the limit of aa random walk: ๐‘Š(๐‘ก) = lim ๐‘Š๐‘› (๐‘ก) = lim ( ) ∑๐‘–=1 ๐‘…๐‘–
๐‘›→∞
Reflection Principle:
• General case if Sn is the sum of n Rademacher variables (n and k different parity):
๐‘ƒ(๐‘š๐‘Ž๐‘ฅ๐‘—=1,…,๐‘› ๐‘†๐‘— ≥ ๐‘˜) = 2๐‘ƒ(๐‘†๐‘› ≥ ๐‘˜)
• Brownian motion case:
๐‘ƒ(๐‘ ๐‘ข๐‘0≤๐‘ก≤๐‘ ๐‘Š(๐‘ก) > y) = 2๐‘ƒ(W(b) > y)
• Let Ta and Tb be two hitting times (first time BM hits the level y) with a, b > 0. The probability that
๐‘ƒ(๐‘‡๐‘Ž < ๐‘‡๐‘ ) is:
โ–ถ 0 ๐‘–๐‘“ ๐‘Ž > ๐‘ > 0
โ–ถ 1 ๐‘–๐‘“ ๐‘ > ๐‘Ž > 0
โ–ถ −๐‘/(๐‘Ž − ๐‘) ๐‘–๐‘“ ๐‘Ž > 0 > ๐‘
• Boundary crossing from both sides:
∞
๐‘ƒ(๐‘ ๐‘ข๐‘0≤๐‘ก≤๐‘ |๐‘Š(๐‘ก)| > y) = 2 ∑(−1)๐‘—+1 ๐‘ƒ(๐‘ ๐‘ข๐‘0≤๐‘ก≤๐‘ ๐‘Š(๐‘ก) > (2j − 1)y)
๐‘—=1
• Butler test: use to test if sample Y1,…,Yn is symmetric around 0.
โ–ถ Rearrange the sample so as to satisfy |๐‘Œ(1)| ≤ |๐‘Œ(2)| ≤ . . . ≤ |๐‘Œ(๐‘›) |.
1 ๐‘–๐‘“ ๐‘Œ(๐‘–) > 0
โ–ถ Define random variables R1, R2, . . . , Rn by ๐‘…๐‘– = {
.
−1 ๐‘–๐‘“ ๐‘Œ(๐‘–) < 0
1
[๐‘›๐‘ก]
โ–ถ Butler’s test statistic ๐‘‡๐‘› = ๐‘ ๐‘ข๐‘0≤๐‘ก≤1 |( ๐‘›) ∑๐‘–=1 ๐‘…๐‘– | .
√
โ–ถ Under null hypothesis, Tn converges in distribution to the absolute supremum of the Brownian motion on
the unit interval. Use critical values to perform test and reject null for high values of Tn.
๐‘Š(๐‘ก)
2
• Linear boundaries (Doob): ๐‘ƒ(๐‘ ๐‘ข๐‘๐‘ก≥0 1+๐‘Ž๐‘ก > ๐‘ฆ) = ๐‘’ −2๐‘Ž๐‘ฆ .
โ–ถ Can be used for Brownian motion with drift: ๐‘‹(๐‘ก) = ๐‘Š(๐‘ก) + ๐œ‡๐‘ก. It follows that ๐‘ƒ(๐‘ ๐‘ข๐‘๐‘ก≥0 ๐‘‹(๐‘ก) > ๐‘ฆ) =
๐‘ƒ(๐‘ ๐‘ข๐‘๐‘ก≥0
๐‘Š(๐‘ก)
๐œ‡
๐‘ฆ
1− ๐‘ก
> ๐‘ฆ) = ๐‘’ 2๐œ‡๐‘ฆ
8
Brownian Bridge:
Empirical Process:
1
๐‘›
• Empirical distribution function: ๐น๐‘› (๐‘ฅ) = ∑๐‘›๐‘–=1 ๐ผ{๐‘‹๐‘–≤๐‘ฅ} .
• Empirical process: √๐‘›(๐น๐‘› (๐‘ฅ) − ๐น(๐‘ฅ)).
• CLT gives as n goes to infinity: √๐‘›(๐น๐‘› (๐‘ฅ0 ) − ๐น(๐‘ฅ0 )) ๐‘‘→ ๐‘ (0, ๐น(๐‘ฅ0 )(1 − ๐น(๐‘ฅ0 ))) .
1
• Uniform empirical process: ๐ต๐‘› (๐‘ข) = √๐‘›(๐‘› ∑๐‘›๐‘–=1 ๐ผ{๐‘ˆ๐‘–≤๐‘ข} − ๐‘ข) ๐‘ฃ๐‘œ๐‘œ๐‘Ÿ 0 ≤ ๐‘ข ≤ 1, met U1,…,Un i.i.d. U[0,1].
• If random variable X has CDF F(x), then ๐น(๐‘‹)~๐‘ˆ[0,1]. Because of this follows:
√๐‘›(๐น๐‘› (๐‘ฅ) − ๐น(๐‘ฅ)) ๐‘‘= ๐ต๐‘› (๐น(๐‘ฅ))
• Property: the uniform empirical process at two points s and t converges to a bivariate normal distribution
with mean zero and: ๐ถ๐‘œ๐‘ฃ(๐ต๐‘› (๐‘ก), ๐ต๐‘› (๐‘ )) = min(๐‘ , ๐‘ก) − ๐‘ ๐‘ก
Definition:
• A Brownian Bridge is the limiting process of {๐ต๐‘› (๐‘ข), 0 ≤ ๐‘ข ≤ 1}, and is denoted by {๐ต(๐‘ข), 0 ≤ ๐‘ข ≤ 1}
• Definition:
1. ๐น๐‘œ๐‘Ÿ ๐‘’๐‘ฃ๐‘’๐‘Ÿ๐‘ฆ 0 ≤ ๐‘ข ≤ 1, ๐ต(๐‘ข) ๐‘–๐‘  ๐‘Ž ๐‘›๐‘œ๐‘Ÿ๐‘š๐‘Ž๐‘™ ๐‘Ÿ๐‘Ž๐‘›๐‘‘๐‘œ๐‘š ๐‘ฃ๐‘Ž๐‘Ÿ๐‘–๐‘Ž๐‘๐‘™๐‘’ ๐‘ค๐‘–๐‘กโ„Ž ๐‘š๐‘’๐‘Ž๐‘› ๐‘ง๐‘’๐‘Ÿ๐‘œ ๐‘Ž๐‘›๐‘‘ ๐‘ฃ๐‘Ž๐‘Ÿ๐‘–๐‘Ž๐‘›๐‘๐‘’ ๐‘ข(1 − ๐‘ข)
2. ๐น๐‘œ๐‘Ÿ ๐‘’๐‘ฃ๐‘’๐‘Ÿ๐‘ฆ 0 ≤ ๐‘ข1 , ๐‘ข2 ≤ 1, (๐ต(๐‘ข1 ), ๐ต(๐‘ข2 )) ๐‘–๐‘  ๐‘Ž ๐‘๐‘–๐‘ฃ๐‘Ž๐‘Ÿ๐‘–๐‘Ž๐‘ก๐‘’ ๐‘Ÿ๐‘Ž๐‘›๐‘‘๐‘œ๐‘š ๐‘ฃ๐‘’๐‘๐‘ก๐‘œ๐‘Ÿ ๐‘ค๐‘–๐‘กโ„Ž
๐ถ๐‘œ๐‘ฃ(๐ต(๐‘ข1 ), ๐ต(๐‘ข2 )) = ๐‘š๐‘–๐‘› (๐‘ข1 , ๐‘ข2 ) − ๐‘ข1 ๐‘ข2
Asymptotic statistics:
• As the sample size n → ∞, the general empirical process {√๐‘›(๐น๐‘› (๐‘ฅ) − ๐น(๐‘ฅ)), ๐‘ฅ ∈ ๐‘…} converges to a limiting
process {๐ต(๐น(๐‘ฅ)), ๐‘ฅ ∈ ๐‘…}.
• A more rigorous way to formulate the convergence is as follows: ๐‘ ๐‘ข๐‘๐‘ฅ∈๐‘… |√๐‘›(๐น๐‘› (๐‘ฅ) − ๐น(๐‘ฅ)) − ๐ต(๐น(๐‘ฅ))| ๐‘ƒ→ 0
• Delta method(univariate):
๐ผ๐‘“ √๐‘›(๐œƒ๐‘› − ๐œƒ) ๐‘‘→ ๐‘(0, ๐œŽ 2 ) ๐‘กโ„Ž๐‘’๐‘› √๐‘›(๐‘”(๐œƒ๐‘› ) − ๐‘”(๐œƒ)) ๐‘‘→ ๐‘”′ (๐œƒ)๐‘(0, ๐œŽ 2 ) ๐‘“๐‘œ๐‘Ÿ ๐‘’๐‘ฃ๐‘’๐‘Ÿ๐‘ฆ ๐‘‘๐‘–๐‘“๐‘“๐‘’๐‘Ÿ๐‘’๐‘›๐‘ก๐‘–๐‘Ž๐‘๐‘™๐‘’ ๐‘”(๐œƒ)
• Delta method (bivariate):
Δ๐‘”
Δ๐‘”
๐ฟ
๐œƒ
๐œƒ
๐ผ๐‘“ √๐‘› (( ๐‘› ) − ( )) ๐‘‘→ ( 1 ) ๐‘กโ„Ž๐‘’๐‘› √๐‘›(๐‘”(๐œƒ๐‘› , ๐œ‡๐‘› ) − ๐‘”(๐œƒ, ๐œ‡)) ๐‘‘→
(๐œƒ, ๐œ‡)๐ฟ1 +
(๐œƒ, ๐œ‡)๐ฟ2 ๐‘“๐‘œ๐‘Ÿ ๐‘”(๐‘ฅ, ๐‘ฆ)
๐œ‡๐‘›
๐œ‡
๐ฟ2
Δ๐‘ฅ
Δ๐‘ฆ
Brownian Motion to Brownian Bridge:
• Let {๐‘Š (๐‘ก), 0 ≤ ๐‘ก ≤ 1} be a Brownian motion. The process {๐‘‹(๐‘ก), 0 ≤ ๐‘ก ≤ 1} defined by ๐‘‹(๐‘ก) =
๐‘Š (๐‘ก) − ๐‘ก๐‘Š (1) for 0 ≤ t ≤ 1 is a Brownian bridge on the unit interval.
• By conditioning on the event {๐‘Š (1) = 0}, the Brownian motion {๐‘Š (๐‘ก), ๐‘ก ≥ 0} becomes a Brownian
bridge on the unit interval.
Brownian Bridge to Brownian Motion:
• Let Z be a standard normal random variable, independent of the Brownian bridge {๐ต(๐‘ก), ๐‘ก ≥ 0}. Then, the
process {๐‘‹(๐‘ก), ๐‘ก ≥ 0} defined by ๐‘‹(๐‘ก) = ๐ต(๐‘ก) + ๐‘ก๐‘, 0 ≤ ๐‘ก ≤ 1 is a Brownian motion on the unit interval.
Kolmogorov-Smirnov test:
• Suppose we have a random sample Y1, Y2, . . . , Yn drawn from an unknown distribution, with the aim of
testing the null hypothesis that the unknown distribution has some given CDF F0(y). The Kolmogorov statistic
๐พ๐‘› = √๐‘› ๐‘ ๐‘ข๐‘๐‘ฆ∈๐‘… |๐น๐‘› (๐‘ฆ) − ๐น0 (๐‘ฆ)| can be used.
• Under null: √๐‘› ๐‘ ๐‘ข๐‘๐‘ฆ∈๐‘… |๐น๐‘› (๐‘ฆ) − ๐น0 (๐‘ฆ)| ๐‘‘→ ๐‘ ๐‘ข๐‘๐‘ฆ∈๐‘… |๐ต(๐น0 (๐‘ฆ))| = ๐‘ ๐‘ข๐‘0≤๐‘ข≤1 |๐ต(๐‘ข)|
• For Brownian Bridge:
∞
๐‘ƒ(๐‘ ๐‘ข๐‘0≤๐‘ข≤1 ๐ต(๐‘ข) > ๐‘ฆ) = ๐‘’
−2๐‘ฆ 2
๐‘Ž๐‘›๐‘‘ ๐‘ƒ(๐‘ ๐‘ข๐‘0≤๐‘ข≤1 |๐ต(๐‘ข)| > ๐‘ฆ) = 2 ∑(−1)๐‘—+1 ๐‘’ −2๐‘—
๐‘—=1
9
2๐‘ฆ2
• Now, suppose we have drawn the sample. One way to perform the Kolmogorov-Smirnov test is to draw Fn(y)
first. Determine a maximum allowed distance by multiplying critical value kα by
๐น๐‘› (๐‘ฆ) ±
๐‘˜α
√๐‘›
1
.
√๐‘›
Draw the two lines
red:
โ–ถ If F0(y) falls completely between the red lines, do not reject the null hypothesis.
โ–ถ If F0(y) exceeds one of the red lines, reject the null hypothesis.
Other CTCS Processes:
Ornstein Uhlenbeck:
• Let {๐‘Š (๐‘ก), ๐‘ก ≥ 0} be Brownian motion on the interval [0, ∞), and let α ≥ 0. The stochastic process
๐‘Ž๐‘ก
{๐‘‹(๐‘ก), ๐‘ก ≥ 0} defined by ๐‘‹(๐‘ก) = ๐‘’ − 2 ๐‘Š(๐‘’ ๐‘Ž๐‘ก ) is called the Ornstein-Uhlenbeck process.
• The Ornstein-Uhlenbeck process {๐‘‹(๐‘ก), ๐‘ก ≥ 0} is a Gaussian process with zero mean function and
covariance function: ๐ถ๐‘œ๐‘ฃ(๐‘‹(๐‘ก1), ๐‘‹(๐‘ก2)) = ๐‘’๐‘ฅ๐‘{−๐›ผ|๐‘ก1 − ๐‘ก2|/2}.
• Stationarity (a process is stationary if at any given time point the distribution does not depend on the time,
i.e. is the same for every t):
โ–ถ Brownian motion is NOT a stationary process
โ–ถ Ornstein-Uhlenbeck process is a stationary process
• Increment:
โ–ถ Brownian motion a process with independent (and stationary) increments
โ–ถ Ornstein-Uhlenbeck process does not have independent increments, but it has stationary increments
Geometric Brownian Motion:
• Let {๐‘Š (๐‘ก), ๐‘ก ≥ 0} be Brownian motion on the interval [0, ∞). The stochastic process {๐‘‹(๐‘ก), ๐‘ก ≥ 0} defined
by ๐‘‹(๐‘ก) = ๐‘’ µ๐‘ก+๐œŽ๐‘Š(๐‘ก) is called the geometric Brownian motion with drift coefficient µ and variance
parameter ๐œŽ 2 .
• If {๐‘‹(๐‘ก), ๐‘ก ≥ 0} is a geometric Brownian motion with drift coefficient µ and variance parameter ๐œŽ 2 , then
µ
{๐œŽ −1 ๐‘™๐‘› ๐‘‹(๐‘ก), ๐‘ก ≥ 0} is a Brownian motion with drift coefficient ๐œŽ.
• Note that the geometric Brownian motion is not a Gaussian process!
โ–ถ For fixed t, the random variable has a log-normal distribution with parameters µ๐‘ก ๐‘Ž๐‘›๐‘‘ ๐œŽ 2
10
Tables:
Absolute supremum (Butler Test):
Kolgomorov-Smirnov test:
Standard Normal:
11
12
Download