Uploaded by Abdul Rafay

6 Stocaustic Sys Fall 2021

advertisement
EE-891
STOCHASTIC SYSTEMS
LEC -6
02 December 2021
Sequence
➢ REVIEW
➢ EXPECTED VALUE OF A RANDOM VARIABLE
➢ VARIANCE OF A RANDOM VARIABLE
➢ CONDITIONAL PROBABILITY MASS FUNCTION
➢ CONDITIONAL EXPECTED VALUE
➢ IMPORTANT DISCRETE RANDOM VARIABLES
➢ Conclusion
REVIEW
REVIEW
➢ A random variable 𝑋 is a function that assigns a real
number, to each outcome in the sample space of a random
experiment
➢ The sample space S is the domain of the random variable,
and the set of all values taken on by X; 𝑆𝑋 is the range of the
random variable
REVIEW
➢ Let a random experiment have sample space S and event class ℱ
➢ To find the probability of a subset B of Range R, e.g., Β = {𝑥𝑘 } we need
to find the outcomes in S that are mapped to B
➢ Thus if event Α occurs then 𝑋(𝜁) ∈ Β, so event Β occurs
……………………………….(1)
➢ Thus the probability that X is in B is given by:
………….(2)
Thus A and B are equivalent events
REVIEW
➢ The probability mass function (pmf) of a discrete random variable X is
defined as:
; for 𝑥 a real no.
➢ All events involving the random variable 𝑋 can be expressed as the
union of events Α′𝑘 𝑠
➢ The expected value or mean of a discrete random variable 𝑿 is defined by:
……..(3)
REVIEW
➢ Let 𝑋 be a discrete random variable, and let function of it be: Ζ = 𝑔(𝑋)
➢ One way to find the expected value of Z by making use of fol Eq.
…………..(4)
➢ Another way is to use the following result:
……………..(5)
➢ To show Eq. (5) we group the terms 𝑥𝑘 that are mapped to each value
𝑧𝑗 :
……………..(6)
➢ The sum inside the braces is the probability of all terms 𝑥𝑘 for which
𝑔 𝑥𝑘 = 𝑧𝑗 , which is the probability that 𝑍 = 𝑧𝑗 , that is: 𝑝𝑍 (𝑧𝑗 )
REVIEW
➢ Example: 3.17:
Let 𝑋 be a noise voltage that is uniformly distributed
in 𝑆𝑋 = −3, −1, +1, +3 with Ρ𝑋 𝑘 = 1Τ4 for k in 𝑆𝑋 . Find 𝐸[𝑍] where
𝑍 = 𝑋2.
➢ Solution:
❖ We find the 𝒑𝒎𝒇 of 𝑍:
Ρ𝑍 9 = Ρ 𝑋 ∈ −3, +3
= 𝑃𝑋 −3 + Ρ𝑋 +3 = 1Τ2
Ρ𝑍 1 = Ρ 𝑋 ∈ −1, +1
= 𝑃𝑋 −1 + Ρ𝑋 +1 = 1Τ2
❖ Thus we find out by Eq. (6) as: Ε[𝑧] = σ𝑗 𝑧𝑗 Ρ𝑍 (𝑧𝑗 )
❖ Alternately; by using Eq. (5) we have as:Ε 𝑧 = σ𝑘 𝑔 𝑥𝑘 Ρ𝑋 (𝑥𝑘 )
EXPECTED VALUE OF A RANDOM
VARIABLE
EXPECTED VALUE OF A RANDOM VARIABLE
➢ If 𝑍 be a function defined as:
……………………………….(7)
➢ Where, 𝑎, 𝑏 and 𝑐 be real numbers then the expected value of 𝑍 be
defined as:
………………….(8)
➢ Thus by using Eq. (5) we have:
……………….…….(9)
➢ By setting: 𝑎 = 𝑏 = 1; 𝑐 = 0, we get by using Eqs. (7) and (9) as
under:
………………(10)
EXPECTED VALUE OF A RANDOM VARIABLE
➢ By setting: 𝑏 = 𝑐 = 0, we get by using Eqs. (7) and (9) as under:
…………………………………..(11)
➢ By setting: 𝑎 = 1; 𝑏 = 0, we get by using Eqs. (7) and (9) as under:
…………………….(12)
➢ By setting: 𝑎 = 𝑏 = 0, we get by using Eqs. (7) and (9) as under:
……………………………………(13)
EXPECTED VALUE OF A RANDOM VARIABLE
➢ Example: 3.18:
Let 𝑋 be a noise voltage that is uniformly distributed
in 𝑆𝑋 = −3, −1, +1, +3 with Ρ𝑋 𝑘 = 1Τ4 for k in 𝑆𝑋 . If the noise voltage X
is amplified and shifted to obtain 𝑌 = 2𝑋 + 10 and then squared to
produce 𝑍 = 𝑌 2 = 2𝑋 + 10 2 . Then find E[Z]
➢ Solution:
➢ Since: Ε 𝑋 = σ𝑘 𝑘Ρ𝑋 𝑘 =
➢ Thus we have:
Ε𝑍 =
1
4
−3 + −1 + +1 + +3
=0
EXPECTED VALUE OF A RANDOM VARIABLE
➢ Example: 3.19:
Let X be the number of voice packets containing
active speech produced by 𝑛 = 48 independent speakers in a 10millisecond period. X is a binomial random variable with parameter n and
1
3
probability Ρ = . Suppose a packet multiplexer transmits up to Μ = 20
active packets every 10 ms, and any excess active packets are discarded.
Let 𝑍 be the number of packets discarded. Find 𝐸[𝑍].
➢ Solution:
VARIANCE OF A RANDOM VARIABLE
VARIANCE OF A RANDOM VARIABLE
➢ The variance of the random variable X is defined as the expected
value of 𝐷 𝑋 = 𝑋 − Ε 𝑋
2
given as:
…………………(14)
➢ The standard deviation of the random variable X is defined by:
………………………………(15)
➢ An alternative expression for the variance is given as:
…………………………………….……..(16)
VARIANCE OF A RANDOM VARIABLE
➢ In Eq. (16) Ε 𝑋 2 is termed as the second moment of 𝑿
➢ Similarly, the nth moment of 𝑿 would be defined as: Ε 𝑋 𝑛
➢ Let 𝑌 = 𝑋 + 𝑐 then:
……………………(17)
➢ Thus adding a constant to a random variable does not vary its variance
➢ Let 𝑌 = 𝑐𝑋 then:
……..(18)
➢ Thus scaling a random variable by c scales the variance by 𝑐 2 and the
standard deviation by 𝑐
➢ Now let 𝑋 = 𝑐 a random variable that is equal to a constant with
probability 1, then:
➢ Thus a constant random variable has zero variance
………..…(19)
VARIANCE OF A RANDOM VARIABLE
➢ Example: 3.20:
Let X be the number of heads in three tosses of a
fair coin. Find VAR[𝑋].
➢ Solution:
❖ The sample space for the experiment is defined as:
❖ The no of events with no (zero) Head =1 ; having probability= 1Τ8
❖ The no of events with One Head= 3; having probability=3Τ8
❖ The no of events with Two Heads=3; having probability= 3Τ8
❖ The no of events with Three Heads=1; having probability=1Τ8
❖ The Sample Space of Random Variable X(Head Counts) is given as:
𝑆𝑋 = [0,1,2,3]
❖ The Mean of these events is found as:
Ε 𝑋 = ෍ 𝑥Ρ𝑋 (𝑥)
𝑥∈𝑆𝑋
VARIANCE OF A RANDOM VARIABLE
➢ Solution (Continue)
❖ Thus the expected value of Head Counts or Ε[𝑋] be found as:
1
3
3
1
12
Ε 𝑋 = ෍ 𝑥Ρ𝑋 (𝑥) = 0
+1
+2
+3
=
= 1.5
8
8
8
8
8
𝑥∈𝑆𝑋
❖ The variance in Counting Heads is defined as:
𝜎𝑋2 = 𝑉𝐴𝑅 𝑋 = Ε 𝑋 − Ε 𝑋
2
= Ε 𝑋 2 − (Ε 𝑋 )2
❖ Thus we have:
Ε
𝑋2
=
02
1
3
3
1
2
2
2
+1
+2
+3
=3
8
8
8
8
❖ Therefore we get:
𝜎𝑋2 = 𝑉𝐴𝑅 𝑋 = Ε 𝑋 2 − Ε 𝑋
2
= 3 − 1.5
2
= 3 − 2.25 = 𝟎. 𝟕𝟓
VARIANCE OF A RANDOM VARIABLE
➢ Example: 3.21:
Find
the
variance
of
the
variable 𝐼𝐴
➢ Solution:
❖ We define Bernoulli Random Variable as under:
❖ Thus: Ρ𝐼 0 + Ρ𝐼 1 = 1 − 𝑝 + 𝑝 = 1
Bernoulli
random
VARIANCE OF A RANDOM VARIABLE
➢ Solution (Continue)
❖ The Sample Space of Random Variable 𝐼𝐴 is given as:
𝑆𝐼𝐴 = [0,1]
❖ Thus the expected value of 𝐼𝐴 be found as:
Ε[𝐼𝐴 ] = ෍ 𝑥Ρ𝐼𝐴 (𝑥) = 0 1 − 𝑝 + 1 𝑝 = 𝑝
𝑥∈𝑆𝐼𝐴
❖ The variance in Counting Heads is defined as:
𝜎𝐼2𝐴 = 𝑉𝐴𝑅 𝐼𝐴 = Ε 𝐼𝐴 − Ε 𝐼𝐴
2
Ε 𝑋 2 = 02 1 − 𝑝 + 12 𝑝 = 𝑝
❖ Thus:
= Ε 𝐼𝐴2 − (Ε 𝐼𝐴 )2
VARIANCE OF A RANDOM VARIABLE
➢ Example: 3.22:
Find the variance of the geometric random variable.
➢ Solution:
❖ Let X be the random Variable (say number of bytes in a message), and
suppose that X has a geometric distribution with parameter p.
❖ Then the sample space 𝑆𝑋 of X can take on arbitrarily large values given
as: 𝑆𝑋 = {1,2,3,4, … … … . . }
❖ The expected Ε 𝑋 of X be found as:
❖ As we have:
❖ Therefore:
thus:
VARIANCE OF A RANDOM VARIABLE
➢ Solution (Continue)
❖ The variance of the geometric variable X is defined as:
𝜎𝑋2 = 𝑉𝐴𝑅 𝑋 = Ε 𝑋 − Ε 𝑋
❖ And
2
= Ε 𝑋 2 − (Ε 𝑋 )2
Ε 𝑋 2 = ෍ 𝑘 2 𝑝(1 − 𝑝)𝑘−1 = ෍ 𝑘 2 𝑝(𝑞)𝑘−1
𝑘=1
= 𝑝 σ𝑘=1 𝑘 2 (𝑞)𝑘−1
❖ Since:
∞
❖ Thus:
1
= ෍ 𝑘𝑥 𝑘−1
2
(1 − 𝑥)
𝑘=0
❖ Moreover:
2
(1−𝑥)3
𝑘−2
= σ∞
𝑘=0 𝑘(𝑘 − 1)𝑥
𝑘=1
VARIANCE OF A RANDOM VARIABLE
➢ Solution (Continue)
❖ Thus:
⇛
𝜎𝑋2
=Ε𝑋
2
− Ε𝑋
2
1+𝑞 1
𝑞
= 2 − 2= 2
𝑝
𝑝
𝑝
VARIANCE OF A RANDOM VARIABLE
➢ Example: 3.24: Let 𝑋 be the time required to transmit a message,
where 𝑋 is a uniform random variable with 𝑆𝑋 = {1,2,3, … . , 𝐿}. Suppose
that a message has already been transmitting for 𝑚 time units, find the
probability that the remaining transmission time is 𝑗 time units.
➢ Solution
❖ For 𝐶 = {𝑋 > 𝑚} we need to find the probability of message
transmission
❖ Thus for times greater than 𝑚 (say for example for 𝑚 + 1) we have:
𝑚+1≤𝑚+𝑗 ≤𝐿
Ρ[𝑋 = 𝑚 + 𝑗]
1Τ𝐿
1
Ρ
𝑚
+
𝑗
𝑋
>
𝑚
=
=
=
❖ Therefore: 𝑋
Ρ[𝑋 > 𝑚]
𝐿 − 𝑚Τ𝐿 𝐿 − 𝑀
For 𝑚 + 1 ≤ 𝑚 + 𝑗 ≤ 𝐿
❖ X is equally likely to be any of the remaining 𝐿 − 𝑚 possible values
❖ As m increases, 1Τ𝐿 − 𝑚 increases implying that the end of the
message transmission becomes increasingly likely
VARIANCE OF A RANDOM VARIABLE
➢ Example: 3.25: A production line yields two types of devices. Type 1
devices occur with probability 𝛼 and work for a relatively short time that is
geometrically distributed with parameter 𝑟.Type 2 devices work much
longer, occur with probability 1 − 𝛼
and have a lifetime that is
geometrically distributed with parameter 𝑠. Let 𝑋 be the lifetime of an
arbitrary device. Find the 𝒑𝒎𝒇 of 𝑋.
➢ Solution
❖ The random experiment that generates X involves selecting a device
type and then observing its lifetime.
❖ We can partition the sets of outcomes in this experiment into event Β1
consisting of those outcomes in which the device is type 1, and
Β2 consisting of those outcomes in which the device is type 2
❖ The conditional 𝑝𝑚𝑓’𝑠 of X given the device type are:
VARIANCE OF A RANDOM VARIABLE
➢ Solution(Contin……)
❖ Ρ 𝑋 𝐵1 𝑘 = (1 − 𝑟)𝑘−1 . 𝑟
for 𝑘 = 1,2, … … … … … …
❖ and: Ρ 𝑋 𝐵2 𝑘 = (1 − 𝑠)𝑘−1 . 𝑠
for 𝑘 = 1,2, … … … … … …
❖ The Probability Mass Function (𝑝𝑚𝑓) is given as:
Ρ𝑋 𝑘 = Ρ𝑋 𝑘 Β1 Ρ Β1 + Ρ𝑋 𝑘 Β2 Ρ Β2
⇛ Ρ𝑋 𝑘 = (1 − 𝑟)𝑘−1 . 𝑟. 𝛼 + (1 − 𝑠)𝑘−1 . 𝑠 .(1−𝛼)
for 𝑘 = 1,2, … … … … … …
❖ Where we have made use of theorem on total probability stated as:
CONDITIONAL PROBABILITY MASS
FUNCTION
CONDITIONAL PROBABILITY MASS FUNCTION
➢ In many situations we have partial information about a random
variable X or about the outcome of its underlying random experiment
➢ Thus we be interested in how this information changes the probability of
events involving the random variable
➢ The conditional probability mass function addresses this question for
discrete random variables
➢ Let X be a discrete random variable with 𝑝𝑚𝑓 Ρ𝑋 (𝑥)and let C be an
event that has nonzero probability, Ρ 𝐶 > 0
➢ The conditional probability mass function of X is defined by the
conditional probability:
➢ By applying the conditional probability relationship we get:
CONDITIONAL PROBABILITY MASS FUNCTION
➢ Thus the conditional probability of the event {𝑋 = 𝑥𝑘 } is given by the
probabilities of the outcomes 𝜁 for which both 𝑋 𝜁 = 𝑥𝑘 and 𝜁 are in
C, normalized by Ρ[𝐶]
➢ The axioms of conditional probability Mass Function are given as:
➢ Consider the situation as depicted:
CONDITIONAL PROBABILITY MASS FUNCTION
➢ Here the set of events: Α𝑘 = {𝑋 = 𝑥𝑘 } is a partition of S therefore we
have: 𝐶 = ‫(𝑘ڂ‬Α𝑘 ∩ 𝐶)
➢ And:
➢ Similarly we may show that:
CONDITIONAL PROBABILITY MASS FUNCTION
➢ Most of the time event 𝐶 is defined in terms of 𝑋, for example: 𝐶 = {𝑋 > 10}
or 𝐶 = {𝑎 ≤ 𝑋 ≤ 𝑏}
➢ For 𝑥𝑘 in 𝑆𝑋 we have the following general result:
CONDITIONAL EXPECTED VALUE
CONDITIONAL EXPECTED VALUE
➢ Let X be a discrete random variable, and suppose that we know that
event B has occurred
➢ The conditional expected value of X given B is defined as:
➢ The conditional variance of X given B is defined as:
➢ Let Β1 , Β2 , … … … … . . Β𝑛 be the partition of S, and
➢ Let Ρ𝑋 𝑥 Β𝑖 be the conditional 𝑝𝑚𝑓 of 𝑋 given event Β𝑖
➢ 𝐸[𝑋] can be calculated from the conditional expected values Ε 𝑋 Β as
under:
CONDITIONAL EXPECTED VALUE
➢ Which may be proved as under:
❖ By theorem on total probability we have:
❖ where we first express Ρ𝑋 (𝑥𝑘 )in terms of the conditional pmf’s, and
we then change the order of summation
❖ By using the same approach we can also show:-
IMPORTANT DISCRETE RANDOM
VARIABLES
IMPORTANT DISCRETE RANDOM VARIABLES
➢ Certain random variables arise in many diverse, unrelated applications
➢ The pervasiveness of these random variables is due to the fact that they
model fundamental mechanisms that underlie random behavior
➢ We present the most important of the discrete random variables and
discuss how they arise and how they are interrelated.
➢ Following table summarizes the basic properties of the discrete random
variables discussed here
THE BERNOULLI RANDOM
VARIABLE
THE BERNOULLI RANDOM VARIABLE
➢ Let Α be an event related to the outcomes of some random experiment
➢ The Bernoulli random Variable:
𝐼𝐴 = 1
If the event 𝐴 occurs, and
𝐼𝐴 = 0
Otherwise
➢ The Sample space 𝑆𝑋 = {0,1}
➢ 𝐼𝐴 is a discrete random variable as it assigns a number to each outcome
of 𝑆
➢ 𝐼𝐴 is a discrete random variable with range = {0,1} then The pmf of
𝑰𝑨 is found as:
Ρ𝐼 1 = 𝑝
and
Ρ𝐼 0 = 1 − 𝑝
➢ The Mean of 𝑰𝑨 is being worked out as:
Ε 𝐼𝐴 = 0Ρ𝐼 0 + 1Ρ𝐼 1 = 0 1 − 𝑝 + 1 𝑝 = 𝑝
THE BERNOULLI RANDOM VARIABLE
➢
➢ The 2nd moment of function 𝐼𝐴 is worked out as under:
Ε 𝐼𝐴2 = 02 Ρ𝐼 0 + 12 Ρ𝐼 1 = 𝑝
➢ Thus the variance of function 𝐼𝐴 is worked out as under:
𝜎𝐼2 = VAR 𝐼𝐴 = Ε 𝐼𝐴2 − Ε 𝐼𝐴
2
= 𝑝 − 𝑝2 = 𝑝 1 − 𝑝 = 𝑝𝑞
➢ The variance is quadratic in p, with value zero at 𝑝 = 0and 𝑝 = 1and
maximum at 𝑝 =
1
2
➢ This agrees with intuition since values of p close to 0 or to 1 imply a
preponderance of successes or failures and hence less variability in the
observed values
THE BERNOULLI RANDOM VARIABLE
➢ The maximum variability occurs when 𝑝 =
1
2
which corresponds to the
case that is most difficult to predict
➢ Every Bernoulli trial, regardless of the event A, is equivalent to the
tossing of a biased coin with probability of heads p
➢ In this sense, coin tossing can be viewed as representative of a
fundamental mechanism for generating randomness, and the
Bernoulli random variable is the model associated with it
THE BINOMIAL RANDOM VARIABLE
THE BINOMIAL RANDOM VARIABLE
➢ Let a random experiment is repeated 𝒏 independent times
➢ Let 𝑋 be the number of times a certain event A occurs in these 𝒏 trials
➢ Thus 𝑿 is a random variable with range 𝑺𝑿 = {𝟎, 𝟏, 𝟐, 𝟑, … . , 𝒏}
➢ For example, 𝑿 could be the number of heads in 𝒏 tosses of a coin
➢ If we let 𝑰𝒋 be the indicator function for the event A in the 𝑗𝑡ℎ 𝑡𝑟𝑖𝑎𝑙,
then:
➢ That is, 𝑿 is the sum of the Bernoulli random variables associated with
each of the 𝒏 independent trials
➢ We found that 𝑿 has probabilities that depend on 𝒏 and 𝒑 as under:
THE BINOMIAL RANDOM VARIABLE
➢ 𝑿 as defined above is called the binomial random variable.
➢ Figure as under shows the 𝑝𝑚𝑓 (𝑝𝑑𝑓 )of 𝑿 for 𝑝 = 0.2 and 𝑝 = 0.5
➢ The factorial terms grow large very quickly and cause overflow
problems in the calculation of
𝒏
𝒌
➢ We can use following recursive formula for the ratio of successive
terms in the 𝑝𝑚𝑓
THE BINOMIAL RANDOM VARIABLE
➢ The binomial random variable arises in applications where there are
two types of objects (i.e., heads/tails, correct/erroneous bits,
good/defective items, active/silent speakers) and we are interested in
the number of type 1 objects in a randomly selected batch of size 𝒏,
where the type of each object is independent of the types of the other
objects in the batch
➢ The Mean of the Binomial Random Variable is calculated as:
1−𝑝
THE BINOMIAL RANDOM VARIABLE
➢ The Second Moment of the Binomial Random Variable is calculated
as under:
𝑘ሖ
𝑘ሖ
➢ Where we remove 𝑘 = 0 term at 2nd step of first line then set: 𝑘 − 1 = 𝑘ሖ
➢ In the third line we see that the first sum is the mean of a binomial
random variable with parameters 𝑛 − 1 and 𝑝, and hence equal to
𝑛−1 𝑝
➢ The second sum of third line is the sum of the binomial probabilities
and hence equal to 1
THE BINOMIAL RANDOM VARIABLE
➢ The Variance of the Binomial Random Variable is calculated as
under:
➢ We see that the variance of the binomial is 𝒏 times the variance of a
Bernoulli random variable
➢ We observe that values of 𝒑 close to 0 or to 1 imply smaller variance,
and that the maximum variability is when 𝒑 =
𝟏
𝟐
THE GEOMETRIC RANDOM
VARIABLE
THE GEOMETRIC RANDOM VARIABLE
➢ The geometric random variable arises when we count the number 𝑴 of
independent Bernoulli trials until the first occurrence of a success.
➢ 𝑴 is called the geometric random variable and it takes on values from
the set {1,2,3 … … . }
➢ Consider a sequential experiment in which we repeat independent
Bernoulli trials until the occurrence of the first success
➢ Let the outcome of this experiment be 𝒎, the number of trials carried
out until the occurrence of the first success.
➢ The sample space for this experiment is the set of positive integers
➢ The probability, 𝒑(𝒎), that 𝒎 trials are required is found by noting that
this can only happen if the first 𝒎 − 𝟏 trials result in failures and the
𝒎𝒕𝒉 trial in success
➢ The probability of this event is:
THE GEOMETRIC RANDOM VARIABLE
➢ where 𝜜𝒊 is the event “success in 𝒊𝒕𝒉 trial”
➢ The probability assignment specified as above is called the geometric
probability law
➢ The probabilities described above sum to 1 and can be shown as
under:
➢ where 𝑞 = 1 − 𝑝 and where we have used the formula for the
summation of a geometric series
➢ The probability that more than K trials are required before a success
occurs has a simple form as:
THE GEOMETRIC RANDOM VARIABLE
➢ Figure as under shows the geometric 𝒑𝒎𝒇 for 𝒑 =
𝟏
𝟐
➢ It is notable that 𝒑𝒎𝒇 decays geometrically with k, and that the ratio of
consecutive terms is
➢ As 𝒑 increases, the 𝒑𝒎𝒇 decays more rapidly
➢ The probability that 𝑴 ≤ 𝒌 can be written in closed form as:
THE GEOMETRIC RANDOM VARIABLE
ሖ = M − 1 the number of failures before a
➢ Sometimes we are interested in M
success occurs.
ሖ as a geometric random variable and its 𝒑𝒎𝒇 is given as:
➢ We refer to M
➢ The random variable X can take on arbitrarily large values since the
sample space of this experiment be as: 𝑆𝑋 = {1,2, … … }
➢ The expected value
calculated as:
or Mean of the Geometric Random Variable is
THE GEOMETRIC RANDOM VARIABLE
➢ The Second Moment of the Geometric Random Variable is calculated
as under:
➢ We have:
➢
➢
➢ The Variance of the Geometric Random Variable is calculated as under:
THE GEOMETRIC RANDOM VARIABLE
➢ We see that the mean and variance increase as p, the success
probability, decreases
➢ The geometric random variable is the only discrete random variable that
satisfies the memoryless property:
➢ The above expression states that if a success has not occurred in the
first 𝑗 trials, then the probability of having to perform at least k more trials
is the same as the probability of initially having to perform at least 𝑘 trials
➢ Thus, each time a failure occurs, the system “forgets” and begins anew
as if it were performing the first trial
➢ The geometric random variable arises in applications where one is
interested in the time (i.e., number of trials) that elapses between the
occurrence of events in a sequence of independent experiments
THE GEOMETRIC RANDOM VARIABLE
➢ Examples where the modified geometric random variable arises are:
❖ Number of customers awaiting service in a queueing system;
❖ Number of white dots between successive black dots in a scan of a
black-and-white document
THE POISSON RANDOM VARIABLE
THE POISSON RANDOM VARIABLE
➢ In many applications, we are interested in counting the number of
occurrences of an event in a certain time period or in a certain region
in space
➢ The Poisson random variable arises in situations where the events
occur “completely at random” in time or space
➢ For example, the Poisson random variable arises in:
❖ Counts of emissions from radioactive substances,
❖ Counts of demands for telephone connections, and
❖ Counts of defects in a semiconductor chip
➢ The 𝒑𝒎𝒇 for the Poisson random variable is given by:
THE POISSON RANDOM VARIABLE
➢ Coherent Light-Example
❖ Coherent light has a constant optical power P
❖ The corresponding mean photon flux Φ 𝑡 = 𝒫(𝑡)Τℎ𝜈ҧ
(photons/s) is also constant, but the actual times of
registration of the photons are random as shown in Fig.
Figure: Random arrival of photons in a light beam of constant
power P within during intervals of duration T.
Although the optical power is constant, the number n
of photons arriving within each interval is random.
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
❑ Let us consider a time interval 𝒯 in which 𝑛 photons are arriving at
the detector
❑ Let us divide each time interval 𝒯 into a large number 𝑁 of
subintervals of sufficiently small duration, such that during each
interval either one photon or no photon gets detected thus each
subinterval will be of time duration: 𝒯 Τ𝑁
❑ Therefore, the probability of detection of a photon in a segment of
duration 𝒯 Τ𝑁 will be as under:
𝑛ത
ℙ = … … … … … … … … … … … … … … … … … … . . (𝐴)
𝑁
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
❑ Thus the probability of detecting no photon during time subinterval of
duration 𝒯 Τ𝑁 be given as: 1 − ℙ
❑ The probability of finding 𝑛 independent photons in the 𝑁 intervals,
like the flips of a biased coin, then follows the binomial distribution as
under:
ℙ 𝑛 =
Ν!
ℙ𝑛
𝑛! Ν−𝑛 !
1−ℙ
Ν−𝑛
…………………….(B)
❑ By substituting the value of ℙ from Eq. (A) in Eq. (B) we get:
ℙ 𝑛 =
Ν!
𝑛ത 𝑛
𝑛! Ν−𝑛 ! 𝑁
1−
𝑛ത Ν−𝑛
…………………….(C)
𝑁
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
❑ For
Ν → ∞ we have
Ν!
Ν−𝑛 !
→ Ν𝑛 which may be proved by
using sterling Formula as under:
∴ ln 𝑛! = 𝑛𝑙𝑛 𝑛 − 𝑛
⇛ 𝑙𝑛
⇛ 𝑙𝑛
Ν!
Ν−𝑛 !
(Sterling Formula)
= ln (Ν!)−𝑙𝑛
Ν!
𝑛! Ν−𝑛 !
Ν−𝑛 !
= ln (Ν!)− 𝑙𝑛 Ν − 𝑛 !
Ν!
⇛ 𝑙𝑛
=
𝑛! Ν − 𝑛 !
Νln (Ν)−Ν − Ν − 𝑛 𝑙𝑛 Ν − 𝑛
+N−n
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
⇛ 𝑙𝑛
Ν!
= Ν ln Ν − Ν − Ν𝑙𝑛 Ν − 𝑛
Ν−𝑛 !
= Ν𝑙𝑛Ν − Ν − Ν𝑙𝑛Ν 1 −
𝑛
Ν
+ 𝑛𝑙𝑛Ν 1 −
𝑛
𝑛
Ν
+ 𝑛𝑙𝑛 Ν − 𝑛
+N−n
+Ν−𝑛
𝑛
= Ν𝑙𝑛Ν − Ν − Ν𝑙𝑛Ν − Ν ln 1 − Ν + 𝑛𝑙𝑛Ν + 𝑛𝑙𝑛 1 − Ν + Ν − 𝑛
❑ Since 𝑛 ≪ Ν ⇛ 𝑙𝑛 1 −
𝑛
Ν
≅−
𝑛
Ν
in terms of Maclaurin Series
expansion
⇛ 𝑙𝑛
⇛ 𝑙𝑛
⇛
Ν!
Ν−𝑛 !
Ν!
Ν−𝑛 !
Ν!
Ν−𝑛 !
≅ −Ν − Ν𝑛 + 𝑙𝑛𝑁 𝑛 + 𝑛 − Ν𝑛 − n
𝑛
≈ +𝑛 + 𝑙𝑛𝑁 − n ≡ 𝑙𝑛𝑁
𝑛
∴
𝑛2
𝑁
≪𝑛
≡ Ν 𝑛 … … … … … … … … … … … . .(D)
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
❑ Similarly for Ν ≫ 𝑛ത we have: 1 −
𝑛ത Ν−𝑛
Ν
→ 𝑒 −𝑛ത which may be
proved as under:
𝑛ത
𝑙𝑛 1 −
Ν
Ν−𝑛
𝑛ത
= Ν − 𝑛 ln 1 −
Ν
𝑛ത Ν−𝑛
Ν
𝑛ത Ν−𝑛
⇛ 𝑙𝑛
1−
⇛ 𝑙𝑛
1−
⇛ 𝑙𝑛
1−
⇛
1−
Ν
𝑛ത Ν−𝑛
Ν
𝑛ത
≅ Ν − 𝑛 (− Ν) for 𝑛ത ≪ Ν
𝑛ത
𝑛ത
≅ Ν (− Ν) −𝑛 (− Ν)
𝑛ത
≅ −𝑛ത − 𝑛 Ν ≈ −𝑛ത
𝑛ത Ν−𝑛
≈ 𝑒 −𝑛ത ……………………(E)
Ν
THE POISSON RANDOM VARIABLE
❖ Derivation of the Poisson Distribution
❑ By using Eqs. (D) & (E) in Eq. (C) we get:
Ν𝑛 𝑛ത
ℙ 𝑛 ≅
𝑛! Ν
𝑛
𝑒 −𝑛ത
𝑛ത 𝑛 −𝑛ത
ℙ 𝑛 ≡
𝑒 … … … … … … . . (𝐹)
𝑛!
➢ In current discussion we have 𝒑𝒎𝒇 of the Poisson random variable as:
➢ i.e., 𝑛 = 𝑘 and 𝑛ത = 𝛼
THE POISSON RANDOM VARIABLE
➢ where 𝜶 is the average number of event occurrences in a specified
time interval or region in space
➢ Figure below shows the Poisson 𝒑𝒎𝒇 for several values of 𝜶
➢ For 𝛼 < 1, Ρ Ν = 𝑘 is maximum at 𝒌 = 𝟎;
➢ For 𝛼 > 1, Ρ[Ν = 𝑘]is maximum at 𝜶
➢ If 𝛼 is a positive integer, the Ρ[Ν = 𝑘] is maximum at and at 𝑘 = 𝛼 and
𝒌 = 𝒂 − 𝟏
THE POISSON RANDOM VARIABLE
➢ The 𝑝𝑚𝑓 of the Poisson random variable sums to one, since:
➢ The expected value or Mean of the Poisson Random Variable is calculated as:
ΕΝ =
𝑘
𝛼
σ∞
𝑘
𝑘=0 𝑘!
𝑒 −𝛼
=
𝛼𝑘
∞
σ𝑘=1
𝑒 −𝛼
(𝑘−1)!
= 𝛼𝑒
−𝛼
σ∞
𝑘=1
𝛼𝑘−1
(𝑘−1)!
= 𝛼𝑒 −𝛼 . 𝑒 𝛼 = α
THE POISSON RANDOM VARIABLE
➢ The Second Moment of the Poisson Random Variable is calculated as under:
Ε
Ν2
=
𝑘
2𝛼
σ∞
𝑘
𝑘=0
𝑘!
𝑒 −𝛼
=
𝛼𝑘
∞
σ𝑘=1 𝑘
𝑒 −𝛼
(𝑘−1)!
𝛼𝑘−1
= 𝛼𝑒 −𝛼 σ∞
𝑘=1 𝑘 (𝑘−1)!
∞
∞
∞
𝑘=1
𝑘=1
𝑘=1
𝑘−1
𝑘−1
𝑘−1
𝛼
𝛼
𝛼
= 𝛼𝑒 −𝛼 . ෍ (𝑘 − 1 + 1)
= 𝛼𝑒 −𝛼 ෍ (𝑘 − 1)
+෍
(𝑘 − 1)!
(𝑘 − 1)!
(𝑘 − 1)!
= 𝛼𝑒 −𝛼
∞
∞
𝑘=2
𝑘−1=0
𝛼 𝑘−2
𝛼 𝑘−1
𝛼෍
+ ෍
= 𝛼𝑒 −𝛼 𝛼𝑒 𝛼 + 𝑒 𝛼 = 𝛼 2 + 𝛼
(𝑘 − 2)!
(𝑘 − 1)!
➢ The Variance of the Poisson Random Variable is calculated as under:
𝜎Ν2 =VAR[N] = 𝐸 Ν 2 − Ε Ν
2
= 𝛼2 + 𝛼 − 𝛼2 = 𝛼
Thank You
Download