Uploaded by m

ECE342 Course Notes

advertisement
ECE342- Probability for
Electrical & Computer Engineers
C. Tellambura and M. Ardakani
Copyright ©2013 C. Tellambura and M. Ardakani. All rights reserved.
Winter 2013
Contents
1 Basics of Probability Theory
1.1 Set theory . . . . . . . . . . . . . . . . . .
1.1.1 Basic Set Operations . . . . . . . .
1.1.2 Algebra of Sets . . . . . . . . . . .
1.2 Applying Set Theory to Probability . . . .
1.3 Probability Axioms . . . . . . . . . . . . .
1.4 Some Consequences of Probability Axioms
1.5 Conditional probability . . . . . . . . . . .
1.6 Independence . . . . . . . . . . . . . . . .
1.7 Sequential experiments and tree diagrams
1.8 Counting Methods . . . . . . . . . . . . .
1.9 Reliability Problems . . . . . . . . . . . .
1.10 Illustrated Problems . . . . . . . . . . . .
1.11 Solutions for the Illustrated Problems . . .
1.12 Drill Problems . . . . . . . . . . . . . . . .
2 Discrete Random Variables
2.1 Definitions . . . . . . . . . . . . . . . . . .
2.2 Probability Mass Function . . . . . . . . .
2.3 Cumulative Distribution Function (CDF) .
2.4 Families of Discrete RVs . . . . . . . . . .
2.5 Averages . . . . . . . . . . . . . . . . . . .
2.6 Function of a Random Variable . . . . . .
2.7 Expected Value of a Function of a Random
2.8 Variance and Standard Deviation . . . . .
2.9 Conditional Probability Mass Function . .
2.10 Basics of Information Theory . . . . . . .
2.11 Illustrated Problems . . . . . . . . . . . .
2.12 Solutions for the Illustrated Problems . . .
2.13 Drill Problems . . . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
Variable
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
. . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
1
1
1
2
2
2
3
3
4
4
4
5
5
11
20
.
.
.
.
.
.
.
.
.
.
.
.
.
29
29
29
30
30
31
32
32
33
33
34
35
39
46
iv
CONTENTS
3 Continuous Random Variables
3.1 Cumulative Distribution Function . . . . .
3.2 Probability Density Function . . . . . . . .
3.3 Expected Values . . . . . . . . . . . . . .
3.4 Families of Continuous Random Variables
3.5 Gaussian Random Variables . . . . . . . .
3.6 Functions of Random Variables . . . . . .
3.7 Conditioning a Continuous RV . . . . . . .
3.8 Illustrated Problems . . . . . . . . . . . .
3.9 Solutions for the Illustrated Problems . . .
3.10 Drill Problems . . . . . . . . . . . . . . . .
4 Pairs of Random Variables
4.1 Joint Probability Mass Function . . .
4.2 Marginal PMFs . . . . . . . . . . . .
4.3 Joint Probability Density Function .
4.4 Marginal PDFs . . . . . . . . . . . .
4.5 Functions of Two Random Variables .
4.6 Expected Values . . . . . . . . . . .
4.7 Conditioning by an Event . . . . . .
4.8 Conditioning by an RV . . . . . . . .
4.9 Independent Random Variables . . .
4.10 Bivariate Gaussian Random Variables
4.11 Illustrated Problems . . . . . . . . .
4.12 Solutions for the Illustrated Problems
4.13 Drill Problems . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
55
55
56
56
57
58
59
60
60
64
69
75
75
75
76
76
76
77
78
78
79
80
80
83
89
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
5 Sums of Random Variables
5.1 Summary . . . . . . . . . . . . . . . . . . .
5.1.1 PDF of sum of two RV’s . . . . . . .
5.1.2 Expected values of sums . . . . . . .
5.1.3 Moment Generating Function (MGF)
5.2 Illustrated Problems . . . . . . . . . . . . .
5.3 Solutions for the Illustrated Problems . . . .
5.4 Drill Problems . . . . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
93
. 93
. 93
. 93
. 94
. 94
. 96
. 100
.
.
.
.
.
.
103
103
104
105
106
107
108
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
1
2
3
4
5
6
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
A 2009 Quizzes
A.1 Quiz Number
A.2 Quiz Number
A.3 Quiz Number
A.4 Quiz Number
A.5 Quiz Number
A.6 Quiz Number
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
CONTENTS
v
A.7 Quiz Number 7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
A.8 Quiz Number 8 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
B 2009 Quizzes: Solutions
B.1 Quiz Number 1 . . . .
B.2 Quiz Number 2 . . . .
B.3 Quiz Number 3 . . . .
B.4 Quiz Number 4 . . . .
B.5 Quiz Number 5 . . . .
B.6 Quiz Number 6 . . . .
B.7 Quiz Number 7 . . . .
B.8 Quiz Number 8 . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
111
111
113
114
116
118
120
122
123
C 2010 Quizzes
C.1 Quiz Number
C.2 Quiz Number
C.3 Quiz Number
C.4 Quiz Number
C.5 Quiz Number
C.6 Quiz Number
C.7 Quiz Number
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
125
125
126
127
128
129
130
131
D 2010 Quizzes: Solutions
D.1 Quiz Number 1 . . . .
D.2 Quiz Number 2 . . . .
D.3 Quiz Number 3 . . . .
D.4 Quiz Number 4 . . . .
D.5 Quiz Number 5 . . . .
D.6 Quiz Number 6 . . . .
D.7 Quiz Number 7 . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
133
133
135
136
138
139
140
141
E 2011 Quizzes
E.1 Quiz Number
E.2 Quiz Number
E.3 Quiz Number
E.4 Quiz Number
E.5 Quiz Number
E.6 Quiz Number
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
143
143
144
145
146
147
148
.
.
.
.
149
149
151
153
155
1
2
3
4
5
6
7
1
2
3
4
5
6
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
F 2011 Quizzes: Solutions
F.1 Quiz Number 1 . . . .
F.2 Quiz Number 2 . . . .
F.3 Quiz Number 3 . . . .
F.4 Quiz Number 4 . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
vi
CONTENTS
F.5 Quiz Number 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
F.6 Quiz Number 6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
Chapter 1
Basics of Probability Theory
Goals of EE387
• Introduce the basics of probability theory,
• Apply probability theory to solve engineering problems.
• Develop intuition into how the theory applies to practical situations.
1.1
Set theory
A set can be described by the tabular method or the description method.
Two special sets: (1) The universal set S and (2) The null set φ.
1.1.1
Basic Set Operations
|A|: cardinality of A.
A ∪ B = {x|x ∈ A or x ∈ B}: union - Either A or B occurs or both occur.
A ∩ B = {x|x ∈ A and x ∈ B}: intersection - both A and B occur.
A − B = {x ∈ A and x ∈
/ B}: set difference
Ac = {x | x ∈ S and x ∈
/ A}: complement of A.
n
!
k=1
n
"
k=1
Ak = A1 ∪ A2 ∪ . . . ∪ An : Union of n ≥ 2 events - one or more of Ak ’s occur.
Ak = A1 ∩ A2 ∩ . . . ∩ An : Intersection of n ≥ 2 events - all Ak ’s occur simulta-
neously.
Definition 1.1: A and B are disjoint if A ∩ B = φ.
Definition 1.2: A collection of events A1 , A2 , . . . , An (n ≥ 2) is mutually
exclusive if all pairs of Ai and Aj (i ̸= j) are disjoint.
2
Basics of Probability Theory
1.1.2
Algebra of Sets
1. Union and intersection are commutative.
2. Union and intersection are distributive.
3. (A ∪ B)c = Ac ∩ B c - De Morgan’s law.
4. Duality Principle
1.2
Applying Set Theory to Probability
Definition 1.3: An experiment consists of a procedure and observations.
Definition 1.4: An outcome is any possible observation of an experiment.
Definition 1.5: The sample space S of an experiment is the finest-grain,
mutually exclusive, collectively exhaustive set of all possible outcomes.
Definition 1.6: An event is a set of outcomes of an experiment.
Definition 1.7: A set of mutually exclusive sets (events) whose union equals
the sample space is an event space of S. Mathematically, Bi ∩ Bj = φ for all
i ̸= j and B1 ∪ B2 ∪ . . . ∪ Bn = S.
Theorem 1.1: For an event space B = {B1 , B2 , · · · , Bn } and any event A ⊂ S,
let Ci = A ∩ Bi , i = 1, 2, · · · , n. For i ̸= j, the events Ci and Cj are mutually
exclusive, i.e., Ci ∩ Cj = φ, and A =
1.3
n
!
Ci .
i=1
Probability Axioms
Definition 1.8: Axioms of Probability: A probability measure P [·] is a
function that maps events in S to real numbers such that:
Axiom 1. For any event A, P [A] ≥ 0.
Axiom 2. P [S] = 1.
Axiom 3. For any countable collection A1 , A2 , · · · of mutually exclusive events
P [A1 ∪ A2 ∪ · · · ] = P [A1 ] + P [A2 ] + · · ·
1.4 Some Consequences of Probability Axioms
Theorem 1.2:
P [A] =
m
#
P [Ai ].
3
If A = A1 ∪ A2 ∪ · · · ∪ Am and Ai ∩ Aj = φ for i ̸= j, then
i=1
Theorem 1.3: The probability of an event B = {s1 , s2 , · · · , sm } is the sum of
the probabilities of the outcomes in the event, i.e., P [B] =
m
#
i=1
1.4
P [{si }].
Some Consequences of Probability Axioms
Theorem 1.4: The probability measure P [·] satisfies
1. P [φ] = 0. 2. P [Ac ] = 1 − P [A].
3. For any A and B (not necessarily disjoint), P [A∪B] = P [A]+P [B]−P [A∩B].
4. If A ⊂ B, then P [A] ≤ P [B].
Theorem 1.5:
For any event A and event space B = {B1 , B2 , · · · , Bm } ,
P [A] =
m
#
i=1
1.5
P [A ∩ Bi ].
Conditional probability
The probability in Section 1.3 is also called a priori probability. If an event has
happened, this information can be used to update the a priori probability.
Definition 1.9: The conditional probability of event A given B is
P [A|B] =
P [A ∩ B]
.
P [B]
To calculate P [A|B], find P [A ∩ B] and P [B] first.
Theorem 1.6 (Law of total probability):
{B1 , B2 , · · · , Bm } with P [Bi ] > 0 for all i,
P [A] =
m
#
For an event space
P [A|Bi ]P [Bi ].
i=1
Theorem 1.7 (Bayes’ Theorem): P [B|A] =
P [A|B]P [B]
.
P [A]
4
Basics of Probability Theory
Theorem 1.8 (Bayes’ Theorem- Expanded Version):
P [A|Bi ]P [Bi ]
P [Bi |A] = $m
.
i=1 P [A|Bi ]P [Bi ]
1.6
Independence
Definition 1.10: Events A and B are independent if and only if P [A ∩ B] =
P [A]P [B].
Relationship with conditional probability: P [A|B] = P [A], P [B|A] = P [B]
when A and B are independent.
Definition 1.11: Events A, B and C are independent if and only if
P [A ∩ B] = P [A]P [B]
P [B ∩ C] = P [B]P [C]
P [A ∩ C] = P [A]P [C]
P [A ∩ B ∩ C] = P [A]P [B]P [C].
1.7
Sequential experiments and tree diagrams
Many experiments consist of a sequence of trials (subexperiments). Such experiments can be visualized as multiple stage experiments.
Such experiments can be conveniently represented by tree diagrams.
The law of total probability is used with tree diagrams to compute event probabilities of these experiments.
1.8
Counting Methods
Definition 1.12: If task A can be done in n ways and B in k way, then A and
B can be done in nk ways.
Definition 1.13: If task A can be done in n ways and B in k way, then either
A or B can be done in n + k ways.
Here are some important cases:
• The number of ways to choose k objects out of n distinguishable objects
(with replacement and with ordering) is nk .
1.9 Reliability Problems
5
• The number of ways to choose k objects out of n distinguishable objects
(without replacement and with ordering) is n(n − 1) · · · (n − k + 1).
• The number of ways to choose k objects out of n distinguishable objects
% &
n!
(without replacement and without ordering) is nk =
.
k!(n − k)!
• Number of permutations on n objects out of which n1 are alike, n2 are alike,
n!
. . ., nR are alike:
.
n1 !n2 ! · · · nR !
1.9
Reliability Problems
For n independent systems in series: P [W ] =
n
'
P [Wi ].
i=1
For n independent systems in parallel: P [W ] = 1 −
1.10
n
'
i=1
(1 − P [Wi ]).
Illustrated Problems
1. True or False. Explain your answer in one line.
a) If A = {x2 |0 < x < 2, x ∈ R} and B = {2x|0 < x < 2, x ∈ R} then
A = B.
b) If A ⊂ B then A ∪ B = A
c) If A ⊂ B and B ⊂ C then A ⊂ C
d) For any A, B and C, A ∩ B ⊂ A ∪ C
e) There exist a set A for which (A ∩ ∅c )c ∩ S = A (S is the universal set).
f) For a sample space S and two events A and C, define B1 = A ∩ C,B2 =
Ac ∩ C, B3 = A ∩ C c and B4 = Ac ∩ C c . Then {B1 , B2 , B3 , B4 }is an
event space.
2. Using the algebra of sets, prove
a) A ∩ (B − C) = (A ∩ B) − (A ∩ C),
b) A − (A ∩ B) = A − B.
3. Sketch A − B for
a) A ⊂ B,
b) B ⊂ A,
c) A and B are disjoint.
6
Basics of Probability Theory
4. Consider the following subsets of S = {1, 2, 3, 4, 5, 6}: R1 = {1, 2, 5}, R2 =
{3, 4, 5, 6}, R3 = {2, 4, 6}, R4 = {1, 3, 6}, R5 = {1, 3, 5}. Find:
a) R1 ∪ R2 ,
b) R4 ∩ R5 ,
c) R5c ,
d) (R1 ∪ R2 ) ∩ R3 ,
e) R1c ∪ (R4 ∩ R5 ),
f) (R1 ∩ (R2 ∪ R3 ))c ,
g) ((R1 ∪ R2c ) ∩ (R4 ∪ R5c ))c
h) Write down a suitable event space.
5. Express the following sets in R as a single interval:
a) ((−∞, 1) ∪ (4, ∞))c ,
b) [0, 1] ∩ [0.5, 2],
c) [−1, 0] ∪ [0, 1].
6. By drawing a suitable Venn diagram, convince yourself of the following:
a) A ∩ (A ∪ B) = A,
b) A ∪ (A ∩ B) = A.
7. Three telephone lines are monitored. At a given time, each telephone line
can be in one of the following three modes: (1) Voice Mode, i.e., the line is
busy and someone is speaking (2) Data Mode, i.e., the line is busy with a
modem or fax signal and (3) Inactive Mode, i.e., the line is not busy. We
show these three modes with V, D and I respectively. For example if the
first and second lines are in Data Mode and the third line is in Inactive
Mode, the observation is DDI.
a) Write the elements of the event A= {at least two Voice Modes}
b) Write the elements of B= {number of Data Modes > 1+ number of
Voice modes}
8. The data packets that arrive at an Internet switch are buffered to be processed. When the buffer is full, the arrived packet is dropped and the transmission must be repeated. To study this system, at the arrival time of any
new packet, we observe the number of packets that are already stored in the
buffer. Assuming that the switch can buffer a maximum of 5 packets, the
used buffer at any given time is 0, 1, 2, 3, 4 or 5 packets. Thus the sample
space for this experiment is S = {0, 1, 2, 3, 4, 5}.
This experiment is repeated 500 times and the following data is recorded.
1.10 Illustrated Problems
Used buffer
0
1
2
3
4
5
7
Number of times observed
112
119
131
85
43
10
The relative frequency of an event A is defined as nnA , where nA is the number
of timesA occurs and n is the total number of observations.
a) Consider the following three exclusively mutual events:A = {0, 1, 2},
B = {3, 4}, C = {5}. Find the relative frequency of these events.
b) Show that the relative frequency of A ∪ B ∪ C is equal to the sum of
the relative frequencies of A, B and C.
9. Consider an elevator in a building with four stories, 1-4, with 1 being the
ground floor. Three people enter the elevator on floor 1 and push buttons for
their destination floors. Let the outcomes be the possible stopping patterns
for all passengers to leave the elevator on the way up. For example, 2-2-4
means the elevator stops on floors 2 and 4. Therefore,
2-2-4 is an outcome in S.
a) List the sample space, S, with its elements (outcomes).
b) Consider all outcomes equally likely. What is the probability of each
outcome?
c) Let E = {stops only on even floors} and T = {stops only twice}. Find
P[E] and P[T ].
d) Find P [E ∩ T ]
e) Find P [E ∪ T ]
f) Is P [EU T ] = P [E] + P T ]? Does this contradict the third axiom of
probability?
10. This problem requires the use of event spaces. Consider a random experiment and four events A, B, C, and D such that A and B form an event
space and also C and D form an event space. Furthermore, P [A ∩ C] = 0.3
and P [B ∩ D] = 0.25.
a) Find P [A ∪ C].
b) If P [D] = 0.58, find P [A].
11. Prove the following inequalities:
8
Basics of Probability Theory
a) P [A ∪ B] ≤ P [A] + P [B].
b) P [A ∩ B] ≥ P [A] + P [B] − 1.
12. This problem requires the law of total probability and conditional
probability. A study on relation between the family size and the number
of cars reveals the following probabilities.
Number of Cars
Family size
S: Small (2 or less)
M: Medium (3, 4 or 5)
L: Large(more than 5)
0
1
2
More than 2
0.04
0.02
0.01
0.14
0.33
0.03
0.02
0.23
0.13
0.00
0.02
0.03
Answer the following questions:
a) What is the probability of a random family having less than 2 cars?
b) Given that a family has more than 2 cars, what is the probability that
this family be large?
c) Given that a family has less than 2 cars, what is the probability that
this family be large?
d) Given that the family size is not medium, what is the probably of
having one car?
13. A communication channel model is shown Fig. 1.1. The input is either 0 or
1, and the output is 0, 1 or X, where X represents a bit that is lost and not
arrived at the channel output. Also, due to noise and other imperfections,
the channel may transmit a bit in error. When Input = 0, the correct output
(Output = 0) occurs with a probability of 0.8, the incorrect output (Output
= 1) occurs with a probability of 0.1, and the bit is lost (Output = X)
with a probability of 0.1. When Input = 1, the correct output (Output = 1)
occurs with a probability of 0.7, the wrong output (Output = 0) occurs with
a probability of 0.2, and the bit is lost (Output = X) with a probability of
0.1. Assume that the inputs 0 and 1 are equally likely (i.e. P [0] = P [1]).
a) If Output = 1, what is the probability of Input = 1?
b) If the output is X, what is the probability of Input = 1, and what is
the probability of Input = 0?
c) Repeat part a), but this time assume that the inputs are not equally
likely and P [0] = 3P [1].
14. This problem requires Bayes’ theorem. Considering all the other evidences
Sherlock was 60% certain that Jack is the criminal. This morning, he found
1.10 Illustrated Problems
Input
0
9
Output
0
X
1
1
Figure 1.1: Communication Channel
another piece of evidence proving that the criminal is left handed. Dr.
Watson just called and informed Sherlock that on average 20% of people are
left handed and that Jack is indeed left handed. How certain of the guilt of
Jack should Sherlock be after receiving this call?
15. This problem requires Bayes’ theorem. Two urns A and B each have 10
balls. Urn A has 3 green, 2 red and 5 white balls and Urn B has 1 green, 6
red and 3 white balls. One urn is chosen at (equally likely) and one ball is
drawn from it (balls are also chosen equally likely)
a) What is the probability that this ball is red?
b) Given that the drawn ball is red, what the probability that Urn A was
selected?
c) Suppose the drawn ball is green. Now we return this green ball to the
other urn and draw a ball from it (from the urn that received the green
ball). What is the probability that this ball is red?
16. Two urns A with 1 blue and 6 red balls and B with 6 blue and 1 red balls
are present. Flip a coin. If the outcomes is H, put one random ball from A
in B, and if the outcome is T , put one random ball from B in A. Now draw
a ball from A. If blue, you win. If not, draw a ball from B, if blue you win,
if red, you lose. What is the probability of wining this game?
17. Two coins are in an urn. One is fair with P [H] = P [T ] = 0.5, and one is
biased with P [H] = 0.25 and P [T ] = 0.75. One coin is chosen at random
(equally likely) and is tossed three times.
a) Given that the biased coin is selected what is the probability of T T T ?
b) Given that the biased coin is selected and that the outcome of the first
tree tosses in T T T , what is the probability that the next toss is T ?
10
Basics of Probability Theory
B
A
C
Figure 1.2: for question 20.
c) This time, assume that we do not know which coin is selected. We
observe that the first three outcomes are T T T . What is the probability
that the next outcome is T ?
d) Define two events E1: the outcomes of the first three tosses are T T T ;
E2: the forth toss is T . Are E1 and E2 independent?
e) Given that the biased coin is selected, are E1 and E2 independent?
18. Answer the following questions about rearranging the letters of the word
“toronto”
a) How many different orders are there?
b) In how many of them does ‘r’ appear before n?
c) In how many of them the middle letter is a consonant?
d) How many do not have any pair of consecutive ‘o’s?
19. Consider a class of 14 girls and 16 boys. Also two of the girls are sisters. A
team of 8 players are selected from this class at random.
a) What is the probability that the team consists of 4 girls and 4 boys?
b) What is the probability that the team be uni-gender (all boys or all
girls)?
c) What is the probability that the number of girls be greater than the
number of boys?
d) What is the probability that both sisters are in the team?
20. In the network (Fig. 1.2), a data packet is sent from A to B. In each step,
the packet can be sent one block either to the right or up. Thus, a total of
9 steps are required to reach B.
a) How many paths are there from A to B?
b) If one of these paths are chosen randomly (equally likely), what is the
probability that it pass through C?
1.11 Solutions for the Illustrated Problems
a
R
1
R
R
R
11
b
Figure 1.3: Question 22.
21. A binary communication system transmits a signal X that is either a + 2
voltage signal or a − 2 voltage signal. These voltage signals are equally
likely. A malicious channel reduces the magnitude of the received signal by
the number of heads it counts in two tosses of a coin. Let Y be the resulting
signal.
a) Describe the sample space in terms of input-output pairs.
b) Find the set of outcomes corresponding to the event ‘transmitted signal
was definitely +2’.
c) Describe in words the event corresponding to the outcome Y = 0.
d) Use a tree diagram to find the set of possible input-output pairs.
e) Find the probabilities of the input-output pair.
f) Find the probabilities of the output values.
g) Find the probability that the input was X = +2 given that Y = k for
all possible values of k.
22. In a communication system the signal sent from point a to point b arrives
along two paths in parallel (Fig. 1.3). Over each path the signal passes
through two repeaters in series. Each repeater in Path 1 has a 0.05 probability of failing (because of an open circuit). This probability is 0.08 for
each repeater on Path 2. All repeaters fail independently of each other.
a) Find the probability that the signal will not arrive at point b.
1.11
1.
Solutions for the Illustrated Problems
a) True. They both contain all real numbers between 0 and 4.
b) False. A ∪ B = B
c) True. ∀x ∈ A ⇒ x ∈ B ⇒ x ∈ C, therefore: A ⊂ C.
d) True. Because (A ∩ B) ⊂ A and A ⊂ A ∪ C.
e) False. (A ∩ φc )c = (A ∩ S)c = (A)c = Ac . There is no set A such
thatA = Ac .
12
Basics of Probability Theory
f) True. Bi s are mutually exclusive and collectively exhaustive.
2.
a) Starting with the left hand side we have:
A ∩ (B − C) = A ∩ (B ∩ C c ) = (A ∩ B) ∩ C c = (A ∩ B) ∩ C c
= (A ∩ B) − C.
For the right hand side we have:
(A ∩ B) − (A ∩ C) = (A ∩ B) ∩ (A ∩ C)c = (A ∩ B) ∩ (Ac ∪ C c )
= (A ∩ B ∩ Ac ) ∪ (A ∩ B ∩ C c )
We also know that (A ∩ B ∩ Ac ) = ((A ∩ Ac ) ∩ B) = φ and
(A ∩ B ∩ C c ) = ((A ∩ B) ∩ C c ) = (A ∩ B) − C.
As a result we have: (A ∩ B) − (A ∩ C) = φ ∪ ((A ∩ B) − C) =
(A ∩ B) − C.
Then both sides are equal to (A ∩ B) − C, and therefore the equality
holds.
3.
b) A − (A ∩ B) = A ∩ (A ∩ B)c = A ∩ (Ac ∪ B c ) = (A ∩ Ac ) ∪ (A ∩ B c )
We know that A ∩ Ac = φ.
Thus we have: A − (A ∩ B) = φ ∪ (A ∩ B c ) = A ∩ B c = A − B
a) null set
b)
A
(A – B)
B
c)
S
A
(A – B)
4.
a) R1 ∪ R2 = {1, 2, 3, 4, 5, 6}
b) R4 ∩ R5 = {1, 3}
B
1.11 Solutions for the Illustrated Problems
13
c) R5c = {2, 4, 6}
d) (R1 ∪ R2 ) ∩ R3 = {2, 4, 6}
e) R1c ∪ (R4 ∩ R5 ) = {1, 3, 4, 6}
f) (R1 ∩ (R2 ∪ R3 ))c = {1, 3, 4, 6}
g) ((R1 ∪ R2c ) ∩ (R4 ∪ R5c ))c = {3, 4, 5, 6}
h) One solution is {1,2,3} and {4,5,6} which partition S to two disjoint
sets.
5.
a) ((−∞, 1) ∪ (4, ∞))c = [1, 4]
b) [0, 1] ∩ [0.5, 2] = [0.5, 1]
c) [−1, 0] ∪ [0, 1] = [−1, 1]
6. Try drawing Venn diagrams
7. A = {V V I, V V D, V V V, V IV, V DV, IV V, DV V }
B = {DDD, DDI, DID, IDD}
8.
a)
b)
9.
nA
= 112+119+131
= 0.724
n
500
nB
85+43
=
=
0.256
n
500
nC
10
=
= 0.02
n
500
nA∪B∪C
= 500
=1
n
500
nA
nB
nC
+
+
= 0.724 + 0.256
n
n
n
+ 0.02 = 1 =
nA∪B∪C
n
a) S = {2 − 2 − 2, 2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 3 − 4, 2 − 4 − 4,
3 − 3 − 3, 3 − 3 − 4, 3 − 4 − 4, 4 − 4 − 4}
b) There are 10 elements in S, thus the probability of each outcome is
1/10. To be mathematically rigorous, one can define 10 mutually exclusive outcomes: E1 = {2−2−2}, E2 = {2−2−3}, . . ., E10 = {4−4−4}.
These outcomes are also collectively exhaustive.
Thus, using the second and the third axioms of probability, P [E1 ] +
P [E2 ] + ...P [E10 ] = P [S] = 1.
Now, since these outcomes are equally likely, each has P [Ei ] = 1/10.
c) E = {2 − 2 − 2, 2 − 2 − 4, 2 − 4 − 4, 4 − 4 − 4},
T = {2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 4 − 4, 3 − 3 − 4, 3 − 4 − 4}
Thus, P [E] = 4/10, P [T ] = 6/10
d, e) E ∩ T = {2 − 2 − 4, 2 − 4 − 4}
E ∪ T = {2 − 2 − 2, 2 − 2 − 3, 2 − 2 − 4, 2 − 3 − 3, 2 − 4 − 4, 3 − 3 −
4, 3 − 4 − 4, 4 − 4 − 4}
Thus, P [E ∩ T ] = 2/10, P [E ∪ T ] = 8/10.
14
Basics of Probability Theory
f) It can be seen that P [E ∪ T ] ̸= P [E] + P [T ]. This does not contradicts
the third axiom, because the third axiom on only for mutually exclusive
(in this case, disjoint) events. E and T are not disjoint.
10.
A = B c and C = Dc
a) P [B ∩ D] = P [C c ∩ Ac ] = P [(A ∪ C)c ] = 1 − P [A ∪ C]
⇒ P [A ∪ C] = 0.75
b) P [A ∪ C] = P [A] + P [C] − P [A ∩ C]
⇒ P [A] = P [A ∪ C] − P [C] + P [A ∩ C]
P [C] = 1 − P [D] = 0.42
⇒ P [A] = 0.75 − 0.42 + 0.3 = 0.63
11.
a)
(
P [A ∪ B] = P [A] + P [B] − P [A ∩ B]
⇒ P [A ∪ B] ≤ P [A]+P [B]
P [A ∩ B] ≥ 0
Notice that from a) it can easily be concluded that
P [A ∪ B ∪ C ∪ · · ·] ≤ P [A] + P [B] + P [C] + · · ·
b)
1
P [A ∪ B] = P [A] + P [B] − P [A ∩ B]
P [A ∪ B] ≤ 1
(
⇒ P [A]+P [B]−P [A ∩ B] ≤
⇒ P [A ∩ B] ≥ P [A] + P [B] − 1
12.
a) We define A to be the event that a random family has less than two
cars and N to be number of cars.
P [A ∩ S] = P [N = 0 ∩ S] + P [N = 1 ∩ S] = 0.04 + 0.14 = 0.18
P [A ∩ M ] = P [N = 0 ∩ M ] + P [N = 1 ∩ M ] = 0.02 + 0.33 = 0.35
P [A ∩ L] = P [N = 0 ∩ L] + P [N = 1 ∩ L] = 0.01 + 0.03 = 0.04
P [A] = P [A ∩ S] + P [A ∩ M ] + P [A ∩ L] = 0.18 + 0.35 + 0.04 = 0.57
>2)]
P [L∩(N >2)]
b) P [ L| N > 2] = P [L∩(N
= P [L∩(N >2)]+P
P [N >2]
[M ∩N >2]+P [S∩(N >2)]
0.03
⇒ P [ L| N > 2] = 0.03+0.02+0 = 0.6
<2)]
P [L∩(N <2)]
c) P [L|N < 2] = P [L∩(N
= P [L∩(N <2)]+P
P [N <2]
[M ∩N <2]+P [S∩(N <2)]
0.03+0.01
4 ∼
⇒ P [ L| N < 2] = (0.03+0.01)+(0.33+0.02)+(0.14+0.04)
= 0.04
= 57
= 0.07
0.57
1.11 Solutions for the Illustrated Problems
15
d)
P [M̄ ∩ (N = 1)]
P [(S ∪ L) ∩ (N = 1)]
=
P [S ∪ L]
P [M̄ ]
P [S ∩ (N = 1)] + P [L ∩ (N = 1)]
=
P [S] + P [L]
0.14 + 0.03
=
(0.04 + 0.14 + 0.02 + 0.00) + (0.01 + 0.03 + 0.13 + 0.03)
0.17
17
=
=
= 0.425
0.4
40
P [ N = 1| M̄ ] =
13.
P [ out=1|in=1]·P [in=1]
a) P [ in = 1| out = 1] = P [ out=1|in=1]·P
[in=1]+P [ out=1|in=0]·P [in=0]
0.7·0.5
= 0.7·0.5+0.1·0.5 = 0.875
P [ out=X|in=1]·P [in=1]
b) P [ in = 1| out = X] = P [ out=X|in=1]·P
[in=1]+P [ out=X|in=0]·P [in=0]
0.1·0.5
= 0.1·0.5+0.1·0.5
= 0.5
P [ out=X|in=0]·P [in=0]
P [ in = 0| out = X] = P [ out=X|in=1]·P
[in=1]+P [ out=X|in=0]·P [in=0]
0.1·0.5
= 0.1·0.5+0.1·0.5
= 0.5
or
P [ in = 0| out = X] = 1 − P [ in = 1| out = X] = 1 − 0.5 = 0.5.
c) P [0] + P [1] = 1 ⇒ 3P [1] + P [1] = 1 ⇒ P [1] = 0.25
P [ out=1|in=1]·P [in=1]
P [ in = 1| out = 1] = P [ out=1|in=1]·P
[in=1]+P [ out=1|in=0]·P [in=0]
0.7·0.25
⇒ P [ in = 1| out = 1] = 0.7·0.25+0.1·0.75
= 0.7
14. First we define some events as follows:
C = the event that Jack is criminal
L = the event that Jack is left handed.
Now we use the Bayes’ rule and write
[C]
[C]
P [ C| L] = P [L|C]P
= 1×P
= PP [C]
P [L]
P [L]
[L]
P [L] = P [ L| C c ]P [C c ] + P [ L| C]P [C] = (0.2) · (0.4) + (1) · (0.6) = 0.68
0.6
P [ C| L] = PP [C]
= 0.68
≈ 0.88
[L]
15.
a) P [red] = P [ red| A]P [A] + P [ red| B]P [B] =
b) P [ A| red] =
P [ red|A]P [A]
P [red]
=
(0.2)·(0.5)
0.4
2
10
· 12 +
6
10
·
1
2
= 0.4
= 0.25
c) Let A & B denote drawing the first ball from urn A & B respectively.
Then
16
Basics of Probability Theory
P [ green| A]P [A]
P [ green| A]P [A] + P [ green| B]P [B]
(0.3) · (0.5)
= 0.75
(0.3) · (0.5) + (0.1) · (0.5)
P [ green| B]P [B]
P [ green| A]P [A] + P [ green| B]P [B]
(0.1) · (0.5)
= 0.25
(0.3) · (0.5) + (0.1) · (0.5)
P [ red| A, green]P [ A| green] + P [ red| B, green]P [ B| green]
) *
) *
6
2
5
· (0.75) +
· (0.25) =
11
11
11
P [ A| green] =
=
P [ B| green] =
=
P [ red| green] =
=
16. Let us define the event A ↑ to denote drawing a ball from the urn A. Similarly
define another event B ↑ for the urn B.
P [win] =
%
17.
&
%
1 6 1
· ·
6% 7 2
+ 68 · 56 · 67
2
8
·
· 67 ·
1
2
&
1
2
& %
+
5
6
· 68 · 67 ·
1
2
& %
+
1
8
· 17 ·
1
2
& %
+ 1 · 78 · 17 ·
1
2
&
%
+0+
7
8
· 1 · 17 ·
1
2
&
= 0.848
a) P [T1 T2 T3 |b] = (0.75)3 = 0.422
b) P [T4 |b, T1 T2 T3 ] = 0.75
c) P [T4 |T1 T2 T3 ] = P [T4 |b, T1 T2 T3 ]P [b|T1 T2 T3 ]+P [T4 |f, T1 T2 T3 ]P [f |T1 T2 T3 ]
P [T1 T2 T3 ] = P [T1 T2 T3 |b]P [b] + P [T1 T2 T3 |f ]P [f ] = 0.422 × 0.5 + 0.5 ×
(0.5)3 = 0.2735
T2 T3 |b]P [b]
P [b|T1 T2 T3 ] = P [TP1[T
= 0.422×0.5
= 0.77
0.2735
1 T2 T3 ]
P [f |T1 T2 T3 ] = 1 − P [b|T1 T2 T3 ] = 0.23
⇒ P [T4 |T1 T2 T3 ] = 0.75 × 0.77 + 0.5 × 0.23 = 0.69
d) no, because if T T T happens the probability that the biased coin is
chosen increases.
e) yes.
18.
a)
+
7
3, 2, 1, 1
,
=
7!
(3!)·(2!)·(1!)·(1!)
= 420
b) For every arrangement that r appears before n, there is a counterpart
where n appear before r (just interchange r and n). Thus in half of the
arrangements r appears before n. The answer, therefore, is 420
= 210.
2
c) The middle letter can be t, r or n. If t, we have
6!
If r (or n), we have (3!)·(2!)
= 60 arrangements.
Total = 120 + 60 + 60 = 240.
6!
3!
= 120 arrangements.
+
1.11 Solutions for the Illustrated Problems
17
Figure 1.4: Tree Diagram for 16
d) We can think of it as “_ X _ X _ X _ X _”, where “X” represents
other letters and “_” represents a potential location for “o” (notice
that this way consecutive “o”s are avoided). There are 5 locations for
“o” and we want to pick three% of
& them. Since order does not matter,
the total number of ways is 52 = 10. The other 4 letters (“tmt”
have a total of 4!
= 12 arrangements among themselves to fill the “X”
2!
locations. So the total will be 12 × 10 = 120.
18
19.
Basics of Probability Theory
a)
(144)(164)
=
(308)
1001×1820
5852925
= 0.31
(148)(160)
3003×1
= 5852925
= 0.000513
(308)
(14)(16)
P [all boy] = 0 30 8 = 1×12870
= 0.0022
5852925
(8)
Therefore,
P [one gender] = 0.0022 + 0.00051 = 0.00271
b) P [all girl] =
c)
P [g > b] =
=
d)
20.
(286)(22)
=
(308)
376740
5852925
% &% &
14
8
16
0
+
% &% &
14
7
16
1
+
% &% &
% &
30
8
14
6
16
2
+
14
5
16
3
3003 + 54912 + 360360 + 1121120
= 0.263
5852925
= 0.064
a) We can look at this question as follows: from the 9 steps, 4 needs to
be upward and 5 to be to the right. Therefore, out of 9 steps we want
to pick 4 upward ones.
% & We get,
Number of paths = 94 = 126.
b) Number of paths from A to C (similar part a) is
of paths from C to B is
% &
5
3
a)
if X = +2
Y
HH
0
% &
4
2
and number
. Thus the number of all paths from A
to B which pass through C is
(4)·(5)
P [C] = 2 9 3 = 6×10
= 0.476.
126
(4)
21.
% &% &
% &
4
2
·
% &
5
3
. So the required probability
if X = −2
HT or T H
+1
TT
+2
Y
HH
0
HT or T H
-1
S = {(+2, 0), (+2, +1), (+2, +2), (−2, 0), (−2, −1), (−2, −2)}
b) E = {+1, +2}
c) {Y = 0} ={number of heads tossed was 2}
d)
TT
-2
1.11 Solutions for the Illustrated Problems
1/4
+2
1/2
19
(X,Y)
Probability
HH
(+2,0)
1/8
HT or TH
(+2,+1)
1/4
TT
(+2,+2)
1/8
HH
(-2,0)
1/8
HT or TH
(-2,-1)
1/4
TT
(-2,-2)
1/8
1/4
1/2
1/4
1/2
-2
1/2
1/4
e) P [+2, 0] = 1/8
P [−2, 0] = 1/8
f) P [Y = 0] = 1/4
P [Y = −1] = 1/4
P [+2, +1] = 1/4
P [−2, −1] = 1/4
P [Y = +1] = 1/4
P [Y = −1] = 1/8
P [+2, +2] = 1/8
P [−2, −2] = 1/8
P [Y = +2] = 1/8
=0]
g) P [X = 2|Y = 0] = P [X=2,Y
= 1/2
1/4
Similarly, P [X = +2|Y = +1] = 1, P [X = +2|Y = +2] = 1,
P [X = +2|Y = −1] = P [X = +2|Y = −2] = 0.
22.
a)
P [Path 1 fails] =
=
=
P [Path 2 fails] =
=
P [fail] =
=
P [(R1 fails) ∪ (R2 fails)]
P [R1 fails] + P [R2 fails] − P [(R1 fails) ∩ (R2 fails)]
0.05 + 0.05 − (0.05) · (0.05) = 0.0975
P [R3 fails] + P [R4 fails] − P [(R3 fails) ∩ (R4 fails)]
0.08 + 0.08 − (0.08) · (0.08) = 0.1536
P [(Path 1 fails) ∩ (Path 2 fails)]
(0.1536) · (0.0975) = 0.014976
20
Basics of Probability Theory
1.12
Drill Problems
Section 1.1,1.2,1.3 and 1.4 - Set theory and
Probability axioms
1. A 6-sided die is tossed once. Let the event A be defined A =‘outcome is a
prime number’.
a) Write down the sample space S.
b) The die is unbiased (i.e. all outcomes are equally likely). What is the
probability P [A] of event A?
c) Suppose that the die was biased such that: the outcomes 2, 3 and 4
are equally likely; and the outcome 1 is twice as likely as the others.
What would have been the probability P [A] of event A?
Ans
a) S = {1, 2, 3, 4, 5, 6}
b) P [A] = 0.5
c) P [A] =
3
7
2. An unbiased 4-sided die is tossed. Let the events A and B be defined as:
A =‘outcome is a prime number’ and B = {4}.
a) Find probabilities P [A] and P [B].
b) What is A ∩ B? Write down P [A ∩ B].
c) What does this imply about A and B?
d) Find P [(A ∩ B)c ]
Ans
a) P [A] = 0.5, P [B] = 0.25
b) A ∩ B = φ, P [A ∩ B] = 0
c) mutually exclusive
d) P [(A ∩ B)c ] = 1
Section 1.6 - Independence
3. An unbiased 4-sided die is tossed. Let the events A and B be defined as:
A =‘outcome is a prime number’ and B =‘outcome is an even number’.
a) Find probabilities P [A] and P [B].
1.12 Drill Problems
21
b) What is A ∩ B? Write down P [A ∩ B].
c) Are A and B mutually exclusive?
d) Are A and B independent?
Ans
a) P [A] = 0.5, P [B] = 0.5
b) A ∩ B = {2}, P [A ∩ B] = 0.25
c) no
d) yes
4. A pair of unbiased 6-sided dies (X and Y ) is tossed simultaneously. Let the
events A and B be denoted as
A: ‘X yields 2’
B: ‘Y yields 2’
a) Write down the sample space S.
Note that each outcome is a pair (x, y), where x, y ∈ {1, 2, 3, 4, 5, 6}.
b) Write A and B as sets of outcomes. Find corresponding probabilities
P [A] and P [B].
c) What is A ∩ B? Write down P [A ∩ B].
d) What is P [A ∪ B]?
e) Are A and B mutually exclusive?
f) Are A and B independent?
Ans
a) S = {(i, j)|1 ≤ i, j ≤ 6}
b) A = {(2, j)|1 ≤ j ≤ 6},
B = {(i, 2)|1 ≤ i ≤ 6},
P [A] = 16 , P [B] = 16
c) A ∩ B = {(2, 2)}, P [A ∩ B] =
d) P [A ∪ B] =
e) no
f) yes
11
36
1
36
22
Basics of Probability Theory
Section 1.5 - Conditional Probability
5. In a certain experiment, A, B, C, and D are events with probabilities P [A] =
1/4, P [B] = 1/8, P [C] = 5/8, and P [D] = 3/8. A and B are disjoint, while
C and D are independent.
Hint: Venn diagrams are helpful for problems like this.
a) Find P [A ∩ B], P [A ∪ B], P [A ∩ B c ], and P [A ∪ B c ].
b) Are A and B independent?
c) Find P [C ∩ D], P [C ∩ Dc ], and P [C c ∩ Dc ].
d) Are C c and Dc independent?
e) Find P [A|B] and P [B|A].
f) Find P [C|D] and P [D|C].
g) Verify that P [C c |D] = 1 − P [C] for this problem. Can you interpret
its meaning?
Ans
a) P [A ∩ B] = 0, P [A ∪ B] = 0.375,
P [A ∩ B c ] = 0.25, P [A ∪ B c ] = 0.875
b) no
c) P [C ∩ D] = 0.2344, P [C ∩ Dc ] = 0.3906,
P [C c ∩ Dc ] = 0.2344
d) yes
e) P [A|B] = 0, P [B|A] = 0
f) P [C|D] = 0.625, P [D|C] = P [D] = 3/8
6. Let A be an arbitrary event. Events D, E and F form an event space.
P [D] = 0.35 P [A|D] = 0.4
P [E] = 0.55 P [A|E] = 0.2
P [F ] = ?
P [A|F ] = 0.3
a) Find P [F ] and P [A].
b) Find P [A ∩ D].
c) Use Bayes’ rule to compute P [D|A] and P [E|A].
d) Can you compute P [F |A] without using the Bayes’ rule?
e) Compute P [Ac |D], P [Ac |E] and P [Ac |F ]. What is the axiom you had
to use?
f) Use Bayes’ rule to compute P [D|Ac ] and P [E|Ac ].
1.12 Drill Problems
23
g) What theorem(s) you need to compute P [Ac ]? Is the value in agreement
with P [A] computed in question 10 part a).
Ans
a) P [F ] = 0.1, P [A] = 0.28
b) P [A ∩ D] = 0.14
c) P [D|A] = 0.5, P [E|A] = 0.3929
d) P [F |A] = 1 − P [D|A] − P [E|A]
e) P [Ac |D] = 0.6, P [Ac |E] = 0.8,
P [Ac |F ] = 0.7
f) P [D|Ac ] = 0.2917, P [E|Ac ] = 0.6111
Section 1.7 - Sequential experiments and tree
diagrams
7. Tabulated below is the number of different electronic components contained
in boxes B1 and B2 .
B1
B2
capacitors
3
1
diodes
3
5
A box is chosen at random, then a component is selected at random from the
box. The boxes are equally likely to be selected. The selection of electronic
components from the chosen box is also equally likely.
a) Draw a probability tree for the experiment.
b) What is the probability that the component selected is a diode?
c) Find the probability of selecting a capacitor from B1 .
d) Suppose that the component selected is a capacitor. What is the probability that it came from B1 ?
Ans
b) 0.6667
c) 0.25
d) 0.75
24
Basics of Probability Theory
8. Consider the following scenario at the quality assurance division of a certain
manufacturing plant.
In each lot of 100 items produced, two items are tested; and the whole lot
is rejected if either of the tested items is found to be defective. Outcome of
each test is independent of the other tests.
Let q be the probability of an item being defective. Suppose A denotes the
event ‘the lot under inspection is accepted’; and k denotes the event ‘the lot
has k defective items’, where k ∈ {0, . . . , 100}.
a) Compute probability P [k] of having k defective items in a lot.
b) Find the probability P [A ∩ k] that a lot with k defective items is accepted.
Note: check whether your result for P [A ∩ k] is intuitive for both k = 0
and k = 99.
c) What is the conditional probability P [A|k] of ‘a lot being accepted’
given it has k defective items?
Ans
a) P [k] =
b)
c)
%
100
k
&
q k (1−q)100−k
-% &
98
k
q k (1−q)100−k , k ∈ {0,..,98}
0
, k ∈ {99, 100}
-%
&%
&
k
k
1− 100
1− 99
, k ∈ {0,..,98}
0
, k ∈ {99, 100}
9. In a binary digital communication channel the transmitter sends symbols
{0, 1} over a noisy channel to the receiver. Channel introduced errors may
make the symbol received to be different from what transmitted.
Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where
i ∈ {0, 1}.
Relevant symbol and error probabilities are tabulated below.
i
0
1
P [Si ]
0.6
0.4
P [R0 |Si ]
0.9
0.05
a) Draw corresponding probability tree.
b) Find the probability that a symbol is received in error.
1.12 Drill Problems
25
c) Given that a “zero" is received, what is the conditional probability that
a “zero" was sent?
d) Given that a “zero" is received, what is the conditional probability that
a “one" was sent?
Ans
b) 0.08
c) 0.9643
d) 0.0357
10. In a ternary digital communication channel the transmitter sends symbols
{0, 1, 2} over a noisy channel to the receiver. Channel introduced errors may
make the symbol received to be different from what transmitted.
Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where
i ∈ {0, 1, 2}.
Relevant symbol and error probabilities are tabulated below.
i
0
1
2
P [Si ]
0.6
0.3
0.1
P [R0 |Si ]
0.9
0.049
0.1
P [R1 |Si ]
0.05
0.95
0.1
a) Draw corresponding probability tree.
b) Find the probability that a symbol is received in error.
c) Given that a “zero" is received, what is the conditional probability that
a “zero" was sent?
d) Given that a “zero" is received, what is the conditional probability that
a “one" was sent?
e) Given that a “zero" is received, what is the conditional probability that
a “two" was sent?
Ans
b) 0.095
c) 0.9563
d) 0.026
e) 0.0177
26
Basics of Probability Theory
11. In a binary digital communication channel the transmitter sends symbols
{0, 1} over a noisy channel to the receiver. Channel introduced errors may
make the symbol received to be different from what transmitted.
Let Si = {the symbol i is sent} and Ri = {the symbol i is received}, where
i ∈ {0, 1}.
A block of two symbols are sent along the channel. Channel errors on
different symbol periods can be deemed independent.
Relevant symbol and error probabilities are tabulated below.
i
0
1
P [Si ]
0.6
0.4
P [R0 |Si ]
0.9
0.05
a) Draw corresponding probability tree (two-stage).
Note: a ‘stage’ corresponds to a single transmitted symbol.
b) Find the probability that the block is received in error.
c) Given that ‘00’ received, what is the conditional probability that a ‘00’
was sent?
d) Given that ‘00’ received, what is the conditional probability that a ‘01’
was sent?
Ans
b) 0.1536
c) 0.9298
d) 0.0344
12. A machine produces photo detectors in pairs. Tests show that the first photo
detector is acceptable with probability 0.6. When the first photo detector is
acceptable, the second photo detector is acceptable with probability 0.85. If
the first photo detector is defective, the second photo detector is acceptable
with probability 0.35.
Let Ai the event ‘i-th photo detector is acceptable’.
a) Draw a suitable probability tree.
b) Describe the event ‘(Ac1 ∩ A2 ) ∪ (A1 ∩ Ac2 )’ in words. Compute the
corresponding probability.
c) What is the probability P [Ac1 ∩ Ac2 ] that both photo detectors in a pair
are defective?
d) Compute the probability P [A1 |A2 ].
1.12 Drill Problems
27
Ans
b) P [(Ac1 ∩ A2 ) ∪ (A1 ∩ Ac2 )] = 0.23
c) P [Ac1 ∩ Ac2 ] = 0.26
d) P [A1 |A2 ] = 0.7846
Section 1.8 - Counting methods
13. A hospital ward contains 15 male and 20 female patients. Five patients
are randomly chosen to receive a special treatment. Find the probability of
choosing:
a) at least one patient of each gender
b) at least two patient of each gender
c) all patients from the same gender
d) a group where certain two male patients (say Tim and Joe) are not
chosen at the same time
Ans
a) 0.943
b) 0.635
c) 0.057
d) 0.9832
14. A bridge club has 12 members (six married couples). Four members are
randomly selected to form the club executive. Find the probability that the
executive consists of:
a) two men and two women
b) all men or all women
c) no married couples
d) at least two men
Ans
a) 0.4545
b) 0.0606
c) 0.4545
d) 0.7273
28
Basics of Probability Theory
Figure 1.5: a system that includes both series and parallel subsystems
Section 1.9 - Reliability
15. Figure 1.5 shows a system in a reliability study composed of series and parallel subsystems. The subsystems are independent. P [W1 ] = 0.91, P [W2 ] =
0.87, P [W3 ] = 0.50, and P [W4 ] = 0.75. What is the probability that the
system operates successfully?
Ans 0.974
Chapter 2
Discrete Random Variables
2.1
Definitions
Definition 2.1: A random variable (RV) consists of an experiment with a
probability measure P [·] defined on a sample space S and a function that assigns
a real number to each outcome in the sample space of the experiment.
Definition 2.2:
X is a discrete RV if its range is a countable set:
SX = {x1 , x2 , · · · }. Further, X is a finite RV if its range is a finite set:
SX = {x1 , x2 , . . . , xn }.
2.2
Probability Mass Function
Definition 2.3:
is defined as
The probability mass function (PMF) of the discrete RV X
PX (a) = P [X = a].
Theorem 2.1: For a discrete RV with PMF PX (x) and range SX ,
1. For any x, PX (x) ≥ 0.
2.
#
PX (x) = 1.
x∈SX
3. For any event B ⊂ SX , P [B] =
#
x∈B
PX (x).
30
Discrete Random Variables
2.3
Cumulative Distribution Function (CDF)
Definition 2.4: The cumulative distribution function (CDF) of a RV X is
FX (r) = P [X ≤ r]
where P [X ≤ r] is the probability that RV X is no larger than r.
Theorem 2.2: For discrete RV X with SX = {x1 , x2 , · · · }, x1 ≤ x2 ≤ · · ·
• FX (−∞) = 0, FX (∞) = 1.
• If xj ≥ xi , FX (xj ) ≥ FX (xi ).
• For a ∈ SX and ϵ > 0, limϵ→0 FX (a) − FX (a − ϵ) = PX (a).
• FX (x) = FX (xi ) for all x such that xi ≤ x < xi+1 .
• For b ≥ a, FX (b) − FX (a) = P [a < X ≤ b]
2.4
Families of Discrete RVs
Definition 2.5: X is Bernoulli(p) RV if the PMF of X has the form
PX (x) =
with SX = {0, 1}.
⎧
⎪
⎪
⎨1 − p,
⎪
⎪
⎩
p,
0,
x=0
,
x=1
otherwise
Definition 2.6: X is a Geometric(p) RV if the PMF of X has the form
PX (x) =
⎧
⎨p(1 − p)x−1 ,
⎩0,
x = 1, 2, . . .
,
otherwise
2.5 Averages
31
Definition 2.7: X is Binomial(n, p) RV if the PMF of X has the form
⎧% &
⎨ n px (1 − p)n−x ,
PX (x) = ⎩
x
0,
x = 0, 1, 2, . . . , n
otherwise
,
where 0 < p < 1 and n is an integer with n ≥ 1.
Definition 2.8: X is Pascal(k, p) RV (also known as negative binomial RV)
if the PMF of X has the form
PX (x) =
⎧%
&
⎨ x−1 pk (1 − p)x−k ,
k−1
⎩0,
x = k, k + 1, k + 2, . . .
otherwise
,
where 0 < p < 1 and k is an integer such that k ≥ 1.
Definition 2.9:
form
X is Discrete Uniform(k, l) RV if the PMF of X has the
PX (x) =
⎧
⎨
1
,
l−k+1
⎩0,
x = k, k + 1, k + 2, . . . , l
,
otherwise
where the parameters k and l are integers such that k < l.
Definition 2.10: X is Poisson(α) RV if the PMF of X has the form
PX (x) =
where α > 0.
2.5
⎧
⎨ αx e−α ,
⎩0,
x!
x = 0, 1, 2, . . .
,
otherwise
Averages
Definition 2.11: A mode of X is a number xmod satisfying PX (xmod ) ≥ PX (x)
for all x.
Definition 2.12: A median of X is a number xmed satisfying P [X < xmed )] =
P [X > xmed )].
32
Discrete Random Variables
Definition 2.13: The mean (aka expected value or expectation) of X is
E[X] = µX =
#
xPX (x).
x∈SX
Theorem 2.3:
1. If X ∼ Bernoulli(p), then E[X] = p.
2. If X ∼ Geometric(p), then E[X] = 1/p.
3. If X ∼ Poisson(α), then E[X] = α.
4. If X ∼ Binomial(n, p), then E[X] = np.
5. If X ∼ Pascal(k, p), then E[X] = k/p.
6. If X ∼ Discrete Uniform(k, l), then E[X] = (k + l)/2.
2.6
Function of a Random Variable
Theorem 2.4: For a discrete RV X, the PMF of Y = g(X) is
PY (y) = P [Y = y] =
#
PX (x)
x:g(x)=y
i.e., P [Y = y] is the sum of the probabilities of all the events X = x for which
g(x) = y.
2.7
Expected Value of a Function of a Random
Variable
Theorem 2.5: Given X with PMF PX (x) and Y = g(X), the expected value
of Y is
#
E[Y ] = µY = E[g(X)] =
g(x)PX (x).
x∈SX
2.8 Variance and Standard Deviation
2.8
33
Variance and Standard Deviation
2
Definition 2.14: The variance of RV X is VAR[X] = σX
= E[(X − µX )2 ],
VAR[X] =
#
x∈SX
(x − µX )2 PX (x) ≥ 0.
Equivalently, the expected value of Y = (X − µX )2 is VAR [X].
Definition 2.15: The standard deviation of RV X is σX =
Theorem 2.6:
2
VAR[X].
VAR[X] = E[X 2 ] − (E[X])2
Theorem 2.7: For any two constants a and b,
VAR[aX + b] = a2 VAR[X]
Theorem 2.8:
1. If X ∼ Bernoulli(p), then VAR[X] = p(1 − p).
2. If X ∼ Geometric(p), then VAR[X] = (1 − p)/p2 .
3. If X ∼ Binomial(n, p), then VAR[X] = np(1 − p).
4. If X ∼ Pascal(k, p), then VAR[X] =
k(1 − p)
.
p2
5. If X ∼ Poisson(α), then VAR[X] = α.
6. If X ∼ Discrete Uniform(k, l), then VAR[X] =
(l − k)(l − k + 2)
.
12
Definition 2.16: For RV X,
(a) The n-th moment is E[X n ]
(b) The n-th central moment is E[(X − µX )n ].
2.9
Conditional Probability Mass Function
Definition 2.17: Given the event B, with P [B] > 0, the conditional probability mass function of X is
PX|B (x) = P [X = x|B].
34
Discrete Random Variables
⎧
P [X=x]
P [X = x, B] ⎨ P [B] , x ∈ B
Theorem 2.9: For B ⊂ SX , PX|B (x) =
=
.
⎩0,
P [B]
otherwise
Definition 2.18: The conditional expected value of RV given condition is
E[X|B] = µX|B =
#
xPX|B (x).
x∈B
Theorem 2.10: The conditional expected value of Y = g(X) given condition
B is
#
E[Y |B] = µY |B =
g(x)PX|B (x).
x∈B
2.10
Basics of Information Theory
Definition 2.19: The information content of any event A is defined as
I(A) = − log2 P [A]
This definition is extended to a Random Variable X.
Definition 2.20: The information content of X is defined as
I(X) = −E[log2 (P [X = x])]
=−
#
x
PX (x) log2 PX (x)
I(X) is measured in bits. Suppose X produces symbols s1 , s2 , ...sn . A binary code
is used to represent the symbols Let li bits used represent si , for i = 1...n .
Definition 2.21: The average length of the code is
E[L] =
#
pi li
i
Definition 2.22: The efficiency of the code is defined as
η=
I(X)
× 100%
E[L]
2.11 Illustrated Problems
35
Theorem 2.11: Huffman’s Algorithm
1. Write symbols in decreasing order with their probabilities.
2. Merge in pairs from the bottom and reorder.
3. Repeat until one symbol is left.
4. Code each branch with "1" or "0".
2.11
Illustrated Problems
1. Two transmitters send messages through bursts of radio signals to an antenna. During each time slot each transmitter sends a message with probability 1/2. Simultaneous transmissions result in loss of the messages. Let
X be the number of time slots until the first message gets through. Let Ai
be the event that a message is transmitted successfully during the i-th time
slot.
a) Describe the underlying sample space S of this random experiment (in
terms of Ai andAci ) and specify the probabilities of its outcomes.
b) Show the mapping from S to SX , the range of X.
c) Find the probability mass function of X.
d) Find the cumulative distribution function of X.
2. An experiment consists of tossing a fair coin until either three heads or two
tails have appeared (not necessarily in a row). Let X be the number of
tosses required.
a) Describe the underlying sample space S of this random experiment
using a tree diagram and specify the probabilities of its outcomes.
b) Show the mapping from S to SX , the range of X.
c) Find the probability mass function of X.
d) Find the cumulative distribution function of X.
3. Ten balls numbered from 1 to 10 are in an urn. Four balls are to be chosen
at random (equally likely) and without replacement. We define a random
variable X which is the maximum of the four drawn balls (e.g., if the drawn
balls are numbered 3, 2, 8 and 6, then X = 8).
a) What is the range of X, SX ?
b) Find the PMF of X and plot it.
c) Find the probability that X be greater than or equal to 7.
4. The Oilers and Sharks play a best out 7 playoff series. The series ends as
soon as one of the teams has won 4 games. Assume that Sharks (Oilers)
36
Discrete Random Variables
are likely to win any game with a probability of 0.45(0.55) independently of
any other game played. For n = 4, 5, 6, 7 define events On = {Oilers win the
series in n games} and Sn = {Sharks win the series in n games}.
a) Suppose the total number of games played in the series is N . Describe
the event {N = n} in terms of On and Sn and find the PMF of N .
b) Let W be the number of Oilers wins in the series. Now, express the
events {W = n} for n = 0, 1, . . . , 4 in terms of On and Sn and find the
PMF of W .
5. The random variable X has PMF
⎧
⎨k cx+1 ,
PX (x) = ⎩
x2 +1
0,
x = −2, −1, 0, 1, 2
otherwise
a) Find the value of k and the range of c for which this is a valid PMF.
b) For c = 0 and k found in part a, compute and plot the CDF of X.
c) Compute the mean and the variance of X.
6. The CDF of a random variable is as follows
⎧
⎪
⎪
⎪r,
⎪
⎪
⎪
⎪
⎪
⎨0.3,
a<1
1≤a<3
FX (a) = s,
3≤a<4
⎪
⎪
⎪
⎪
0.9, 4 ≤ a < 6
⎪
⎪
⎪
⎪
⎩
t,
6≤a
a) What are the values of r and t and the valid range of s?
b) What is P [2 < X ≤ 5]?
c) Knowing that P [X = 3] = P [X = 4], Find s and plot the PMF of X.
7. Studies show that 20% of people are left handed. Also, it is known that 15%
of people are allergic to dust.
a) What is the probability that in a class of 40 students, exactly 8 students
be left handed?
b) Assuming that being left handed is independent of being allergic to
dust, what is the probability that in a class of 30 students more than
2 students be both left handed and allergic to dust?
c) To study a new allergy medicine, the goal is to select a group of 10
people that are allergic to dust. Randomly selected people are tested to
check whether or not they are allergic to dust. What is the probability
that after testing exactly 75 people, the needed group of 10 is found?
2.11 Illustrated Problems
37
8. A game is played with probability of win P [W ] = 0.4. If the player wins
10 times (not necessarily consecutive) before failing 3 times (not necessarily
consecutive), a $100 award is given. What is the probability that the award
is won? Hint: Identify all award-winning cases (10W, 10W + 1F, 10W + 2F )
and notice that all award-winning cases finish with a W .
9. Phone calls received on a cell phone are totally random in time. Therefore
(as we proved in class), the number of telephone calls received in a 1 hour
period is a Poisson random variable. If the average number of calls received
during 1 hour is 2 (meaning that α = 2) answer the following questions:
a) What is the probability that exactly 2 calls are received during this one
hour period?
b) The cell phone is turned off for 15 minutes, what is the probability that
no call is missed.
c) What is the probability that exactly 2 calls are received during this one
hour period and both calls are received in the first 30 minutes?
d) Find the standard deviation of the number of calls received in 15 minutes.
10. A stop-and-wait protocol is a simple network data transmission protocols in
which both the sender and receiver participate. In its simplest form, this
protocol is based on one sender and one receiver. The sender establishes
the connection and sends data in packets. Each data packet is acknowledged by the receiver with an acknowledgement packet. If a negative acknowledgement arrives (i.e., the received packet contains errors), the sender
retransmits the packet.
Now consider the use of this protocol on a network with packet error rate
1/70 (acknowledgement packets are assumed to receive perfectly). Let X be
the number of transmissions necessary to send one packet successfully.
a) Find the probability mass function of X.
b) Find the mean and variance of X.
c) If successful transmission does not take place in 12 attempts, the sender
declares a transmission failure. Find the probability of a transmission
failure.
d) Assume that 100 packets are to be transmitted. Let Y be the number
of transmissions necessary to send all 100 packets. Find the probability
mass function of Y .
e) Find the mean and variance of Y .
11. Find the n-th moment and the n-th central moment of X ∼ Bernoulli(p).
38
Discrete Random Variables
12. The random variable X has PMF
PX (x) =
a) Compute FX (x).
⎧
⎨c/(1 + x2 ),
⎩0,
x = −3, −2, . . . , 3
otherwise
b) Compute E[X] and VAR [X].
c) Consider the function Y = 2X 2 . Find PY (y).
d) Compute E[Y ] and Var[Y ].
13. Consider a source sending messages through a noisy binary symmetric channel (BSC); for example, a CD player reading from a scratched music CD, or
a wireless cellphone capturing a weak signal from a relay tower that is too
far away.
For simplicity, assume that the message being sent is a sequence of 0’s and
1’s. The BSC parameter is p. That is, when a 0 is sent, the probability that
a 0 is (correctly) received is p and the probability that a 1 is (incorrectly)
received is 1 − p. Likewise, when a 1 is sent, the probability that a 1 is
(correctly) received is p and the probability that a 0 is (incorrectly) received
is 1 − p.
Let p = 0.97 for the BSC. Suppose the all-zero byte (i.e. 8 zeros) is transmitted over this channel. Let X be the number of 1s in the received byte.
a) Find the probability mass function of X.
b) Compute E[X] and Var[X].
c) Suppose that in all transmitted bytes, the eighth bit is reserved for
parity (even parity is set for the whole byte), so that the receiver can
perform error detection. Let E be the event of an undetectable error.
Describe E in terms of X. Find P [E].
14.
PX (x) =
⎧
⎨c/(1 + x2 ),
⎩0,
x = −3, −2, . . . , 3
otherwise
a) Define event B = {X ≥ 0}. Compute PX|B (x).
b) Compute FX|B (x).
c) Compute E[X|B] and Var[X|B].
15. Let X be a Binomial(8, 0.3) random variable.
a) Find the standard deviation of X.
b) Define B={X is odd}. Find PX|B (x).
2.12 Solutions for the Illustrated Problems
39
c) Find E[X|B].
d) Find Var[X|B].
2.12
1.
Solutions for the Illustrated Problems
a) Ai : one of them sends message at ith time slot, P [Ai ] = 14 + 14 = 12
Aci : both or none of them sends message at ith time slot, P [Aci ] = 12
S = {A1 , Ac1 A2 , Ac1 Ac2 A3 , . . . , Ac1 Ac2 · · · Acn−1 An , . . .}
b)
S
A1
!
A1c A2
!
A1c A2c A3
A1c A2c " Anc−1 An
!
!
SX
1
2
3
n
c) PX (t) =
⎧
⎨(1/2)t ,
⎩0,
⎧
⎪
0,
⎪
⎪
⎨
d) FX (t) = ⎪
or
FX (t) =
2.
a)
⎪
⎪
⎩1 +
2
⎧
⎨0,
t ∈ {1, 2, . . .}
otherwise
t<1
..
.
1
4
+ ··· +
⎩F (t − 1) +
X
1
2n−1
=1−
% &n−1
1
2
1
,
2n−1
t<1
n − 1 ≤ tt < n
, n−1≤t<n
40
Discrete Random Variables
H
1/2
1/2
1/2
H
H
T
1/2
T
1/2
H
1/2
T
1/2
H
H
1/2
1/2
1/2
1/2
T
1/2
T
1/2
H
1/2
H
1/2
H
1/2
1/2
1/2
T
T
T
T
P [HHH] = 1/8, P [HHT H] = P [HT HH] = P [HT HT ] = 1/16, P [HT T ] =
1/8
P [T HHH] = P [T HHT ] = 1/16, P [T HT ] = 1/8, P [T T ] = 1/4, P [HHT T ] =
1/16
b)
!
!
HHH
HTT
THT
HHTH, HTHH,
HTHT, HHTT,
THHH, THHT
!
!
2
3
4
TT
S
SX
⎧
⎪
1/4,
⎪
⎪
⎪
⎪
⎨3/8,
t=2
t=3
c) PX (t) =
⎪
⎪
3/8, t = 4
⎪
⎪
⎪
⎩
0,
otherwise
⎧
⎪
0,
⎪
⎪
⎪
⎪
⎨1/4,
3.
t<2
2≤t<3
d) FX (t) =
⎪
⎪
5/8, 3 ≤ t < 4
⎪
⎪
⎪
⎩
1,
t≥4
a) SX = {4, 5, 6, 7, 8, 9, 10}
2.12 Solutions for the Illustrated Problems
41
b) The probability that x = n means one of these four balls is n and the
other three are chosen form n − 1 balls with number less than n.
P [X = n] =
P [X
P [X
P [X
P [X
%
&% &
n−1 1
3
1
% &
10
4
=
(n − 1).(n − 2).(n − 3)
1260
1
= 4] = 210
= 0.0048
P [X = 5] =
5×4×3
= 6] = 1260 = 0.048 P [X = 7] =
= 8] = 7×6×5
= 0.167 P [X = 9] =
1260
9×8×7
= 10] = 1260 = 0.4
4×3×2
1260
6×5×4
1260
8×7×6
1260
= 0.019
= 0.095
= 0.267
c) P [X ≥ 7] = 0.095 + 0.167 + 0.267 + 0.4 = 0.929
4.
a) {N = n} is the event that the series ends in n games. This means
either Sn or On occurs: {in n − 1 games, Sharks win 3 times (and
Oilers win n − 4 times) and in the nth game, Sharks win} or {in n − 1
games, Oilers win 3 times (and Sharks win n − 4 times) and in the nth
game, Oilers win}.
{N = n} = O%n ∪ &Sn
%
&
n−1
4
n−4
P [N = n] = n−1
·
(0.45)
(0.55)
+
· (0.45)n−4 (0.55)4
3
3
b) {W = 0} = S4 ,
{W = 1} = S5 , {W = 2} = S6 ,
{W = 4} = O4 + O5 + O6 + O7
4
P [W = 0] = %(0.45)
= 0.041
&
4
P [W = 1] = 3 (0.45)4 · (0.55) = 0.09
P [W = 2] =
% &
5
%3&
{W = 3} = S7
(0.45)4 · (0.55)2 = 0.124
P [W = 3] = 63 (0.45)4 · (0.55)3 = 0.136
P [W = 4] = 0.608
5.
a)
$
PX (x) = 1 ⇒ k
3
1−2c
5
+
1−c
2
+1+
1+c
2
PX (−2) ≥ 0 ⇒ 1 − 2c ≥ 0 ⇒ c ≤ 0.5
PX (2) ≥ 0 ⇒ 1 + 2c ≥ 0 ⇒ c ≥ −0.5
thus, −0.5 ≤ c ≤ 0.5
b)
⎧
⎪
0,
⎪
⎪
⎪
⎪
⎪
⎪
1/12,
⎪
⎪
⎪
⎪
⎨
x < −2
x = −2
7/24, x = −1
FX (x) =
⎪
⎪
⎪17/24, x = 0
⎪
⎪
⎪
⎪
22/24, x = 1
⎪
⎪
⎪
⎪
⎩1,
x≥2
+
1+2c
5
4
=1⇒k=
5
12
42
Discrete Random Variables
c) With c =⎧0 and k
⎪
⎪
⎪1/12,
⎪
⎪
⎪
⎪
5/24,
⎪
⎪
⎪
⎪
⎨5/12,
PX (x) =
⎪
⎪
5/24,
⎪
⎪
⎪
⎪
⎪
1/12,
⎪
⎪
⎪
⎪
⎩0,
= 5/12 we have
x = −2
x = −1
x=0
x=1
x=2
otherwise
Therefore,
1
5
5
5
1
E[X] = 12
× (−2) + 24
× (−1) + 12
× (0) + 24
× (+1) + 12
× (+2) = 0
and
1
5
5
VAR[X] = E[(X − 0)2 ] = E[X 2 ] = 12
× (−2)2 + 24
× (−1)2 + 12
× (0) +
5
1
13
2
2
×
(+1)
+
×
(+2)
=
24
12
12
6.
a) r = 0, t = 1, 0.3 ≤ s ≤ 0.9
Recall that FX (−∞) = 0, FX (∞) = 1 and that FX (a) is non-decreasing.
b) P [a < X ≤ b] = FX (b) − FX (a)
P [2 < X ≤ 5] = FX (5) − FX (2) = 0.9 − 0.3 = 0.6
lim
(FX (3) − FX (3 − ε)) = s − 0.3
ε→0
lim
P [X = 4] =
(FX (4) − FX (4 − ε)) = 0.9 − s
ε→0
s − 0.3 = 0.9 − s ⇒ s = 0.6
⎧
⎪
a<1
⎪
⎪0,
⎪
⎧
⎪
⎪
⎪
⎪
0.3,
1
≤
a
<
3
⎪
⎪
⎨
⎨0.3, t ∈ {1, 3, 4}
⇒ FX (a) = 0.6, 3 ≤ a < 4
⇒ PX (t) = 0.1, t = 6
⎪
⎪
⎪
⎪
⎪
⎩
⎪
0.9, 4 ≤ a < 6
0,
otherwise
⎪
⎪
⎪
⎪
⎩
1,
6≤a
$
Notice that t PX (t) = 1
c) P [X = 3] =
7.
a) X ∼ Binomial(40, 0.2) ⇒ P [X = 8] =
% &
40
8
(0.2)8 (0.8)32
b) P [both] = 0.2 × 0.15 = 0.03 ⇒ Y ∼ Binomial(30, 0.03)
P [Y > 2] = 1 − P [Y
% =
& 0] − P [Y = 1]
% &
30
where P [Y = 0] = 0 (0.97)30 (0.03)0 , P [Y = 1] = 30
(0.97)29 (0.03)1
0
c) The last person tested is allergic (since the group is formed and no need
for more tests)
% ⇒
& Z ∼ Pascal(10, 0.15).
74
P [Z = 75] = 9 (0.15)10 (1 − 0.15)65
8. Award-winning cases all end with W and thus can be modeled with Pascal(10, 0.4).
2.12 Solutions for the Illustrated Problems
10W
→ X = 10
10W, 1F
10W, 2F
9.
→
→ X = 11 →
→ X = 12 →
% &
9
% 9&
43
(0.4)10 (0.6)0
=a
10
(0.4)10 (0.6)1
%9&
11
(0.4)10 (0.6)2
9
=b
=c
P [$100] = a + b + c
2
a) P [X = 2] = e−2 22! = 0.27
2
b) λ = 60
(average per minute) ⇒ for 15 minutes α = 15 × λ = 0.5
0
P [X = 0] = e−0.5 (0.5)
= 0.6
0!
c) P [2+ in first 30 & 0 in
second 30] = P [2 ,in first 30]P [0 in second 30]
,+
2 2
2 0
2 (30×
2 (30×
60 )
60 )
−30× 60
= e−30× 60
e
= 0.068.
2!
0!
Notice that for 30 minutes: α = 30 ×
10.
2
.
60
d) For 15 minutes we saw that α = 0.5. We also know that for Poisson
RV VAR = α.
2
√
Thus, std = VAR[X] = 0.5 = 0.71.
a) The probability that X = n is the probability that the first n − 1
transmission were unsuccessful and the nth transmission is successful
[Geometric RV with probability of success p = 69/70].
% &n−1 % &
1
P [X = n] = 70
· 69
70
b) X is a Geometric RV, ∴ E [X] =
(1 − p)/p2 = 0.0147.
1
p
=
70
69
= 1.014 and VAR [X] =
c) P [failure] = P [X > 12] = 1 − P [X ≤ 12] = 1 −
=1−
%
69
70
&
·
+
1
1−( 70
)
12
1
1−( 70
)
,
=
%
1
70
&12
= 7.2 × 10−12
%
69
70
& )$
12 %
·
n=1
1
70
&n−1 *
Alternative solution:
*
*
% &) $
% &)$
∞ % &n−1
∞ % &m+12
1
69
1
P [failure] = P [X > 12] = 69
·
=
·
70
70
70
70
=
%
69
70
& %
·
1
70
&12 )$
∞%
m
1
70
&m *
=
%
n=13
69
70
& %
·
1
70
&12
·
1
1
1−( 70
)
=
%
m=0
&12
1
70
Without
detailed derivation, it could be easily argued that the solution
% &12
1
is 70 . How?
d) The probability that Y = n n ≥ 100 is the probability that in the first
n − 1 transmissions, only 99 of them were successful and also the nth
transmission is also successful [In other words, Y is a Pascal(100, 69/70)
RV]. Therefore:
%
& % &100 % &n−100
1
P [Y = n] = n−1
· 69
· 70
99
70
44
Discrete Random Variables
e) Y is a Pascal random variable. Thus, E[Y ] =
k
p
VAR [Y ] = k(1 − p)/p2 = 1.47.
=
100
( 69
70 )
= 101.45 and
11. E[X n ] = 1n p + 0n q = p
E[(X − µ)n ] = E[(X − p)n ] = (1 − p)n p + (−p)n q
12.
a)
3
$
x=−3
so,
c
1+x2
PX (x) =
b) E[X] =
=1⇒c=
⎧
1
⎪
,
⎪
⎪
26
⎪
⎪
2
⎪
⎪
,
⎪
26
⎪
⎪
⎪
5
⎪
⎪
⎪
⎨ 26 ,
10
,
⎪ 26
⎪
⎪
5
⎪
,
⎪
⎪
26
⎪
⎪
⎪
2
⎪
,
⎪
⎪
⎪ 26
⎪
⎩1,
26
3
$
x=−3
5
13
5
13
x = −3
x = −2
x = −1
x=0
x=1
x=2
x=3
%
x
1+x2
&
3
$
x=−3
y = 18
y=8
c) PY (y) = ⎪ 13
5
⎪
, y ∈ {0, 2}
⎪
13
⎪
⎪
⎩
0, otherwise
,
13
⎪9,
⎪
⎪
13
⎪
⎪
⎪
23
⎪
,
⎪
⎪
26
⎪
⎪
⎪
25
⎪ ,
⎪
⎪
⎪ 26
⎪
⎩1,
=0
VAR[X] = E[X 2 ] − 0 =
⎧
1
⎪
,
⎪
13
⎪
⎪
⎪
⎨2,
FX (x) =
⎧
⎪
0,
⎪
⎪
⎪
⎪
⎪
1
⎪
⎪ 26 ,
⎪
⎪
⎪
3
⎪
⎪
,
⎪
26
⎪
⎪
⎪
⎨4
5
13
%
x2
1+x2
&
=
22
13
x < −3
−3 ≤ x < −2
−2 ≤ x < −1
−1 ≤ x < 0
0≤x<1
1≤x<2
2≤x<3
x≥3
= 1.6923
d) E[Y ] = 18
+ 16
+ 10
= 44
= 3.3846
13
13
13
13
182
82 ×2
22 ×5
2
E[Y ] = 13 + 13 + 13 = 472
13
VAR[Y ] = E[Y 2 ] − E[Y ]2 = 24.852
13.
a) PX (x) =
⎧% &
⎨ 8 (0.03)x (0.97)8−x ,
x
x = 0, 1, . . . , 8
otherwise
It is Binomial distribution with n = 8, p = 0.03.
b) E[X] =
⎩0,
8
$
x=0
xPX (x) = np = 8 × 0.03 = 0.24
VAR[X] = npq = 8 × 0.03 × 0.97 = 0.2328
c) E = {X is even and X ̸= 0} = {undetectable error} P [E] = PX (2) +
PX (4) + PX (6) + PX (8) = 0.02104
2.12 Solutions for the Illustrated Problems
14.
a) P (B) =
5
13
%
1 + 12 + 15 +
⎧
5
⎪
,
⎪
9
⎪
⎪
⎪
⎪
⎪5,
⎪
⎨ 18
1
10
&
=
x=0
x=1
1
PX|B (x) = 9 , x = 2
⎪
⎪
⎪
1
⎪
, x=3
⎪
⎪
18
⎪
⎪
⎩
0, otherwise
45
9
13
⎧
⎪
0,
⎪
⎪
⎪
⎪
5
⎪
⎪
⎪
⎨9,
x<0
0≤x<1
5
b) FX|B (x) = ⎪ 6 , 1 ≤ x < 2
⎪
⎪
17
⎪
⎪
⎪ 18 , 2 ≤ x < 3
⎪
⎪
⎩
1, x ≥ 3
5
3
+ 29 + 18
= 23
18
5
9
= 18
+ 49 + 18
= 11
9
c) E[X|B] =
E[X 2 |B]
15.
⇒ VAR[X|B] =
11
9
−
% &2
2
3
=
7
9
a) E[X] = np = 0.3 × 8 = 2.4 (recall: E[Bionomial(n, p)] = np)
VAR[X] = np(1−p) = 8×0.3×0.7 = 1.68 (recall: VAR[Bionomial(n, p)] =
np(1 −2p))
σX = VAR[X] = 1.3
b)
P [X
P [X
P [X
P [X
P [X
P [X
P [X
P [X
P [X
= 0|B]
= 1|B]
= 2|B]
= 3|B]
= 4|B]
= 5|B]
= 6|B]
= 7|B]
= 8|B]
c) E[X|B] =
2.4
$
d) E[X 2 |B] =
k
=0
[B|X]
= P [X]P
=
P [B]
=0
= 0.508
=0
= 0.094
=0
= 0.002
=0
= 0.396
kP [X = k|B] = 1×0.396+3×0.508+5×0.094+7×0.002 =
$
k
0.198×1
0.198+0.254+0.047+0.001
k 2 P [X = k|B] = 12 × 0.396 + 32 × 0.508 + 52 × 0.094 +
72 × 0.002 = 7.42
VAR [X|B] = E[X 2 |B] − (E[X|B])2 = 7.42 − 2.42 = 1.64
46
Discrete Random Variables
2.13
Drill Problems
Section 2.1,2.2 and 2.3 - PMFs and CDFs
1. The discrete random variable K has the following PMF.
⎧
b
⎪
⎪
⎪
⎨
k=0
2b k = 1
PK (k) =
⎪
3b k = 2
⎪
⎪
⎩
0 otherwise
a) What is the value of b?
b) Determine the values of (i) P [K < 2] (ii) P [K ≤ 2] (iii) P [0 < K < 2].
c) Determine the CDF of K.
Ans
a) 1/6
b) (i) 1/2 (ii) 1 (iii) 1/3
2. The random variable N has PMF,
PN (n) =
-
⎧
0
⎪
⎪
⎪
⎨
k<0
1/6 0 ≤ k < 1
c) FK [k] = ⎪
1/2 1 ≤ k < 2
⎪
⎪
⎩
1
k≥2
c
2n
0
n = 0, 1, 2
.
otherwise
a) What is the value of the constant c?
b) What is P [N ≤ 1]?
c) Find P [N ≤ 1|N ≤ 2].
d) Compute the CDF.
Ans
a) 4/7
b] 6/7 c) 6/7
⎧
0
n<0
⎪
⎪
⎪
⎨ 4/7 0 ≤ n < 1
[d) FN [n] =
⎪
6/7 1 ≤ n < 2
⎪
⎪
⎩
1
3. The discrete random variable X has PMF,
PX (x) =
-
n≥2
c/x x = 2, 4, 8
.
0
otherwise
2.13 Drill Problems
47
a) What is the value of the constant c?
b) What is P [X = 4]?
c) What is P [X < 4]?
d) What is P [3 ≤ X ≤ 9]?
e) Compute the CDF of X.
f) Compute the mean E[X] and the variance VAR[X] of X.
Ans
a) 8/7 b) 2/7 c) 4/7 d) 3/7
f) E[X] = 24/7,VAR[X] = 208/49
⎧
0
x<2
⎪
⎪
⎪
⎨ 4/7 2 ≤ x < 4
[e) FX [x] =
⎪
6/7 4 ≤ x < 8
⎪
⎪
⎩
1
x≥8
Section 2.4 and 2.5 - Families of Discrete RVs and
Averages
4. A student got a summer job at a bank, and his assignment was to model
the number of customers who arrive at the bank. The student observed that
the number of customers K that arrive over a given hour had the PMF,
PK (k) =
-
λk e−λ
k!
0
k ∈ {0, 1, . . .}
otherwise
a) Show that PK (k) is a proper PMF. What is the name of this RV?
b) What is P [K > 1]?
c) What is P [2 ≤ K ≤ 4]?
d) Compute E[K] and VAR[K] of K.
Ans
[a) Poisson(λ) b) 1 − e−λ − λe−λ
d) E[K] = VAR[K] = λ
c)
%
λ2
2
+
λ3
6
+
λ4
24
&
e−λ
5. Let X be the random variable that denotes the number of times we roll a
fair die until the first time the number 5 appears.
a) Derive the PMF of X. Identify this random variable.
b) Obtain the CDF of X.
48
Discrete Random Variables
c) Compute the mean E[X] and the variance VAR[X].
Ans
a) PX (x) =
-
0
⎧
5x−1
6x
⎨
x = 1, 2, . . .
Geometric(1/6)
otherwise
% &⌊x⌋
5
x≥1
b) FX (x) = 1 − 6
⎩ 0
otherwise
c) E[K] = 6, VAR[X] = 30
6. Let X be the random variable that denotes the number of times we roll a
fair die until the first time the number 3 or 5 appears.
a) Derive the PMF of X. Identify this random variable.
b) Obtain the CDF of X.
c) Compute the mean E[X] and the variance VAR[X].
Ans
a) PX (x) =
-
⎧
⎨
2x−1
3x
0
b) FX (x) = ⎩ 1 −
0
x = 1, 2, . . .
Geometric(1/3)
otherwise
% &⌊x⌋
2
3
x≥1
otherwise
[c) E[K] = 3, VAR[X] = 6
7. A random variable K has the PMF
PK (k) =
+ ,
5
(0.1)k (0.9)5−k
k
, k ∈ {0, 1, 2, 3, 4, 5}.
Obtain the values of: (i) P [K = 1] (ii) P [K ≥ 1] (iii) P [K ≥ 4|K ≥ 2].
Ans
i) 0.32805 ii) 0.40951 iii) 5.647×10−3
8. The number of N of calls arriving at a switchboard during a period of one
hour is Poisson with λ = 10. In other words,
PN (n) =
-
10n e−10
n!
0
n ∈ {0, 1, . . .}
,
otherwise
a) What is the probability that at least two calls arrive within one hour?
b) What is the probability that at most three calls arrive within one hour?
2.13 Drill Problems
49
c) What is the probability that the number of calls that arrive within one
hour is greater than three but less than or equal to six?
Ans
a) 0.9995 b) 0.0103 c) 0.1198
9. Prove that the function P (x) is a legitimate PMF of a discrete random
variable, where P (x) is defined by
P (x) =
-
2
3
0
% &x
1
3
x ∈ {0, 1, . . .}
.
otherwise
Calculate the mode, expected value and the variance of this random variable.
Ans
mode = 0,
expected value = 1/2, variance = 3/4
10. A recruiter needs to hire 10 chefs. He visits NAIT first and interviews only
10 students; because of high demand he can’t get more students to sign up
for an interview. He knows that the probability of hiring any given NAIT
chef is 0.4. He then goes to SAIT and keeps interviewing until his quota is
filled. At SAIT the probability of success on any given interview is 0.8, and
plenty of students are looking for jobs. Let X be the number of chefs hired
at NAIT, Y the number hired at SAIT, and N = the number of interviews
required to fill his quota.
a) Find PX (x)
b) Find E[X]
c) Find E[Y ]
d) Find E[N ]
Ans
a) B10 (x, 0.4)
b) 4.0
c) 6.0
d) 17.5
Section 2.6 - Function of a RV
11. The discrete random variable X has the following PMF.
⎧
b
⎪
⎪
⎪
⎨
k=0
2b k = 1
PX (k) =
⎪
3b k = 2
⎪
⎪
⎩
0 otherwise
50
Discrete Random Variables
a) What is the value of b?
b) Let Y = X 2 . Determine the PMF of Y . Determine the CDF of Y .
c) Let Z = sin( π2 X). Determine the PMF of Z. Determine the CDF of
Z.
Ans
⎧
1/6 k = 0
⎪
⎪
⎪
⎨
⎧
0
⎪
⎪
⎪
⎨
k<0
1/3 k = 1
1/6 0 ≤ k < 1
a) 1/6 b) PY (k) =
, and FY (k) =
.
⎪
⎪
1/2
k
=
4
1/2 1 ≤ k < 4
⎪
⎪
⎪
⎪
⎩
⎩
0
otherwise
1
k≥4
⎧
⎧
⎪
⎪
k<0
⎨ 2/3 k = 0
⎨ 0
c) PZ (k) = 1/3 k = 1
, and FZ (k) = 2/3 0 ≤ k < 1 .
⎪
⎪
⎩
⎩
0
otherwise
1
k≥1
Section 2.7 and 2.8 - Expected value and Standard
deviation of a function of RVs
12. Consider discrete random variable K defined in Problem 1.
a) Compute the mean E[K] and the variance VAR[K].
b) Suppose another discrete random variable N is defined as: N = K − 1.
– Compute its PMF and CDF.
– What is E[N ] and VAR[N ]?
– Compute E[N 3 ] and E[N 4 ]
c) Suppose N is redefined as: N = (K − 1)2 . Repeat the computations
of part b).
Ans
a) E[K] = 4/3, VAR[K] = 5/9
⎧
1/6 n = −1
⎪
⎪
⎪
⎨ 1/3 n = 0
b) PN (n) =
,
⎪
1/2 n = 1
⎪
⎪
⎩
0
otherwise
⎧
0
⎪
⎪
⎪
⎨
n < −1
1/6 −1 ≤ n < 0
FN (n) = ⎪
,
1/2 0 ≤ n < 1
⎪
⎪
⎩
1
n≥1
E[N 3 ] = 1/3, E[N 4 ] = 2/3.
E[N ] = 1/3, VAR[N ] = 5/9,
2.13 Drill Problems
51
⎧
⎪
⎨ 1/3 n = 0
⎧
⎪
⎨ 0
n<0
c) PN (n) = ⎪ 2/3 n = 1
, FN (n) = ⎪ 1/3 0 ≤ n < 1 ,
⎩
⎩
0
otherwise
1
n≥1
3
4
E[N ] = 2/3, VAR[N ] = 2/9, E[N ] = 2/3, E[N ] = 2/3.
13. Consider discrete random variable N defined in Problem 2.
a) Compute the mean E[N ] and the variance VAR[N ].
b) Suppose another discrete random variable K is defined as: K = N 2 +
3N . Compute E[K].
c) Suppose M = K − N . Find E[M ].
Ans
a) E[N ] = 4/7, VAR[N ] = 26/49 b) E[K] = 18/7 c) E[M ] = 2
Section 2.9 - Conditional PMFs
14. The discrete random variable K has the following PMF.
⎧
b k=0
⎪
⎪
⎪
⎨ 2b k = 1
PK (k) =
⎪
3b k = 2
⎪
⎪
⎩
0
otherwise
a) What is the value of b?
b) Let B = {K < 2}. Determine the values of P [B]
c) Determine the conditional PMF PK|B (k).
d) Determine the conditional mean and variance of K given B.
Ans
⎧
⎪
⎨ 1/3 k = 0
2/3 k = 1
0
otherwise
d) E[K|B] = 2/3, VAR[K|B] = 2/9
a) 1/6 b) 1/2 c) PK|B (k) =
15. Let X is a Geometric(0.5) RV.
a) Find E[X|X > 3]
b) Find VAR[X|X > 3]
⎪
⎩
52
Discrete Random Variables
Ans
a) 5
b) 3
16. An exam has five problems in it, each worth 20 points. Let N be the number of problems a student answers correctly (no partial credit). The PMF
of N is PN (0) = 0.05, PN (1) = 0.10,PN (2) = 0.35,PN (3) = 0.25,PN (4) =
0.15,PN (5) = 0.1, zow.
a) Express the total mark G as a function of N .
b) Find the PMF of G.
c) What is the expected value of G, given that the student answered at
least one question correctly?
d) What is the variance of G, given that the student answered at least
one question correctly?
e) What is the probability that the student’s mark is greater than the
mean plus or minus half the standard deviation, all with the condition
that the student answered at least one question correctly?
Ans
a) G = 20N
b) PG (20x) = PN (x) for x = 0, 1, 2, 3, 4, 5 c) 55.8 d) 530
17. You rent a car from the Fly-by-night car rental company. Let M represent
the distance in miles beyond 100 miles that you will be able to drive before
the car breaks down. If the car has a good engine, denoted as event G, then
M is Geometric(0.03). Otherwise it is Geometric(0.1). Assume further that
P [G] = 0.6.
a) What is the PMF of M , given that the engine is bad? What is E[M ]
and VAR[M ] in this case ?
b) What is the PMF of M generally?
c) What is the probability of the successful completion of a trip of 120
miles without the engine failure?
d) What is your expected distance to travel before engine failure?
Ans
a) PM (m) = 0.1 × (0.9)m−1 , E[M ] = 10, V AR[M ] = 90
0.018 × 0.97m−1 c)0.3904 d) 124 miles
b) 0.04 × 0.9m−1 +
2.13 Drill Problems
53
Section 2.10 - Basics of Information Theory
18. An source outputs independent symbols A,B and C with probabilities 16/20,
3/20 and 1/20 respectively; 100 such symbols are output per second. Consider a noiseless binary channel with a capacity of 100 bits per second. Design a Huffman code and find the probabilities of the binary digits produced.
Find the efficiency of the code.
19. Construct a Huffman code for five symbols with probabilities 1/2, 1/4, 1/8, 1/16, 1/16.
Show that the average length is equal to the source information.
Ans
1.875
20. The types and numbers of vehicles passing a point in a road are to be
recorded. A binary code is to be assigned to each type of vehicle and the
appropriate code recorded on the passage of that type. The average numbers of vehicles per hour are as follows:
Cars : 500 Motorcycles : 50 Buses : 25
50 Vans : 100 Cycles : 50 Others : 25
Lorries : 200
Mopeds :
Design a Huffman code. Find its efficiency and compare it with that of
a simple equal-length binary code. Comment on the feasibility and usefulness of this system.
Chapter 3
Continuous Random Variables
3.1
Cumulative Distribution Function
Definition 3.1: The cumulative distribution function (CDF) of random variable (RV) X is
FX (x) = P [X ≤ x]
Note: if there is no confusion, the subscript can be dropped - F (x).
Theorem 3.1: For any RV X, the CDF satisfies the following:
1. FX (−∞) = 0
2. FX (∞) = 1
3. if a < b, FX (a) ≤ FX (b)
4. P [a < X ≤ b] = FX (b) − FX (a)
Definition 3.2: X is a continuous RV if the CDF is a continuous function.
Then P [X = x] = 0 for any x.
56
Continuous Random Variables
3.2
Probability Density Function
Definition 3.3:
X is
The probability density function (PDF) of a continuous RV
d
FX (x)
dx
Note: if there is no confusion, the subscript can be dropped - f (x).
fX (x) =
Theorem 3.2: For a continuous RV X with PDF fX (x)
1. fX (x) ≥ 0
2.
5∞
−∞
fX (x) dx = 1
3. FX (x) =
5x
−∞
fX (t) dt
4. P [a < X ≤ b] =
5b
a
fX (x) dx
The range of X is defined as SX = {x|fX (x) > 0}.
3.3
Expected Values
Expected values of X and g(X) are important.
Definition 3.4: Expected value of a random variable is defined as
µX = E[X] =
6
∞
−∞
xfX (x) dx.
Theorem 3.3: The expected value of Y = g(X) is
E[Y ] = E[g(X)] =
6
∞
−∞
g(x)fX (x) dx.
You can use this theorem to calculate the moments and the variance.
Definition 3.5: If Y = g(X) = (X − µX )2 , then E[Y ] measures the spread of
X around µX .
VAR[X] =
2
σX
2
= E[(X − µX ) ] =
6
∞
−∞
(x − E[X])2 fX (x) dx.
3.4 Families of Continuous Random Variables
Theorem 3.4: The variance can alternatively be calculated as
2
2
VAR[X] = E[X ] − E [X] =
6
∞
−∞
x2 fX (x) dx − µ2X .
Theorem 3.5: For any two constants a and b,
VAR[aX + b] = a2 VAR[X]
3.4
Families of Continuous Random Variables
These are several important practical RVs.
Definition 3.6: X is a uniform (a, b) RV if the PDF of X is
fX (x) =
where b > a.
⎧
⎨
1
,
b−a
⎩0,
a≤x<b
otherwise
Theorem 3.6: If X is a Uniform(a, b) RV, then
1. CDF is FX (x) =
⎧
⎪
⎪0,
⎨
x−a
,
⎪ b−a
⎪
⎩
1,
2. E[X] =
a+b
.
2
3. VAR [X] =
x≤a
a < x ≤ b.
x>b
(b−a)2
.
12
Definition 3.7: X is an Exponential(λ) RV if the PDF of X is
fX (x) =
where λ > 0.
⎧
⎨λe−λx ,
⎩0,
x≥0
otherwise
57
58
Continuous Random Variables
Theorem 3.7: If X is an Exponential(λ) RV, then
1. FX (x) =
⎧
⎨1 − e−λx ,
⎩0,
x≥0
otherwise
2. E[X] = λ1 . VAR [X] =
1
.
λ2
Definition 3.8: X is an Erlang(n, λ) RV if its PDF is given by
fX (x) =
⎧ n −λx n−1
⎨λ e x
,
⎩0,
(n−1)!
x≥0
x<0
The Erlang (n, λ) RV can be viewed as sum on n independent exponential (λ)
RVs.
Theorem 3.8: If X is an Erlang(n, λ) RV,
1. FX (x) =
⎧%
⎨ 1 − e−λx $n−1
⎩0,
2. E[X] = nλ , VAR [X] =
3.5
(λx)j
j=0 j!
&
, x≥0
otherwise
n
.
λ2
Gaussian Random Variables
Definition 3.9: X is a Gaussian (µ, σ 2 ) or N (µ, σ 2 ) random variable if the
PDF of X is
(x−µ)2
1
fX (x) = √
e− 2σ2
2πσ 2
where the parameter µ can be any real number and σ > 0.
Theorem 3.9: If X is a N (µ, σ 2 ) RV, E[X] = µ and VAR [X] = σ 2 .
Definition 3.10: The standard normal random variable Z is the N (0, 1) RV.
The CDF is
6 z
u2
1
√ e− 2 du = Φ(z).
FZ (z) = P [Z ≤ z] =
−∞
2π
Theorem 3.10: If X is a N (µ, σ 2 ) RV, then Y = aX + b is N (aµ + b, a2 σ 2 ).
3.6 Functions of Random Variables
59
Theorem 3.11: If X is a N (µ, σ 2 ) RV, the CDF of X is
)
x−µ
FX (x) = Φ
σ
*
The probability that X is in the interval is
7
a−µ
X −µ
b−µ
P [a < X ≤ b] = P
<
≤
σ
σ
σ
+
,
)
*
b−µ
a−µ
= Φ
−Φ
σ
σ
8
The tabled values of Φ(z) are used to find FX (x). Note that Φ(−z) = 1 − Φ(z).
3.6
Functions of Random Variables
General Idea: If Y = g(X), then find the PDF of Y from the PDF of X.
The CDF method involves two steps:
1. The CDF of Y is obtained by using
FY (y) = P [Y ≤ y] = P [g(X) ≤ y].
2. The PDF of Y is given by
fY (y) =
dFY (y)
.
dy
The PDF method is as follows:
1. The PDF of Y is obtained by using
9
fX (x) 99
fY (y) = ′
9
|g (x)| 9x=g−1 (y)
where g ′ (x) is the derivative of g(x).
2. If x = g −1 (y) is multi-valued, then the sum is over all solutions of y = g(x).
Theorem 3.12: If Y = aX + b, then the PDF of Y is
fY (y) = fX
+
,
y−b
1
·
a
|a|
60
Continuous Random Variables
Theorem 3.13: Let X ∼ Uniform(0,1) and let F (x) denote CDF with an
inverse F −1 (u) defined for 0 < u < 1. The RV Y = F −1 (X) has CDF F (y).
3.7
Conditioning a Continuous RV
Definition 3.11: For X with PDF fX (x) and event B ⊂ SX with P [B] > 0,
the conditional PDF of X given B is
⎧
⎨ fX (x) ,
x∈B
fX|B (x) = ⎩ P [B]
0,
otherwise
Theorem 3.14: The conditional expected value of X is
µX|B = E[X|B] =
6
∞
−∞
xfX|B (x)dx.
The conditional expected value of g(X) is
E[g(X)|B] =
6
∞
−∞
g(x)fX|B (x)dx.
The conditional variance is
3
4
3
4
VAR[X|B] = E (X − µX|B )2 |B = E X 2 |B − µ2X|B .
We use this theorem to calculate the conditional moments and the variance.
3.8
Illustrated Problems
1. X is a continuous random variable. Test if each of the following function
could be a valid CDF. Provide brief details on how you tested each.
⎧
⎪
⎪
⎨0,
x < −1
a) F (x) = ⎪x2 , |x| ≤ 1
⎪
⎩
1, 1 < x
b) F (x) =
⎧
⎪
⎪
⎨0,
x2
,
⎪2
⎪
⎩
1,
x<0
0≤x≤1
1<x
3.8 Illustrated Problems
61
⎧
⎪
⎪
⎨0,
x<0
c) F (x) = sin(x), 0 ≤ x ≤
⎪
⎪
⎩
π
1,
<x
2
d) F (x) =
π
2
⎧
⎨0,
x ≤ −4
⎩1 − exp(−a(x + 4)), −4 < x
For the valid CDF’s, derive their PDF’s.
2. The CDF of a random variable Y is
FY (y) =
⎧
⎪
⎪
⎨0,
(y+2)
,
⎪ 3
⎪
⎩
1,
y < −2
−2 ≤ y ≤ 1
y>1
a) Find the PDF of this random variable.
b) Find P [Y < 0].
c) Find the expected value of Y .
d) Find the variance of Y .
e) Find the median of Y (i.e., find a such that P [Y < a] = P [Y > a]).
3. Consider the following PDF.
⎧
⎨3 − 4x,
fX (x) = ⎩
0,
0<x<a
otherwise
a) Find a to have a valid PDF.
b) Find E [X 3 ].
4. The random variable X has PDF
⎧
⎪
⎪
⎨1/4,
0<x≤2
fX (x) = ⎪cx + 1, 2 < x ≤ 4
⎪
⎩
0,
otherwise
a) Find c to have a valid PDF.
b) Find the CDF of X and plot it.
c) Find P [1 < X < 3] (It is a good practice to find this value from both
the PDF and CDF to compare both methods).
5. The waiting time at a bank teller is modeled as an exponential random
variable and the average waiting time is measured to be 2 minutes (in other
words E[T ] = 2).
62
Continuous Random Variables
a) What is the value of λ?
b) What is the probability of waiting more than 5 minutes?
c) What is the probability of waiting more than 8 minutes (in total) given
that you have already waited 5 minutes?
d) Find a such that in 90% of cases the waiting time is less than a. (Hint:
In other words find a such that P [T < a] = 0.9.)
e) Given that the waiting time is less than 5 minutes, what is the probability that it is also less than 3 minutes?
6. Jim has started an internet file sharing website with 3 file servers A, B and C.
The file requests are handled by these 3 servers in the order of A, B, C. Thus
server A will service requests number 1, 4, 7, . . .; server B will service requests
number 2, 5, 8, . . . and server C will service requests 3, 6, 9, . . .. Servicing each
request takes 10ms. The time interval T between requests that are handled
by any given server is an Erlang(3, λ) random variable, where 1/λ = 2ms.
a) Write down and expand the CDF of this Erlang random variable.
b) Find the probability of T > 10ms (meaning that a server is not busy
when it received its next request).
c) Find the expected value and the variance of T .
d) For maintenance, server C is taken out of service and the traffic is
handled by servers A and B. Find the probability of T > 10ms in this
case. [Hint: since only two servers work, T is an Erlang(2, λ) R.V. with
λ as before.]
7. A study shows that the heights of Canadian men are independent Gaussian
random variables with mean 175cm and standard deviation 10cm. Use the
table on page 123 of the textbook to answer the following questions:
a) What percentage of men are at least 165 cm?
b) What is the probability that a randomly chosen man is shorter than
160 cm or taller than 195 cm
c) A similar study shows that the heights of Canadian women are also
independent Gaussian random variables with mean 165 cm. It also
shows that 91% of the women are shorter than 177 cm. What is the
variance of the heights of Canadian women?
d) What percentage of women are at least 165 cm? Compare with part
a).
e) [Just read this part] A random man and a random woman are selected.
What is the probability that the woman be taller than the man? [Think
about this question, and try to realize why we cannot solve it. We learn
how to approach such questions in Chapter 4].
3.8 Illustrated Problems
63
8. The life time in months, X, of light bulbs produced by two manufacturing
plants A and B are exponential with λ = 1/4 and λ = 1/2, respectively.
Plant B produces 3 times as many bulbs as plant A. The bulbs are mixed
together and sold.
a) What is the probability that a light bulb purchased at random will last
at least five months?
b) Given that a light bulb has last more than five months, what is the
probability that it is manufactured by plant A?
9. The input voltage X to an analog to digital converter (A/D) is a Gaussian(6, 16)
random variable. The input/output relation of the A/D is given by
⎧
⎪
0,
⎪
⎪
⎪
⎪
⎪
⎪
⎪
⎨1,
X≤0
0<X≤4
Y = 2, 4 < X ≤ 8
⎪
⎪
⎪
⎪
3, 8 < X ≤ 12
⎪
⎪
⎪
⎪
⎩
4, X > 12
Find and plot the PMF of Y .
10. A study shows that the height of a randomly selected Canadian man is a
Gaussian random variable with mean 175 cm and standard deviation 10 cm.
A random Canadian man is selected. Given that his height is at least 165 cm,
answer the following:
a) What is the probability that his height is at least 175 cm?
b) What is the probability that his height is at most 185 cm?
11. A Laplace(λ) random variable is one with the following PDF.
fX (x) =
λ −λ|x|
e
,
2
−∞ < x < ∞
a) For λ = 1, find the conditional PDF of X given that |X| > 2.
b) For λ = 1, find E[X||X| > 2].
12. X is an Exponential(λ) random variable. Let Z = X 2 .
a) Find the PDF of Y = 3X + 1.
b) Find the PDF of Z.
c) Find E[Z] and VAR[Z].
64
Continuous Random Variables
13. Let Xbe uniform RV on [0, 2]. Compute the mean and variance of Y = g(X)
where
⎧
⎪
0,
x<0
⎪
⎪
⎪
⎪
⎨2x,
0 ≤ x < 12
g(x) = ⎪
⎪
2 − 2x, 12 ≤ x < 1
⎪
⎪
⎪
⎩
0,
1<x
Repeat the above if X is exponential with a mean of 0.5.
14. The random variable W has the PDF
⎧
⎪
⎪
⎨1 − w,
0<w<1
fW (w) = w − 1, 1 ≤ w < 2
⎪
⎪
⎩
0,
otherwise
Let A = {W > 1} and B = {0.5 < W < 1.5}.
a) Derive fW |A (w) and sketch it.
b) Find FW |A (w)and sketch it.
c) Derive fW |B (w)and sketch it.
d) Find FW |B (w) and sketch it.
3.9
1.
Solutions for the Illustrated Problems
a) Not valid (not monotonically increasing)
b) Not valid (not a smooth function of x)
c) A valid ⎧
CDF.
⎨cos(x), 0 ≤ x ≤ π
2
f (x) = ⎩
0,
otherwise
d) A valid ⎧
CDF.
⎨a exp(−a(x + 4)), −4 < x
f (x) = ⎩
0,
otherwise
2.
a) fY (y) =
dFY (y)
dy
⎧
⎨1,
= ⎩3
0,
b) P [Y < 0] = F (0) =
c) E[Y ] =
5∞
−∞
2
3
−2 ≤ y ≤ 1
otherwise
= 0.67
y f (y)dy =
51
−2
1
y
3
dy = −0.5
3.9 Solutions for the Illustrated Problems
d) E[Y 2 ] =
5∞
−∞
y 2 f (y)dy =
51
−2
1 2
y
3
dy =
65
8+1
9
=1
VAR [Y ] = E[Y 2 ] − E 2 [Y ] = 0.75
e) FY (a) = 1 − FY (a) ⇒ FY (a) = 0.5 ⇒ y = −0.5
3.
5a
a) Solving (3 − 4x) dx = 1 ⇒ 3a − 2a2 = 1 we get a ∈ {0.5, 1}.
0
PDF fX (x) is non-negative only if 3 − 4a ≥ 0 ⇒ a ̸= 1.
Therefore, a = 0.5
b) E[X 3 ] =
4.
a)
5∞
−∞
5a
0
x3 (3 − 4x)dx = 0.75a4 − 0.8a5 = 0.0219
54
fX (x)dx = 1 ⇒ (cx + 1)dx = 0.5 ⇒ 6c + 2 = 0.5 ⇒ c = −0.25
2
⎧
⎪
0,
⎪
⎪
⎪
⎪
⎨x,
x≤0
5x
0<x≤2
b) FX (x) =
fX (x)dx = ⎪ 4
x2
⎪
−∞
−1 + x − 8 , 2 < x ≤ 4
⎪
⎪
⎪
⎩
1,
x>4
c) P [1 < X < 3] = P [1 < X ≤ 3] = FX (3)−FX (1) = 0.875−0.25 = 0.625
5. The PDF of an Exponential(λ) random variable X is given by fX (x) =
λe−λx , x ≥ 0, λ > 0.
a) Waiting time T is given to be an Exponential(λ) random variable,
whose mean E[T ] = λ1 is 2. Thus we get λ = 0.5.
b) Therefore PDF of T is fT (t) = 0.5e−0.5t , t ≥ 0.
P [T > 5] =
5∞
5
0.5e−0.5t dt = e−2.5 = 0.0821
c) We know that the exponential distribution is ‘memoryless’.
Therefore P [T > 8|T > 5] = P [T > 3] =
5∞
3
0.5e−0.5t dt = e−1.5 = 0.2231
Note: You can derive the same result using conditional probability.
d) P [T < a] = 1 − e−0.5a = 0.9 ⇒ a = − ln(0.1)
=
λ
e) P [T < 3|T < 5] =
P [T <5|T <3]P [T <3]
P [T <5]
=
2.3026
λ
1×(1−0.2331)
1−0.0821
= 4.6052 min
= 0.8464
6. The CDF of an Erlang(k, λ) random variable X is given by FX (x) = 1 −
k−1
$
n=0
e−λx (λx)
.
n!
n
a) Therefore, CDF of time interval T ∼ Erlang(3, λ) is:
FT (t) = 1 −
2
$
n=0
%
e−0.5t (0.5t)
= 1 − e−0.5t 1 + 2t +
n!
n
t2
8
&
Note: all time values are measured in milliseconds.
66
Continuous Random Variables
b) P [T > 10] = 1 − FT (10) = e−5 (1 + 5 + 12.5) = 0.1247
3
c) E[T ] = λk = 0.5
= 6 ms
k
Var[X] = λ2 = 12 ms
d) Server outage makes T ∼ Erlang(2, λ) and FT (t) = 1 −
Thus, P [T > 10] = 1 − P [T < 10] =
1
$
n=0
n
e−5 5n!
= 0.0404
1
$
n=0
e−0.5t (0.5t)
.
n!
n
7. Let X be a random variable denoting the height of a Canadian man (in cm)
under consideration. Then we know that, X ∼ N (175, 100).
3
a) P [X > 165] = P
0.84134 = 84.134%
X−175
10
4
> −1 = 1 − P
3
X−175
10
4
≤ 1 = 1 − Φ(1) =
b)
P [{X < 160} ∪ {X > 195}] = 1 − P [160 < X < 195]
:
;
X − 175
= 1 − P −1.5 <
<2
10
= 1 − (Φ(2) − Φ(−1.5)) = 1 − (0.97725 − 0.06681) = 0.8956
c) Let Y be a random variable denoting the height of a Canadian woman
(in cm) under consideration. Then we know that, Y ∼ N (165, σ 2 ),
whose variance σ 2 is to be determined.
Given 91% of women
are shorter
than
3
4
% &177cm,
Y −165
12
P [Y < 177] = P
< σ = Φ 12
= 0.91
σ
σ
12
12
Therefore, σ = Φ−1 (0.91) = 1.34097 = 8.9487 ⇒ VAR[Y ] = 80.08
d) P [Y > 165] = 0.5 = 50%. Therefore, when compared to Canadian
men, a Canadian woman is less likely to be taller than 165 cm.
8. The conditional PDFs fX|A (x) = 0.25e−0.25x , x ≥ 0 and fX|B (x) = 0.5e−0.5x , x ≥
0 of life time X of a bulb are known, given the plant which manufactured
it. We also know that P [B] = 3P [A].
a) Solving P [A] + P [B] = 1 (since any bulb is manufactured in one of the
plants A and B) and P [B] = 3P [A] we get: P [A] = 0.25, P [B] = 0.75.
Applying the law of total probability,
P [X > 5] = P [X > 5|A]P [A] + P [X > 5|B]P [B] = e−5/4 (0.25) +
e−5/2 (0.75) = 0.1332
b) Using the Bayes’ rule P [A|X > 5] =
0.5377
P [X>5|A]P [A]
P [X>5]
=
e−5/4 (0.25)
0.1332
=
0.0716
0.1332
=
3.9 Solutions for the Illustrated Problems
67
9. Y can only take values in {0, 1, 2, 3,%4}. &
% &
6
P [Y = 0] = P [X ≤ 0]
=
1
Φ
= 1 − Φ(1.5) ∼
= Φ 0−6
−
= 0.067
% 4 &
%
&4
% &
% &
4−6
0−6
6
2 ∼
P [Y = 1] = P [0 < X ≤ 4] = Φ 4 − Φ 4 = Φ 4 − Φ 4 = 0.241
%
&
%
&
% &
4−6
2
P [Y = 2] = P [4 < X ≤ 8] = Φ 8−6
−
Φ
=
2Φ
−1∼
= 1.383 − 1 = 0.383
% 4 &
%4 &
% 4&
% &
12−6
8−6
6
P [Y = 3] = P [8 < X ≤ 12] = Φ 4 − Φ 4 = Φ 4 − Φ 24 ∼
= 0.241
%
&
% &
12−6
6 ∼
P [Y = 4] = P [X ≥ 12]
=1−Φ
=1−Φ
= 0.067
4
10.
a) P [X > 175|x > 165] =
P [X>175]
P [X>165]
a) FX (x| |X| > 2) =
0.5
Φ(1)
P [165<X<185]
P [X>165]
b) P [X < 185|X > 165] =
11.
=
P [|X|>2,X≤x]
P [|X|>2]
=
4
=
=
2Φ(1)−1
Φ(1)
⎧
F (x)
⎪
,
⎪
⎪
⎨ P [|X|>2]
b) E (X| |X| > 2) =
5∞
−∞
= 0.5943
=
F (−2)
,
P [|X|>2]
⎪
⎪
⎪
⎩ F (x)−P [|X|<2] ,
P [|X|>2]
P [|X| > 2] = e
Therefore, the PDF
is equal to
⎧ 2−|x|
e
⎪
⎪
⎨ 2 , x < −2
fX (x| |X| > 2) = ⎪0,
−2 ≤ x ≤ 2
⎪
⎩ e2−||x|
, x>2
2
−2
0.5
0.8413
0.6826
0.8413
= 0.8114
x < −2
−2 ≤ x ≤ 2
x>2
x fX (x| |X| > 2) dx = 0 (Due to the symmetry of
the conditional PDF)
12.
a) fX (x) = ⎧
0.1e−x/10 , x > 0
y−1
y−1
⎨ fX ( 3 )
e− 30
=
, y>1
3
30
fY (y) =
⎩0,
otherwise
b) fZ (z) =
c)
⎧
√
⎪
⎨ fX (√ z )
⎪
⎩0,
2 z
E[Z]
=
E[Z 2 ]
=
5∞
−∞
5∞
−∞
=
√
z
10
e− √
20 z
, z>0
otherwise
x2 fx (x) dx =
1
10
x4 fx (x)dx =
1
10
5∞
0
5∞
0
x2 e−x/10 dx = 100 × Γ(3) = 200
x4 e−x/10 d = 10000 × Γ(5) = 240000
VAR[Z] = 240000 − 40000 = 200000
13. Given X ∼ Uniform(0, 2)
68
Continuous Random Variables
E[Y ]
= E[g(X)] =
=
E[Y 2 ]
1/2
5
0
% &
1
2
2x
5∞
1/2
5
0
(2x)2
51
dx +
= E[g 2 (X)] =
=
g(x)fX (x) dx
−∞
% &
1
2
5∞
1/2
−∞
(2 − 2x)
1
2
dx =
1
4
(g(x))2 fX (x)dx
dx +
2
51
1/2
VAR[Y ] = E[Y ] − (E[Y ]) =
2
% &
(2 − 2x)2
1
6
−
% &2
1
4
% &
1
2
dx =
1
6
= 0.1042
For the exponential case
For X ∼ Exponential(λ), we know that E[X] = λ1 . Therefore, we find λ = 2
and PDF fX (x) = 2e−2x , x ≥ 0.
E[Y ] = E[g(X)] =
=
g(x)fX (x)dx
&
61
−∞
61/2
%
2x 2e
0
−2x
E[Y 2 ] = E[g 2 (X)] =
=
6∞
dx +
1/2
6∞
2
(2x)
0
%
2e
−2x
2
&
(g(x))2 fX (x)dx
−∞
61/2
%
(2 − 2x) 2e−2x dx = 0.3996
&
dx +
61
1/2
%
&
(2 − 2x)2 2e−2x dx = 0.2578
2
VAR[Y ] = E[Y ] − (E[Y ]) = 0.09815
14.
P [A] = P [W > 1] =
5∞
1
52
fW (w)dw = (w − 1)dw =
P [B] = P [0.5 < W < 1.5] =
1.5
5
0.5
1
fW (w)dw =
a) Conditioning
⎧ on A we get,
⎨2(w − 1), 1 ≤ w ≤ 2
fW |A (w) =
⎩0,
otherwise
⎧
⎪
⎪
⎨0,
51
0.5
1
2
(1 − w)dw +
w<1
b) FW |A (w) =
fW |A (t)dt = (w − 1) , 1 ≤ w < 2
⎪
⎪
−∞
⎩
1,
w≥2
5w
2
1.5
5
1
(w − 1)dw =
1
4
3.10 Drill Problems
69
c) Conditioning on B we get,
⎧
⎪
⎪
⎨4(1 − w), 0.5 < w < 1
fW |B (w) = 4(w − 1), 1 < w < 1.5
⎪
⎪
⎩
0,
otherwise
⎧
⎪
0,
⎪
⎪
%
&
⎪
⎪
⎨
w − 12 (3 − 2w),
5w
d) FW |B (w) =
fW |B (t)dt =
⎪
⎪0.5 + 2(w − 1)2 ,
−∞
⎪
⎪
⎪
⎩
1,
3.10
w < 0.5
0.5 ≤ w ≤ 1
1 ≤ w < 1.5
w ≥ 1.5
Drill Problems
Section 3.1 and Section 3.2 - PDF and CDF
1. Test if the following functions are valid PDF’s. Provide brief details on how
you tested each.
a) f (x) =
b) f (x) =
⎧
⎨0.75x(2 − x)
⎩0
⎧
⎨0.5e−x
⎩0
⎧
⎨2x − 1
c) f (x) = ⎩
0
d) f (x) =
Ans
0<x<∞
otherwise
0 ≤ x ≤ 0.5(1 +
otherwise
⎧
⎨0.5(x + 1)
⎩0
a) Yes; F (x) =
⎧
⎪
⎪
⎨0 2
x (3−x)
⎪ 4
⎪
⎩
1
0≤x≤2
otherwise
√
5)
|x| ≤ 1
otherwise
x<0
0≤x≤2
x>2
b) Not a PDF; since
c) Not a PDF; since f (x) < 0 for some x. d) Yes; F (x) =
For the valid PDF’s, derive the corresponding CDF.
5
f (x)dx ̸= 1.
⎧
⎪
⎪
⎨0
(x+1)2
⎪ 4
⎪
⎩
1
2. The cumulative distribution function of random variable X is
⎧
⎪
x < −1,
⎨ 0
FX (x) = (x + 1)/2 −1 ≤ x < 1,
⎪
⎩
1
x ≥ 1.
x < −1
−1 ≤ x ≤ 1
x>1
70
Continuous Random Variables
a) What is P [X > 1/2]?
b) What is P [−1/2 < X ≤ 3/4]?
c) What is P [|X| ≤ 1/2]?
d) What is the value of a such that P [X ≤ a] = 0.8?
e) Find the PDF fX (x) of X.
Ans
a) 1/4
e)
b) 5/8
c) 1/2
d) 0.6
fX (x) =
-
1/2 , −1 ≤ x ≤ 1
0
, otherwise
3. The random variable X has probability density function
fX (x) =
-
cx 0 ≤ x ≤ 2,
0 otherwise.
Use the PDF to find
a) the constant c,
b) P [0 ≤ X ≤ 1],
c) P [−1/2 ≤ X ≤ 1/2],
d) the CDF FX (x).
Ans
a)1/2
d)
b) 1/4
c) 1/16
FX (x) =
⎧
⎪
⎨ 0
,x < 0
x2 /4 , 0 ≤ x ≤ 2
⎪
⎩
1
,x > 2
Section 3.3 - Expected Values
4. Continuous random variable X has PDF
fX (x) =
-
1/4 −1 ≤ x ≤ 3,
0
otherwise.
Define the random variable Y by Y = h(X) = X 2 .
a) Find E[X] and VAR[X].
b) Find h(E[X]) and E[h(X)].
c) Find E[Y ] and VAR[Y ].
3.10 Drill Problems
71
Ans
a) E[X] = 1, VAR[X] = 4/3 b) h(E[X]) = 1, E[h(x)] = 7/3
c) E[Y ] = 7/3, VAR[Y ] = 304/45
5. The cumulative function of random variable U is
FU (u) =
a) What is E[U ]?
⎧
⎪
0
⎪
⎪
⎪
⎪
⎪
⎨ (u + 5)/8
⎪
⎪
⎪
⎪
⎪
⎪
⎩
u < −5,
−5 ≤ u < −3,
1/4
−3 ≤ u < 3,
1/4 + 3(u − 3)/8 3 ≤ u < 5,
1
u ≥ 5.
b) What is VAR[U ]?
c) What is E[2U ]?
Ans
a) 2
b) 37/3
c) 13.001
Section 3.4 - Families of RVs
6. You take bus to the university from
is uniformly distributed between 40
VAR [X], (b) the probability that it
probability that it takes less than 45
your home. The time X for this trip
and 55 minutes. (a) Find E[X] and
takes more than 50 minutes. (c) The
minutes.
Ans
a) E[X] = 47.5 min. Var[X] = 18.75 min2 .
b)1/3
c) 1/3
7. Let X be uniform RV on [0, 2]. Compute the mean and variance of Y = g(X)
where
⎧
⎪
x<0
⎪
⎪0
⎪
⎪
⎨2x
0 < x < 12
g(x) =
⎪
⎪
2 − 2x 12 < x < 1
⎪
⎪
⎪
⎩
0
1 < x.
Ans
case: X ∼ Uniform(0,2)
E[Y] = 1/4 Var[Y] = 5/48
72
Continuous Random Variables
8. The random variable X, which represents the life time in years of a communication satellite, has the following PDF:
fX (v) =
⎧
⎨0.4e−0.4v
⎩0
v≥0
otherwise
What family of random variables is this?
Find the expected value, variance and CDF of X. What is the probability
that the satellite will last longer than 3 years? If the satellite is 3 years old,
what it the probability that it will last for another 3 years or more?
Ans
Exponential, E[X] = 2.5,
Var[Y] = 6.25 FX (v) =
⎧
⎨0
⎩1 − e
−0.4v
P [X > 3] = 0.3012 P [X > 6|X > 3] = P [X > 3] = 0.3012
v<0
v≥0
9. The life time in months, X, of light bulbs (identical specifications) produced
by two manufacturing plants A and B are exponential with λ = 1/5 and
λ = 1/2, respectively. Plant B produces four times as many bulbs as plant
A. The bulbs are mixed together and sold. What is the probability that a
light bulb purchased at random will last at least (a) two months; (b) five
months; (c) seven months.
Ans
a) 0.4284
b) 0.1392
c) 0.0735
10. X is a Erlang(3, 0.2). Calculate the value of P [1.3 < X < 4.6].
Ans
0.0638
Section 3.5 - Gaussian RVs
11. A Gaussian random variable, X, has a mean of 10 and a variance of 12.
a) Find the probability that X is less than 13.
b) Find P [−1 < X < 1].
c) If Y = 2X + 3, find the mean and variance of Y .
d) Find P [0 < Y ≤ 80].
Ans
a) 0.8068
b) 0.00394
c) 23, 48
d) 0.9995
12. A Gaussian random variable, X, has an unknown mean but a standard
deviation of 4.
3.10 Drill Problems
73
a) The random variable is positive on 32% of the trials. What is the mean
value?
b) This random variable is changed to another Gaussian random variable
through the linear transformation Y = X2 + 1. Find the expected value
of Y .
c) Find the variance of Y .
d) Find the mean of the square of Y .
Ans
a) -1.871
b) 0.0646
c) 4
d) 4.0042
Section 3.6 - Function of a RV
13. X is a Uniform(0,4) RV. The RV Y is obtained by Y = (X − 2)2 .
a) Derive the PDF and CDF of Y .
b) Find E[Y ].
c) Find VAR [Y ].
Ans
a)
fY (y) =
FY (y) =
⎧
⎨
1
√
4 y
⎩0
⎧
⎪
⎪
⎨0√
⎪
⎪
⎩
y
2
1
b)E[Y]=4/3 c)Var[Y]=64/45
0≤y≤4
otherwise
y<0
0≤y≤4
y>4
14. The RV X is N (1,1). Let
Y =
⎧
⎨1,
⎩2,
a) Find the PDF and CDF of Y .
b) Find E[Y ].
c) Find VAR [Y ].
X≤0
X>0
74
Continuous Random Variables
Ans
a)
fY (y) = 0.5 for y ∈ {1, 2}, ZOW.
⎧
⎪
⎪0
⎨
y<1
FY (y) =
0.5 1 ≤ y < 2
⎪
⎪
⎩
1
y≥2
b)E[Y]=3/2 c)Var[Y]=1/4
Section 3.7 -Conditioning
15. The random variable X is uniformly distributed between 0 and 5. The event
2
B is B = {X > 3.7}. What are fX|B (x), µX|B , and σX|B
?
Ans
⎧
⎨
2
fX|B (x) = ⎩ 13
0
3.7 ≤ x ≤ 5
otherwise
2
µX|B = 4.35, σX|B
= 0.1408
16. Let X be an exponential random variable.
FX (x) =
-
1 − e−x/3 , x ≥ 0
,
0
,x < 0
2
and let B be the event B = {X > 2}. What are fX|B (x), µX|B , and σX|B
?
Ans
fX|B (x) =
2
µX|B = 4.9997, σX|B
= 9.0007
⎧ 2−x
⎨1e 3
3
⎩0
x>2
otherwise
17. X is Gaussian with a mean of 997 and a standard deviation of 31. What is
the probability of B where B = {X > 1000}? And what is the pdf for X
conditioned by B?
Ans
P [X > 1000] = 0.4615
fX|B (x) =
⎧
⎨
fX (x)
P [X>1000]
⎩0
x > 1000
otherwise
Chapter 4
Pairs of Random Variables
4.1
Joint Probability Mass Function
Definition 4.1: The joint PMF of two discrete RVs X and Y is
PX,Y (a, b) = P [X = a, Y = b].
Theorem 4.1: The joint PMF PX,Y (x, y) has the following properties
1. 0 ≤ PX,Y (x, y) ≤ 1 for all (x, y) ∈ SX,Y .
2.
#
(x,y)∈SX,Y
PX,Y (x, y) = 1.
3. For event B, P [B] =
#
(x,y)∈B
4.2
PX,Y (x, y).
Marginal PMFs
Theorem 4.2: For discrete RV’s X and Y with joint PMF PX,Y (x, y), the
marginals are
#
PX (x) =
PX,Y (x, y) and
y∈SY
PY (y) =
#
x∈SX
PX,Y (x, y).
76
Pairs of Random Variables
4.3
Joint Probability Density Function
Definition 4.2: The joint PDF of the continuous RVs (X, Y ) is defined (indirectly) as
6
6
x
y
−∞
−∞
fX,Y (u, v)dudv = FX,Y (x, y).
Theorem 4.3:
fX,Y (x, y) =
∂ 2 FX,Y (x, y)
∂x∂y
Theorem 4.4: A joint PDF fX,Y (x, y) satisfies the following two properties
1. fX,Y (x, y) ≥ 0 for all (x, y).
2.
5∞ 5∞
−∞ −∞
fX,Y (x, y)dxdy = 1.
Theorem 4.5: The probability that the continuous random variables (X, Y )
are in B is
66
P [B] =
fX,Y (x, y)dxdy.
(x,y)∈B
4.4
Marginal PDFs
Theorem 4.6: For (X, Y ) pair with joint PDF fX,Y (x, y), the marginal PDFs
are
fX (x) =
fY (x) =
4.5
6
∞
−∞
∞
6
−∞
fX,Y (x, y)dy
fX,Y (x, y)dx
Functions of Two Random Variables
Theorem 4.7: For discrete RV’s X and Y , the derived random variable W =
g(X, Y ) has PMF,
#
PW (w) =
PX,Y (x, y)
(x,y):g(x,y)=w
i.e., a sum of all PX,Y (x, y) where x and y subject to g(x, y) = w.
We use this theorem to calculate probabilities of event {W = w}.
4.6 Expected Values
77
Theorem 4.8: For continuous RV’s X and Y , the CDF of W = g(X, Y ) is
FW (w) = P [W ≤ w] =
66
g(x,y)≤w
fX,Y (x, y)dxdy.
It is useful to draw a picture in the plane to calculate the double integral.
Theorem 4.9: For continuous RV’s X and Y , the CDF of W = max(X, Y ) is
FW (w) = FX,Y (w, w) =
4.6
6
w
−∞
6
w
−∞
fX,Y (x, y)dxdy.
Expected Values
Theorem 4.10: For RV’s X and Y , the expected value of W = g(X, Y ) is
E[W ] =
⎧ # #
⎪
g(x, y)PX,Y (x, y)
⎨
x∈SX y∈SY
⎪
⎩5 ∞ 5 ∞ g(x, y)f
X,Y (x, y)dxdy
−∞ −∞
discrete
continuous
Note -the expected value of W can be computed without its PMF or PDF.
Theorem 4.11: E[g1 (X, Y ) + · · · + gn (X, Y )] = E[g1 (X, Y )] + · · · + E[gn (X, Y )]
Theorem 4.12: The variance of the sum of two RV’s is
VAR [X + Y ] = VAR [X] + VAR [Y ] + 2E[(x − µX )(y − µY )].
Definition 4.3: The covariance of two RV’s X and Y is defined as
Cov[X, Y ] = E[(X − µX )(Y − µY )] = E[XY ] − µX µY .
Theorem 4.13:
1. VAR [X + Y ] = VAR [X] + VAR [Y ] + 2Cov[X, Y ].
2. If X = Y , Cov[X, Y ] = VAR [X] = VAR [Y ]
Definition 4.4: If Cov[X, Y ] = 0, RV’s X and Y are said to be uncorrelated.
Definition 4.5: The correlation coefficient of RV’s X and Y is ρX,Y =
Theorem 4.14: −1 ≤ ρX,Y ≤ 1.
Cov[X,Y ]
.
σX σY
78
4.7
Pairs of Random Variables
Conditioning by an Event
Theorem 4.15: For event B, a region in the (X, Y ) plane with P [B] > 0,
PX,Y |B (x, y) =
-
PX,Y (x,y)
P [B]
0
(x, y) ∈ B
.
otherwise
We use this to calculate the conditional joint PMF, conditioned on an event.
Theorem 4.16: For continuous RV’s X and Y and event B with P [B] > 0,
the conditional joint PDF of X and Y given B is
fX,Y |B (x, y) =
-
fX,Y (x,y)
P [B]
0
(x, y) ∈ B
otherwise
.
Theorem 4.17: For RV’s X and Y and an event B with P [B] > 0, the
conditional expected value of W = g(X, Y ) given B is
E[W |B] =
.
4.8
⎧ # #
⎪
g(x, y)PX,Y |B (x, y)
⎨
x∈SX y∈SY
⎪
⎩5 ∞ 5 ∞ g(x, y)f
X,Y |B (x, y)dxdy
−∞ −∞
discrete
continuous
Conditioning by an RV
Definition 4.6: For event {Y = y} with non-zero probability, the conditional
PMF of X is
PX|Y (x|y) = P [X = x|Y = y] =
P [X = x, Y = y]
PX,Y (x, y)
=
.
P [Y = y]
PY (y)
Theorem 4.18: For discrete RV’s X and Y with joint PMF PX,Y (x, y) and x
and y such that PX (x) > 0 and PY (y) > 0,
PX,Y (x, y) = PX|Y (x|y)PY (y) = PY |X (y|x)PX (x).
This allows us to derive the joint PMF from conditional joint PMF and marginal
PMF.
4.9 Independent Random Variables
79
Definition 4.7: The conditional PDF of X given {Y = y} is
fX|Y (x|y) =
fX,Y (x, y)
fY (y)
fY |X (y|x) =
fX,Y (x, y)
.
fX (x)
where fY (y) > 0. Similarly,
Theorem 4.19: X and Y are discrete RV’s. Find any y ∈ SY , the conditional
expected value of g(X, Y ) given Y = y is
E[g(X, Y )|Y = y] =
#
g(x, y)PX|Y (x|y).
x∈SX
Definition 4.8: For continuous RV’s X and Y , and any y such that fY (y) > 0,
the conditional expected value of g(X, Y ) given Y = y is
E[g(X, Y )|Y = y] =
6
∞
−∞
g(x, y)fX|Y (x|y)dx.
To calculate the conditional moments, we need the conditional joint PMF and
PDF first.
Theorem 4.20: Iterated Expectation
E[E[X|Y ]] = E[X].
4.9
Independent Random Variables
Definition 4.9: RV’s X and Y are independent if and only if
Discrete: PX,Y (x, y) = PX (x)PY (y). Continuous: fX,Y (x, y) = fX (x)fY (y),
for all values of x and y.
Theorem 4.21: For independent RV’s X and Y ,
1. E[g(X)h(Y )] = E[g(X)]E[h(Y )].
Cov[X, Y ] = 0.
→ E[XY ] = E[X]E[Y ] thus
2. VAR [X + Y ] = VAR [X] + VAR [Y ] E[X|Y = y] = E[X] E[Y |X =
x] = E[Y ].
80
Pairs of Random Variables
4.10
Bivariate Gaussian Random Variables
Definition 4.10: X and Y are bivariate Gaussian with parameters µ1 , σ1 ,µ2 ,
σ2 , ρ if the joint PDF is
fX,Y (x, y) =
1
√
2πσ1 σ2 1 − ρ2
1
−
2(1−ρ2 )
e
:%
x−µ1
σ1
&2
%
2ρ(x−µ1 )(y−µ2 )
−
+
σ1 σ2
y−µ2
σ2
&2 ;
,
where µ1 , µ2 can be any real numbers, σ1 > 0, σ2 > 0 and −1 < ρ < 1.
Theorem 4.22: If X and Y are the bivariate Gaussian RV’s, then X is
N
(µ1 ,σ12 )
√ 1 2e
2πσ2
−
and Y is N (µ2 ,
(y−µ2 )2
2σ 2
2
σ22 ).
That is, fX (x) =
√ 1 2e
2πσ1
−
(x−µ1 )2
2σ 2
1
, fY (y) =
.
Theorem 4.23: Bivariate Gaussian RV’s X and Y have the correlation coefficient ρX,Y = ρ.
Theorem 4.24: Bivariate Gaussian RV’s X and Y are uncorrelated if and only
if they are independent, i.e., ρ = 0 implies that X and Y are independent.
4.11
Illustrated Problems
1. The joint PMF of two random variables is given in Table 4.1.
a) Find P [X < Y ].
b) Find E[Y ].
c) Find E[X|Y = 2].
Y
1
2
3
4
X
0
1
2
3
0.03
0.05
0.02
0.05
0.10
0.07
0.20
0.23
0.02
0.02
0.02
0.04
0.02
0.05
0.05
0.03
Table 4.1: for Question 1
2. The joint PDF of X and Y is given as
⎧
⎨c(3x2 + 2y),
fX,Y (x, y) = ⎩
0,
0 < x < 1, 0 < y < 1
.
otherwise
4.11 Illustrated Problems
81
a) Find the value of c to ensure a valid PDF.
b) Find the PDF of X.
3. The joint PDF of X and Y is
⎧
⎨cx2 y,
0 ≤ x < 1, 0 ≤ y ≤ 2
.
otherwise
fX,Y (x, y) = ⎩
0,
a) Find the value of constant c.
b) Find P [Y < 1].
c) Find P [Y < X].
d) Find P [Y > X 2 ].
e) Find the PDF of V = min{X, Y }.
f) Find the PDF of U = X/Y .
4. The joint PDF of X and Y is
fX,Y (x, y) =
⎧
⎨2.5x2 ,
⎩0,
−1 ≤ x ≤ 1, 0 ≤ y ≤ x2
.
otherwise
a) Find the marginal PDF of X.
b) Find the marginal PDF of Y .
c) Find Var[X] and Var[Y ].
d) Find Cov[X, Y ] and ρX,Y (the correlation coefficient of X and Y ).
5. Let X and Y be jointly distributed with
fX,Y (x, y) =
⎧
⎨ke−3x−2y ,
⎩0,
x, y ≥ 0
.
otherwise
a) Find k and the PDFs of X and Y .
b) Find means and variances of X and Y .
c) Are X and Y independent? Find ρX,Y .
6. X and Y are jointly distributed with
⎧
⎨ke−3x−2y ,
fX,Y (x, y) = ⎩
0,
0≤y≤x
.
otherwise
a) Find k and the marginal PDFs and CDFs of X and Y .
82
Pairs of Random Variables
b) Find means and variances of X and Y .
c) Are X and Y independent? Find ρX,Y .
7. Let Z = X + Y , where X and Y are jointly distributed with
fX,Y (x, y) =
⎧
⎨c,
⎩0,
x ≥ 0, y ≥ 0, x + y ≤ 1
.
otherwise
a) Find the PDF and CDF of Z.
b) Find the expected value and variance of Z.
8. Two random variables X and Y are jointly distributed with
fX,Y (x, y) =
⎧
⎨ (x+y) ,
⎩0,
3
0 ≤ x ≤ 1, 0 ≤ y ≤ 2
.
otherwise
Let event A be defined as A = {Y ≤ 0.5}.
a) Find P [A].
b) Find the conditional PDF fX,Y |A (x, y).
c) Find the conditional PDF fX|A (x).
d) Find the conditional PDF fY |A (y).
9. In Question 4, define B = {X > 0}.
a) Find fX,Y |B (x, y).
b) Find VAR[X|B].
c) Find E[XY |B].
d) Find fY |X (y|x).
e) Find E[Y |X = x].
f) Find E[E[Y |X]].
10. Let X and Y be jointly distributed with
fX,Y (x, y) =
a) Find the PDF fY (y).
⎧
⎨2,
⎩0,
0≤y≤x≤1
.
otherwise
b) Find the conditional PDF fX|Y (x|y).
c) Find the conditional expected value E[X|Y = y].
4.12 Solutions for the Illustrated Problems
83
11. Let X and Y be jointly distributed with
⎧
⎨ (4x+2y) ,
fX,Y (x, y) = ⎩
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
.
otherwise
3
0,
a) Find the PDFs fY (y) and fX (x).
b) Find the conditional PDF fX|Y (x|y).
c) Find the conditional PDF fY |X (y|x).
12. The joint PDF of two RVs is given by
fX,Y (x, y) =
⎧
⎨ |xy| ,
8
⎩0,
x2 + y 2 ≤ 1
.
otherwise
Determine if X and Y are independent.
13. X and Y are independent, with X ∼ Gaussian(5, 15) and Y ∼ Uniform(4,6).
a) Find E[XY ].
b) Find E[X 2 Y 2 ].
4.12
1.
Solutions for the Illustrated Problems
a) P [X < Y ] = 0.03+0.05+0.07+0.02+0.20+0.02+0.05+0.23+0.04+0.03
= 0.74 (see: shaded in the table below)
b) E[Y ] = 1(0.17) + 2(0.19) + 3(0.29) + 4(0.35) = 2.82
c) E[X|Y = 2] = 0(0.05)+1(0.07)+2(0.02)+3(0.05)
=
0.19
with bold text in the table below)
Y
X
1
2
3
4
2.
a) 1 =
55
0
0.03
0.05
0.02
0.05
fX,Y (x, y)dxdy = c
⎧5
⎨ f
X,Y (x, y)dy,
b) fX (x) =
⎩0,
⎧
⎨ 1+3x2 , 0 < x < 1
=
⎩0,
2
otherwise
1
0.10
0.07
0.20
0.23
51
51
x=0 y=0
2
0.02
0.02
0.02
0.04
0.26
0.19
= 1.3684 (see: noted
3
0.02
0.05
0.05
0.03
P[Y]
0.17
0.19
0.29
0.35
(3x2 + 2y)dxdy = 2c ⇒ c =
⎧ 5
1
2
0 < x < 1 ⎨c 01 (3x2 + 2y)dy, 0 < x < 1
=
⎩0,
otherwise
otherwise
84
Pairs of Random Variables
3.
a)
55
fX,Y (x, y)dxdy = c
b) P [Y < 1] =
51
0
c) P [Y < X] =
52
x=0 y=0
51
x2 ydxdy = 1 ⇒
1.5x2 dx ydy =
0
51
0
d) P [Y > X 2 ] =
51
5x
0
0
1.5x2
52
x2
=1⇒c=
3
2
1
4
1.5x2 ydydx =
51
2c
3
3
4
ydydx =
51
0
x4 dx =
3
20
25
28
e) Case v < 0: both X and Y are always greater than v. ⇒ FV (v) = 0
Case v > 1: at least X is always smaller than v. ⇒ FV (v) = 1.
Otherwise:
5 5
FV (v) = P [V ≤ v] = P [min(X, Y ) ≤ v] = 1 − v∞ v∞ f (x, y)dxdy
5 5
= 1 − v2 v1 32 x2 ydxdy
= 0.25v 2 + v 3 − 0.25v 5
⎧
⎪
⎪
⎨0,
v<0
Thus we have: FV (v) = 0.25v + v − 0.25v , 0 ≤ v ≤ 1
⎪
⎪
⎩
1,
v>1
f)
2
3
⎧
⎨0.5v + 3v 2 − 1.25v 4 ,
As a result: fV (v) = ⎩
0,
FU (u) = P [U < u] = P [
=
=
As a result: fU (u) =
4.
⎧
52
⎪
⎪
⎪
⎪
⎨
uy
5
0
⎪
⎩
0,
0<v<1
otherwise
X
< u]
Y
1.5x2 y dxdy,
y=0 x=0
⎪
51 x/u
5
⎪
⎪
1.5x2 y dydx,
⎪
⎩1 −
x=0 y=0
⎧
⎨3.2u3 ,
u < 0.5
⎩1 −
3
,
20u2
u > 0.5
⎧
⎨9.6u2 ,
u < 0.5
⎩0.3u−3 , u > 0.5
⎧5
⎨ f
X,Y (x, y)dy, −1 ≤ x ≤ 1
a) fX (x) = ⎩
0,
otherwise
⎧ 2
x
⎪
⎨ 5 2.5x2 dy, −1 ≤ x ≤ 1
=
5
otherwise
u < 0.5
u > 0.5
4.12 Solutions for the Illustrated Problems
=
b)
⎧
⎨2.5x4 ,
85
−1 ≤ x ≤ 1
otherwise
⎩0,
⎧5
⎨ f
X,Y (x, y)dx, 0 ≤ y ≤ 1
fY (y) =
⎩0,
otherwise
⎧
√
−5 y
⎪
51
⎪
⎨ 2.5x2 dx +
2.5x2 dx, 0 ≤ y ≤ 1
= √y
−1
⎪
⎪
⎩0,
otherwise
⎧ %
&
⎨ 5 1 − y 3/2 , 0 ≤ y ≤ 1
=
c)
=
E[X]
E[X ]
5
3
⎩0,
otherwise
x fX (x)dx = 2.5
5
2
5
2
51
−1
51
= x fX (x)dx = 2.5
2
5
E[Y ]
= y fY (y)dy =
E[Y ]
= y fY (y)d =
2
51
5
3
&
0
y 2 1 − y 3/2 dy =
VAR[Y ] = E[Y 2 ] − E 2 [Y ] =
55
%
0
VAR[X] = E[X 2 ] − E 2 [X] =
d) E[XY ] =
−1
5
7
x6 dx =
y 1 − y 3/2 dy =
51
5
3
x5 dx = 0
%
5
− 0 = 57
7
% &2
5
5
−
27
14
xyfX,Y (x, y)dxdy =
51
x
52
x=−1 0
5
14
&
5
27
= 0.05763
xy (2.5x2 ) dydx = 0
⇒ COV [X, Y ] = E[XY ] − E[X]E[Y ] = 0 and ρx,y = √
0
5.
a)
55
fX,Y (x, y)dxdy = k
5
5∞ 5∞
fX (x) = fX,Y (x, y)dy =
=
⎧
⎨3e−3x ,
⎩0,
5
x≥0
otherwise
e−3x−2y dxdy =
0 0⎧
5∞
⎪
⎨6 e−3x−2y dy,
0
⎪
⎩0,
⎧ ∞
5
⎪
⎨6 e−3x−2y dx,
fY (y) = fX,Y (x, y)dy = ⎪ 0
⎩0,
=
⎧
⎨2e−2y ,
⎩0,
y≥0
otherwise
k
6
COV [X,Y ]
VAR[X]VAR[Y ]
=1⇒k=6
x≥0
otherwise
y≥0
otherwise
=
86
Pairs of Random Variables
b) E[X] =
5
E[Y ] =
5
E[Y ] =
2
5∞
0
5
x fX (x)dx = 3
5
y fY (y)dy = 2
E[X ] =
2
x fX (x)dx = 3
2
5∞
y fY (y)dy = 2
5∞
0
x2 e−3x dx =
ye−2y dx =
0
2
1
3
xe−3x dx =
5∞
0
2
9
a)
55
fX,Y (x, y)dxdy = k
5
5∞ 5x
0 y=0
⎧
y 2 e−2y dy =
2
4
2
4
⇒ VAR[X] =
⎧
⎨7.5 (e−3x − e−5x ) ,
=
⎩0,
5
=⎩
b) E[X] =
5
E[X 2 ] =
98
225
−
%
8
15
E[Y ] =
5
E[Y ] =
2
=
y≥0
otherwise
0,
1
25
⎪
⎩0,
=
34
225
y fY (y)dy = 5
5
% &2
1
4
5∞
5∞
0
= 15
0
5∞
0
otherwise
5∞
0
8
15
x2 (e−3x − e−5x ) dx =
98
225
⇒ VAR[X] =
1
5
y 2 e−5y dy =
2
25
⇒ VAR[X] =
2
25
−
% &2
1
5
= 15
xye−3x−2y dydx
x=0 y=0
5∞
−3x
xe
x=0
5∞
xe−3x
x=0
+
%
5x
,
ye−2y dy dx
y=0
1−(1+2x)e−2x
4
&
dx =
∴ By definition:
E[XY ]−E[X]E[Y ]
√
ρX,Y = √
= 11/75−(8/15)(1/5)
=
VAR[X]VAR[Y ]
1
4
y≥0
c) No; because 5f5X,Y (x, y) ̸= fX (x)fY (y)for some x and y.
E[XY ] =
xy fX,Y (x, y)dxdy
5∞ 5x
=
1
9
= 1 ⇒ k = 15
x (e−3x − e−5x ) dx =
ye−5y dx =
y fY (y)dy = 5
2
= 15
=
otherwise
y
⎪
⎩0,
x2 fX (x)dx = 7.5
&2
1
3
x≥0
0
⎧ ∞
5
⎪
⎨15 e−3x−2y dx,
x fX (x)dx = 7.5
5
k
15
−
% &2
x≥0
otherwise
fY (y) = fX,Y (x, y)dy =
⎧
⎨5e−5y ,
e−3x−2y dydx =
5x
⎪
⎨15 e−3x−2y dy,
fX (x) = fX,Y (x, y)dy =
−
1
2
c) Yes; because fX,Y (x, y) = fX (x)fY (y) for every x and y.
∴ ρX,Y = 0 (because X and Y are uncorrelated)
6.
2
9
⇒ VAR[X] =
(34/225)(1/25)
11
75
√1/25
34/75
=
√3
34
= 0.5145
4.12 Solutions for the Illustrated Problems
7.
a)
55
fX,Y (x, y)dxdy = c
51
⎧
⎪
⎪0,
⎨
z<0
= z , 0≤z≤1
⎪
⎪
⎩
1, z > 1
2
d
F (z)
dz Z
fZ (z) =
b) E[Z] =
51
0
E[Z ] =
2
8.
a) P [A] =
2z 2 dz =
51
0.5
5
51
y=0 x=0
b) fX,Y |A (x, y) =
=
⎧
⎨ 8(x+y) ,
⎩0,
z<0
5z
z−x
5
x=0 y=0
1,
⎧
⎨2z,
0,
z>1
0≤z≤1
otherwise
1
2
⇒ VAR[Z] =
⎧
⎨ fX,Y (x,y) ,
1/8
1
2
−
% &2
2
3
=
1
18
1
8
A is True
A is False
0.5
5
0
51
⎧
⎨ 4x+1 ,
fX,Y |A (x, y)dy = ⎩ 3
0,
d) fY |A (y) = fX,Y |A (x, y)dx =
0
9.
dydx, 0 ≤ z ≤ 1
fX,Y (x, y)dxdy =
⎩0,
=1⇒c=2
0 ≤ x ≤ 1, 0 ≤ y ≤ 0.5
otherwise
3
c) fX|A (x) =
2
⎪
⎪
⎪
⎪
⎩
c
2
2
3
2z 3 dz =
0
⎪
⎨
=⎩
dxdy =
y=0 x=0
Consider the CDF of⎧Z.
⎪
⎪
⎪0,
FZ (z) = P [Z ≤ z] =
1−y
5
87
0≤x≤1
otherwise
⎧
⎨ 4(1+2y) ,
⎩0,
3
0 ≤ y ≤ 0.5
otherwise
a) Because of the symmetry around the y axis, it is easy to see that
P [B] = 0.5.
Thus:
⎧
2
⎨2f
X,Y (x, y), 0 < x ≤ 1, 0 ≤ y ≤ x
fX,Y |B (x, y) =
⎩0,
otherwise
=
⎧
⎨5x2 ,
⎩0,
0 < x ≤ 1, 0 ≤ y ≤ x2
otherwise
5
b) fX|B (x) = fX,Y |B (x, y)dy =
⎧ 2
x
⎪
⎨ 5 5x2 dy,
0
⎪
⎩
0,
0<x≤1
otherwise
88
Pairs of Random Variables
⎧
⎨5x4 ,
0<x≤1
otherwise
As a result:
5∞
51
E[X|B] =
x fX|B (x, y)dy = 5x5 dy =
=
⎩0,
E[X |B] =
2
VAR[X|B] =
c) E[XY |B] =
=
x
52
51
−∞
5∞
−∞
5
7
55
0
−
% &2
5
6
=⎩
0,
=
xy fX,Y |B (x, y)dxdy
51
0
fX,Y (x,y)
fX (x)
=
⎧
⎨ 2.5x2 ,
2.5x4
⎩0,
−1 ≤ x ≤ 1, 0 < y ≤ x2
otherwise
−1 ≤ x ≤ 1, 0 < y ≤ x2
otherwise
e) E[Y |X = x] =
=
0
5
252
5
7
9
9
9
9
9
9⇒
9
9
9
xy (5x2 ) dydx = 2.5 x7 dx
d) fY |X (y|x) =
1
,
x2
51
x fX|B (x, y)dy = 5x6 dy =
2
x=0 y=0
5
= 16
⎧
⎨
5
6
⎧
⎨0.5x2 ,
⎧ 2
x
⎪
⎨5
0
⎪
⎩
1
ydy,
x2
−1 ≤ x ≤ 1
0,
otherwise
−1 ≤ x ≤ 1
otherwise
⎩0,
f) E [E[Y |x]] = 0.5
51
−1
x2 fX (x)dx =
51
−1
0.5x2 (2.5x4 ) dx =
5
14
(compare with E[Y ] found in question 4).
10.
5
⎧ 1
5
⎪
⎨2 dx,
a) fY (y) = fX,Y (x, y)dx = ⎪
=
⎧
⎨2(1 − y),
⎩0,
b) fX|Y (x|y) =
fX,Y (x,y)
fY (y)
c) E[X|Y = y] =
=
⎧
⎨ y+1 ,
2
⎩0,
0≤y≤1
otherwise
5
=
⎧
⎨
⎩
0,
1
,
1−y
⎩0,
otherwise
0≤y≤x≤1
otherwise
x fX|Y (x|y)dx =
0≤y≤1
otherwise
0≤y≤1
y
⎧
⎪
⎨
⎪
⎩
1
1−y
0,
51
y
xdx, 0 ≤ y ≤ 1
otherwise
4.13 Drill Problems
5
11.
89
a) fY (y) = fX,Y (x, y)dx =
=
⎧
⎨ 2(y+1) ,
⎩0,
0≤y≤1
otherwise
3
5
fX (x) = fX,Y (x, y)dy =
=
⎧
⎨ 4x+1 ,
3
⎩0,
0≤x≤1
otherwise
b) fX|Y (x|y) =
fX,Y (x,y)
fY (y)
c) fY |X (y|x) =
fX,Y (x,y)
fX (x)
=
⎧1
5
⎪
⎨
⎪
⎩
0
4x+2y
dx,
3
0,
⎧1
5
⎪
⎨
⎪
⎩
0
otherwise
4x+2y
dy,
3
0,
⎧
⎨ 2x+y ,
1+y
⎩0,
0≤y≤1
⎧
⎨ 4x+2y ,
= ⎩ 4x+1
0,
0≤x≤1
otherwise
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
otherwise
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
otherwise
12. No, for example: if we know that X equals 2, then Y can only be zero. In
other words, information about X can change the probability of Y .
The other way to show this is to find marginal distributions and see that
the product of the marginal PDFs is not equal to the joint PDF.
13. From given data: E[X] = 5, VAR[X] = 15, E[Y ] =
(6−4)2
= 13
12
4+6
2
= 5, VAR[Y ] =
a) X and Y are independent. Therefore: E[XY ] = E[X]E[Y ] = 5 × 5 =
25
b) Similarly,
E[X 2 Y 2 ] = E[X 2 ]E[Y 2 ] = (VAR[X] + (E[X])2 ) (VAR[Y ] + (E[Y ])2 )
= (15 + 25) (1/3 + 25) = 1013.33
4.13
Drill Problems
Section 4.1,4.2,4.3,4.4 aand 4.5 - Joint and marginal
PDF/PMFs
1. The joint PDF fX,Y (x, y) = c, for (0 < x < 3) and (0 < y < 4), and is 0
otherwise.
a) What is the value of the constant c?
b) Find the marginal PDFs fX (x) and fY (y)?
90
Pairs of Random Variables
Figure 4.1: Figure for drill problem 3.
c) Determine if X and Y are independent.
Ans
a) c =
1
12
c) Yes
⎧
⎨1,
b) fX (x) = ⎩ 3
0,
⎧
⎨1,
0≤x≤3
otherwise
fY (y) = ⎩ 4
0,
0≤y≤4
otherwise
2. The joint PMF of discrete random variables X and Y is given by
⎧
0.2, x = 0, y = 0
⎪
⎪
⎪
⎨
0.3, x = 1, y = 0
0.3, x = 0, y = 1
⎪
⎪
⎩
c,
x = 1, y = 1
PX,Y (x, y) = ⎪
a) What is the value of the constant c?
b) Find the marginal PMFs PX (x) and PY (y). Are X and Y independent?
Ans
a) c = 0.2
b) PX (x) = 0.5 for x ∈ {0, 1} and ZOW.
ZOW.
No
PY (y) = 0.5 for y ∈ {0, 1} and
3. Fig. 4.1 shows a region in the x-y plane where the bivariate PDF fX,Y (x, y) =
cx2 . Elsewhere, the PDF is 0.
a) Compute the value of c.
b) Find the marginal PDFs fX (x) and fY (y).
Ans
a) c =
3
32
b) fX (x) =
⎧ 2
⎨ x (2−x) ,
⎩0,
32
−2 ≤ x ≤ 2
otherwise
fY (y) =
⎧ 3
⎨ y +8 ,
32
⎩0,
−2 ≤ y ≤ 2
otherwise
4.13 Drill Problems
91
Section 4.6 - Functions of 2 RVs
4. Let X ∼ Uniform(0,3) and Y ∼ Uniform(-2,2). They are indepdent. Find
the PDF of W = X + Y .
Ans
⎧ (w+2)
⎪
⎪ 12 ,
⎪
⎪
⎪
⎨1,
−2 ≤ w ≤ 1
1<w<2
fW (w) = 4(5−w)
⎪
⎪
, 2≤w≤5
⎪
12
⎪
⎪
⎩
0,
otherwise
Section 4.7,4.8,4.9 and 4.10 - Expected values,
conditioning and independence
5. The joint PMF of discrete random variables X and Y is given by
⎧
0.3, x = 0, y = 0
⎪
⎪
⎪
⎨
a,
x = 1, y = 0
,
0.3, x = 0, y = 1
⎪
⎪
⎩
b,
x = 1, y = 1
PX,Y (x, y) = ⎪
where a and b are constants. X and Y are known to be independent.
a) Find a and b.
b) Find the conditional PMF PX,Y |A (x, y), where the event A is defined
as {(x, y)|x ≤ y}.
c) Compute PX|A (x) and PY |A (y). Are X and Y still independent (even
when conditioned on A)? Can you explain why?
Ans
⎧
⎪
3/8,
⎪
⎪
⎪
⎪
⎨6/8,
x = 0, y = 0
x = 0, y ∈ {0, 1}
a) a = 0.2; b = 0.2 b) PX,Y |A (x, y) =
⎪
⎪
2/8, x = 1, y = 1
⎪
⎪
⎪
⎩
0,
otherwise
⎧
⎪
⎪
⎨6/8,
⎧
⎪
x=0
⎪
⎨6/8, y = 0
c) PX|A (x) = 2/8, x = 1
PY |A (y) = 2/8, y = 1
⎪
⎪
⎪
⎪
⎩
⎩
0,
otherwise
0,
otherwise
No. Conditioning creates a dependency.
6. Reconsider the problem 3. Suppose an event A is defined as {(x, y)|x ≤
0, y ≥ 0}.
a) Compute the conditional joint PDF fX,Y |A (x, y).
92
Pairs of Random Variables
b) Find the marginal PDFs fX|A (x) and fY |A (y).
c) Are X and Y independent?
Ans
a) ⎧
fX,Y |A (x, y)
⎨ 3x2 , −2 ≤ x ≤ 0, 0 ≤ y ≤ 2
= ⎩ 16
0,
otherwise
b) fX|A (x) =
⎧
⎨ 3x2 ,
8
⎩0,
−2 ≤ x ≤ 0
otherwise
fY |A (y) =
⎧
⎨1,
2
⎩0,
0≤y≤2
otherwise
c) Yes
7. E[X] = −2 and VAR [X] = 3. E[Y ] = 3 and VAR [Y ] = 5. The covariance Cov[X, Y ] = −0.8. What are the correlation coefficient ρX,Y and the
correlation E[XY ]?
Ans
ρX,Y = −0.2066 and E[XY ] = −6.8
8. X is a random variable, µX = 4 and σX = 5. Y is a random variable, µY = 6
and σY = 7. The correlation coefficient is −0.2. If U = 3X + 2Y , what are
VAR[U ], Cov[U, X] and Cov[U, Y ]?
Ans
Var[U ] = 337; Cov[U, X] = 29.4; Cov[U, Y ] = 83.6
Section 4.11 - Bivariate Gaussian RVs
9. Given the joint PDF fX,Y (x, y) of random variables X and Y :
fX,Y (x, y) =
%
&
1
exp −(x2 + 1.4xy + y 2 )/1.02 , −∞ < x, y < ∞
1.42829π
2
a) Find the means (µX , µY ), the variances (σX
, σY2 ) of X and Y .
b) What is the correlation coefficient ρ? Are X and Y independent?
c) What are the marginal PDFs for X and Y ?
Ans
2
a) µX = µY = 0; σX
= σY2 =%1 b)
& ρ = −0.7; No
1
x2
√
c) fX (x) = fY (x) = 2π exp − 2
Chapter 5
Sums of Random Variables
5.1
5.1.1
Summary
PDF of sum of two RV’s
Continuous case
Theorem 5.1: The PDF of W = X + Y is
fW (w) =
6
∞
−∞
fX,Y (x, w − x)dx =
6
∞
−∞
fX,Y (w − y, y)dy.
Theorem 5.2: When X and Y are independent RV’s, the PDF of W = X + Y
is
6 ∞
6 ∞
fW (w) =
fX (x)fY (w − x)dx =
fX (w − y)fY (y)dy.
−∞
−∞
It is actually a convolution of fX (x) and fY (y).
Discrete case
Theorem 5.3: The PMF of W = X + Y is
PW (w) =
#
x
5.1.2
PX,Y (x, w − x).
Expected values of sums
Theorem 5.4: For any set of RV’s X1 , X2 , · · · , XN , the expected value of
SN = X1 + · · · + XN is E[SN ] = E[X1 ] + · · · + E[XN ].
Theorem 5.5: For independent RV’s X1 , X2 , · · · , XN the variance of SN =
$
X1 + · · · + XN is VAR[SN ] = N
i=1 VAR[Xi ].
94
Sums of Random Variables
5.1.3
Moment Generating Function (MGF)
Definition 5.1: For a RVX, the moment generating function (MGF) of X is
φ(s) = E[e
sX
]=
- 5∞
esx fX (x)dx
.
sxi
PX (xi )
x∈SX e
$−∞
Therefore, the MGF is a Laplace Transform of the PDF for a continuous RV
and a Z Transform of the PMF for a discrete RV.
Theorem 5.6: A RV X with MGF φ(s) has nth moment as
9
dn Ψ(s) 99
E[X n ] =
9
.
dsn 9s=0
Theorem 5.7: For a set of independent RV’s X1 , X2 , · · · , Xn , the moment
generating function of Sn = X1 + X2 + · · · + Xn is
φSn (s) = φX1 (s) · · · φXn (s)
.
We use this theorem to calculate the PMF or PDF of a sum of independent RV’s.
Theorem 5.8 central limit theorem (CLT): Let Sn = X1 + X2 + · · · + Xn
be a sum of n i.i.d. RV’s with E[Xi ] = µ and VAR [X] = σ 2 . Then E[Sn ] = nµ
and VAR [Sn ] = nσ 2 . The following holds:
Sn − nµ
√
∼ N (0, 1) as n → ∞.
nσ 2
We use this theorem to approximate the PMF or PDF of Yn when the PMF’s or
PDF’s of Xi are unknown but their means and variances are known to be identical.
5.2
Illustrated Problems
1. Random variables X and Y have joint PDF
fX,Y (x, y) =
⎧
⎨e−(x+y) ,
⎩0,
What is the PDF of W = X + Y ?
0 ≤ x, y ≤ ∞,
otherwise
5.2 Illustrated Problems
95
2. The joint PDF of two random variables X and Y is
⎧
⎨1,
fX,Y (x, y) = ⎩
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
0, otherwise
a) Find the marginal distribution of X and Y .
b) Show that X and Y are independent.
c) Find E[X + Y ].
d) Find VAR[X + Y ].
e) Find the PDF of W = X + Y .
3. Consider X ∼ N (0,2).
a) Find E[X 4 ]. (Hint: use a MGF table to answer this question.)
b) Find E[X 5 ]. (Hint: while you can use MGF, it is better to use symmetry here.)
s2 /2
(e −e )
4. The MGF of W = X + Y + Z is equal to e 2s(1−s)
. If X is N (0,1) and Y
is Exponential(1), and X, Y and Z are independent. Find the MGF of Z.
What is the PDF of Z?
s
−s
5. Random variable Y has MGF φY (s) = 1/(1 − s). X has MGF φX (s) =
1/(1 − 2s)2 . X and Y are independent. Let W = X + Y .
a) Find E[Y ], E[Y 2 ], E[X] and E[X 2 ].
b) Find the variance of W .
6. Telephone calls handled by a certain phone company can be either voice (V)
or data (D). The company estimates that P [V ] = 0.8 and P [D] = 0.2. All
telephone calls are independent of one another. Let X be the number of
voice calls in a collection of 100 telephone calls.
a) What is E[X]?
b) What is VAR[X]?
c) Use the CLT to estimate P [X ≥ 18].
d) Use the CLT to estimate P [16 ≤ X ≤ 24].
7. The duration of a cellular telephone call is an exponential random variable
with average length of 4 minutes. A subscriber is charged $30/month for the
first 300 minutes of airtime and $0.25 for each extra minute. A subscriber
has received 90 calls during the past month. Use the central limit theorem
to answer the following questions:
96
Sums of Random Variables
a) What is the probability that this month bill is $30? (meaning that the
total airtime is less than or equal to 300 minutes).
b) What is the probability that this month bill is more than $35?
8. A random walk in two dimensions is the following process: flop a fair coin
and move one unit in the +x direction if heads and one unit in the −x
direction if tails; flip another fair coin and move one unit in the +y direction
if heads and one unit in the −y direction if tails. This is one cycle. Repeat
the cycle 200 times. Let X and Y be the final position. Let Xk ∈ {+1, −1}
be the movement along the x axis during the k-th cycle. Let Yk ∈ {+1, −1}
be the movement along the y axis during the k-th cycle.
a) Give the exact PMF of X and find the exact probability that X exceeds
10 at the end of the process. Find the same probability using the CLT.
b) Find the exact probability that both X and Y exceeds 10 at the end
of the process. Find the same probability using the CLT.
c) Find the probability that the final position lies outside the circle centered on the origin and goes through (+10, +10).
5.3
Solutions for the Illustrated Problems
⎧
⎨e−(x+y) ,
1. Given fX,Y (x, y) = ⎩
0,
0 ≤ x, y ≤ ∞
, we get that
otherwise
fX,Y (x, w − x) =
By definition fW (w) =
5∞
−∞
a) fX (x) =
5∞
fX,Y (x, y)dy =
fY (y) =
5∞
fX,Y (x, y)dx =
−∞
−∞
⎧
⎨1,
⎩0,
0≤w<∞
otherwise
fX,Y (x, w − x)dx.
fW (w) =
2.
⎧
⎨e−w ,
⎧
⎨we−w ,
⎩0,
0≤w<∞
otherwise
⎧
⎨1,
0≤x≤1
⎩0, otherwise
⎧
⎨1,
0≤y≤1
⎩0, otherwise
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
= fX,Y (x, y)
⎩0, otherwise
Therefore, X and Y are independent.
b) fX (x)fY (y) =
5.3 Solutions for the Illustrated Problems
97
c) Since X, Y ∼ Uniform(0, 1) we know that E[X] = E[Y ] = 0.5.
E[X + Y ] = E[X] + E[Y ] = 0.5 + 0.5 = 1
1
d) Since X, Y ∼ Uniform(0, 1) we know that VAR[X] = VAR[Y ] = 12
.
Moreover, X and Y are independent (⇒ uncorrelated). Thus we have,
1
1
VAR[X + Y ] = VAR[X] + VAR[Y ] = 12
+ 12
= 16
e) X and Y are independent random variables. Therefore, the PDF of
W = X + Y is the convolution of their PDFs.
⎧
⎪
⎪
⎨w,
0≤w<1
fW (w) = fX (w) ⊗ fY (w) = 2 − w, 1 ≤ w < 2
⎪
⎪
⎩
0,
otherwise
3. Given: X ∼ N (0, 2).
a) Moment generating function of X is given by φX (s) = exp(sµ+s2 σ 2 /2).
Differentiating it 4 times with respect to s, we get:
%
&
d4
2 2
4
2 2 2
2 4
(φ
(s))
=
exp(sµ
+
s
σ
/2)
×
3σ
+
6(µ
+
σ
s)
σ
+
(µ
+
σ
s)
X
ds4
Thus we have, E[X 4 ] =
d4
ds4
5∞
9
(φX (s))99
s=0
= 12
b) By definition, E[X 5 ] = −∞ x5 fX (x)dx. Since fX (x) is an even function of x, x5 fX (x) is an odd function of x, which makes the integrand
evaluate to zero. Therefore, E[X 5 ] = 0
4. Since X, Y and Z are independent, the MGF of W = X + Y + Z is given
by:
φW (s) = φX (s) · φY (s) · φZ (s)
Substituting φW (s) =
(5.1), we get:
es
2 /2
(es −e−s )
;
2s(1−s)
φZ (s) =
φX (s) = es
2 /2
and φY (s) =
(5.1)
1
1−s
es − e−s
2s
This MGF can be recognized as that of Z ∼ Uniform(−1, 1).
5. Given: φX (s) = 1/(1 − 2s)2 and φY (s) = 1/(1 − s).
a) Using the series expansion:
φX (s) ≡ 1 + 4s + 12s2 + . . . ≡ 1 + E[X]s + 12 E[X 2 ]s2 + . . .
φY (s) ≡ 1 + s + s2 + . . . ≡ 1 + E[Y ]s + 12 E[Y 2 ]s2 + . . .
Equating the coefficients, we get:
E[X] = 4; E[X 2 ] = 24; E[Y ] = 1; E[Y 2 ] = 2.
in Eqn.
98
Sums of Random Variables
b) Since X and Y are independent,
VAR[W ] = VAR[X + Y ] = VAR[X] + VAR[Y ] = (E[X 2 ] − E 2 [X]) +
(E[Y 2 ] − E 2 [Y ])
= (24 − 16) + (2 − 1) = 9
$
6. Let X = 100
k=1 Xk , where Xk is an indicator random variable which is 1
whenever the k-th call is a voice call; 0 whenever the k-th call is a data call.
⎧
⎨0.8,
x=1
PMF of Xk , k ∈ {1, . . . , 100} is given by: PXk = ⎩
.
0.2, x = 0
Thus, we have E[Xk ] = E[Xk2 ] = 0.8, and VAR[Xk ] = E[Xk2 ] − E[Xk ]2 =
0.8 − 0.82 = 0.16.
a) Since, Xk are independent,
$
E[X] = 100
k=1 E[Xk ] = 80
b) Likewise, VAR[X] =
$100
k=1
VAR[Xk ] = 16.
c) Therefore, CLT can be used to approximate the distribution of X with
N (80, 16).
%
&
√
∴ P [X ≥ 18] ≈ 1 − P [X ≤ 18] = 1 − φ 18−80
= φ(15.5) ≈ 1.
16
d) Similarly, P [16 ≤ X ≤ 24] ≈ φ
φ(−16) ≈ 0
%
24−80
√
16
&
−φ
%
16−80
√
16
&
= φ(−14) −
7. Since the duration (airtime) Xk , k ∈ {1, . . . , 90} of each call is given to have
the distribution Xk ∼ Exponential(λ) such that E[Xk ] = 1/λ = 4(⇒ λ =
0.25), we know that σXk = 1/λ = 4.
Let X denote the total airtime (in minutes). Then, we have X =
$90
Therefore, E[X] = k=1 E[Xk ] = 90(4) = 360 minutes.
Assuming
k of the calls to be independent,
2$ duration X√
90
2
σX =
k=1 σXk = 4 90 = 37.95 minutes.
$90
k=1
Xk .
a) The distribution of %X can be
& approximated (using CLT) with Gaussian(360, 1440).
300−360
∴ P [X ≤ 300] ≈ φ 37.95 = 1 − φ(1.58) = 0.0571
8.
b) Similarly, probability that the monthly bill is more than $35 is given
by
%
&
%
&
40
P [X > 320] = 1 − φ 320−360
=
φ
= 0.8531
37.95
37.95
a) Consider the movement in ±x direction. For Xk , the displacement that
takes place in the k-th cycle, we get,
E[Xk ] = (+1) × 12 + (−1) × 12 = 0.
5.3 Solutions for the Illustrated Problems
E[Xk2 ] = (+1)2 × 12 + (−1)2 ×
1
2
99
= 1 ⇒ VAR[Xk ] = 1.
Using the exact distribution of X
Let X̂k = Xk2+1 , which is Bernoulli(0.5).
$
1
Thus, X̂ = 200
k=1 X̂k = ( 2 X + 100) ∼ Binomial(200, 0.5).
Therefore, P [X > 10] = P [X̂ > 105] = 0.5200
$200
k=106
%
200
k
&
= 0.21838
Hint: you may use the following MATLAB code to compute this value.
p = 0;
for k = 106:200
p = p + nchoosek(200,k);
end
p = (0.5^200) * y
Using CLT approximation
$
Since, initial displacement is zero, X = 200
k=1 Xk . We also know that
the Xk ’s are independent.
$
∴ E[X] = 200
E[Xk ] = 0,
$k=1
VAR[X] = 200
k=1 VAR[Xk ] = 200.
Thus, the distribution of X can be approximated using the CLT as
Gaussian(0, 200).
%
&
11−0
√
Therefore, P [X > 10] ≈ 1 − φ 10
= 1 − φ(0.77782) = 0.21834
2
Note: ‘11’ is used in this approximation, because final position must
be an even number; and 11 is halfway between 10 and 12.
b) Since X and Y are independent and identically
distributed,
⎧
⎨0.04768, exact
P [X > 10, Y > 10] = (P [X > 10])2 =
⎩0.04767, using CLT
c) Using the exact distributions
Since X̂ and Ŷ are independent, their joint PMF is given by PX̂,Ŷ (j, k) =
% &% &
200
0.5400 200
.
j
k
√
√
We3 are required to find P [ X 2 + Y 24 > 10 2]; or equivalently
P (2X̂ − 200)2 + (2Ŷ − 200)2 > 200 . This is equivalent to finding the
sum of probability masses lying outside the circle.
$
Therefore, the desired probability is: (k,j)∈ℵ PX̂,Ŷ (j, k)
.
where: ℵ = {(j, k) | j, k ∈ {0, . . . , 200}, (2j−200)2 +(2k−200)2 > 200}.
Hint: you may use the following MATLAB code to compute this value
to be 0.59978.
100
Sums of Random Variables
p = 0;
for j = 0:200
for k = 0:200
if (2*j-200)^2 + (2*k-200)^2 > 200
p = p + nchoosek(200,k)* nchoosek(200,j);
end
end
end
y = y * (0.5^400)
Using CLT approximation
√
Let R = X 2 + Y 2 . Since X, Y ∼ Gaussian(0, 200),
√ and X and Y are
independent, R is Rayleigh with parameter σ = 10 2.
√
√
2
Therefore, P [R > 10 2] = e−(10 2/σ) /2 = e−0.5 = 0.60653.
5.4
Drill Problems
Section 5.1.1 - PDF of the sum of 2 RVs
1. X ∼ Uniform(0, 1) and Y ∼ Uniform(0, 2) are independent RVs. Compute
the PDF fW (w) of W = X + Y .
Ans
⎧
⎪
⎪
⎪0.5w,
⎪
⎪
⎨0.5,
0≤w<1
1≤w<2
fW (w) =
⎪
⎪
0.5(3 − w), 2 ≤ w < 3
⎪
⎪
⎪
⎩
0,
otherwise
2. X ∼ Uniform(0, 1) and Y ∼ Uniform(0, 2) are independent RVs. Compute
the PDF fW (w) of W = X − Y .
Ans
⎧
⎪
0.5(w + 2),
⎪
⎪
⎪
⎪
⎨0.5,
−2 ≤ w < −1
−1 ≤ w < 0
fW (w) = ⎪
⎪
0.5(1 − w), 0 ≤ w < 1
⎪
⎪
⎪
⎩
0,
otherwise
3. X, Y ∼ Exponential(λ) are independent RVs. Compute the PDF fW (w) of
W =X +Y.
Ans
fW (w) =
⎧
⎨λ2 we−λw ,
⎩0,
w≥0
otherwise
5.4 Drill Problems
101
Figure 5.1: figure used in Question 7
Section 5.1.2 - Expected values of sums
4. The random variable U has a mean of 0.3 and a variance of 1.5. Ui , i ∈
{1, . . . , 53} are independent realizations of U .
a) Find the mean and variance of Y if Y =
b) Find the mean and variance of Z if Z =
Ans
a) E[Y ] = 0.3; VAR[Y ] =
b) E[Z] = 0.3(53) = 15.9;
1.5
53
1 $53
k=i Ui .
53
$53
k=i Ui .
= 0.0283
VAR[Z] = 1.5(53) = 79.5
Section 5.1.3 - MGF
3
4
5. Compute the moment generating function φX (s) = E esX of a random
variable X exponentially distributed with a parameter α.
Let, W = X +Y , where X and Y are Exponential(λ). Compute the moment
generating function ΨW (s) of W using those of X and Y .
Ans
%
&−1
%
&−2
φX (s) = 1 − αs
; φW (s) = 1 − αs
6. Compute the MGF of an Erlang(n, λ) RV. Calculate the mean and variance.
Ans
%
&−n
1 − λs
; mean = nλ ; variance =
n
λ2
7. Find the MGF φX (s) of a uniform random variable X ∼ Uniform(a, b).
a) Derive the mean and variance of X.
b) Find E[X 3 ] and E[X 5 ].
102
Sums of Random Variables
Ans
2
a) E[X] = a+b
; VAR[X] = (b−a)
2
2
5
4
2 3
3 b2 +ab4 +b5
E[X 5 ] = a +ab +a b +a
6
b) E[X 3 ] =
a3 +ab2 +a2 b+ab2 +b3
;
4
Appendix A
2009 Quizzes
A.1
Quiz Number 1
1. True or False, Justify your answer.
a) If A ⊂ B, and B ⊂ C, and C ⊂ A, then A = B = C.
b) A − (B − C) = (A − B) − C.
2. Consider the experiment of flipping a coin four times and recording the T and
H sequence. For example, THTT is a possible outcome.
a) How many elements does the event A = {at least one heads} have?.
b) Let B = {even number of tails}. All outcomes are equally likely. Find P [AorB].
104
A.2
2009 Quizzes
Quiz Number 2
1.
a) Using the axioms of probability prove that P [Ac ] = 1 − P [A].
b) It is known that P [A
<
B] = 0.24, P [A] = 0.15, P [B] = 0.18. Find P [A|B].
2. Events D, E and F form an event space. Calculate P [F |A].
P [D] = 0.35 P [A|D] = 0.4
P [E] = 0.55 P [A|E] = 0.2
P [F ] = 0.10 P [A|F ] = 0.3
A.3 Quiz Number 3
A.3
105
Quiz Number 3
1. From a group of five women and seven men, two women and three men are
randomly selected for a committee?
a) How many ways can the committee be selected?
b) Suppose that two of the men (say, John and Tim) cannot serve in the committee
together. What is the probability that a randomly selected committee meets this
requirement?
2. In a lot of 100 used computers, 18 have faulty hard drives and 12 have faulty
monitors. Assume that these two problems are independent. If a computer chosen
at random, find the probability that
(a) it has a hard disc problem,
(b) it does not have a faulty monitor,
(c) it has a hard disc problem only.
106
A.4
2009 Quizzes
Quiz Number 4
1. A fair die is rolled twice, and the two scores are recorded. The random variable
X is 1 if both scores are equal. If not, X is the minimum of the two scores. For
example, if the first score is 3 and the second one is 5, then X = 3, but if both
are 3, then X = 1.
(a) Write SX , the range, and PX (x), the PMF of X. Be sure to write the value of
PX (x) for all x from −∞ to ∞.
(b) Find the probability of X > 3.
2.
(a) Two percent of the resistors manufactured by a company are defective. You
need 23 good resistors for a project. Suppose you have a big box of the resistors
and you keep on picking resistors until you have 23 good ones. Let X be the total
number of resistors that you pick. Write down the PMF of X.
(b) A student takes a multiple choice test with 20 questions. Each question has
5 answers (only one of which is correct). The student blindly guesses. Let X be
the number of correct answers. Find the PMF of X.
A.5 Quiz Number 5
A.5
107
Quiz Number 5
1. The CDF of a random variable is given as
FX (x) =
a) Find P [1 < X < 2].
⎧
⎪
⎨ 04
⎪
⎩
x≤0
0<x<2
2≤x
x
16
1
b) Find the PDF of X, fX (x).
2. The PDF of a random variable is given as
fX (x) =
-
x+
0
1
2
if 0 < x < 1
Otherwise
a) Find FX (0.5).
b) A PDF is given by
fY (y) =
Find the value of the constant c.
-
1
cy − 2
0
0<y<1
otherwise
108
A.6
2009 Quizzes
Quiz Number 6
1. X is a continuous Uniform(−2, +2) random variable.
a) Find E[X 3 ].
b) Find E[eX ].
2. The number of sales a retailer has is modelled as X ∼ Gaussian(50, 100). Considering the overhead costs and the cost of products, the net profit of the retailer
is Y = X5 − 5.
a) Find E[Y ] and V ar[Y ].
b) Find the probability of a positive net profit, i.e., P [Y ≥ 0]. Leave your answer
in the form of Φ(x) function.
3. Telephone calls arrive at a switchboard at the average rate of 2 per hour. You
can assume that the time between two calls is an exponential random variable.
Find the probability that it will be at least 3 hours between two calls?
A.7 Quiz Number 7
A.7
109
Quiz Number 7
X is a Uniform(2,6) random variable and event B = {X < 3}.
(a) Find fX|B (x), E[X|B] and VAR[X|B]
(b) Suppose Y = g(X) =
cases for −∞ < a < ∞.
1
.
X
Find the CDF of Y , FY (a). Be sure to consider all
110
A.8
2009 Quizzes
Quiz Number 8
1. Random variables X and Y have joint PDF fX,Y (x, y) =
Let W =
2X
.
Y
-
2
0
(a) What is the range of W.
(b) Find FW (a). Be sure to consider all cases for −∞ < a < ∞.
(c) Find P [X ≤ 2Y ].
if 0 ≤ y ≤ x ≤ 1
otherwise
Appendix B
2009 Quizzes: Solutions
B.1
Quiz Number 1
Quiz # 1, EE 387
1. True or False, Justify your answer.
a) If A ⊂ B, and B ⊂ C, and C ⊂ A, then A = B = C.
Solution: True. Since A ⊂ B and B ⊂ C, one can conclude that A ⊂ C.
Moreover, one has C ⊂ A. Consequently, A = C. A similar discussion holds for
B. C ⊂ A and A ⊂ B, therefore C ⊂ B, and since B ⊂ C, one has B = C = A.
b) A − (B − C) = (A − B) − C.
Solution: False. It can be shown easily using a counterexample. If A = {1, 2, 3, 4},
B = {1, 2, 3}, and C = {1},
A − (B − C) = A − {2, 3} = {1, 4}
while
(A − B) − C = {4} − C = {4}.
2. Consider the experiment of flipping a coin four times and recording the T and
H sequence. For example, THTT is a possible outcome.
a) How many elements does the event A = {at least one heads} have?.
112
2009 Quizzes: Solutions
Solution: A={at least one H} is equivalent to the set of all outcomes except for
the TTTT. Since there are 2 × 2 × 2 × 2 = 16 outcomes in S, A has 15 elements.
b) Let B = {even number of tails}. All outcomes are equally likely. Find P [AorB].
Solution: B has 8 elements (0 is an even number), i.e.
B = {HHT T, HT HT, HT T H, T HHT, T HT H, T T HH, T T T T, HHHH}.
A includes all the elements of B except for TTTT and therefore
A ∩ B = {HHT T, HT HT, HT T H, T HHT, T HT H, T T HH, T T T T, HHHH}
A ∪ B = S has 16 elements.
B.2 Quiz Number 2
B.2
113
Quiz Number 2
1.
a) Using the axioms of probability prove that P [Ac ] = 1 − P [A].
Solution : Using axiom 3, one has,
A ∩ Ac = Φ ⇒ P [A ∪ Ac ] = P [A] + P [Ac ]
Moreover, using axiom 2,
Therefore,
A ∪ Ac = S ⇒ P [A ∪ Ac ] = P [S] = 1
1 = P [A] + P [Ac ] ⇒ P [Ac ] = 1 − P [A]
b) It is known that P [A
Solution :
<
B] = 0.24, P [A] = 0.15, P [B] = 0.18. Find P [A|B].
P [A ∩ B]
P [B]
P [A ∩ B] = P [A] + P [B] − P [A ∪ B] = 0.15 + 0.18 − 0.24 = 0.09
0.09
P [A|B] =
= 0.5
0.18
P [A|B] =
2. Events D, E and F form an event space. Calculate P [F |A].
P [D] = 0.35 P [A|D] = 0.4
P [E] = 0.55 P [A|E] = 0.2
P [F ] = 0.10 P [A|F ] = 0.3
Solution :
P [A] = P [D]P [A|D] + P [E]P [A|E] + P [F ]P [A|F ] = 0.14 + 0.11 + 0.03 = 0.28
P [A|F ]P [F ]
0.03
P [F |A] =
=
= 0.107
P [A]
0.28
114
B.3
2009 Quizzes: Solutions
Quiz Number 3
1. From
a group of five women and seven men, two women and three men are randomly
selected for a committee?
a) How many ways can the committee be selected?
Solution: Number of ways =
% &% &
5
2
7
3
= 350
b) Suppose that two of the men (say, John and Tim) cannot serve in the committee together. What is the probability that a randomly selected committee meets
this requirement? Solution: The total number of ways that the committee does
not meet the requirement is equivalent to the number of ways that both John
and Tim are in the committee. So only one other man should be selected for
the committee out of 5 men. If we call the event that the committee meets the
requirement, A,
P r[A] = 1 −
% &% &
5
2
5
1
350
=
300
350
One can also find the same result using the total number of ways that the committee meets the requirement.We can divide the menś group into two groups,
one group which only has two members (John and Tim) and one group which
includes all other possible members (has 5 members). The committee meets the
requirement if one of the members is selected out of the first group and two other
members are selected out of the second group or if {all the members of the committee are selected out of the second group},
P r[A] =
% & % &% &
5
2
[
2
1
5
2
+
350
% &% &
2
0
5
3
]
=
300
350
2. In a lot of 100 used computers, 18 have faulty hard drives and 12 have faulty
monitors. Assume that these two problems are independent. If a computer chosen
at random, find the probability that
(a) it has a hard disc problem,
(b) it does not have a faulty monitor,
(c) it has a hard disc problem only.
Solution:
B.3 Quiz Number 3
(a) P r[A] = 0.18
(b) P r[B c ] = 1 − 0.12 = 0.88
(c) P r[A ∩ B c ] = P [A]P [B c ] = 0.18 × 0.88 = 0.1584
115
116
B.4
2009 Quizzes: Solutions
Quiz Number 4
1. A fair die is rolled twice, and the two scores are recorded. The random variable
X is 1 if both scores are equal. If not, X is the minimum of the two scores. For
example, if the first score is 3 and the second one is 5, then X = 3, but if both
are 3, then X = 1.
(a) Write SX , the range, and PX (x), the PMF of X. Be sure to write the value of
PX (x) for all x from −∞ to ∞.
Solution:
a) SX = {1, 2, 3, 4, 5}.
In the following equations, {nm} means {dice1= n & dice2= m}.
PX (x = 1) = P {11, 22, 33, 44, 55, 66, 12, 21, 13, 31, 14, 41, 15, 51, 16, 61} =
PX (x = 2) = P {23, 32, 24, 42, 25, 52, 26, 62} =
PX (x = 3) = P {34, 43, 35, 53, 36, 63} =
PX (x = 4) = P {45, 54, 46, 64} =
PX (x = 5) = P {56, 65} =
2
36
4
36
6
36
8
36
16
36
Note that PX (x) = 0 if x is not a member of SX = {1, 2, 3, 4, 5}.
(b) Find the probability of X > 3.
Solution:
b) PX (x > 3) =
2+4
36
=
6
36
2.
(a) Two percent of the resistors manufactured by a company are defective. You
need 23 good resistors for a project. Suppose you have a big box of the resistors
and you keep on picking resistors until you have 23 good ones. Let X be the total
number of resistors that you pick. Write down the PMF of X.
Solution: Using the information given, the xth resistor must be a good one. Moreover, 22 resistors out of x − 1 resistors must be good too. Therefore:
+
,
x−1
PX (x) =
(0.98)23 (0.02)x−1−22
22
B.4 Quiz Number 4
117
You could also argue that X has a Pascal(23, 0.02) distribution and get the exact
same result immediately.
(b) A student takes a multiple choice test with 20 questions. Each question has
5 answers (only one of which is correct). The student blindly guesses. Let X be
the number of correct answers. Find the PMF of X.
Solution: It can be easily seen that it has a binomial distribution.
+
,
20
PX (x) =
(0.2)x (0.8)20−x
x
118
B.5
2009 Quizzes: Solutions
Quiz Number 5
1. The CDF of a random variable is given as
FX (x) =
a) Find P [1 < X < 2].
⎧
⎪
⎨ 04
⎪
⎩
if x ≤ 0
if 0 < x < 2
if 2 ≤ x
x
16
1
Solution:
P [1 < X < 2] = FX (2) − FX (1) = 1 −
1
15
=
16
16
b) Find the PDF of X, fX (x).
Solution:
-
d
fX (x) =
FX (x) =
dx
x3
4
0
,0 < x < 2
, otherwise
2. The PDF of a random variable is given as
fX (x) =
-
x+
0
1
2
if 0 < x < 1
Otherwise
a) Find FX (0.5).
Solution:
FX (0.5) = P [X ≤ 0.5] =
b) A PDF is given by
fY (y) =
Find the value of the constant c.
6
0.5
0
)
*
90.5
1
x2 1 99
3
x+
dx =
+ x99 =
2
2
2 0
8
1
cy − 2 0 < y < 1
0
otherwise
B.5 Quiz Number 5
119
Solution: Using the fact that
6
1
0
5
fY (y)dy = 1,
1
1
91
cy − 2 dy = 2cy 2 99 = 2c = 1
∴c =
1
2
0
120
B.6
2009 Quizzes: Solutions
Quiz Number 6
is a continuous Uniform(−2, +2) random variable.
1. X
a) Find E[X 3 ].
Solution:
-
1
,
4
−2 ≤ x ≤ 2
0, otherwise
6 2
1 3
E[X 3 ] =
x dx = 0 (from odd symmetry of the integrand)
−2 4
X ∼ Uniform(−2, +2) ⇒ fX (x) =
b) Find E[eX ].
Solution:
E[e ] =
X
6
2
−2
9
1 x
ex 992
e2 − e−2
e dx = 9 =
= 1.81343
4
4 −2
4
2. The number of sales a retailer has is modeled as X ∼ Gaussian(50, 100). Considering the overhead costs and the cost of products, the net profit of the retailer
is Y = X5 − 5.
a) Find E[Y ] and V ar[Y ].
Solution: Given E[X] = 50, Var[X] = 100,
E[Y ] = E[X]
−5=5
5
Var[X]
Var[Y ] = 52 = 4
b) Find the probability of a positive net profit, i.e., P [Y ≥ 0]. Leave your answer
in the form of Φ() function.
Solution: P [Y ≥ 0] = P
3
Y −5
2
4
≥ −2.5 = 1 − Φ(−2.5) = Φ(2.5) = 0.99379
B.6 Quiz Number 6
121
3. Telephone calls arrive at a switchboard at the average rate of 2 per hour. You
can assume that the time between two calls is an exponential random variable.
Find the probability that it will be at least 3 hours between two calls?
Solution:
Given average call arrival rate 2 per hour (i.e. average inter-arrival time of 0.5
hours),
we find that inter-arrival time T ∼ Exponential(2).
9
9
∴ P [T > 3] = e−3λ 9
λ=2
= e−6 = 2.4787 × 10−3
122
B.7
2009 Quizzes: Solutions
Quiz Number 7
X is a Uniform(2,6) random variable and event B = {X < 3}.
(a) Find fX|B (x), E[X|B] and VAR[X|B]
Solution:
By definition,
fX|B (x) =
-
fX (x)
,
P [B]
0,
when B is true
when B is false
There’s no need to compute P [B] since the above equation hints that X|B ∼
Uniform(2,3).
∴ fX|B (x) =
-
1, 2 ≤ x < 3
0, otherwise
2+3
= 2.5
2
(3 − 2)2
1
Var[X|B] =
=
= 0.0833
12
12
E[X|B] =
(b) Suppose Y = g(X) =
cases for −∞ < a < ∞.
1
.
X
Find the CDF of Y , FY (a). Be sure to consider all
Solution: By definition of the CDF, for −∞ < a < ∞.
FY (a) = P [Y ≤ a] = P
⎧
3
⎪
⎨
P
X≥
1
≤a =
3
⎪
X
⎩ P X ≤
⎧
% &
⎪
⎨ 1 − FX a1 , a > 0
=
% &
⎪
⎩ FX 1 ,
a<0
a
⎧
⎪
x<2
⎨ 0,
But, we know that FX (x) =
Hence, we get
:
⎧
⎪
⎨ 0,
⎪
⎩
x−2
,
4
1,
;
1
a
1
a
4
4
, a>0
, a<0
2≤x≤6 .
x>6
a < 16
−1
FY (a) =
1 − a 4−2 , 16 ≤ a ≤
⎪
⎩
1,
a > 12
1
2
⎧
⎪
⎨ 0,
a < 16
6a−1
, 16 ≤ a ≤
=
⎪ 4a
⎩
1,
a > 12
1
2
B.8 Quiz Number 8
B.8
123
Quiz Number 8
1. Random variables X and Y have joint PDF fX,Y (x, y) =
Let W =
2X
.
Y
-
2
0
if 0 ≤ y ≤ x ≤ 1
otherwise
(a) What is the range of W.
Solution: Consider w =
2x
,
y
the relationship a particular instantiation w of RV
W has with corresponding instantiations: x, y of X and Y .
Since y ≤ x, w is minimal when x = y. Thus, min(w) = 2.
Maximum value of w is found when y → 0 while x ̸= 0. Therefore, max(w) → ∞.
Being a smooth function of x and y, w exists for all intermediate values.
Thus, the range of W is [2, ∞).
(b) Find FW (a). Be sure to consider all cases for −∞ < a < ∞.
Solution:
Y
(1, 1)
y=x
y = 2x/a
(for: a ≥ 2)
(0, 0)
x
X
Figure B.1: region of integration (for case: a ≥ 2)
Since x and y are non-negative, FW (a) = 0 whenever a < 0.
Consider a ≥ 0 case. By definition,
⎧
:
;
:
;
⎨0,
0≤a<2
2X
2X
FW (a) = P [W ≤ a] = P
≤a =P Y ≥
= ⎩5 1 5 x
Y
a
x=0 y= 2x fX,Y (x, y)dydx, a ≥ 2
⎧
⎨0,
5
=
⎩ 1
x=0
%5
x
y= 2x
a
&
2dy dx,
a
⎧
⎨
0≤a<2
0,
&5
= %
1
⎩ 1− 2
a≥2
x=0 2xdx,
a
⎧
0 ≤ a < 2 ⎨0,
0
=
2
⎩1 − , a
a≥2
a
124
Therefore, FW (a) =
2009 Quizzes: Solutions
⎧
⎨0,
⎩1 − 2 ,
a
a<2
.
a≥2
(c) Find P [X ≤ 2Y ].
Solution: P [X ≤ 2Y ] = P [ X
≤ 2] = P [W =
Y
2X
Y
≤ 4] = FW (4) = 1 −
2
4
=
1
2
Appendix C
2010 Quizzes
C.1
Quiz Number 1
1. [2 marks] Use algebra of sets to prove (A − B) − C = A − (B ∪ C).
2. A fair die is rolled twice and the sum is recorded.
a) [1 mark] Give the sample space (S) of this experiment.
b) [1 mark] Are the outcomes of this sample space equally likely? Explain.
c) [1 mark] Let B={sum is less than or equal to 2}. Find P [B].
3. In a company 40% of employees are female. Also, 15% of the male (M)
employees and 10% of female (F) employees hold managerial positions.
a) [2 marks] Let A be the event that a randomly selected employee of this company
holds a managerial position. Find P [A]?
b) [1 mark] In part (a), what is the probability that the employee does not have
a managerial position?
c) [2 marks] A randomly selected employee is found to have a managerial position.
What is the probability that this person is female?
126
C.2
2010 Quizzes
Quiz Number 2
1. [5 marks] Events A and B are independent and events A and C are disjoint
(mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer
the following parts:
a) [1 mark] Find P [A ∪ B].
b) [1 mark] Find P [A|B].
c) [1 mark] Find P [Ac ∪ B].
d) [1 mark] Find P [A|C].
e) [1 mark] Find P [A ∩ B ∩ C].
2. For this question, you may leave your answers as ratios of
% &
n
k
terms.
From a class of 20 boys and 10 girls a team of 5 is selected.
a) [1 mark] Find the probability that the team consists of 2 boys and 3 girls.
b) [2 marks] Find the probability that the majority of the team members are girls.
c) [2 mark] There are 6 students in this class, that do not like to be in the team.
Find the probability that the randomly chosen team has none of these 6 students.
C.3 Quiz Number 3
C.3
127
Quiz Number 3
1. A discrete random variable X has the following probability mass function
(PMF)
⎧
⎪
A x = −4
⎪
⎪
⎪
⎪
⎪
x = −1
⎨ A
PX (x) = 0.3 x = 0
⎪
⎪
⎪
⎪
⎪ 0.3 x = 4
⎪
⎩ 0
otherwise.
(a) (1 mark) Find A.
(b) (3 marks) Sketch the PMF. Find FX (0.5), where FX (·) represents the CDF of
X.
(c) (1 marks) Find P [0.5 < X ≤ 3].
(d) (1 mark) Find P [X > 2].
2. (4 marks) A biased coin with P [T ] = 0.2 and P [H] = 0.8 is tossed repeatedly.
Identify the type of the random variable (for example, X ∼Binomial(10,0.1)) in
each of the following cases.
a) X is the number of tosses before the first H (inclusive).
b) X is the number of tosses before the third T.
c) X is the number of heads (H) in 5 tosses.
d) After the occurrence of the first H, X is the number of extra tosses before the
second H (inclusive).
128
C.4
2010 Quizzes
Quiz Number 4
1. The CDF of a random variable is given as
FX (x) =
a) [2 marks] Find P [1 < X < 2].
b) [2 marks] Find P [X > 1].
c) [3 marks] Find the pdf of X.
d) [3 marks] Find E[X].
⎧
⎪
⎨ 02
⎪
⎩
x
9
1
if x ≤ 0
if 0 < x < 3
if 3 ≤ x
C.5 Quiz Number 5
C.5
129
Quiz Number 5
1. The lifetime of a transistor in years is modeled as Exponential(0.2). Answer
the following:
(a) [1 mark] Find the average lifetime.
(b) [2 marks] Find the probability that it lasts longer than 3 years.
(c) [2 marks] Given that it has lasted for 5 years, what is the probability that it
lasts for another 3 years.
(d) [1 mark] An EE has designed a circuit using two of above-mentioned transistors
such that one is active and the second one is the spare (i.e., the second one becomes
active when the first one dies). The circuit can last until both transistors are dead.
What random variable can model the lifetime of this circuit? Give the parameters
of its PDF.
2. For a Uniform(1,3)3 random
variable
4
1
(a) [2 marks] Find E X 2 .
(b) [2 marks] Find E[4X − 5].
130
C.6
2010 Quizzes
Quiz Number 6
1. X is a Gaussian random variable with µ = 8, σ 2 = 16. Answer the following
(leave your answers in Φ(·) form with positive argument).
(a) [1 mark] Find P [12 < X < 16].
(b) [2 marks] Find P [X > 0].
(c) [2 marks] Define Y = X/4 + 6. Find, the average and variance of Y.
(d) [1 mark] Define W = AX + B. Find A and B such that W is a Gaussian with
mean zero and variance 4.
2. Consider X ∼ Exponential(2) and Y = X 3 .
(a) [1 mark] Find the range of Y.
(b) [3 marks] Find the PDF of Y .
C.7 Quiz Number 7
C.7
131
Quiz Number 7
1. The joint pdf of two random variable is given as
fX,Y (x, y) =
-
cx
0 ≤ y/2 ≤ x ≤ 1
0 elsewhere
[2 marks] Find c.
[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole
range −∞ < x < ∞.
[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole
range −∞ < y < ∞.
[4 marks] Find the PDF of Z =
−∞ < a < ∞.
Y
,
X
fZ (a). Be sure to consider the whole range
Appendix D
2010 Quizzes: Solutions
D.1
Quiz Number 1
1. [2 marks] Use algebra of sets to prove (A − B) − C = A − (B ∪ C).
Solution: (A−B)−C = (A∩B c )∩C c = A∩(B c ∩C c ) = A∩(B ∪C)c = A−(B ∪C)
2. A fair die is rolled twice and the sum is recorded.
a) [1 mark] Give the sample space (S) of this experiment.
Solution: S = {2, 3, . . . , 12}
b) [1 mark] Are the outcomes of this sample space equally likely? Explain.
Solution: No. Some outcomes are more likely than others. For example 2 can just
be the outcome when both die rolls result in 1 (i.e., (x1 , x2 ) = (1, 1)), while 7 is the
outcome of the experiment when any of the pairs (1, 6), (2, 5), (3, 4), (4, 3), (5, 2),
or (6, 1) happens (i.e., (x1 , x2 ) ∈ {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}).
c) [1 mark] Let B={sum is less than or equal to 2}. Find P [B].
Solution: Sum less than or equal to 2, means both die rolls should have resulted
in 1. Hence we have, P [B] = P {(x1 , x2 ) = (1, 1)} = P (x1 = 1) × P (x2 = 1) =
1
1
× 16 = 36
. Note that the die rolls are independent.
6
134
2010 Quizzes: Solutions
3. In a company 40% of employees are female. Also, 15% of the male (M)
employees and 10% of female (F) employees hold managerial positions.
a) [2 marks] Let A be the event that a randomly selected employee of this company
holds a managerial position. Find P [A]?
Solution: P [A] = P [A|F ]P [F ] + P [A|F c ]P [F c ] = 0.1 × 0.4 + 0.15 × 0.6 = 0.13
b) [1 mark] In part (a), what is the probability that the employee does not have
a managerial position?
Solution: P [Ac ] = 1 − P [A] = 1 − 0.13 = 0.87
c) [2 marks] A randomly selected employee is found to have a managerial position.
What is the probability that this person is female?
Solution: P [F |A] =
P [A|F ]P [F ]
P [A]
=
0.1×0.4
0.13
=
4
13
≃ 0.3077
D.2 Quiz Number 2
D.2
135
Quiz Number 2
1. [5 marks] Events A and B are independent and events A and C are disjoint
(mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer
the following parts:
a) [1 mark] Find P [A ∪ B].
Solution: P [A ∪ B] = P [A] + P [B] − P [A ∩ B] = P [A] + P [B] − P [A]P [B] =
0.2 + 0.4 − 0.08 = 0.52
b) [1 mark] Find P [A|B].
Solution: P [A|B] =
P [A∩B]
P [B]
=
P [A]P [B]
P [B]
=
0.2×0.4
0.4
=
0.08
0.4
= 0.2
c) [1 mark] Find P [Ac ∪ B].
Solution: P [Ac ∪ B] = P [Ac ] + P [B] − P [Ac ∩ B] = (1 − P [A]) + P [B] − (1 −
P [A])P [B] = (1 − 0.2) + 0.4 − (1 − 0.2) × 0.4 = 0.88
d) [1 mark] Find P [A|C].
Solution: P [A|C] =
P (A∩C)
P [C]
=
P [∅]
P [C]
=0
e) [1 mark] Find P [A ∩ B ∩ C].
Solution: P [A ∩ B ∩ C] = P [(A ∩ C) ∩ B] = P [∅ ∩ B] = P [∅] = 0
% &
2. For this question, you may leave your answers as ratios of nk terms.
From a class of 20 boys and 10 girls a team of 5 is selected.
a) [1 mark] Find the probability that the team consists of 2 boys and 3 girls.
Solution:
(202)×(103)
(305)
b) [2 marks] Find the probability that the majority of the team members are girls.
Solution:
(202)×(103)+(201)×(104)+(200)×(105)
(305)
c) [2 mark] There are 6 students in this class, that do not like to be in the team.
Find the probability that the randomly chosen team has none of these 6 students.
Solution:
(30−6
5 )
(305)
136
D.3
2010 Quizzes: Solutions
Quiz Number 3
1. A discrete random variable X has the following probability mass function
(PMF)
⎧
⎪
A x = −4
⎪
⎪
⎪
⎪
⎪
x = −1
⎨ A
0.3
x=0
PX (x) = ⎪
⎪
⎪
0.3 x = 4
⎪
⎪
⎪
⎩ 0
otherwise.
(a) (1 mark) Find A.
Solution: 0.3 + 0.3 + A + A = 1 ⇒ A = 0.2
(b) (3 marks) Sketch the PMF. Find FX (0.5), where FX (·) represents the CDF of
X.
Solution: Fx (0.5) = 0.2 + 0.2 + 0.3 = 0.7
(c) (1 marks) Find P [0.5 < X ≤ 3].
Solution: No mass ⇒ P [0.5 < X ≤ 3] = 0
(d) (1 mark) Find P [X > 2].
Solution: P [X > 2] = 0.3
2. (4 marks) A biased coin with P [T ] = 0.2 and P [H] = 0.8 is tossed repeatedly.
Identify the type of the random variable (for example, X ∼Binomial(10,0.1)) in
each of the following cases.
a) X is the number of tosses before the first H (inclusive).
D.3 Quiz Number 3
137
Solution: Geometric(0.8)
b) X is the number of tosses before the third T.
Solution: Pascal(3,0.2)
c) X is the number of heads (H) in 5 tosses.
Solution: Binomial(5,0.8)
d) After the occurrence of the first H, X is the number of extra tosses before the
second H (inclusive).
Solution: Geometric(0.8)
138
D.4
2010 Quizzes: Solutions
Quiz Number 4
1. The CDF of a random variable is given as
FX (x) =
a) [2 marks] Find P [1 < X < 2].
⎧
⎪
⎨ 02
⎪
⎩
x
9
1
if x ≤ 0
if 0 < x < 3
if 3 ≤ x
Solution: P [1 < X < 2] = FX (2) − FX (1) =
22
9
−
12
9
=
3
9
=
1
3
b) [2 marks] Find P [X > 1].
Solution: P [X > 1] = 1 − P [X ≤ 1] = 1 − FX (1) = 1 −
c) [3 marks] Find the pdf of X.
Solution: fX (x) =
-
2x
9
0
0<x<3
otherwise
d) [3 marks] Find E[X].
Solution: E[X] =
5 +∞
−∞
xfX (x)dx =
53
0
x 2x
dx =
9
2x3 3
|
27 0
=2
1
9
=
8
9
D.5 Quiz Number 5
D.5
139
Quiz Number 5
1. The lifetime of a transistor in years is modeled as Exponential(0.2). Answer
the following:
(a) [1 mark] Find the average lifetime.
Solution:
1
λ
=
1
0.2
=5
(b) [2 marks] Find the probability that it lasts longer than 3 years.
Solution: P [T > 3] = e−λ×3 = e−0.6
(c) [2 marks] Given that it has lasted for 5 years, what is the probability that it
lasts for another 3 years.
Solution: P [T > 8|T > 5] = P [T > 3] = e−λ×3 = e−0.6
(d) [1 mark] An EE has designed a circuit using two of above-mentioned transistors
such that one is active and the second one is the spare (i.e., the second one becomes
active when the first one dies). The circuit can last until both transistors are dead.
What random variable can model the lifetime of this circuit? Give the parameters
of its PDF.
Solution: Erlang(2, 0.2)
2. For a Uniform(1,3)3 random
variable
4
1
(a) [2 marks] Find E X 2 .
Solution:
53
1
1 x2
× 12 dx =
−x−1 3
|1
2
=
1
2
−
1
6
=
1
3
(b) [2 marks] Find E[4X − 5].
Solution: E[4X − 5] = 4 × E[X] − 5 = 4 × 2 − 5 = 3
140
D.6
2010 Quizzes: Solutions
Quiz Number 6
1. X is a Gaussian random variable with µ = 8, σ 2 = 16. Answer the following
(leave your answers in Φ(·) form with positive argument).
(a) [1 mark] Find P [12 < X < 16].
Solution: P [12 < X < 16] = P [ 12−8
<Z<
4
16−8
]
4
= Φ(2) − Φ(1)
(b) [2 marks] Find P [X > 0].
Solution: P [X > 0] = 1 − P [X ≤ 0] = 1 − P [Z ≤
0−8
]
4
= 1 − Φ(−2) = Φ(2)
(c) [2 marks] Define Y = X/4 + 6. Find, the average and variance of Y.
Solution: E[Y ] =
VAR[Y ] =
1
16
1
4
× E[X] + 6
× VAR[X]
⇒
E[Y ] = 8
VAR[Y ] = 1
⇒
(d) [1 mark] Define W = AX + B. Find A and B such that W is a Gaussian with
mean zero and variance 4.
Solution: VAR[W ] = A2 × VAR[X] = 4
E[W ] = A × E[X] + B = 0
⇒
A=
1
2
B = −4
⇒
2. Consider X ∼ Exponential(2) and Y = X 3 .
(a) [1 mark] Find the range of Y.
Solution: Since the range of X ⇒ X ≥ 0, the range of Y is Y ≥ 0
(b) [3 marks] Find the PDF of Y .
Solution: fY (y) =
fY (y) = 23 y −2/3 e−2y
9
fX (x) 9
9
,
|g ′ (x)| x=g −1 (y)
1/3
for y ≥ 0
where g(x) = x3 , g ′ (x) = 3x2 and g −1 (y) = y 1/3 .
D.7 Quiz Number 7
D.7
141
Quiz Number 7
1. The joint pdf of two random variable is given as
fX,Y (x, y) =
-
cx
0 ≤ y/2 ≤ x ≤ 1
0 elsewhere
[2 marks] Find c.
Solution:
6
0
1
6
2x
0
cx dydx =
6
1
0
2cx2 dx = 1
⇒
2c
=1
3
c=
⇒
3
2
[2 marks] Find the marginal PDF of X, fX (x). Be sure to consider the whole
range −∞ < x < ∞.
⎧6
⎪
⎨
2x
Solution: fX (x) = ⎪ 0
⎩0,
3
x dy = 3x2 , 0 < x < 1
2
otherwise.
[2 marks] Find the marginal PDF of Y , fY (y). Be sure to consider the whole
range −∞ < y < ∞.
⎧6
⎪
⎨
Solutions: fY (y) = ⎪
⎩
1
y/2
0,
3
3 3y 2
x dx = −
, 0<y<2
2
4
16
otherwise.
[4 marks] Find the PDF of Z =
−∞ < a < ∞.
Y
,
X
fZ (a). Be sure to consider the whole range
Solution: For 0 < a < 2, FZ (a) = P [Z ≤ a] = P [Y ≤ aX] =
a
.
2
For a ≥ 2, FZ (a) = 1 and for a ≤ 0, FZ (a) = 0.
⎧
⎧
⎪
⎪
⎪
⎨0, a ≤ 0
⎨1, 0 < a < 2
a
⇒ FZ (a) = 2 , 0 < a < 2 ⇒ fZ (a) = 2
⎪
⎪
⎪
⎩0, otherwise.
⎩
1, a ≥ 2.
6
0
1
6
0
ax
3
x dydx =
2
Appendix E
2011 Quizzes
E.1
Quiz Number 1
Quiz #1, EE 387, Time: 15 min, Last Name:
First Name:
1. [2 marks] Let P [A] = P [B] = .2 and P [A ∪ B] = .3. Find P [(A ∩ B)c ].
2. [2 marks] Let P [A] = 0.2, P [B] = 0.3 and A and B are mutually exclusive.
Find P [A|B c ].
4. Consider couples adhering to the following family-planning policy. Each couple
will stop when they have a child of each sex, or stop when they have 3 children.
Consider a collection of such families and use the notation: B=boy and G=girl.
You pick such a family and observe the kids in that family. For example, one
possible outcome is GB (ie younger girl and older boy).
a) [1 mark] Give the sample space (S) of this experiment.
b) [1 mark] Are all outcomes in S equally likely? Briefly justify your answer.
5. In a cosmetic product store 30% of products are for males and 70% for females.
Also, 50% of the male products are hair-care products, whereas 20% of female
products are hair-care products.
a) [2 marks] Let A be the event “a randomly selected product of this store is a
hair-care product”. Find P [A]?
c) [2 marks] A randomly selected product is observed not to be a hair-care product. What is the probability that it is a male product?
144
E.2
2011 Quizzes
Quiz Number 2
Quiz #2, EE 387, Time: 15 min, Last Name:
First Name:
1. [4 marks] Events A and B are independent and events A and C are disjoint
(mutually exclusive). Let P [A] = 0.2, P [B] = 0.4, and P [C] = 0.1. Please answer
the following parts:
a) [1 mark] Find P [A ∪ B c ].
b) [1 mark] Find P [Ac |B].
2. [2 mark] A student goes to EE387 class on a snowy day with probability 0.4,
but on a nonsnowy day attends with probability 0.7. Suppose that 20% of the
days in March are snowy in Edmonton. What is the probability that it snowed
on March 10 given that the student was in class on that day?
3. For this question, you may leave your answers as ratios of
% &
n
k
terms.
From a class of 20 boys and 10 girls a team of 5 is randomly selected for a
competition.
a) [1 mark] Find the probability that the team consists of at least one boy and
one girl.
b) [1 mark] Find the probability that the team has an odd number of girls.
c) [1 mark] Three students in this class cannot attend the competition. Find the
probability that the randomly chosen team has none of these 3 students.
E.3 Quiz Number 3
E.3
145
Quiz Number 3
Quiz #3, EE 387, Time: 15 min, Last Name:
First Name:
1. A discrete random variable X has the following probability mass function
(PMF)
⎧
⎪
x = −2, −1
⎨ A
PX (x) = ⎪ A/2 x = 0, 1
⎩
0
otherwise.
(a) (1 mark) Find A.
(b) (1 mark) Sketch the PMF.
(c) (1 marks) Find P [0.5 < X ≤ 3].
2. (3 marks) The PMF of random variable X is given by PX (x) = 13 for x = 1, 2, 3
and PX (x) = 0 for all other x. Derive the CDF FX (a) = P [X ≤ a] for all values
of a. Sketch the CDF.
3. (4 marks) Mary writes a 20 multiple-choice examination on Chemistry 101.
Each question has 5 answers. Because she has skipped many lectures, she must
take random guesses. Suppose X is the number of questions that she gets right.
a) (2 marks) Write down the PMF of X.
b) (1 mark) What is the probability that she gets 19 or more questions right?
c) (1 mark) What is the probability that she gets no questions right?
146
E.4
2011 Quizzes
Quiz Number 4
Quiz #4, EE 387, Time: 15 min, Last Name:
1. A discrete random variable X has
(PMF)
⎧
0.1
⎪
⎪
⎪
⎨ 0.2
PX (x) =
⎪
⎪
⎪ 0.3
⎩
0
First Name:
the following probability mass function
x = −3, −2, −1
x = 0, 1
x=2
otherwise.
(a) (2 marks) Find the PMF of W = |X|.
(b) (2 marks) Find the PMF of Y = X 2 + 1.
(c) (2 marks) Find the PMF of Z = 1 − X.
(d) (2 marks) Find E[X 2 + 1].
(e) (2 marks) A fair coin is tossed three times. Let X be the number of Heads.
Find E[X], the expected value of X.
E.5 Quiz Number 5
E.5
147
Quiz Number 5
Quiz #5, EE 387, Time: 15 min, Last Name:
First Name:
1. The wealth of an individual in a country is related to the continuous random
variable X, which has the following cumulative distribution function (CDF)
FX (x) =
⎧
⎨
⎩
1−
0
1
1≤x<∞
x4
x ≤ 1.
(a) (2 marks) Derive the PDF of X
(b) (2 marks) Find the mean of X, E[X].
(c) (2 marks) Find the mean square of X, E[X 2 ]. Find the variance of X.
(d) (2 marks) Suppose the wealth measured in dollars is given by Y = 10000X +
2000. Find the mean wealth E[Y ] and standard deviation STD[Y ].
(e) (2 marks) What is the percentage of the individuals whose income exceed
22000$?
148
E.6
2011 Quizzes
Quiz Number 6
Quiz #6, EE 387, Time: 15 min, Last Name:
First Name:
1. X is a Gaussian random variable with µ = 10, σ 2 = 4. Answer the following
(leave your answers in Φ(·) form with positive argument).
(a) [2 marks] Find P [12 < X < 16].
(b) [1 mark] Find P [X > 0].
2. The time T in minutes between two successive bus arrivals in a bus stop is
Exponential ( 0.2).
(a) [ 1 mark] When you just arrive at the bus stop, what is the probability that
you have to wait for more than 5 minutes?
(b) [1 mark] What is the average value of your waiting time?
(c) [ 1 mark] You are waiting for a bus, and no bus has arrived in the past 2
minutes. You decide to go to the adjacent coffee shop to grab a coffee. It takes
you 5 minutes to grab your coffee and be back at the bus station. Determine the
probability that you will not miss the bus.
3. [ 4 marks] You borrow your friend’s car to drive to Hinton to see your significant other. The driving distance is 100 km. The gas gauge is broken, so you don’t
know how much gas is in the car. The tank holds 40 liters and the car gets 15 km
per liter, so you decided to take a chance.
(a) [2 marks] Suppose X is the distance (km) that you can drive until the car
runs out of gas. Out of Uniform, Exponential and Gaussian PDFs, which one is
most suitable for modeling X? Briefly justify your choice. Use your choice with
the appropriate parameters to answer the following questions.
(b) [ 1 mark] What is the probability that you make it to Hinton without running
out of gas?
(c) [1 mark] If you don’t run out of gas on the way, what is the probability that
you will not run out of gas on the way back if you decide to a take chance again?
Appendix F
2011 Quizzes: Solutions
F.1
Quiz Number 1
Quiz #1, EE 387, Time: 15 min, Last Name:
First Name:
1. [2 marks] Let P [A] = P [B] = .2 and P [A ∪ B] = .3. Find P [(A ∩ B)c ].
Solution:
P [A ∩ B] = P [A] + P [B] − P [A ∪ B] = 0.2 + 0.2 − 0.3 = 0.1
P [(A ∩ B)c ] = 1 − P [A ∩ B] = 0.9.
2. [2 marks] Let P [A] = 0.2, P [B] = 0.3 and A and B are mutually exclusive.
Find P [A|B c ].
Solution:
P [A ∩ B c ]
P [A]
0.2
=
=
.
c
c
P [B ]
P [B ]
0.7
Note that P [A] = P [A ∩ B] + P [A ∩ B c ] and P [A ∩ B] = 0.
4. Consider couples adhering to the following family-planning policy. Each couple
150
2011 Quizzes: Solutions
will stop when they have a child of each sex, or stop when they have 3 children.
Consider a collection of such families and use the notation: B=boy and G=girl.
You pick such a family and observe the kids in that family. For example, one
possible outcome is GB (ie younger girl and older boy).
a) [1 mark] Give the sample space (S) of this experiment.
Solution:
S = {GB, GGB, GGB, GGG, BG, BBG, BBB}.
b) [1 mark] Are all outcomes in S equally likely? Briefly justify your answer.
Solution:
No. Clearly GB is more likely than GGB.
5. In a cosmetic product store 30% of products are for males and 70% for females.
Also, 50% of the male products are hair-care products, whereas 20% of female
products are hair-care products.
a) [2 marks] Let A be the event “a randomly selected product of this store is a
hair-care product”. Find P [A]?
Solution:
P [A] = P [A|M ]P [M ] + P [A|F ]P [F ] = 0.5 × 0.3 + 0.2 × 0.7 = 0.29.
c) [2 marks] A randomly selected product is observed not to be a hair-care product.
What is the probability that it is a male product?
Solution:
.
P [M |Ac ] =
P [Ac ∩ M ]
0.5 × 0.3
0.15
=
=
c
P [A ]
1 − 0.29
0.71
F.2 Quiz Number 2
F.2
151
Quiz Number 2
Quiz #2, EE 387, Time: 15 min, Last Name:
First Name:
1. [2 marks] Events A and B are independent. Also P [A] = 0.2 and P [B] = 0.4.
a) [1 mark] Find P [A ∪ B c ].
Solution:
P [A ∪ B c ] = P [A] + P [B c ] − P [A ∩ B c ] = 0.2 + (1 − 0.4) − 0.2(1 − 0.4) = 0.68.
b) [1 mark] Find P [Ac |B].
Solution:
P [Ac |B] = P [Ac ] = 0.8.
2. [2 marks] A student goes to EE387 class on a snowy day with probability 0.4,
but on a non-snowy day attends with probability 0.7. Suppose that 20% of the
days in March are snowy in Edmonton. What is the probability that it snowed
on March 10 given that the student was in class on that day?
Solution:
P [C|S] = 0.4, P [C|S c ] = 0.7, P [S] = 0.2
P [C] = P [C|S]P [S] + P [C|S c ]P [S c ] = 0.4 × 0.2 + 0.7 × 0.8 = 0.64
P [S|C] =
P [C|S]P [S]
0.4 × 0.2
=
= 0.125.
P [C]
0.64
152
2011 Quizzes: Solutions
3. [6 marks] For this question, you may leave your answers as ratios of
% &
n
k
terms.
From a class of 20 boys and 10 girls a team of 5 is randomly selected for a
competition.
a) [2 marks] Find the probability that the team consists of at least one boy and
one girl.
Solution:
1−
% &
20
5
+
% &
% &
30
5
10
5
.
b) [2 marks] Find the probability that the team has an odd number of girls.
Solution:
% &% &
10
1
20
4
+
% &% &
10
2
20
3
+
% &% &
% &
30
5
10
3
20
2
+
% &% &
10
4
20
1
.
c) [2 marks] Three students in this class cannot attend the competition. Find the
probability that the randomly chosen team has none of these 3 students.
Solution:
% &
27
5
30
5
% &.
F.3 Quiz Number 3
F.3
153
Quiz Number 3
Quiz #3, EE 387, Time: 15 min, Last Name:
First Name:
1. A discrete random variable X has the following probability mass function
(PMF)
⎧
⎪
x = −2, −1
⎨ A
PX (x) = ⎪ A/2 x = 0, 1
⎩
0
otherwise.
(a) (1 mark) Find A.
Solution:
A
1
2×A+2×( )=1⇒A= .
2
3
(b) (1 mark) Sketch the PMF.
Solution:
1/3
-2
1/3
-1
1/6
1/6
0
1
(c) (1 marks) Find P [0.5 < X ≤ 3].
Solution:
1
P [0.5 < X ≤ 3] = P [X = 1] = .
6
2. (3 marks) The PMF of random variable X is given by PX (x) = 13 for x = 1, 2, 3
and PX (x) = 0 for all other x. Derive the CDF FX (a) = P [X ≤ a] for all values
of a. Sketch the CDF.
Solution:
154
2011 Quizzes: Solutions
1
2/3
1/3
0
1
2
3
4
3. (4 marks) Mary writes a 20 multiple-choice examination on Chemistry 101.
Each question has 5 answers. Because she has skipped many lectures, she must
take random guesses. Suppose X is the number of questions that she gets right.
a) (2 marks) Write down the PMF of X.
Solution:
+
,
20 1 i 4 (20−i)
PX (i) =
( )( )
.
i 5 5
b) (1 mark) What is the probability that she gets 19 or more questions right?
Solution:
+
,
+
,
20 1 19 4
20 1 20 4 0
81
( ) ( )+
( ) ( ) = 20 .
19 5
5
20 5
5
5
c) (1 mark) What is the probability that she gets no questions right?
Solution:
+
,
20 1 0 4 20
4
( ) ( ) = ( )20 .
0 5 5
5
F.4 Quiz Number 4
F.4
155
Quiz Number 4
Quiz #4, EE 387, Time: 15 min, Last Name:
1. A discrete random variable X has
(PMF)
⎧
0.1
⎪
⎪
⎪
⎨ 0.2
PX (x) =
⎪
⎪
⎪ 0.3
⎩
0
the following probability mass function
x = −3, −2, −1
x = 0, 1
x=2
otherwise.
(a) (2 marks) Find the PMF of W = |X|.
x = 0 =⇒ w = 0
x = ±1 =⇒ w = 1
x = ±2 =⇒ w = 2
x = −3 =⇒ w = 3
PW (w) =
⎧
⎪
0.2 w = 0
⎪
⎪
⎪
⎪
⎪
⎨ 0.3 w = 1
0.4 w = 2
⎪
⎪
⎪
0.1 w = 3
⎪
⎪
⎪
⎩
0
otherwise.
(b) (2 marks) Find the PMF of Y = X 2 + 1.
x = 0 =⇒ y = 1
x = ±1 =⇒ y = 2
x = ±2 =⇒ y = 5
x = −3 =⇒ y = 10
PY (y) =
⎧
⎪
0.1 y = 10
⎪
⎪
⎪
⎪
⎪
⎨ 0.4 y = 5
0.3 y = 2
⎪
⎪
⎪
0.2 y = 1
⎪
⎪
⎪
⎩
0
First Name:
otherwise.
(c) (2 marks) Find the PMF of Z = 1 − X.
156
2011 Quizzes: Solutions
x = −3 =⇒ z = 4
x = −2 =⇒ z = 3
x = −1 =⇒ z = 2
x = 0 =⇒ z = 1
x = 1 =⇒ z = 0
x = 2 =⇒ z = −1
PZ (z) =
⎧
0.3 z = −1
⎪
⎪
⎪
⎨
0.2 z = 0, 1
⎪
0.1 z = 2, 3, 4
⎪
⎪
⎩
0
(d) (2 marks) Find E[X 2 + 1].
E[X 2 + 1] = E[X 2 ] + 1 =
orE[X2 + 1] = E[Y ] =
#
otherwise.
#
x2 p(x) + 1 = 2.8 + 1 = 3.8
yp(y) = 3.8
(e) (2 marks) A fair coin is tossed three times. Let X be the number of Heads.
Find E[X], the expected value of X.
S = {TTT, TTH, THT, THH, HTT, HTH, HHT, HHH }
P (x = 0) = 1/8
P (x = 1) = 3/8
P (x = 2) = 3/8
P (x = 3) = 1/8
E[X] =
#
p(x)x = 1/8(0) + 3/8(1) + 3/8(2) + (1/8)3 = 1.5
F.5 Quiz Number 5
F.5
157
Quiz Number 5
Quiz #5, EE 387, Time: 15 min, Last Name:
First Name:
1. The wealth of an individual in a country is related to the continuous random
variable X, which has the following cumulative distribution function (CDF)
⎧
⎨
FX (x) =
⎩
1−
0
1
1≤x<∞
x4
x ≤ 1.
(a) (2 marks) Derive the PDF of X.
⎧
4
dFX (x) ⎨
1≤x<∞
fX (x) =
=
x5
⎩
dx
0 x ≤ 1.
(b) (2 marks) Find the mean of X, E[X].
E[X] =
=
6
6
1
4
=
3
xfX (x)dx =
∞
6
∞
1
x
4
dx
x5
4
4x dx = x−3 |∞
1
3
−4
(c) (2 marks) Find the mean square of X, E[X 2 ]. Find the variance of X.
2
E[X ] =
6
2
x fX (x)dx =
= −2x−2 |∞
1
=2
6
1
∞
4x−3 dx
4 2
VAR[X] = E[X 2 ] − E 2 [X] = 2 − ( )
3
2
=
9
(d) (2 marks) Suppose the wealth measured in dollars is given by Y = 10000X +
2000. Find the mean wealth E[Y ] and standard deviation STD[Y ].
158
2011 Quizzes: Solutions
E[Y ] = E[10000X + 2000] = 10000E[X] + 2000
= 40000/3 + 2000
2
VAR[Y] = 100002 VAR[X] = ( )108
29
STD[Y] = 10000STD[X] = VAR[Y] = 4714
(e) (2 marks) What is the percentage of the individuals whose income exceed
22000$?
P [Y > 22000] = P [10000X + 2000 > 22000] = P [X > 2]
= 1 − P [X < 2] = 1 − FX (2) = 1 − (1 − (
= 1/16 = 6.25%
1
))
16
F.6 Quiz Number 6
F.6
159
Quiz Number 6
Quiz #6, EE 387, Time: 15 min, Last Name:
First Name:
1. X is a Gaussian random variable with µ = 10, σ 2 = 4. Answer the following
(leave your answers in Φ(·) form with positive argument).
(a) [2 marks] Find P [12 < X < 16].
Solution:
P [1 <
X − 10
16 − 10
12 − 10
< 3] = Φ(
) − Φ(
) = Φ(3) − Φ(1).
2
2
2
(b) [1 mark] Find P [X > 0].
Solution:
P[
X − 10
5 − 10
X − 10
>
] = 1 − P[
< −5] = 1 − Φ(−5) = 1 − (1 − Φ(5)) = Φ(5).
2
2
2
2. The time T in minutes between two successive bus arrivals in a bus stop is
Exponential ( 0.2).
(a) [ 1 mark] When you just arrive at the bus stop, what is the probability that
you have to wait for more than 5 minutes?
Solution:
P [T > 5] = e−λ·5 = e−0.2·5 = e−1 .
(b) [1 mark] What is the average value of your waiting time?
Solution:
E[T ] =
1
= 5.
λ
160
2011 Quizzes: Solutions
(c) [ 1 mark] You are waiting for a bus, and no bus has arrived in the past 2
minutes. You decide to go to the adjacent coffee shop to grab a coffee. It takes
you 5 minutes to grab your coffee and be back at the bus station. Determine the
probability that you will not miss the bus.
Solution:
P [T > 7|T > 2] = P [T > 5] = e−1 .
3. [ 4 marks] You borrow your friend’s car to drive to Hinton to see your significant
other. The driving distance is 100 km. The gas gauge is broken, so you don’t
know how much gas is in the car. The tank holds 40 liters and the car gets 15 km
per liter, so you decided to take a chance.
(a) [2 marks] Suppose X is the distance (km) that you can drive until the car
runs out of gas. Out of Uniform, Exponential and Gaussian PDFs, which one is
most suitable for modeling X? Briefly justify your choice. Use your choice with
the appropriate parameters to answer the following questions.
Solution:
First note that our random variable is limited and should have zero probability
for values larger than 600. In addition there is no information about the value of
available gas then every value between 0 and 600 should have the same probability
then,
X ∼ Uniform(0, 600).
(b) [ 1 mark] What is the probability that you make it to Hinton without running
out of gas?
Solution:
P [X > 100] = 1 − P [X < 100] = 1 −
100 − 0
5
= .
600
6
(c) [1 mark] If you don’t run out of gas on the way, what is the probability that
you will not run out of gas on the way back if you decide to a take chance again?
F.6 Quiz Number 6
161
Solution:
P [X > 200|X > 100] =
P [(X > 200) ∩ (X > 100)]
P [(X > 200)]
4
=
= .
P [X > 100]
P [X > 100]
5
Download