MH2500
Probability
Nanyang Technological University Singapore
Tutorial 7 (Chapter 6 in Ross’s book)
HINTS
In the tutorial, you can have some time on solutions to Test 2
1. Suppose that p(x, y), the joint probability mass function of X and Y ,
is given by
p(0, 0) = 0.4, p(0, 1) = 0.2, p(1, 0) = 0.1, p(1, 1) = 0.3.
(a) Calculate the marginal probability mass functions of X and Y
respectively.
(b) Calculate the conditional probability mass function of X given
that Y = 1.
Answer:
(a) By pX (x) =
X
p(x, y), we have
y
pX (0) = p(0, 0) + p(0, 1) = 0.4 + 0.2 = 0.6;
pX (1) = p(1, 0) + p(1, 1) = 0.1 + 0.3 = 0.4.
X
By pY (y) =
p(x, y), we have
x
pY (0) = p(0, 0) + p(1, 0) = 0.4 + 0.1 = 0.5;
pY (1) = p(0, 1) + p(1, 1) = 0.2 + 0.3 = 0.5.
P r{X = 0, Y = 1}
p(0, 1)
0.2
2
=
=
= .
P r{Y = 1}
pY (1)
0.5
5
P r{X = 1, Y = 1}
p(1, 1)
0.3
3
pX|Y (1|1) =
=
=
= .
P r{Y = 1}
pY (1)
0.5
5
(b) pX|Y (0|1) =
1
2. Consider n independent trials, with each trial being a success with
probability p. Given a total of k successes, prove that all possible
orderings of the k successes and n − k failures are equally likely.
Answer: Let X denote the number of successes, and consider k successes and n − k failures, in any order. E.g.,
······
f
· · {z
····
s} f
|
{z
}
k many successes
n−k many failures
|s
for k many consecutive successes, followed by n − k many failures. Let
O denote any such sequence of k successes and n − k failures. That is,
O just means a possible sequence of k successes and n − k failures in a
certain order and hence
P r{O and X = k} = P r{O}.
This gives
P r{O | X = k} =
=
P r{O and X = k}
P r{X = k}
P r{O}
P r{X = k}
pk (1 − p)n−k
= n k
p (1 − p)n−k
k
1
= .
n
k
3. Let X and Y be independent binomial random variables with respective
parameters (n, p) and (m, p). Calculate the distribution of X + Y .
Answer: Recall the interpretation of a binomial random variable, and
without any computation at all, we can immediately conclude that
2
X + Y is binomial distribution Bin(n + m, p). This follows because
X represents the number of successes in n independent trials, each of
which results in a success with probability p; similarly, Y represents the
number of successes in m independent trials, each of which results in
a success with probability p. Hence, given that X and Y are assumed
independent, it follows that X + Y represents the number of successes
in n + m independent trials when each trial has a probability p of
resulting in a success. Therefore, X + Y is a binomial random variable
with parameters Bin(n + m, p).
Here is the proof: Let q = 1 − p. Then
P r{X + Y = k} =
n
X
P r{X = i and Y = k − i}
i=0
=
n
X
P r{X = i}P r{Y = k − i}
i=0
n X
n i
m
n−i
k−i
m−(k−i)
=
p (1 − p) ·
p (1 − p)
i
k−i
i=0
n X
n
m
k
n+m−k
=
p (1 − p)
i
k
−
i
i=0
k
= p (1 − p)
n+m−k
·
n X
n
m
i
i=0
From combinatorics, we know that
k−i
n X
n
m
i=0
i
k−i
Thus we have
n+m k
P r{X + Y = k} =
p (1 − p)n+m−k .
k
That is,
X + Y ∼ Bin(n + m, k).
3
=
n+m
.
k
4. Suppose that the number of people who enter a post office on a given
day is a Poisson random variable with parameter λ. Prove that if each
personwho enters the post office is a male with probability p and a
female with probability 1 − p, then the number of males and females
entering the post office are independent Poisson random variables with
respective parameters λ and λ(1 − p).
Proof: Let X and Y denote the numbers of males and females that
enter the post office, respectively. We will prove that for any i, j,
p(i, j) = pX (i)pY (j).
To obtain P r{X = i, Y = j}, we condition on X + Y as:
P r{X = i, Y = j} = P r{X = i, Y = j | X + Y = i + j}P r{X + Y = i + j}
·P r{X = i, Y = j | X + Y ̸= i + j}P r{X + Y ̸= i + j}
Recall the formula:
P r(A) = P r(A|B)P r(B) + P r(A|B ′ )P r(B ′ ).
Here B ′ is the complement of B. Note that P r{X = i, Y = j | X +Y ̸=
i + j} = 0, we thus have
P r{X = i, Y = j} = P r{X = i, Y = j | X+Y = i+j}P r{X+Y = i+j}.
As X + Y is the total number of people who enter the post office, it
follows, by assumption, that the number of people who enter a post
office on a given day is a Poisson random variable with parameter λ,
P r{X + Y = i + j} =
λi+j −λ
e .
(i + j)!
Furthermore, given that i + j people do enter the post office, since
each person entering will be male with probability p, it follows that
the probability that exactly i of them will be male (and thus j of them
female) is just the binomial probability
i+j i
p (1 − p)j .
i
4
That is,
i+j i
P r{X = i, Y = j | X + Y = i + j} =
p (1 − p)j .
i
This gives
i+j i
λi+j −λ
P r{X = i, Y = j} =
p (1 − p)j ·
e
i
(i + j)!
=
(λp)i (λ(1 − p))j −λ (λp)i −λp (λ(1 − p))j −λ(1−p)
e =
e
·
e
.
i! · j!
i!
j!
This gives
pX (i) =
X
p(i, j) =
j
=
P r{X = i, Y = j}
j=0
∞ X
(λp)i
j=0
∞
X
i!
−λp
e
(λ(1 − p))j −λ(1−p)
·
e
j!
∞ (λp)i −λp X (λ(1 − p))j −λ(1−p)
(λp)i −λp
e
e
e ,
=
=
i!
j!
i!
j=0
the Poisson distribution with parameter λp.
By the same calculation, we also have
pY (j) =
(λ(1 − p))j −λ(1−p)
e
,
j!
the Poisson distribution with parameter λ(1 − p).
As for each i, j,
p(i, j) = P r{X = i, Y = j} ==
(λp)i −λp (λ(1 − p))j −λ(1−p)
e ·
e
= pX (i)·pY (j),
i!
j!
X and Y are independent.
5
5. One of the most important joint distributions is the multinomial distribution, which arises when a sequence of n independent and identical experiments is performed. Suppose that each experiment can result in any one of r possible outcomes, with respective probabilities
p1 , p2 , · · · , pr with p1 + p2 + · · · + pr = 1.
Let Xi denote the number of the n experiments that result in outcome
number i. Prove the following
P r{X1 = n1 , · · · , Xr = nr } =
n!
pn1 1 · · · · · pnr r .
n1 ! · · · nr !
† Note that the multinomial distribution is a direct generalization
of binomial distribution. That is, when r = 2 and p2 = 1 − p1 ,
n!
pn1 1 · p2n−n1
n1 !(n − n1 )!
n n1
=
p (1 − p1 )n−n1 .
n1
P r{X1 = n1 , X2 = n − n1 } =
As an application of the multinomial distribution, suppose that a fair
die is rolled 9 times. Calculate the probability that 1 appears three
times, 2 and 3 twice each, 4 and 5 once each, and 6 not at all.
Answer: It is a combinatorial proof. You can also try to prove it by
induction on r. For the example given, the probability that 1 appears
three times, 2 and 3 twice each, 4 and 5 once each, and 6 not at all is
3 2 2 1 1 0
1
1
1
1
1
1
9!
3!2!2!1!1!0! 6
6
6
6
6
6
9
1
70
= 6.
=9·8·7·6·5·
6
6
6. In the multinomial distribution above, suppose we are given that nj of
the trials resulted in outcome j for j = k + 1, · · · , r, where jk+1 + · · · +
jr = m ≤ n. Find the conditional distribution
P r{X1 = n1 , · · · , Xk = nk | Xk+1 = nk+1 , · · · , Xr = nr }.
6
Answer: Let P = p1 + · · · + pk . Then
P r{X1 = n1 , · · · , Xk = nk | Xk+1 = nk+1 , · · · , Xr = nr }
=
P r{X1 = n1 , · · · , Xk = nk , Xk+1 = nk+1 , · · · , Xr = nr }
P r{Xk+1 = nk+1 , · · · , Xr = nr }
=
n!
nk+1
· · · · · pnr r
pn1 1 · · · · · pnk k · pk+1
n1 ! · · · nk !nk+1 ! · · · nr !
n!
nk+1
· · · · · pnr r
pn−m · pk+1
(n − m)! · nk+1 ! · · · nr !
(n − m)! pn1 1 · · · · · pnk k
(n − m)!
=
·
=
·
n−m
n1 ! · · · nk !
p
n1 ! · · · nk !
=
n−m
n1 , · · · , nk
p1
p
n1
· ··· ·
pk
p
p1
p
n1
· ··· ·
pk
p
nk
nk
as n1 + · · · + nk = n − m and p1 + · · · + pk = p. It is a multinomial
distribution for n−m independent and identical experiments, with each
experiment resulting in any one of k possible outcomes, with respective
pk
p1 p2
pk
p1 p2
+
+ ··· +
= 1.
probabilities , , · · · , . Note that
p p
p
p
p
p
7