6.1.2 Sums of Random Variables
In many applications, we need to work with a sum of several random variables. In particular, we might need to
study a random variable Y given by
Y
= X1 + X2 + ⋯ + Xn .
The linearity of expectation tells us that
EY
= E X1 + E X2 + ⋯ + E Xn .
We can also find the variance of Y based on our discussion in Section 5.3. In particular, we saw that the
variance of a sum of two random variables is
Var(X 1 + X 2 ) = Var(X 1 ) + Var(X 2 ) + 2Cov(X 1 , X 2 ).
For Y
= X1 + X2 + ⋯ + Xn ,
we can obtain a more general version of the above equation. We can write
n
n
Var(Y ) = Cov (∑ X i , ∑ X j )
i=1
n
j=1
n
= ∑ ∑ Cov(X i , X j )
i=1
(using part 7 of Lemma 5.3)
j=1
n
= ∑ Var(X i ) + 2 ∑ Cov(X i , X j ).
i=1
i<j
n
n
Var (∑ X i ) = ∑ Var(X i ) + 2 ∑ Cov(X i , X j )
i=1
i=1
If the Xi 's are independent, then Cov(Xi , Xj )
= 0
i<j
for i
≠ j.
n
In this case, we can write
n
If X 1 , X 2 ,...,X n are independent, Var (∑ X i ) = ∑ Var(X i ).
i=1
i=1
Example 6.2
N people sit around a round table, where N > 5. Each person tosses a coin. Anyone whose outcome is
different from his/her two neighbors will receive a present. Let X be the number of people who receive
presents. Find E X and Var(X).
Solution
Number the N people from 1 to N . Let Xi be the indicator random variable for the ith person,
that is, Xi = 1 if the ith person receives a present and zero otherwise. Then
X = X 1 + X 2 +. . . +X N .
1
First note that P (Xi = 1) = 4 . This is the probability that the person to the right has a different
outcome times the probability that the person to the left has a different outcome. In other words, if
we define Hi and Ti be the events that the ith person's outcome is heads and tails respectively,
then we can write
E X i = P (X i = 1)
= P (Hi−1 , Ti , Hi+1 ) + P (Ti−1 , Hi , Ti+1 )
1
=
1
+
1
=
8
8
.
4
Thus, we find
E X = E X 1 + E X 2 +. . . +E X N =
N
.
4
Next, we can write
N
N
Var(X) = ∑ Var(X i ) + ∑ ∑ Cov(X i , X j ).
i=1
Since Xi
∼ Bernoulli(
1
4
),
i=1
j≠i
we have
1
Var(X i ) =
3
3
.
4
=
4
.
16
It remains to find Cov(Xi , Xj ) . First note that Xi and Xj are independent if there are at least two
people between the ith person and the jth person. In other words, if 2 < |i − j| < N − 2, then
X i and X j are independent, so
Cov(X i , X j ) = 0, for 2 < |i − j| < N − 2.
Also, note that there is a lot of symmetry in the problem:
Cov(X 1 , X 2 ) = Cov(X 2 , X 3 ) = Cov(X 3 , X 4 ) =. . . = Cov(X N −1 , X N ) = Cov(X N , X 1 ),
Cov(X 1 , X 3 ) = Cov(X 2 , X 4 ) = Cov(X 3 , X 5 ) =. . . = Cov(X N −1 , X 1 ) = Cov(X N , X 2 ).
Thus, we can write
Var(X) = N Var(X 1 ) + 2N Cov(X 1 , X 2 ) + 2N Cov(X 1 , X 3 )
3N
=
16
+ 2N Cov(X 1 , X 2 ) + 2N Cov(X 1 , X 3 ).
So we need to find Cov(X1 , X2 ) and Cov(X1 , X3 ) . We have
E [X 1 X 2 ] = P (X 1 = 1, X 2 = 1)
= P (H N , T 1 , H 2 , T 3 ) + P (T N , H 1 , T 2 , H 3 )
1
1
=
+
16
1
=
.
16
8
Thus,
Cov(X 1 , X 2 ) = E [X 1 X 2 ] − E [X 1 ]E [X 2 ]
1
=
1
−
8
1
=
16
,
16
E [X 1 X 3 ] = P (X 1 = 1, X 3 = 1)
= P (H N , T 1 , H 2 , T 3 , H 4 ) + P (T N , H 1 , T 2 , H 3 , T 4 )
1
=
32
Thus,
1
+
1
=
32
.
16
Cov(X 1 , X 3 ) = E [X 1 X 3 ] − E [X 1 ]E [X 3 ]
1
=
1
−
16
= 0.
16
Therefore,
Var(X) =
3N
16
=
3N
+ 2N cov(X 1 , X 2 ) + 2N cov(X 1 , X 3 )
+
16
=
5N
2N
16
.
16
We now know how to find the mean and variance of a sum of n random variables, but we might need to go
beyond that. Specifically, what if we need to know the PDF of Y = X1 + X2 +...+Xn ? In fact we have
addressed that problem for the case where Y = X1 + X2 and X1 and X2 are independent (Example 5.30 in
Section 5.2.4). For this case, we found out the PDF is given by convolving the PDF of X1 and X2 , that is
∞
fY (y) = fX1 (y) ∗ fX2 (y) = ∫
fX1 (x)fX2 (y − x)dx.
−∞
For Y
= X 1 + X 2 +...+X n ,
we can use the above formula repeatedly to obtain the PDF of Y :
fY (y) = fX1 (y) ∗ fX2 (y)∗. . . ∗fXn (y).
Nevertheless, this quickly becomes computationally difficult. Thus, we often resort to other methods if we can.
One method that is often useful is using moment generating functions, as we discuss in the next section.
← previous
next →
The print version of the book is available on Amazon.
Practical uncertainty: Userful Ideas in Decision-Making, Risk, Randomness, & AI