1.2 Repetitions and random variables. 1.2.4 Joint probability mass functions and marginal probability mass functions Suppose X and Y are two random variables and x1, x2, ..., xn and y1, y2, ..., ym, respectively, are the values X and Y assume. The function fxiyj = f(xi, yj) = Pr{X = xi, Y = yj} which assigns to each pair (xi, yj) the probability that X assumes the value xi and Y assumes the value yj is called joint probability mass function of X and Y. In this context, the probability mass functions of X and Y individually are sometimes called the marginal probability mass functions of X and Y. It is natural to represent the joint probability mass function by the matrix f = {fij} = {fxiyj} where i runs from 1 to n and j runs 1 to m. In that case the marginal pmf of X is the vector obtained by summing across the rows of f and the marginal pmf of Y is the vector obtained by summing down the columns of f. Example 5. Consider Example 1 where an office copier on any particular day is either in good condition (g or 1), poor condition (p or 2) or broken (b or 3) and we observe the copier today and tomorrow and X1 = condition of the copier today and X2 = condition of the copier tomorrow. Then the matrix p in (1) and (3) specifies the joint probability mass function, i.e. p = ffggpg fbg fgp fpp fbp fgb fpb fbb = 0.4 0 0.128 0.03 0.07 0.04 0.06 . 0.032 0.24 The marginal pmf's of X1 and X2 are given in (2), i.e. fX1 = marginal pmf of X1 = fX1(g) 0.5 fX1(p) = 0.1 0.4 fX1(b) fX2 = marginal pmf of X2 = fX2(g) 0.528 fX2(p) = 0.102 0.37 fX2(b) Certain questions involving the two random variables are most naturally answered using the joint pmf of the random variables. For example, suppose we want to know the probability the copier is broken either at the today or tomorrow or both. Then we can just sum the values of fjk where one or both of j or k is b, i.e. Pr{X1 = b or X2 = b} = 0.07 + 0.06 + 0.128 + 0.032 + 0.24 = 0.53 1.2 - 1