Independence Example: A fair die is tossed and we want to guess the outcome. The outcomes will be 1, 2, 3, 4, 5, 6 with equal probability 16 each. If we are interested in getting the following results: A = {1, 3, 5}, B = {1, 2, 3} and C = {3, 4, 5, 6}, then we can calculate the probability for each event: P(A) = P(B) = 3 1 4 2 = , and P(C ) = = . 6 2 6 3 If someone tell you that after one toss, event C happened, i.e. the outcome is one of {3, 4, 5, 6}, then what is the probability for event A to happen and what for B? P(A | C ) = P(A ∩ C ) = P(C ) 1 3 2 3 1 P(B ∩ C ) = ; P(B | C ) = = 2 P(C ) 1 6 2 3 1 = . 4 P(A | C ) = P(A) while P(B | C ) 6= P(B) Liang Zhang (UofU) Applied Statistics I June 16, 2008 1 / 10 Independence Definition Two events A and B are independent if P(A | B) = P(A), and are dependent otherwise. Remark: 1. P(A | B) = P(A) ⇒ P(B | A) = P(B). This is natural since the definition for independent should be symmetric. P(B | A) = Liang Zhang (UofU) P(A | B) · P(B) P(A ∩ B) = P(A) P(A) Applied Statistics I June 16, 2008 2 / 10 Independence Remark: 2. If events A and B are mutually disjoint, then they can not be independent. Intuitively, if we know event A happens, we then know that B does not happen, since A ∩ B = ∅. Mathmatically, P(A | B) = P(A ∩ B) P(∅) = = 0 6= P(A), P(B) P(B) unless P(A) = 0 which is trivial. e.g. for the die tossing example, if A = {1, 3, 5} and B = {2, 4, 6}, then P(A ∩ B) = P(∅) = 0, therefore P(A | B) = 0. However, P(A) = 0.5. Liang Zhang (UofU) Applied Statistics I June 16, 2008 3 / 10 Independence The Multiplication Rule for Independent Events The general multiplication rule tells us P(A ∩ B) = P(A | B) · P(B). However, if A and B are independent, then the above equation would be P(A ∩ B) = P(A) · P(B) since P(A | B) = P(A). Furthermore, we have the following Proposition Events A and B are independent if and only if P(A ∩ B) = P(A) · P(B) In words, events A and B are independent iff (if and only if) the probability that the both occur (A ∩ B) is the product of the two individual probabilities. Liang Zhang (UofU) Applied Statistics I June 16, 2008 4 / 10 Independence In real life, we often use this multiplication rule without noticing it. The probability for getting {HH} when you toss a fair coin twice is 41 , which is obtained by 12 · 12 ; The probability for getting {6,5,4,3,2,1} when you toss a fair die six times is ( 61 )6 , which is simply obtained by 16 · 61 · 16 · 16 · 61 · 16 ; The probability for getting {♠♠♠} when you draw three cards from a 1 deck of well-shuffled cards with replacement is 64 , which is simply obtained by 14 · 41 · 14 . However, if you draw the cards without replacement, then the multiplication rule for independent events fails since the event {the first card is ♠} is no longer independent of the event {the second card is ♠}. In fact, P({the second card is ♠ | the first card is ♠}) = Liang Zhang (UofU) Applied Statistics I 12 . 51 June 16, 2008 5 / 10 Independence Example: Exercise 89 Suppose identical tags are placed on both the left ear and the right ear of a fox. The fox is then let loose for a period of time. Consider the two events C1 ={left ear tag is lost} and C2 = {right ear tag is lost}. Let π = P(C1 ) = P(C2 ), and assume C1 and C2 are independent events. Derive an expression (involving π) for the probability that exactly one tag is lost given that at most one is lost. Liang Zhang (UofU) Applied Statistics I June 16, 2008 6 / 10 Independence Remark: 1. If events A and B are independent, then so are events A0 and B, events A and B 0 as well as events A0 and B 0 . P(A0 ∩ B) P(B) − P(A ∩ B) P(A ∩ B) = =1− P(B) P(B) P(B) 0 = 1 − P(A | B) = 1 − P(A) = P(A ) P(A0 | B) = 2. We can use the condition P(A ∩ B) = P(A) · P(B) to define the independence of the two events A and B. Liang Zhang (UofU) Applied Statistics I June 16, 2008 7 / 10 Independence Independence of More Than Two Events Definition Events A1 , A2 , . . . , An are mutually independent if for every k (k = 2, 3, . . . , n) and every subset of indices i1 , i2 , . . . , ik , P(Ai1 ∩ Ai2 ∩ · · · ∩ Aik ) = P(Aii ) · P(Ai2 ) · ··· · P(Aik ). In words, n events are mutually independent if the probability of the intersection of any subset of the n events is equal to the product of the individual probabilities. Liang Zhang (UofU) Applied Statistics I June 16, 2008 8 / 10 Independence An very interesting example: Exercise 113 A box contains the following four slips of paper, each having exactly the same dimensions: (1) win prize 1; (2) win prize 2; (3) win prize 3; and (4) win prize 1, 2 and 3. One slip will be randomly selected. Let A1 = {win prize 1}, A2 = {win prize 2}, and A3 = {win prize 3}. Are these three events mutually independent? Liang Zhang (UofU) Applied Statistics I June 16, 2008 9 / 10 Independence Example: Consider a system of seven identical components connected as following. For the system to work properly, the current must be able to go through the system from the left end to the right end. If components work independently of one another and P(component works)=0.9, then what is the probability for the system to work? Let A = {the system works} and Ai = {component i works}. Then A = (A1 ∪ A2 ) ∩ ((A3 ∩ A4 ) ∪ (A5 ∩ A6 )) ∩ A7 . Liang Zhang (UofU) Applied Statistics I June 16, 2008 10 / 10