COR-STAT1202 Introductory Statistics Seminar 3 – Probability Readings: Statistics for Business and Economics (Chapter 3), Handouts Please Print Double-Sided To Save Paper! Any content in a dotted box is for your understanding only and will not be included in the examinations. Introduction To recap, knowledge of probability is required when performing inferential statistics as it helps us understand the relationship between a population and a sample drawn from it. Events, Samples Spaces and Probability Experiment: A process of observation leading to a single outcome that cannot be predicted with certainty Sample point or simple event: The most basic outcome of an experiment (see also definition of event below) Event: A specific collection of sample points A simple event contains only a single sample point A compound event contains two or more sample points Sample space, πΊ: Collection of all of an experiment’s sample points Probability of an event π¬: Calculated by summing the probabilities of all the sample points for event π¬ in the sample space πΊ 1 Example 3.1: Die Throw Consider the outcome of a single throw of a 6-sided fair die. The experiment is the single throw of the die. One possible sample point or simple event is the throw of 1. One possible compound event is the throw of an odd number, i.e. 1, 3 or 5. The sample space π is {1,2,3,4,5,6}. 1 The probability of the event of throwing 1 is . 6 3 1 6 2 The probability of event of throwing an odd number is = . The three probability axioms upon which all probability theory is based on are: 1) The probability of an event πΈ is a non-negative real number, i.e. π·(π¬) ≥ π, π·(π¬) ∈ β , ∀π¬ ∈ πΊ 2) The probabilities of all the sample points in a sample space π must sum up to 1, i.e. ∑ππ₯π₯ π π·(π¬π ) = π, where πΈπ is a sample point in sample space π 3) The probability of a union of any countable sequence of disjoint events (or mutually exclusive events) πΈ1 , πΈ2 , … is the sum of the individual probabilities, i.e. π·(βππ₯π₯ π π¬π ) = ∑ππ₯π₯ π π·(π¬π ) The three main consequences of the three probability axioms are: 1) The probability of an empty set is 0, i.e. π·(∅) = π 2) If event π΄ is a subset of event π΅, then the probability of event π΄ is less than or equal to the probability of event π΅, i.e. if π¨ ⊆ π©, then π·(π¨) ≤ π·(π©) 3) The probability of any event is between 0 and 1 inclusive, i.e. π ≤ π·(π¬) ≤ π , ∀π¬ ∈ πΊ Example 3.2: Die Throw (continued) Continuing with the setup in Example 3.1, we have the following: With respect to the three probability axioms, we note the following: 1 Probability of one of the simple events = ≥ 0 6 1 1 1 1 1 1 6 6 6 6 6 6 Probability of all the sample points or simple events = + + + + + = 1 Noting that the events are disjoint events, e.g. it is not possible to throw a 1 and a 6 at the same time, 1 1 1 3 1 6 6 6 6 2 π(odd) = π(1) + π(3) + π(5) = + + = = 2 With respect to the three main consequences, we note the following: π(∅) = 0 and 0 ≤ π(πΈ) ≤ 1 , ∀πΈ ∈ π can be observed easily 1 1 6 2 Event of throwing 1 ⊆ event of throwing an odd number, and π(1) = ≤ π(odd) = Two tools useful for presenting an experiment’s sample space and its events are: • Tree diagram: Events are shown chronologically using lines and nodes • Venn diagram: Events are shown as shapes inside a big rectangle that represents the sample space π Example 3.3: Coin Throws Consider the outcomes of two throws of a fair coin. Denoting π» for a throw of head and π for a throw of tail, the sample points are π»π», π»π, ππ» and ππ. Note that the order of the throws matters, i.e. π»π is a different sample point from ππ»! 1st throw 2nd throw Sample points π» π»π» π π»π π» ππ» π ππ π» π Tree diagram π π»π» ππ» π»π ππ Venn diagram 3 Counting Rules Three counting rules are available for counting events in probability calculations: • Multiplicative rule: The number of ways to arrange π distinct items is π!, where π! = π × (π − π) × … × π × π and 0! is defined to be equal to 1 • Combinations rule: The number of ways to arrange π items into π groups of size π and (π − π) respectively, where the order within both groups is ignored, is ππͺπ or (ππ) π πͺπ = (ππ) = π! π!(π−π)! The combinations rule can be expanded to a more general rule for π > 2 groups of sizes π1 , π2 , π3 … to give • π! π1 !π2 !π3 !… Permutations rule: The number of ways to arrange π items into π groups of size π and (π − π) respectively, where the order within the group of size (π − π) is ignored, is ππ·π π π! π·π = (π−π)! Example 3.4: Your Ah-Gong and Ah-Ma’s Favourite Hobby? Consider the numbers 1, 2, 3, 4. The number of ways to arrange 1, 2, 3, 4 is 4! = 24. This is an example of the application of the multiplicative rule. Now consider the numbers 1, 2, 3, 3. The number of ways to arrange 1, 2, 3, 3 is 4! 2! = 12. This is an example of the application of the permutations rule. Now consider the numbers 1, 1, 3, 3. The number of ways to arrange 1, 1, 3, 3 is 4! 2!2! = 6. This is an example of the application of the combinations rule. Now consider the numbers 1, 1, 1, 3. The number of ways to arrange 1, 1, 1, 3 is 4! 3! = 4. Now consider the numbers 1, 1, 1,1. The number of ways to arrange 1, 1, 1, 1 is 4! 4! = 1. 4 Exercise 3.1: Students’ Seating Arrangement Consider 7 students, namely Alice, Brandon, Carol, Dan, Elaine, Fred and Gigi. They are seated in a row of 7 chairs. How many different ways can we arrange the students’ seating, assuming we treat all of them as individuals? 7! = 5040 How many different ways can we arrange the students’ seating, assuming we treat Alice, Carol, Elaine and Gigi homogeneously as girls and treat Brandon, Dan and Fred homogeneously as boys? 7! 4!3! = 35 How many different ways can we arrange the students’ seating, assuming we treat the girls as individuals and treat Brandon, Dan and Fred homogeneously as boys? 7! 3! = 840 Unions, Intersections, Complementary Events A number of compound events are frequently used in probability calculations: • Intersection: The intersection of two events π¨ and π©, denoted by π¨ ∩ π©, is the event that occurs if both π¨ and π© occurs in a single experiment Event π΄ ∩ π΅ shown pictorially, with a Venn diagram: π π΄ π΄∩π΅ π΅ Intersection can also be defined with more than two events, e.g. π΄ ∩ π΅ ∩ πΆ 5 • Union: The union of two events π¨ and π©, denoted by π¨ ∪ π©, is the event that occurs if either π¨ or π© or both π¨ and π© occurs in a single experiment π·(π¨ ∪ π©) = π·(π¨) + π·(π©) − π·(π¨ ∩ π©) Event π΄ ∪ π΅ shown pictorially, with a Venn diagram: π π΄∪π΅ π΅ π΄ Union can also be defined with more than two events, e.g. π΄ ∪ π΅ ∪ πΆ • Complementary: The complement of an event π¨, denoted by π¨π or π¨′ , is the event that π¨ does not occur π·(π¨′ ) = π − π·(π¨) or π·(π¨) + π·(π¨′ ) = π Event π΄′ shown pictorially, with a Venn diagram: π π΄′ π΄ Example 3.5: Die Throw (continued) Continuing with the setup in Examples 3.1 and 3.2, define event π΄ as the throw of ≤ 3 and event π΅ as the throw of an odd number: Event π΄ ∩ π΅ is the event of throwing 1 or 3. 2 1 6 3 π(π΄ ∩ π΅) = = Event π΄ ∪ π΅ is the event of throwing 1, 2, 3 or 5. 4 2 3 3 1 2 6 3 6 6 3 3 π(π΄ ∪ π΅) = = or π(π΄ ∪ π΅) = π(π΄) + π(π΅) − π(π΄ ∩ π΅) = + − = Event π΄′ is the event of throwing 4, 5 or 6. 3 1 6 2 π(π΄′ ) = = 6 Exercise 3.2: Formula for Union of 3 Events Derive a general formula for the probability of the union of 3 events, i.e. π(π΄ ∪ π΅ ∪ πΆ). Assume π(π΄ ∩ π΅) > 0, π(π΄ ∩ πΆ) > 0, π(π΅ ∩ πΆ) > 0 and π(π΄ ∩ π΅ ∩ πΆ) > 0. π π΄ π΅ π΄∪π΅∪πΆ πΆ If we sum up π(π΄), π(π΅) and π(πΆ), we will be double-counting π(π΄ ∩ π΅), π(π΄ ∩ πΆ) and π(π΅ ∩ πΆ) and triple-counting π(π΄ ∩ π΅ ∩ πΆ). If we then subtract π(π΄ ∩ π΅), π(π΄ ∩ πΆ) and π(π΅ ∩ πΆ) from the sum of π(π΄), π(π΅) and π(πΆ), we will be removing π(π΄ ∩ π΅ ∩ πΆ) completely. If we then add π(π΄ ∩ π΅ ∩ πΆ) back, then we will have counted every area exactly once. Thus, π(π΄ ∪ π΅ ∪ πΆ) = π(π΄) + π(π΅) + π(πΆ) − π(π΄ ∩ π΅) − π(π΄ ∩ πΆ) − π(π΅ ∩ πΆ) + π(π΄ ∩ π΅ ∩ πΆ) Conditional Probability The conditional probability that event π¨ occurs given that event π© has occurred, π·(π¨|π©) = π·(π¨∩π©) π·(π©) , assuming π(π΅) > 0 Let us understand π(π΄|π΅) with the help of a Venn diagram: π π΄ π΄∩π΅ 7 π΅ Since we are told event π΅ has occurred, we only need to consider inside the circle representing event π΅. Anything outside of event π΅, i.e. event π΅′ , is ignored. Another perspective is we can treat event π΅ as a “reduced sample space”. We then observe that the only way for event π΄ to occur within this “reduced sample space” is via the event π΄ ∩ π΅. As such, we arrive at the ratio of π(π΄∩π΅) π(π΅) . Example 3.6: Die Throw (continued) Continuing with the setup in Examples 3.1, 3.2 and 3.5, define event π΄ as the throw of 1 and event π΅ as the throw of an odd number: Event π΄ ∩ π΅ is the event of throwing 1. π(π΄ ∩ π΅) = 1 6 Event π΅ is the event of throwing 1, 3 or 5. 3 1 6 2 π(π΅) = = Event π΄|π΅ is the event of a throw of 1 given that the same throw is an odd number. Substituting, π(π΄|π΅) = π(π΄∩π΅) π(π΅) = 1 6 1 2 = 1 3 Exercise 3.3: Students’ Seating Arrangement (continued) Continuing with the setup in Exercise 3.1, given that Alice is seated on the leftmost seat, what is the probability that Alice is sitting next to another girl? Show full working. Let event π΄ be the event of Alice sitting next to another girl and event π΅ be the event that Alice is seated on the leftmost seat. We are therefore solving for π(π΄|π΅). Number of ways that the 7 students can be seated = 7! = 5040 Number of ways that the 7 students can be seated but with Alice on the leftmost seat = 1 × 6! = 720 Thus, π(π΅) = 720 5040 = 1 7 Number of ways that the 7 students can be seated but with Alice on the leftmost seat and another girl on the 2nd leftmost seat = 1 × 3 × 5! = 360 Thus, π(π΄ ∩ π΅) = 360 5040 = 1 14 8 Substituting, π(π΄|π΅) = π(π΄∩π΅) π(π΅) = 1 14 1 7 = 1 2 It is also possible to get to the answer via logical reasoning. After removing Alice from consideration, there are 6 possible choices for the 2nd leftmost seat. Since 3 of these choices are girls, the 3 1 6 2 probability of picking a girl for the 2nd leftmost seat must be = . Mutually Exclusive Events and Independent Events Two properties relating to events are frequently mentioned and/or used in probability calculations: • Mutually exclusive events: Two events π¨ and π© are mutually exclusive if both π¨ and π© cannot occur at the same time in a single experiment π·(π¨ ∩ π©) = π Mutually exclusive events π΄ and π΅ shown pictorially, with a Venn diagram: π π΅ π΄ It follows that for mutually exclusive events, π(π΄ ∪ π΅) = π(π΄) + π(π΅) • Independent events: Two events π¨ and π© are independent of each other if the probability of π© occurring is not affected by whether π¨ has occurred π·(π¨ ∩ π©) = π·(π¨) × π·(π©) or π·(π¨|π©) = π·(π¨) or π·(π©|π¨) = π·(π©) The above three tests for independence are equivalent to one another, so only one of the above checks need to be carried out when checking for independence between events Example 3.7: More Die Throws Continuing with the setup in Example 3.1, 3.2, 3.5 and 3.6, define event π΄ as the throw of 1 and event π΅ as the throw of 2: Events π΄ and π΅ are mutually exclusive events because we can only obtain a single outcome on a single throw of the die. It is not possible to obtain 1 and 2 on the same throw. π(π΄ ∩ π΅) = 0 9 Suppose the die used above is blue in colour. A new red die is introduced. Define event π΄ as the throw of 1 on the blue die and event πΆ as the throw of 3 on the red die. Events π΄ and πΆ are independent events because the outcome of the throw with the blue die does not affect the outcome of the throw with the red die. 1 1 1 6 6 36 π(π΄ ∩ πΆ) = π(π΄) × π(πΆ) = × = Exercise 3.4: First Throw and Sum of Two Throws Consider the outcome of two throws of one 6-sided fair die. Three events are defined as follows: • Event π΄: First throw is 1 • Event π΅: Sum of the two throws is 6 • Event πΆ: Sum of the two throws is 7 Are events π΄ and π΅ independent of each other? Are events π΄ and πΆ independent of each other? Are events π΄ and π΅ mutually exclusive? Are events π΄ and πΆ mutually exclusive? π(π΄) = π(π΅) = 1 6 5 36 (cases of 1 + 5, 2 + 4, 3 + 3, 4 + 2, 5 + 1) π(π΄ ∩ π΅) = 1 36 1 5 6 36 (case of 1 + 5) ≠ π(π΄) × π(π΅) = × = 5 216 Thus, events π΄ and π΅ are not independent of each other. Similarly, we have π(π΄|π΅) = We also have π(π΅|π΄) = π(πΆ) = 6 36 π(π΄∩π΅) = π(π΅) π(π΄∩π΅) π(π΄) = 1 36 1 6 1 36 5 36 1 1 5 6 = ≠ π(π΄) = 1 5 6 36 = ≠ π(π΅) = (cases of 1 + 6, 2 + 5, 3 + 4, 4 + 3, 5 + 2, 6 + 1) = π(π΄ ∩ πΆ) = 1 36 1 1 1 6 6 36 (case of 1 + 6) = π(π΄) × π(πΆ) = × = 1 6 Thus, events π΄ and πΆ are independent of each other. Similarly, we have π(π΄|πΆ) = π(π΄∩πΆ) π(πΆ) Similarly, we also have π(πΆ|π΄) = = 1 36 1 6 π(π΄∩πΆ) π(π΄) 1 1 6 6 = = π(π΄) = = 1 36 1 6 1 1 6 6 = = π(πΆ) = Some independent events are not immediately obvious upon observation, so the best way to find out is to carry out one of the checks, π(π΄ ∩ π΅) = π(π΄) × π(π΅) or π(π΄|π΅) = π(π΄) or π(π΅|π΄) = π(π΅). Since π(π΄ ∩ π΅) = Since π(π΄ ∩ πΆ) = 1 36 1 36 ≠ 0, events π΄ and π΅ are not mutually exclusive. ≠ 0, events π΄ and πΆ are not mutually exclusive. 10 Bayes’ Theorem Given π mutually exclusive and exhaustive events, π΅1 , π΅2 , … , π΅π , i.e. π(π΅1 ) + π(π΅2 ) + β― + π(π΅π ) = 1, and an observed event π΄, π·(π©π |π¨) = π·(π©π )π·(π¨|π©π ) π·(π©π )π·(π¨|π©π )+π·(π©π )π·(π¨|π©π )+β―+π·(π©π )π·(π¨|π©π ) We are primarily interested in the probability of one of the events π΅1 , π΅2 , … , π΅π . Event π΄ contains some information about events π΅1 , π΅2 , … , π΅π . π(π΄|π΅π ) is the accuracy of π΄ in predicting π΅1 , π΅2 , … , π΅π , based on past experience. π(π΅π ) , known as the prior probability, is the estimate of the probability of π΅π before any external information is received and taken into consideration. Hence, “prior”. π(π΅π |π΄), known as the posterior probability, is the estimate of the probability of π΅π after some external information in the form of event π΄ is received and taken into consideration. Hence, “posterior”. Example 3.8: Your Lecturer’s Morning Umbrella Dilemma I am a lazy lecturer who dislikes carrying an umbrella. Consider therefore the question of whether it will rain later today when I woke up this morning. Based on my past experience of living in Singapore, I have my own estimates of the probability of various weather types on a typical September day. These are my prior probabilities. For example, π΅1 is sunny, π΅2 is cloudy, π΅3 is rainy and so on, and I have π(π΅3 ) based on my past experience. I then switch on the radio and the DJ reads a weather report stating it will rain later today. This is the external information π΄. This new information will affect my estimate of the probability of rain later today, so my objective is now to calculate π(π΅3 |π΄), to take into account external information π΄. For many days in the past, I have recorded down the weather forecasts and noted what the weather was like later on those same days. This allows me to calculate π(π΄|π΅π ) for all the π΅π s. For example, π(π΄|π΅3 ) is the probability that the weather report stating it would rain given that it rained that day. We can then apply Bayes’ Theorem to arrive at what I need, π(π΅3 |π΄) = π(π΅3 )π(π΄|π΅3 ) π(π΅1 )π(π΄|π΅1 )+π(π΅2 )π(π΄|π΅2 )+β―+π(π΅π )π(π΄|π΅π ) Note that the LHS is the probability we are interested in and the RHS contains all the information (past and current) we have on hand. 11 Proof of Bayes’ Theorem: π(π΅π |π΄) = π(π΅π )π(π΄|π΅π ) . π(π΅1 )π(π΄|π΅1 )+π(π΅2 )π(π΄|π΅2 )+β―+π(π΅π )π(π΄|π΅π ) We start from the LHS and apply the definition of conditional probability, π(π΅π |π΄) = π(π΅π ∩π΄) π(π΄) . The proof proceeds by showing • The numerators are equal: π(π΅π ∩ π΄) = π(π΅π )π(π΄|π΅π ), and • The denominators are equal: π(π΄) = π(π΅1 )π(π΄|π΅1 ) + π(π΅2 )π(π΄|π΅2 ) + β― + π(π΅π )π(π΄|π΅π ) Numerator Applying definition of conditional probability, π(π΄|π΅π ) = Since π(π΄ ∩ π΅π ) = π(π΅π ∩ π΄), we have π(π΄|π΅π ) = π(π΄∩π΅π ) π(π΅π ∩π΄) π(π΅π ) π(π΅π ) . . Rearranging, π(π΅π ∩ π΄) = π(π΅π )π(π΄|π΅π ). Denominator Since the events π΅1 , π΅2 , … , π΅π are mutually exclusive and exhaustive, we have π(π΄) = π(π΄ ∩ π΅1 ) + π(π΄ ∩ π΅2 ) + β― + π(π΄ ∩ π΅π ). Applying definition of conditional probability again, π(π΄|π΅π ) = Since π(π΄ ∩ π΅1 ) = π(π΅1 ∩ π΄), we have π(π΄|π΅1 ) = π(π΄∩π΅π ) π(π΅π ) . π(π΅1 ∩π΄) π(π΅1 ) . Rearranging, π(π΅1 ∩ π΄) = π(π΅1 )π(π΄|π΅1 ). Similarly, we have π(π΄ ∩ π΅2 ) = π(π΅2 )π(π΄|π΅2 ) and so on. Substituting back, π(π΄) = π(π΅1 )π(π΄|π΅1 ) + π(π΅2 )π(π΄|π΅2 ) + β― + π(π΅π )π(π΄|π΅π ). Combining, π(π΅π |π΄) = π(π΅π )π(π΄|π΅π ) π(π΅1 )π(π΄|π΅1 )+π(π΅2 )π(π΄|π΅2 )+β―+π(π΅π )π(π΄|π΅π ) . Example 3.9: Your Lecturer’s Morning Umbrella Dilemma (cont.) Continuing with the setup in Example 3.8, suppose there are only three weather types, namely sunny, cloudy and rainy. I believe the probabilities of sunny, cloudy and rainy weather on a typical September day are 0.25, 0.35 and 0.4 respectively. From past experience, 15% of the weather forecasts predicted rainy weather when it turned out sunny, 30% of the weather forecasts predicted rainy weather when it turned out cloudy, and 85% of the weather forecasts predicted rainy weather when it turned out rainy. 12 Given that the weather forecast predicted rainy weather later today, what is the probability that it will be rainy later in the day? Let πΉπ be the event that the weather forecast predicts rainy weather, ππ be the event of sunny weather, ππΆ be the event of cloudy weather and ππ be the event of rainy weather. We want to calculate π(ππ |πΉπ ). Applying Baye’s Theorem, π(ππ |πΉπ ) = π(ππ )π(πΉπ |ππ ) π(ππ )π(πΉπ |ππ )+π(ππΆ )π(πΉπ |ππΆ )+π(ππ )π(πΉπ |ππ ) From the question, π(ππ ) = 0.25, π(ππΆ ) = 0.35 and π(ππ ) = 0.4 π(πΉπ |ππ ) = 0.15, π(πΉπ |ππΆ ) = 0.3 and π(πΉπ |ππ ) = 0.85 Substituting, π(ππ |πΉπ ) = 0.4×0.85 0.25×0.15+0.35×0.3+0.4×0.85 = 0.704663 Given that π(ππ |πΉπ ) is quite high, I better bring an umbrella today! Note that the prior probability of rain was 0.4 and it has increased to more than 0.7 after receiving external information that predicted it will rain. This is because the weather forecast has been reasonably accurate in the past. We can see this by noting that the proportion of incorrect predictions of rainy weather are low, i.e. π(πΉπ |ππ ) and π(πΉπ |ππΆ ) are low, and that the proportion of correct predictions of rainy weather is high, i.e. π(πΉπ |ππ ) is high. There are three main methods when applying the Bayes’ Theorem: π(π΅π )π(π΄|π΅π ) • Formula method: Apply π(π΅π |π΄) = • Tree diagram method: Draw a tree diagram, calculate each branch’s probability, then consider π(π΅1 )π(π΄|π΅1 )+π(π΅2 )π(π΄|π΅2 )+β―+π(π΅π )π(π΄|π΅π ) directly the branches of interest • Tableau method: Set out the probabilities in a table and calculate the probabilities systematically Example 3.10: Your Lecturer’s Morning Umbrella Dilemma (cont.) Apply the three methods for Bayes’ Theorem to solve Example 3.9. The formula method has already been covered under Example 3.9. 13 To apply the tree diagram and tableau methods, we need to define two more events. Let πΉπ be the event that the weather forecast predicts sunny weather and πΉπΆ be the event that the weather forecast predicts cloudy weather. Tree Diagram Weather Weather Forecast π(πΉπ |ππ ) π(ππ ) ππ 0.15 0.25 0.35 ππΆ 0.3 0.4 ππ 0.85 Sample points (ππ ∩ πΉπ ) πΉπ ππ ∩ πΉπ πΉπΆ ππ ∩ πΉπΆ πΉπ ππ ∩ πΉπ πΉπ ππΆ ∩ πΉπ πΉπΆ ππΆ ∩ πΉπΆ πΉπ ππΆ ∩ πΉπ πΉπ ππ ∩ πΉπ πΉπΆ ππ ∩ πΉπΆ πΉπ ππ ∩ πΉπ Recall that we want to calculate π(ππ |πΉπ ). π(ππ |πΉπ ) = π(ππ ∩πΉπ ) π(πΉπ ) From the tree diagram, we observe that π(ππ ∩ πΉπ ) = π(ππ ) × π(πΉπ |ππ ) = 0.4 × 0.85 = 0.34 From the tree diagram, we also observe that π(πΉπ ) = π(ππ ∩ πΉπ ) + π(ππΆ ∩ πΉπ ) + π(ππ ∩ πΉπ ) From the tree diagram, we again observe π(ππ ∩ πΉπ ) = π(ππ ) × π(πΉπ |ππ ) = 0.25 × 0.15 = 0.0375 π(ππΆ ∩ πΉπ ) = π(ππΆ ) × π(πΉπ |ππΆ ) = 0.35 × 0.3 = 0.105 Substituting, π(πΉπ ) = π(ππ ∩ πΉπ ) + π(ππΆ ∩ πΉπ ) + π(ππ ∩ πΉπ ) = 0.0375 + 0.105 + 0.34 = 0.4825 Substituting, π(ππ |πΉπ ) = π(ππ ∩πΉπ ) π(πΉπ ) = 0.34 0.4825 = 0.704663 Note that the branches for πΉπ and πΉπΆ are drawn, but the probabilities are not required in solving for π(ππ |πΉπ ). 14 Tableau Method π(πΉπ |ππ ) π(πΉπ ∩ ππ ) π(ππ |πΉπ ) π(ππ ) πΉπ πΉπΆ πΉπ πΉπ πΉπΆ πΉπ πΉπ πΉπΆ πΉπ ππ 0.25 − − 0.15 − − 0.0375 − − 0.0778 ππΆ 0.35 − − 0.3 − − 0.105 − − 0.2176 ππ 0.4 − − 0.85 − − 0.34 − − 0.7047 − − 0.4825 π(πΉπ ) π(ππ ) and π(πΉπ |ππ ) are given in the question. π(πΉπ ∩ ππ ) = π(ππ ) × π(πΉπ |ππ ) π(πΉπ ) = ∑all π π(πΉπ ∩ ππ ) π(ππ |πΉπ ) = π(πΉπ ∩ππ ) π(πΉπ ) Reading off the table, π(ππ |πΉπ ) = π(ππ ∩πΉπ ) π(πΉπ ) = 0.34 0.4825 = 0.704663 Note that the cells for πΉπ and πΉπΆ are included, but the probabilities are not required in solving for π(ππ |πΉπ ). Exercise 3.5: Lie Detector A lie detector is known to be accurate, giving 5% false positives and 1% false negatives. A police detective administers the lie detector test on a suspect, whom he believes has a 75% probability of lying. The lie detector test gives a “lying” result and the police detective accuses the suspect of lying based on it. What is the probability that the police detective’s accusation is incorrect? Let π·πΏ be the event that the lie detector gives a “lying” result, π·π be the event that the lie detector gives a “not lying” result, πΏ be the event of the suspect lying and π be the event of the suspect is not lying, i.e. telling the truth. π(πΏ) = 0.75, so that π(π) = 1 − 0.75 = 0.25 π(π·πΏ|π) = 0.05, so that π(π·π|π) = 1 − 0.05 = 0.95 π(π·π|πΏ) = 0.01, so that π(π·πΏ|πΏ) = 1 − 0.01 = 0.99 We want to calculate π(π|π·πΏ). 15 Applying Bayes’ Theorem, π(π|π·πΏ) = π(π)π(π·πΏ|π) π(π)π(π·πΏ|π)+π(πΏ)π(π·πΏ|πΏ) = 0.25×0.05 0.25×0.05+0.75×0.99 = 0.016556 Thus, there is a probability of 1.66% that the police detective’s accusation is incorrect. Food for Thought Question 3 There are 45 students in G22, COR-STAT1202 class. What is the probability that at least two students share the same birthday in G22? Ignore 29th February, i.e. there are 365 days in a year. 16