8.1 - Probability and Sample Spaces Probability → numerical measure of chance. It is a fraction/proportion of one specific outcome out of all possible outcomes. 𝑃(𝐴) = 𝑛𝑎 𝑛 | 𝑛𝑎 = 𝑓𝑎𝑣𝑜𝑢𝑟𝑎𝑏𝑙𝑒 𝑜𝑢𝑡𝑐𝑜𝑚𝑒; 𝑛 = 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 ● Probability must be between 0 and 1 because it is a proportion (part of a whole). ● Experiment: a process which produces a single outcome whose result is not certain (Toss of a die, a coin, shuffling a deck). Counting Rules 1. mn rule → if an experiment is performed in two stages with m ways to accomplish stage 1 and n ways to accomplish stage 2. The number of outcomes is m x n. a. Example: Flipping two coins 2. Combination (sampling without replacement) → The number of distinct combinations of n distinct objects that can be formed. a. Taking r at a time b. Order doesn’t matter (AB = BA, but only choose one) 𝑛 𝑛! ( 𝑟 ) = 𝑟!(𝑛−𝑟)! 3. Permutation (counting possible sequences) → the number of ways to arrange n distinct objects, taking them r at a time. a. Order matters (AB and BA are both unique possibilities) 𝑛 𝑟! 𝑃 = (𝑛−𝑟)! 𝑟 4. Classical or Theoretical Approach → If all outcomes are equally likely, the probability is the ratio of the number of outcomes of an event divided by the total number of outcomes. a. Example: Rolling two dice and observing the total 8.3 - Approaches to Assigning Probabilities Relative Frequency or Empirical Approach → The Law of Large Numbers (LLN) states that as the number of experiments increases, the observed outcomes approach a specific relative frequency (probability) number. *Data-driven approach Subjective approach → Probability is defined by the degree of belief Examples: - the probability of rain based on past observation and current conditions. - Choosing a CEO out of 4 candidates; analyst comes up with a subjective probability that one will be chosen. Interpreting Probability: ● No matter which method is used to assign a probability, all will be interpreted in the relative frequency approach. ● ● Example: Lottery where 6 numbers are chosen out of 49, the classical approach will say the probability is 6/49. We interpret this to be the long-run probability Why? → as n gets larger, frequency becomes probability. Probability and Events Simple event → an outcome that is observed on a single repetition of an experiment. Denoted as E with a subscript. Sample space → collection of all possible outcomes Event of interest → a collection of one or more simple events that interest us. (Ex: rolling dice and hoping for odd numbers). Probability Tree → a simple method of applying probability rules. Works for displaying experiments in stages. 8.4 Probability Rules (Venn diagram) Event Relationships → ● 𝐴 ∪ 𝐵 (Union: Either A or B or Both; *the entire Venn diagram) Example: throwing two dice, event of interest where the first outcome is 1 and second event of interest is where the second outcome is 5 A{(1, 2), (1, 3), (1, 4), (1, 5), (1, 6)} B{(1, 5), (2, 5), (3, 5), (4, 5), (5, 5), (6, 5)} 𝐴 ∪ 𝐵 {(1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (2, 5), (3, 5), (4, 5), (5, 5), (6, 5)} ● 𝐴 ∩ 𝐵 (A and B; *middle part of the Venn diagram) ● 𝐴 ∪ 𝐵 | 𝐴 ∩ 𝐵 = 0 (Union: either A or B) (Mutually exclusive) ● 𝐶 𝐴 (Not A) (Complement) Probability Rules 1. Probability must be between 0 and 1. If 0, it will never occur. If 1, it will always occur. 0≤𝑃≤1 2. The probability of the set of all possible outcomes must be 1 𝑃(𝐴𝑙𝑙) = 1 3. Rule of Complement: the probability of event A not happening is 1 - the probability of A happening. 4. Addition Rule: used to compute the probability of an event A or B or both. (Union A and B) 𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵) 8.5 Contingency Tables Contingency Table and probability: Events in a r x c contingency table where the variables are qualitative variables. We cross-classify each cell. 8.6 Conditional Probability and Independence Conditional Probability → the probability of an event given that another event has occurred is called conditional. 𝑃(𝐴| 𝐵) | 𝐴 𝑖𝑠 𝑤ℎ𝑎𝑡 𝑤𝑒 𝑤𝑎𝑛𝑡, 𝐵 𝑎𝑙𝑟𝑒𝑎𝑑𝑦 𝑜𝑐𝑐𝑢𝑟𝑒𝑑 (Probability of A given B) 𝑃(𝐴∩𝐵) = 𝑃(𝐵) , 𝑃(𝐵) > 0 If 𝑃(𝐴| 𝐵) = 𝑃(𝐴); or 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) × 𝑃(𝐵) then A and B are independent 5. Multiplication Rule → joint occurrence of A and B 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴|𝐵) × 𝑃(𝐵) 8.8 Probability Trees Probability tree → a diagram of probabilities has branches (typically 3) for possibilities. 1. First Branch: marginal probability → 𝑃(𝐵) 2. Second Branch: conditional probability → 𝑃(𝐴|𝐵) 𝑃(𝐴∩𝐵) 3. Third Branch: joint probability → 𝑃(𝐵) Rule: The probabilities of any set of branches starting at one node must add to 1 8.9 Baye’s Rule Baye’s Rule → a way to formally incorporate new information When we are given: ● Old events (previous) (𝐴1, 𝐴2.... 𝐴𝑘) ● New event 𝐵 ● Prior probabilities 𝑃(𝐴1), 𝑃(𝐴2).... 𝑃(𝐴𝑘) We want to: ● Revise → 𝑃(𝐴1|𝐵), 𝑃(𝐴2|𝐵).... 𝑃(𝐴𝑘|𝐵) Note: 𝑃(𝐵) = 𝑃(𝐵 ∩ 𝐴1) + 𝑃(𝐵 ∩ 𝐴2) +.... 𝑃(𝐵 ∩ 𝐴𝑘) = 𝑃(𝐴1)𝑃(𝐵|𝐴1) + 𝑃(𝐴2)𝑃(𝐵|𝐴2) +.... 𝑃(𝐴𝑘)𝑃(𝐵|𝐴𝑘) Revised→ 𝑃(𝐵∩𝐴𝑖) 𝑃(𝐴1)𝑃(𝐵|𝐴1) + 𝑃(𝐴2)𝑃(𝐵|𝐴2) ...𝑃(𝐴𝑘)𝑃(𝐵|𝐴𝑘) Also Revised→ 𝑃(𝐵∩𝐴𝑖) 𝑃(𝐵∩𝐴1)+𝑃(𝐵∩𝐴2)+....𝑃(𝐵∩𝐴𝑖) Steps: 1. Define prior events 2. Determine probability of prior events (first branch) 3. Define a new event that could alter prior probabilities 4. Determine conditional probabilities (second branch) 5. Calculate joint probabilities (third branch) 6. Calculate revised probability