Example A manufacturer has a 5% rate of defect when making thermostats, which are produced in batches of 4. Let’s assume that production involves independent events. That is, the failure of any individual thermostat does affect the probability of failure for any other thermostat. Number of trials is fixed (4) The trials are independent. (according to given assumption). Each trial has two categories of outcome: the thermostat is manufactured successfully or it is a failure. The probabilities of failure (0.05) remain constant for different thermostats. S D D D S S D D D S D S 0 1 1 D S 2 1 2 S D S D S D S D S D S D S D S 2 2 3 4 3 1 2 S 3 2 3 # Successes Arrangements 4 1 3 4 2 6 1 4 0 1 0 1 2 Probability (0.95)4 = 0.8415 4(0.95)3(0.05)1 = 0.1715 6(0.95)2(0.05)2 = 0.0135 4(0.95)1(0.05)3 = 0.0005 (0.05)4 = 0.0000 1.0000 3 4 Example Suppose in the previous example, batches of size 5 were being examined, and the overall defect rate was 12%. Show Defects 0 Prob of D’s .12 Non Defects 5 Prob of ND’s .88 Arrangement 1 1 0 .12 2 1 4 5 .88 5 .12 3 2 3 4 .88 10 .12 4 3 2 3 .88 10 .12 5 4 1 2 .88 5 .12 5 0 1 .88 0 1 P(0) = 0.527732, P(1) = 0.359817, P(2) = 0.098132, P(3) = 0.013382, P(4) = 0.000912, P(5) = 0.000025 A binomial experiment meets all the following requirements 1. The experiment must have a fixed number of trials 2. The trials must be independent. (The outcome of any individual trial doesn’t affect the probabilities in the other trials). 3. Each trial must have all outcomes classified into two categories (even though the sample space may have more than two simple events). 4. The probabilities must remain constant for each trial. The Binomial Probability Formula n denotes the fixed number of trials x denotes a specific number of successes in n trials, so that x can be any whole number between O and n p denotes the probability of success in one of the n trials 1-p denotes the probability of failure in one of the n trials P(x) denotes the probability of getting exactly x successes among the n trials. n! n x x P x p 1 p n x! x ! Example A manufacturer has a 15% rate of defect when making microchips, which are produced in batches of 10. In a QC inspection plan batches are considered acceptable if there are fewer than 2 defect per batch. Let X denote the number of defect per batch of 10, and p = 0.15 the defect rate. 10! 10 x x P x (0.15) 0.85 10 x! x! P(0) = 0.197, and P(1) = 0.347, so that the probability of accepting a batch equals P(0) + P(1) = 0.544 The Binomial Probability Mass Function Example: A cosmetic salesperson who calls potential customers to sell her products has determined that 30% of her telephone calls result in a sale. Determine the probability distribution for her next three calls. 3! 3 0 P x 0 (0.30) 0 0.70 3 0! 0! 0.343 P x 1 3! 31 (0.30)1 0.70 3 1!1! 0.441 P x 2 3! 3 2 (0.30) 2 0.70 0.189 3 2! 2 ! 3! 3 3 P x 3 (0.30) 3 0.70 0.027 3 3! 3! Determine the cumulative distribution for her next three calls. The cumulative distribution function (CDF) for any discrete random variable is defined as Prob(x < k), where k defines the sample space. 0.45 P x 0 P x 0 0.40 0.35 0.343 0.30 PDF 0.25 P x 1 P x 0 P x 1 0.20 0.15 0.10 0.343 0.441 0.05 0.00 0 1 2 3 X P x 2 P x 1 P x 2 1.0 0.9 0.784 0.189 0.8 CDF 0.784 0.973 0.7 0.6 P x 3 0.973 0.027 0.5 0.4 0.3 0 1 2 X 3 1.0 Example: The records of a department store show that 20% of their customers who make a purchase return the merchandise in order to exchange it. What is the probability that in the next six purchases (a) Exactly three customers return the merchandise? 6! 6 3 P x 3 (0.20) 3 0.80 6 3! 3! 0.08192 (b) At least three customers return the merchandise? P x 3 1 P(0) P(1) P(2) 1.262 .393.246 1.901 Random Variables And Expectations Take the discrete Random Variable X with probability mass function P(xi) Define the expectation of X as E{X} xi P xi i Define the Variance of X as Var{X} xi EX P xi 2 i A Bernoulli experiment, is an experiment in which the outcome can be classified as a “success” (coded 1) or “failure” (coded 0) as follows.. P{X = 0} = 1-p P{X = 1} = p where p, 0<p<1, is the probability that a trial is a “success”. O 1 Expected Value and Variance of a Bernoulli RV E{X} xi P xi i 1. P( X 1) 0. P( X 0) 1. p 0.(1 p) p Var{X} xi EX P xi 2 i xi p2 P xi i (1 p) 2 p (0 p) 2 (1 p) p(1 p) Rules for Expectations of RV’s If a and b are scalars, E{aX + bY} = aE{X} + bE{Y} (in general) Var{aX} = a2Var{X} (in general) and and Var{X + Y} = Var{X} + Var{Y} (for independent variables X and Y) Expected Value and Variance of a Binomial RV First note that a Binomial RV can be written as the sum of n independent Bernoulli RV’s: X = X1 + X2 + ….+ Xn. E{X} E{X1 X2Xn} np Var{X} Var{X1 X2 Xn} Var{X1} Var{X2} Var{Xn} np(1 p) Counting alpha-particles (1910) In this classic set of data Rutherford and Geiger counted the number of scintillations in 72 second intervals caused by radioactive decay of a quantity of the element polonium. Altogether there were 10097 scintillations during 2608 such intervals. Can we model these data?? Count Frequency 0 57 1 203 2 383 3 525 4 532 5 408 6 573 Count Frequency 7 139 8 45 9 27 10 10 11 4 12 1 13 0 5 10 Frequency 500 400 300 200 100 0 0 Number of Observed Alpha-Particles 15 14 1 Characteristics of a Poisson Random Variable The experiment consists of counting the number of times a certain event occurs during a given unit of time or in a given area or volume (or weight, or distance, or any other unit of measurement). The probability that an event occurs in a given unit of time, area, volume, is the same for all the units. The number of events that occur in one unit of time, area, or volume is independent of the number that occur in other units. The mean (or expected) number of events in each unit (of time, area etc..) is denoted by the Greek letter lambda, l. The probability mass function for a Poisson RV is given by lx e l P x x! x 0,1,2.... Here x is the number of observed events (alpha particles) that occur in a given time (or space) unit. P(x) is the probability of observing x events in any given unit of time. l remains unchanged regardless of which time unit is being inspected. l is often referred to as the rate at which the events occur as itself has units: # of particles per 72 second interval Is is legitimate to describe the Rutherford-Geiger data by the Poisson Probability Function? As scientists we should (1) estimate l (2) calculate P(x) using our estimate l (3) visually compare the observed counts with expected counts from the Poisson probability model!! 10097 l 3.87 particals / 72 sec. interval 2608 Evaluate 387 . x e 3.87 P x x! x 0,1,2.... 387 . 0 e 3.87 P0 e 3.87 0.02085 0! P1 387 . e 3.87 0.08072 387 . 2 e 3.87 P 2 01562 . 2! 387 . 3 e 3.87 P3 0.20149 3! 387 . 4 e 3.87 P 4 019495 . 4! Xi 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Observed 57 203 383 525 532 408 273 139 45 27 10 4 0 1 1 P(x) 0.020858 0.080722 0.156197 0.201494 0.194945 0.150888 0.097323 0.053805 0.026028 0.011192 0.004331 0.001524 0.000491 0.000146 0.000040 Expected = P(x)*2608 54.399 210.523 407.361 525.496 508.418 393.515 253.817 140.325 67.882 29.189 11.296 3.974 1.282 0.382 0.105 Obs 500 Exp Frequency 400 300 200 100 0 0 5 10 Xi 15 How Many Beds?? The mean number of patients admitted per day to the emergency room of a small hospital is 2.5. If, on a given day, there are only four beds available for new patients, what is the probability that the hospital will not have enough beds to accommodate its newly admitted patients?? Let X be the number of patients admitted. We require P(X > 5) = 1 - [P(X=0) + P(X=2) + P(X=3) + P(X=4)] using 2.5x e 2.5 P x x! P(0) e 2.5 0.0821 P(1) 2.5 P(0) 0.2052 2.5 P(1) P ( 2) 0.2565 2 2.5 P(2) P(3) 0.2138 3 2.5 P(3) P ( 4) 01336 . 4 P(X<4) = 0.0821+0.2052+0.2565+0.2138+0.1336 = 0.8912 P(X>5) = 1 - 0.8912 = 0.1088 For a Poisson RV we have that E{X} xP( x) l x0 and Var{X} x l P( x) l 2 x0 The Poisson probability distribution also provides a good approximation to a binomial probability distribution with mean l = np when n is large, p is small, and np <7. The probability mass function for a Poisson RV is given by lx e l P x x! x 0,1,2.... Now x l P x e l x0 x 0 x! el el 1 so that the sum of the probabilities equal 1, as is the case for any mass function. lx e l E{X} xP( x) x x! x0 x0 lx lx lx 1 l l e x e 0 x e l x ! x ! x 1 ! x0 x 1 x 1 l lJ e l e l le l l J 0 J ! l so that E{X} = l for a Poisson RV.