Statistics and Their Distributions Deriving Sampling Distributions Example A certain system consists of two identical components. The life time of each component is supposed to have an expentional distribution with parameter λ. The lifetime of the system is supposed to be the sum of the two components and the two components are assumed to work independently. Let X1 and X2 be the lifetime of the two components, respectively. What can we say about the lifetime of the system T0 = X1 + X2 ? Distribution for Linear Combinations Distribution for Linear Combinations Proposition Let X1 , X2 , . . . , Xn have mean values µ1 , µ2 , . . . , µn , respectively, and variances σ12 , σ22 , . . . , σn2 , respectively. 1.Whether or not the Xi s are independent, E (a1 X1 + a2 X2 + · · · + an Xn ) = a1 E (X1 ) + a2 E (X2 ) + · · · + an E (Xn ) = a1 µ1 + a2 µ2 + · · · + an µn 2. If X1 , X2 , . . . , Xn are independent, V (a1 X1 + a2 X2 + · · · + an Xn ) = a12 V (X1 ) + a22 V (X2 ) + · · · + an2 V (Xn ) = a1 σ12 + a2 σ22 + · · · + an σn2 Distribution for Linear Combinations Distribution for Linear Combinations Proposition (Continued) Let X1 , X2 , . . . , Xn have mean values µ1 , µ2 , . . . , µn , respectively, and variances σ12 , σ22 , . . . , σn2 , respectively. 3. More generally, for any X1 , X2 , . . . , Xn V (a1 X1 + a2 X2 + · · · + an Xn ) = n X n X i=1 j=1 ai aj Cov (Xi , Xj ) Distribution for Linear Combinations Proposition (Continued) Let X1 , X2 , . . . , Xn have mean values µ1 , µ2 , . . . , µn , respectively, and variances σ12 , σ22 , . . . , σn2 , respectively. 3. More generally, for any X1 , X2 , . . . , Xn V (a1 X1 + a2 X2 + · · · + an Xn ) = n X n X ai aj Cov (Xi , Xj ) i=1 j=1 We call a1 X1 + a2 X2 + · · · + an Xn a linear combination of the Xi ’s. Distribution for Linear Combinations Distribution for Linear Combinations Example (Problem 64) Suppose your waiting time for a bus in the morning is uniformly distributed on [0,8], whereas waiting time in the evening is uniformly distributed on [0,10] independent of morning waiting time. Distribution for Linear Combinations Example (Problem 64) Suppose your waiting time for a bus in the morning is uniformly distributed on [0,8], whereas waiting time in the evening is uniformly distributed on [0,10] independent of morning waiting time. a. If you take the bus each morning and evening for a week, what is your total expected waiting time? Distribution for Linear Combinations Example (Problem 64) Suppose your waiting time for a bus in the morning is uniformly distributed on [0,8], whereas waiting time in the evening is uniformly distributed on [0,10] independent of morning waiting time. a. If you take the bus each morning and evening for a week, what is your total expected waiting time? b. What is the variance of your total waiting time? Distribution for Linear Combinations Example (Problem 64) Suppose your waiting time for a bus in the morning is uniformly distributed on [0,8], whereas waiting time in the evening is uniformly distributed on [0,10] independent of morning waiting time. a. If you take the bus each morning and evening for a week, what is your total expected waiting time? b. What is the variance of your total waiting time? c. What are the expected value and variance of the difference between morning and evening waiting times on a given day? Distribution for Linear Combinations Example (Problem 64) Suppose your waiting time for a bus in the morning is uniformly distributed on [0,8], whereas waiting time in the evening is uniformly distributed on [0,10] independent of morning waiting time. a. If you take the bus each morning and evening for a week, what is your total expected waiting time? b. What is the variance of your total waiting time? c. What are the expected value and variance of the difference between morning and evening waiting times on a given day? d. What are the expected value and variance of the difference between total morning waiting time and total evening waiting time on a particular week? Distribution for Linear Combinations Distribution for Linear Combinations Corollary E (X1 − X2 ) = E (X1 ) − E (X2 ) and, if X1 and X2 are independent, V (X1 − X2 ) = V (X1 ) + V (X2 ). Distribution for Linear Combinations Corollary E (X1 − X2 ) = E (X1 ) − E (X2 ) and, if X1 and X2 are independent, V (X1 − X2 ) = V (X1 ) + V (X2 ). Proposition If X1 , X2 , . . . , Xn are independent, normally distributed rv’s (with possibly different means and/or variances), then any linear combination of the Xi s also has a normal distribution. In particular, the difference X1 − X2 between two independent, normally distributed variables is itself normally distributed. Distribution for Linear Combinations Distribution for Linear Combinations Example (Problem 62) Manufacture of a certain component requires three different maching operations. Machining time for each operation has a normal distribution, and the three times are independent of one another. The mean values are 15, 30, and 20min, respectively, and the standard deviations are 1, 2, and 1.5min, respectively. Distribution for Linear Combinations Example (Problem 62) Manufacture of a certain component requires three different maching operations. Machining time for each operation has a normal distribution, and the three times are independent of one another. The mean values are 15, 30, and 20min, respectively, and the standard deviations are 1, 2, and 1.5min, respectively. What is the probability that it takes at most 1 hour of machining time to produce a randomly selected component?