Section 7.2 Part 2 – Means and Variances of Random Variables Effects of Linear Transformation Recall from chapter 1 the effect of a linear transformation on measures of center (mean & median) and spread (standard deviation & IQR): xnew = a + bx Adding the same number, a, to each observation in a data set adds a to measures of ______________________, but does not change _________________________. Multiplying each observation in a data set by a positive number, b, multiplies both measures of center and measures of spread by b. Rules for Means The rules for means and variances when working with random variables are similar. Rules for means: Rule 1: If X is a random variable and a and b are fixed numbers, then 𝜇𝑎+𝑏𝑋 = 𝑎 + 𝑏𝜇𝑋 Rule 2: If X and Y are random variables, then 𝜇𝑋+𝑌 = 𝜇𝑋 + 𝜇𝑌 See example 7.10 on p.419 Rules for Variances Adding a constant value, a, to a random variable ________________________, because the mean increases by the ____________________. Multiplying a random variable by a constant, b, increases the ____________________by the ________________ _________________________. These two rules combined state the following: Rule 1: If X is a random variable and a and b are fixed numbers, then Two random variables X and Y are independent if the value of X has no effect on the value of Y Rule 2: If X and Y are independent random variables, then Why add for the difference of variables? We buy some cereal. The box says "16 ounces." We know that's not precisely the weight of the cereal in the box, just close; after all, one corn flake more or less would change the weight ever so slightly. Weights of such boxes of cereal vary somewhat, and our uncertainty about the exact weight is expressed by the variance (or standard deviation) of those weights. Next we get out a bowl that holds 3 ounces of cereal and pour it full. Our pouring skill certainly is not very precise, so the bowl now contains about 3 ounces with some variability (uncertainty). How much cereal is left in the box? Well, we'd assume about 13 ounces. But notice that we're less certain about this remaining weight than we were about the weight before we poured out the bowlful. The variability of the weight in the box has increased even though we subtracted cereal. Moral: Every time something happens at random, whether it adds to the pile or subtracts from it, uncertainty (read "variance") increases. See example 7.11 on p.421-420 When variables are not independent, the variance of their sum depends on the correlation between the two variables as well as on their individual variances. For example, let X represent the amount of income spent and Y represent the amount of income saved. When X increases, Y decreases by the same amount. This relationship prevents their variances from adding since their sum is always 100% and does not vary at all. The correlation between two random variables is represented b ______________________________. This correlation coefficient has ________________________________. Rule 3: If X and Y have correlation 𝜌, then This is the general addition rule for variances of random variables. If two variables are independent (not correlated), then 2𝜌𝜎𝑋 𝜎𝑌 will equal zero. See example 7.12 on p.422-423 Combining Normal Random Variables If a random variable is normally distributed, we can use its mean and variance to compute probabilities. If we have two normal random variables, the following rule applies: Any linear combination of independent normal random variables is also normally distributed. If X and Y are independent normal random variables and a and b are any fixed numbers, then aX + bY is also normally distributed See example 7.14 on p.424-425