STT 430, Summer 2006 Lecture 3 Materials Covered: Chapter 4 Suggested Exercises: 4.1, 4.3, 4.4, 4.6, 4.11, 4.15, 4.18, 4.32, 4.46, 4.47, 4.57, 4.64, 4.77, 4.117. 1. Cumulative Distribution Function. Definition 4.1: Let Y denote any random variable. The cumulative distribution function (CDF) of Y, denoted by F(y), is given by F(y) = P(Yy) for -<y<. Example 4.1: Suppose that Y has binomial distribution with n=2 and p=1/2. Find F(y). Distribution functions for discrete random variable are always step functions because the CDF increases only at a countable number of points. Properties of a CDF. Theorem 4.1: If F(y) is a CDF, then (1). F () lim F ( y) 0. y (2). F () lim F ( y) 1. y (3). F ( y ) is nondecreasing function of y. 2. Continuous Random Variable and it’s Probability Distribution. Definition 4.2: Let Y denote a random variable with CDF F(y). Y is said to be continuous if the distribution function F(y) is continuous for y . Definition 4.3: Let F(y) be the CDF for a continuous random variable Y. The f(y), given by f ( y) dF ( y ) F ( y ) dy wherever the derivative exists, is called the probability density function (PDF) of the random variable Y. STT 430, Summer 2006 Therefore, F(y) can be written as y F ( y) f (t )dt Graphically, we have: Properties of a PDF. Theorem 4.2: If f(y) is a PDF, then (1). f(y) 0 for any value of y. (2). f ( y)dy 1 . Example 4.2: Suppose that 0, for y 0 F ( y ) y, for 0 y 1 1, for y 1 Find the PDF of Y, and graph it. Example 4.3: Let Y be a continuous random variable with PDF given by 3 y 2 , 0 y 1 f ( y) 0, otherwise Find F(y). Graph both f(y) and F(y). STT 430, Summer 2006 Theorem 4.3: If the random variable Y has density function f(y) and ab, then the probability that Y falls in the interval [a,b] is b P(a Y b) f ( y)dy. a Example 4.4: Given f ( y) cy ,0 y 2, and f(y)=0 elsewhere, find the value of c for 2 which f(y) is a valid density function. Example 4.5: Find P(1Y2) for Example 4.4. Also find P(1<Y<2). 3. Expected Values for Continuous Random Variable. Definition 4.4: The expected value of a continuous random variable Y is E (Y ) yf ( y)dy provided that the integral exists. Theorem 4.4: Let g(Y) be a function of Y; then the expected value of g(Y) is given by E[ g (Y )] g ( y) f ( y)dy provided that the integral exists. STT 430, Summer 2006 Theorem 4.5: Let c be a constant, and let g1(Y), g2(Y), … , gk(Y) be functions of a continuous random variable Y. Then the following results hold: (1). E(c) = c. (2). E[cg(Y)] = c E[g(Y)]. (3). E[g1(Y)+g2(Y)+…+gk(Y)] = E[g1(Y)]+E[g2(Y)]+…+E[gk(Y)]. Example 4.6: In example 4.4 we determined that f(y)=(3/8)y2 for 0y2, f(y)=0 elsewhere, is a valid density function. If the random variable Y has this density function, find E(Y) and V(Y). 4. Examples of Continuous Random Variables. (1).The Uniform Probability Distribution. Definition 4.5: If 1 2 , a random variable Y is said to have a uniform probability distribution on the interval ( 1 , 2 ), denoted by Y ~ U( 1 , 2 ), if and only if the density function of Y is 1 f ( y) 2 1 0 1 y 2 . otherwise Theorem 4.6: If Y ~ U( 1 , 2 ), then E (Y ) Proof: 1 2 2 , and V (Y ) ( 2 1 ) 2 . 12 STT 430, Summer 2006 Example 4.7: It is known that, during a given 30-minute period, one customer will arrive at a checkout counter any time within the 30-minute period. Find the probability that the customer will arrive during the last 5 min of the 30-minute period. (2). The Normal Probability Distribution. Definition 4.7: A random variable Y is said to have a normal probability distribution, denoted by Y~N(, ) if and only if the density function of Y is given by 1 f ( y) e 2 ( y )2 2 2 y , , where -<<, and > 0. Theorem 4.7: If Y ~ N(, ), then E (Y ) , V (Y ) 2 . and Example 4.7: Let Z denote a normal random variable with mean 0 and standard deviation 1. a. Find P(Z >2). b. Find P(-2Z2). c. Find P(0Z1.73). Theorem: If Y ~ N(, ), then Z Y ~ N (0,1) . STT 430, Summer 2006 Example 4.9: The achievement score for a college entrance examination are normally distributed with mean 75 and standard deviation 10. What fraction of the scores lies between 80 and 90? (3). The Gamma Probability Distribution. Definition 4.8: A random variable Y is said to have a Gamma probability distribution with parameters >0 and >0 if and only if the density function of Y is given by y 1e y f ( y ) ( ) 0 where ( ) 0 y0 , elsewhere y 1e y dy. Theorem 4.8: If Y has a gamma distribution with parameter and , then E (Y ) , Proof: and V (Y ) 2 . STT 430, Summer 2006 Some Special Gamma Distributions. (i) Chi-square probability distribution: Let v be a positive integer. A random variable Y is said to have a chi-square distribution with v degrees of freedom if and only if Y has a gamma distribution with parameter = v/2 and = 2. Easy to see that if Y is a chi-square random variable with v degrees of freedom, then E(Y) =v, V(Y) = 2v. (ii) Exponential probability distribution: A random variable Y is said to have an exponential distribution with parameter > 0 if and only if Y has a gamma distribution with parameter =1 and >0. Easy to see, the density function of Y is given by 1 y e f ( y) 0 y0 , elsewhere and E(Y) = , V(Y) = 2. Example 4.10: Suppose that Y has an exponential probability density function. Show that, if a>0 and b>0, then P(Y > a + b | Y > a) = P(Y > b). (4). The Beta Probability Distribution (Omit). 5. Tchebysheff’s Theorem Theorem 4.13: Let Y be a random variable with finite mean and variance 2. Then for any k>0, P(| Y | k ) 1 Proof: 1 1 , or P (| Y | k ) 2 . 2 k k STT 430, Summer 2006 Example 4.17: Suppose that experience has shown that the length of time Y (in minutes) required to conduct a periodic maintenance check on a dictating machine follows a gamma distribution with = 3.1 and = 2. A new maintenance worker takes 22.5 min to check the machine. Does this length of time to perform a maintenance check disagree with prior experience? 6. Other Expected Values of Continuous Variables. (1). k-th moment about the origin. (2). k-th moment about the mean. (3). Moment generating function.