R.V.s of Statistics

advertisement
Stat 330 (Spring 2015): slide set 8
i=1
6
ipX (i) = 1 ·
for all i ∈ {1, 2, 3, 4, 5, 6}.
1
1
1
1
1
1
+ 2 · + 3 · + 4 · + 5 · + 6 · = 3.5.
6
6
6
6
6
6
1
6
2
This formula shows that E(X) is also the center of gravity of masses pX
placed at corresponding points x.
E(X) =
Therefore, using the definition
The probability mass function of X is pX (i) =
What is the expected value for X?
Toss a fair die, and denote by X the number of spots on the upturned face.
Toss a Die
Example
Last update: January 16, 2015
Stat 330 (Spring 2015)
Slide set 8
Stat 330 (Spring 2015): slide set 8
3
The variance is measured in squared units of X.
σ := V ar(X) is called the standard deviation of X, its units are the
original units of the values of X.
V ar(X) is usually denoted by the symbol σ 2
The variance of a random variable X is defined as:
V ar(X) := E[{X − E(X)}2] = i{xi − E(X)}2 · pX (xi)
Variance A second common measure for describing a random variable is a
measure of how far its values are spread out.
Statistics of R.V.s
Stat 330 (Spring 2015): slide set 8
1
We see that it is actually a weighted average of the values of X, weighted
by the point masses pX (xi)’s.
The expected value of a random variable E[X] is a measure of the average
value of the possible values of the random variable.
E[X] is usually denoted by the symbol μ.
The most important version of this is the case h(x) = x:
E[X] = i xi · pX (xi) =: μ
Expectation The expected value of a function h(X) is defined as
E[h(X)] := i h(xi) · pX (xi).
Statistics of R.V.s
Stat 330 (Spring 2015): slide set 8
for all i ∈
P (|X − E(X)| ≤ kσ) ≥ 1 −
1
k2
6
Chebyshev’s Inequality For any positive real number k, and random variable
X with variance σ 2:
7
Since X is a discrete random variable, the image of X can be written as
{x1, x2, x3, . . .}, we are therefore interested in all xi with xi ≤ t:
FX (t) = P (X ≤ t) = P ({xi|xi ≤ t}) = i,with xi≤t pX (xi).
What is the relationship between cdf and pmf, pX and FX ?
(Note that in Hofmann’s notes a different terminology is used; she calls it
the probability distribution function. But we will use the terminology used
in Baron’s book)
The function FX (t) := P (X ≤ t) is called the cumulative distribution
function or the cdf of X.
Assume X is a discrete random variable:
Motivation: Very often we are interested in the probability of a whole range
of values, like P (X ≤ 5) or P (4 ≤ X ≤ 16).
Cumulative Distribution Function
Stat 330 (Spring 2015): slide set 8
Stat 330 (Spring 2015): slide set 8
Theorem For a random variable X and a real number a, following hold:
(ii) V ar(aX) = a2V ar(X)
= 2.5
V ar(Z) = (1−2.5)2 · 16 +(2−2.5)2 · 13 +(3−2.5)2 · 13 +(4−2.5)2 · 16 = 0.9167
15
6
1
2
3
4
1/6 1/3 1/3 1/6
E(Z) = 1 · 16 + 2 · 13 + 3 · 13 + 4 · 16 =
Calculate E(Z) and V ar(Z)?
z
p(z)
In that example, Z denoted the number of spots on the upturned face after
toss of the die, the pmf of Z was shown to be
Recall the example of the doctored die we talked about earlier.
5
Theorem For two random variables X and Y and two real numbers a, b
following holds:
E(aX + bY ) = aE(X) + bE(Y ).
(i) V ar(X) = E(X 2) − (E(X))2
Stat 330 (Spring 2015): slide set 8
Example: Toss the doctored die
4
Some properties of E(X) and V ar(X)
The standard deviation for X is:
σ = V ar(X) = 1.71 (spots).
= 2.917 (spots ).
2
= 6.25 · 16 + 2.25 · 16 + 0.25 · 16 + 0.25 · 16 + 2.25 · 16 + 6.25 · 16
Recall: The probability mass function of X is pX (i) =
{1, 2, 3, 4, 5, 6}; and E(X) = 3.5. Therefore,
6
V ar(X) = i=1(xi − 3.5)2pX (i)
1
6
Looking at the above definition for V ar(X), we see that we need to know
the probability mass function and the value of E(X) for this computation.
What is the variance for X?
Toss a fair die, and denote with X the number of spots on the upturned
face.
Example: Toss a die (continued...)
t
6 ,
Stat 330 (Spring 2015): slide set 8
10
Let Z denote the number of spots on the upturned face after toss of the
doctored die. The pmf of Z was shown to be
z
1
2
3
4
p(z) 1/6 1/3 1/3 1/6
1
8
+ 21 =
5
8
= 1.0
FX (3) = P (X ≤ 3) = P (X = 0) + P (X = 1) + P (X = 2) + P (X = 3)
=
FX (1) = P (X ≤ 1) = P (X = 0 or X = 1) = P (X = 0) + P (X = 1)
FX (0.5) = P (X ≤ 0.5) = P (X = 0) =
1
8
0
1
2
3
1/8 1/2 1/4 1/8
Evaluate FX (0.5), FX (1), FX (3)
x
p(x)
Let X be a random variable with the pmf:
Example of computing CDFs
11
Stat 330 (Spring 2015): slide set 8
Stat 330 (Spring 2015): slide set 8
Plots of pmf and cdf for the doctored die example
9
Plot of the cdf for the fair die tossing problem:
Example of a cdf (continued...)
8
t
(−∞, 1) [1, 2) [2, 3) [3, 4) [4, 5) [5, 6) [6, ∞)
2
3
4
5
1
FX (t)
0
1
6
6
6
6
6
See Example 3.1, 3.2 in Baron and Figure 3.1 for another example.
What is the cumulative distribution function?
t
Solution: FX (t) = i≤t pX (i) = i=1 pX (i) =
truncated value of t.
The probability mass function for X therefore is
x
1 2 3 4 5 6
pX (x) 16 61 61 16 61 61
Sample Space is Ω = {1, 2, 3, 4, 5, 6}
where t is the
Stat 330 (Spring 2015): slide set 8
Roll a fair die: X = number of spots on upturned face
Example of a cdf
Stat 330 (Spring 2015): slide set 8
• FX (t) is right continuous with respect to t
• limt→−∞ FX (t) = 0 and limt→∞ FX (t) = 1.
• FX is nondecreasing, (i.e. if x1 ≤ x2 then FX (x1) ≤ FX (x2).)
12
The following properties hold for the cdf FX of a
• 0 ≤ FX (t) ≤ 1 for all t ∈ R
Properties of FX :
random variable X.
Properties of FX
Download