Document

advertisement
MOMENT GENERATING
FUNCTION AND
STATISTICAL
DISTRIBUTIONS
1
MOMENT GENERATING
FUNCTION
The m.g.f. of random variable X is defined as
 e txf ( x )dx if X is cont.
 
all x
tX
M X ( t )  E (e )  
  e txf ( x ) if X is discret e

all x
for t Є (-h,h) for some h>0.
2
Properties of m.g.f.
• M(0)=E[1]=1
• If a r.v. X has m.g.f. M(t), then Y=aX+b
has a m.g.f. ebtM(at)
k
(k )
(k )
th
E
(
X
)

M
(
0
)
where
M
is
the
k
derivative.
•
• M.g.f does not always exists (e.g. Cauchy
distribution)
3
Example
• Suppose that X has the following p.d.f.
x
f ( x )  xe
for x  0
Find the m.g.f; expectation and variance.
4
CHARACTERISTIC FUNCTION
The c.h.f. of random variable X is defined as
 e itx f ( x) dx if X is cont.


all x
itX
 X (t )  E (e )  
itx
e
f ( x) if X is discrete


all x
for all real numbers t.
i 2  1, i  1
C.h.f. always exists.
5
Uniqueness
Theorem:
1. If two r.v.s have mg.f.s that exist and are
equal, then they have the same
distribution.
2. If two r,v,s have the same distribution,
then they have the same m.g.f. (if they
exist)
Similar statements are true for c.h.f.
6
Problem
• It is sometimes the case that exact values of
random variables (Y1, Y2, …) cannot be
observed, but we can observe they are greater
than some fixed value. Let Y1, Y2, … be i.i.d.
r.v.s. Let a be a fixed number on real line. For
i=1,2,… define,
1 if Yi  a
Xi  
0 if Yi  a
7
Problem, cont.
For example, if a manufacturing process
produces parts with strength Yi that are
tested to see if they can withstand stress
a, then Xi denotes whether the strength is
at least a or it is less than a. In such a
case, we cannot directly observe the
strength Yi of the ith part, but we can
observe whether it breaks in stress test.
8
Problem, cont.
• Define p=P(Y1≥a) and q=1-p, Sn=X1+X2+…+Xn.
Note that, Sn is the number of Y1, …, Yn that
exceed a.
i) Define the characteristic function, say  X (t ) , of
a r.v. X
ii) Find X1 (t )
iii) Find Sn (t )
iv) Find P(Sn=j)
9
Other generating functions
• logM(t) is called cumulant generating
function.
X

(
t
)

E
(
t
) is factorial moment
• X
generating function.
X lnt
ln(t X )
X
• Note: M (lnt )  E(e )  E(e
)  E(t )
there is a simple relation between
m.g.f. and f.m.g.f.
10
Other generating functions
d
X 1
 X (t ) |t 1  E ( Xt ) |t 1  E ( X )
dt
2
d
 X (t ) |t 1  E ( X ( X  1))
2
dt
...
dk
 X (t ) |t 1  E ( X ( X  1)...(X  k  1))
k
dt
11
Example
• Suppose X has the following p.m.f.
e 
P( X  x) 
, x  0,1,2,...   0
x!
Find the expectation and variance of X.
• Solution: Let’s use factorial m.g.f.

x
12
Example
(t ) e
 X (t )  E (t )  
x!
x 0

X

X
  t
e e e
( t 1) 
 X (1)  e (t 1)   |t 1    E ( X )
 X (1)  e (t 1)  2 |t 1  2  E ( X ( X  1))
Var ( X )  E ( X 2 )  ( E ( X ))2
 E ( X ( X  1))  E ( X )  ( E ( X ))
2
     
2
2
13
STATISTICAL
DISTRIBUTIONS
14
Recall
• Random variable: A function defined
on the sample space S that
associates a real number with each
outcome in S.
15
Example
• Toss three coins
• Sample space
S={s1=HHH,s2=HHT,…,s6=THT,s7=TTH,s8=TTT}
• Define X=number of heads:
X(s1)=3,X(s6)=1,X(s8)=0
• Define Y=number of tails before first head:
Y(s1)=0, Y(s6)=1, Y(s8)=3
16
Random variables
• A random variable is continuous if its CDF,
F(x)=P(X≤x), is continuous.
• A random variable is discrete if its CDF,
F(x)=P(X≤x), is a step function.
• It is possible for a CDF to have continuous
pieces and steps, but we will mostly
concentrate on the previous two bullets in
this course.
17
SOME DISCRETE
PROBABILITY
DISTRIBUTIONS
Degenerate, Uniform, Bernoulli,
Binomial, Poisson, Negative
Binomial, Geometric,
Hypergeometric
18
DEGENERATE DISTRIBUTION
• An rv X is degenerate at point k if
1, X  k
P X  x  
0, o.w.
The cdf:
0, X  k
F  x  P  X  x  
1, X  k
19
UNIFORM DISTRIBUTION
• A finite number of equally spaced values are
equally likely to be observed.
1
P(X  x )  ; x  1,2,...,N; N  1,2,...
N
• Example: throw a fair die.
P(X=1)=…=P(X=6)=1/6
N 1
E(X) 
;
2
( N  1)( N  1)
Var (X) 
12
20
BERNOULLI DISTRIBUTION
• A Bernoulli trial is an experiment with only
two outcomes. An r.v. X has Bernoulli(p)
distribution if
1 with probability p
X 
;0  p  1
0 with probability 1  p
P(X  x)  p x (1  p)1 x for x  0,1; and 0  p  1
21
BERNOULLI DISTRIBUTION
• P(X=0)=1-p and P(X=1)=p
• E(X)=p
• Var(X)  E(X - E(X))2
 (0  p ) (1  p )  (1  p ) p
2
2
 p (1  p )
22
BINOMIAL DISTRIBUTION
• Define an rv Y by
Y = total number of successes in n Bernoulli trials.
1. There are n trials (n is finite and fixed).
2. Each trial can result in a success or a failure.
3. The probability p of success is the same for all
the trials.
4. All the trials of the experiment are independent.
Y   X ~ Bin  n, p  where X ~ Ber  p  .
n
i 1
i
i
~ Bin  n , p . Then,
independent
Let X
i
i
 X ~ Bin  n  n 
k
i 1
i
1
2
 n , p .
k
23
BINOMIAL DISTRIBUTION
• Example:
• There are black and white balls in a box. Select
and record the color of the ball. Put it back and
re-pick (sampling with replacement).
• n: number of independent and identical trials
• p: probability of success (e.g. probability of
picking a black ball)
• X: number of successes in n trials
24
BINOMIAL THEOREM
• For any real numbers x and y and integer n>0
 n  i n i
( x  y)    x y
i 0  i 
n
n
25
BINOMIAL DISTRIBUTION
• If Y~Bin(n,p), then
n  y
n y


P(Y  y)    p (1  p)
y  0,1,...,n 0  p  1
 y
E(Y)  np
Var(Y)  np(1- p)
t
n
MY (t )  [pe  (1  p)]
26
POISSON DISTRIBUTION
• The number of occurrences in a given time
interval can be modeled by the Poisson
distribution.
• e.g. number of customers to arrive in a bank
between 13:00 and 13:30.
• Another application is in spatial distributions.
• e.g. modeling the distribution of bomb hits in an
area or the distribution of fish in a lake.
27
POISSON DISTRIBUTION
• If X~ Poi(λ), then

e 
P( X  x) 
, x  0,1,2,...   0
x!
x
• E(X)= Var(X)=λ
•
M X (t )  exp{[exp(t ) 1]}
28
Relationship between Binomial and
Poisson
X ~ Bin  n, p  with mgf M  t    pe  1  p 
t
n
X
Let =np.
lim M
n
 t   lim  pe  1  p 
t
X
n
n

   e  1 

 lim 1 
 M t 
 e
n


t
n
n
et 1
Y
The mgf of Poisson()
The limiting distribution of Binomial rv is the
29
Poisson distribution.
NEGATIVE BINOMIAL DISTRIBUTION
(PASCAL OR WAITING TIME DISTRIBUTION)
• X: number of Bernoulli trials required to
get a fixed number of failures before the r
th success; or, alternatively,
• Y: number of Bernoulli trials required to
get a fixed number of successes, such as
r successes.
30
NEGATIVE BINOMIAL DISTRIBUTION
(PASCAL OR WAITING TIME DISTRIBUTION)
X~NB(r,p)
 r  x  1 r
p (1  p) x ;
P(X  x)  
x 

x  0,1,...;0  p  1
MX (t )  pr [1  (1  p)e t ]r
r(1  p)
E(X) 
p
Var (X) 
r(1  p)
p2
31
NEGATIVE BINOMIAL DISTRIBUTION
• An alternative form of the pdf:
 y  1 r
p (1  p) y  r ; y  r, r  1,...; 0  p  1
P(Y  y)  
 r 1 
Note: Y=X+r
r
r(1  p)
E(Y)  E(X)  r 
Var (Y)  Var (X) 
p
p2
32
GEOMETRIC DISTRIBUTION
• Distribution of the number of Bernoulli trials
required to get the first success.
• It is the special case of the Negative Binomial
Distribution r=1.
X~Geometric(p)
P  X  x   p 1  p  , x  1,2,
x 1
1
(1  p)
E ( X) 
Var (X) 
p
p2
33
GEOMETRIC DISTRIBUTION
• Example: If probability is 0.001 that a light bulb
will fail on any given day, then what is the
probability that it will last at least 30 days?
• Solution:
P(X  30) 

x 1
30
0
.
001
(
1

0
.
001
)

(
0
.
999
)
 0.97

x 31
34
HYPERGEOMETRIC DISTRIBUTION
• A box contains N marbles. Of these, M are red.
Suppose that n marbles are drawn randomly
from the box without replacement. The
distribution of the number of red marbles, x is
 M  N  M  X~Hypergeometric(N,M,n)
 x  n  x 
 , x  0,1,..., n
P  X  x    
N
n
 
It is dealing with finite population.
35
HYPERGEOMETRIC
DISTRIBUTION
• As N →∞, hypergeometric → binomial.
• In that case, sampling with or without
replacement does not make much
difference (especially if n/N is small).
36
MULTIVARIATE
DISTRIBUTIONS
37
EXTENDED HYPERGEOMETRIC
DISTRIBUTION
• Suppose that a collection consists of a finite
number of items, N and that there are k+1 different
types; M1 of type 1, M2 of type 2, and so on. Select
n items at random without replacement, and let Xi
be the number of items of type i that are selected.
The vector X=(X1, X2,…,Xk) has an extended
hypergeometric distribution and the joint pdf is
 M 1  M 2   M k  M k 1 




...
x
x
x
x
f  x1 , x2 ,..., xk    1  2   k  k 1  ,  xi  {0,1,...,M i }
N
 
n
k
k
i 1
i 1
where M k 1  N   M i and x k 1  n   xi .
38
MULTINOMIAL DISTRIBUTION
• Let E1,E2,...,Ek,Ek+1 be k+1 mutually exclusive
and exhaustive events which can occur on any
trial of an experiment with P(Ei)=pi,i=1,2,…,k+1.
On n independent trials of the experiment, let Xi
be the number of occurrences of the event Ei.
Then, the vector X=(X1, X2,…,Xk) has a multinomial
distribution with joint pdf
n!
f x1 , x2 ,..., xk  
p1x1 p2x2 ...pkxk 11 ,  xi {0,1,...,n}
x1! x2 !...xk 1!
k
k
i 1
i 1
where x k 1  n   xi and p k 1  1   pi .
39
MULTINOMIAL DISTRIBUTION
• Experiment involves drawing with
replacement.
• Binomial is a special case of multinomial
with k+1=2
40
MULTINOMIAL DISTRIBUTION
• Consider trinomial case for simplicity.
n!
f x1 , x2  
p1x1 p2x2 (1  p1  p2 ) n x1  x2 ,  xi {0,1,...,n}
x1! x2!(n  x1  x2 )!
M (t1 , t2 )  E (e
t1 X 1 t 2 X 2
)
n n  x1
n!
t1 x1
t 2 x2
n  x1  x2
 
( p1e ) ( p2e ) (1  p1  p2 )
x1 0 x2 0 x1! x2 !( n  x1  x2 )!
 ( p1e  p2e  1  p1  p2 )
t1
t2
n
41
MULTINOMIAL DISTRIBUTION
• M.g.f. of X1:
t1 X1 0
M (t1 ,0)  E(e
)  ( p1e  1  p1 )
t1
n
X1~Bin(n,p1)
Similarly, X2~Bin(n,p2)
But, Cov(X1,X2)≠0!
Cov(X1,X2)=?
42
MULTINOMIAL DISTRIBUTION
• Example: Suppose we have a bowl with
10 marbles - 2 red marbles, 3 green
marbles, and 5 blue marbles. We
randomly select 4 marbles from the bowl,
with replacement. What is the probability
of selecting 2 green marbles and 2 blue
marbles?
43
MULTINOMIAL DISTRIBUTION
• n = 4, k+1=3, nred = 0, ngreen = 2, nblue = 2
• pred = 0.2, pgreen = 0.3, pblue = 0.5
• P = [ n! / ( n1! * n2! * ... nk! ) ] * ( p1n1 * p2n2 * . . . * pknk )
P = [ 4! / ( 0! * 2! * 2! ) ] * [ (0.2)0 * (0.3)2 * (0.5)2 ]
P = 0.135
44
Problem
1. a) Does a distribution exist for which the
t
M
(
t
)

m.g.f.
? If yes, find it. If
X
1 t
no, prove it.
b) Does a distribution exist for which the
m.g.f. M X (t )  et ? If yes, find it. If no,
prove it.
45
Solution
46
Problem
2. An appliance store receives a shipment of
30 microwave ovens, 5 of which are
(unknown to the manager) defective. The
store manager selects 4 ovens at random,
without replacement, and tests to see if
they are defective. Let X=number of
defectives found. Calculate the pmf and
cdf of X.
47
Solution
48
Problem
3. Let X denote the number of “do loops” in
a Fortran program and Y the number of
runs needed for a novice to debug the
program. Assume that the joint density for
(X,Y) is given in the following table.
49
Problem
x/y
1
2
3
4
0
0.059
0.1
0.05
0.001
1
0.093
0.12
0.082
0.003
2
0.065
0.102
0.1
0.01
3
0.05
0.075
0.07
0.02
50
Problem
a) Find the probability that a randomly selected program
contains at most one “do loop” and requires at least two
runs to debug the program.
b) Find E[XY].
c) Find the marginal densities for X and Y. Find the mean
and variance for both X and Y.
d) Find the probability that a randomly selected program
requires at least two runs to debug given that it contains
exactly one “do loop”.
e) Find Cov(X,Y). Find the correlation between X and Y.
Based on the observed value correlation, can you claim
that X and Y are not independent? Why?
51
Solution
52
Solution
53
Solution
54
Download