Expectation

advertisement
Discrete Random Variable
Expectation

X ∼ f(x), the expectation (mean or average) of X is


x
μX =E(X) =
Suppose H(x) is a function. Then H(X) is random variable,
å x × f (x)
o
Variance

E[H(X)] =
å H(X) f (x)
x
μ = E(X).
Var(X)

Standard deviation:
σX =

Var(X) = σ2X = E(X2) − [E(X)]2
Discrete Distribution: PMF, P(X=x)

Binomial distribution:
o
(i). n independent trials;
o
(ii). each trial has two possible outcomes: success and failure, with probability p and q =(1 – p), respectively;
o
(iii). X = the total number of ‘success’ in the n trials.
æ n ö x
n-x
ç
÷ p (1- p)
è x ø
f(x) =
, x = 1, 2, 3, …, n

o
o
E(X) = np
V(X) = npq, where q = (1—p)
Negative Binomial distribution:
o
(i). independent trials;
o
(ii). each trial has two possible outcomes: success and failure, with probability p and 1 − p, respectively;
o
(iii). X = the number of trials needed to produce the rth success.
æ x -1 ö r
x-r
ç
÷ p (1- p) ,
r -1 ø
f(x) = è

o
o
E(X)= r(1-p)/p
Poisson Distribution:
o
Has pmf:
o
x = r+1,r+ 2, …, n
V(X) = r(1-p)/p^2
f (x) = P(X £ x) =
l x e- l
x!
, x = 0,1, 2,..., n
o
E(X) = λ V(X) = λ
Two Discrete RV’s PMF

p(x,y) = P(X=x and Y=y)

X and Y are independent if: f(x,y)=fx(x) X fY(y)

Marginal Probability Mass Function of X & Y, respectively:
o
for each possible value of x.
pX (x) = å p(x, y)
y
o
pY (y) = å p(x, y)
for each possible value of y.
x

Conditional Probability Mass Function of X when Y=y
o
o
f X|Y (x | y) =
f (x, y)
pY (y)
Conditional Probability of X given Y=y:

P(X | Y = y) = å f X|Y (x | y)
x

Expectation:
o
å åh(x, y)× p(x, y)
x
o
y
Conditional Expectation of HX(x) given Y=y:

E[H (X) | Y = y] = å H (x)× f X|Y (x | y)
x
Continuous Random Variable

Continuous random variable X takes values in a subinterval of real line R.
Definition: The probability density function
(pdf) of X is a function given by:
o
f(x) ≥ 0;
♠
o
P(X∈D)= D
ò
f(x)dx = 1;
ò

f(x)dx.
Definition: The (cumulative) distribution function (df or cdf) of X:
x
ò f (u)du
o
F(x)=P(X ≤x)=
,
o
Cdf = F(x)
o
Pdf = F’(x) = f(x)
Expectation:
o
Definition: X ∼ f(x), the expectation (mean or average) of X is defined by:
-¥

m X =E(X)= ò x × f (x)dx


s X2 = E(X ) − [E(X)]
o
Var(X) =
o
Standard deviation:
2
Percentile:
o
0 <= p <= 1
o
p = F(η(p)) =
ò
h ( p)
-¥
2
sX =
Var(X)
f (u)du
Common Continuous Distributions

Uniform Distribution:
o
X follows s uniform distribution on [a,b] with p.d.f. :

f(x) = 1/(b—a) , a ≤ x ≤ b

E(X) = (a+b)/2 , Var(X) = (b—a)2/12

F(x)=P(X ≤ x)=0, x<a
o
a≤x≤b
1
b
x
exp(- ),
b
o
f(x) =
x>0
æ xö
ç ÷
o
F(x) = 1—exp è b ø, x>0
Location Exponential Distribution:
o
æ x-mö
1
exp ç ÷
b
è b ø , x> μ E(X) =
f(x) =
b +m ,
V(X) =
b2
Double Exponential Distribution:
f (x) =
o

a
b
1

-¥
x-a
1
Exponential Distribution: X ∼ Exp(β)
l=

x
ò f (u)du = ò b - adu = b - a,


x
æ- x -m ö
1
exp ç
÷,
2b
è b ø
x Î (-¥,¥)
E(X) =
m , V(X) = 2 / b 2
Normal (or Gaussian) Distribution:
f (x) =
æ (x - m )2 ö
1
exp ç ÷
2
2ps
è 2s ø
o
o
Standard normal:
o
Suppose X ~
o
o
Z=
X -m
s
,
x Î (-¥,¥)
N(m, s 2 )
~ N(0,1)
P(a <= Z <= b) =
,
X =sZ +m
æb-mö
æa-mö
Fç
÷ - Fç
÷
è s ø
è s ø
o

æb-mö
æa-mö
P(X £ a) = F ç
÷
÷ P(X ³ b) = 1- F ç
è s ø
è s ø
Log-normal Distribution
o
o
log X ~ N(m, s 2 )
æ
s2ö
E(X) = exp ç m + ÷
è
o
o

2 ø
2
Var(X) = éëexp(2m + s 2 )ùû[es -1]
æ
æ ln x - m ö
ln x - m ö
F(x; m, s ) = P(X £ x) = P [ ln X £ ln x ] = P ç Z £
÷ = Fç
÷
è
è s ø
s ø
Normal Approximation:
o
Suppose Y sin Bin(n,p). For np ≥10 and n(1-p) ≥10, the binomial probability for Y can be approximated by a
normal distribution with mean np and variance np(1-p). That is, for integers a and b,
æ a - 0.5 - np
b + 0.5 - np ö
P(a £ Y £ b) » P ç
£Z£
÷
npq
npq ø
è

o
Gamma Distribution
o
X ~ Gamma(a, b )
f (x) =
o
æ xö
1
a -1
x
exp
ç- ÷
b a G(a )
è bø
G(a ) =
o
where,
o
E(X) = ab
Var(X) = ab 2
o
o

ò ua
-1 -u
e du
0
Weibull Distribution
a
o
o
o

G(k) = (k -1)!
¥
æXö
ç ÷ ~ Exp(1)
èbø
æ 1ö
E(X) = b ´ G ç1+ ÷
è aø
é æ 2 ö é æ 1 öù2 ù
2
Var(X) = b êG ç1+ ÷ - êG ç1+ ÷ú ú
êë è a ø ë è a øû úû
ææ X öa æ c öa ö
æ æ c öa ö
P(X £ c) = P çç ÷ £ ç ÷ ÷ = 1- exp ç - ç ÷ ÷
çè b ø è b ø ÷
ç èb ø ÷
è
ø
è
ø
o
Beta Distribution
o
X ~ Beta(α, β)
1
xa -1 (1- x)b -1,
B(a, b )
G(a )G(b )
B(a, b ) =
G(a + b )
f (x) =
o
o
where,
E(X) =
o
a
a+b
Var(X) =

ab
(a + b ) (a + b +1)
o
Poisson Distribution
2
0 <= x <=1
o
f (x) = P(X £ x) =
Two Continuous RV’s
l x e- l
x!
PDF
P((X,Y ) Î D) =

Probability:

Expectation:
o
E[h(X,Y)]=
o
E(X) =
òò
( x,y)ÎD
f (x, y)dx, dy
ò ò h(x, y) f (x, y)dxdy
ò x × f (x)dx
x


Var(X) = E(X 2 ) -[E(X)]2
Variance:
Marginal Distribution of X:
f X (x) =
o
ò
f (x, y)dy
yÎD

Conditional pdf of X given Y=y:
f X|Y (x | y) =
o

f (x, y)
fY (y)
Conditional Expectation of H(x) given Y=y:
ò
o
x × f X|Y (x | y)dx
xÎD
Polar coordinate (r,θ) ←→ Cartesian coordinate (x,y)

x = r cosθ

y = r sin θ

dx dy = r dr dθ

r^2 = x^2 + y^2


Covariance:
o
Cov(X,Y) = E(XY) – E(X)E(Y)
Correlation:
o
Corr(X,Y)
=r=
Cov(X,Y )
s X ×sY

Independence:
o

Multinomial Distribution:
o
1.) n independent trials
o
2.) each trial has k outcomes

the i-th outcome occurs with probability pi

i = 1, …, k
o
3.) xi = total number of the i-th outcome appeared in the n trials

p(x1,…, xn) = P(X1 = x1, X2 = x2, … , Xn = xn)
f (x, y) = fX (x)× fY (y)


p(x1, … , xr) =
n!
× p1x1 ×... × prxr
(x1 !)...(xr !)
Central Limit Theorem: Suppose X1, · · · , Xn are independent with mean μ and variance σ2, for large n, X ̄ approximately
follows a normal distribution with μ and variance σ2/n.
Download