Random Variable and its Properties

advertisement
Prof. Dr. S. K. Bhattacharjee
Department of Statistics
University of Rajshahi
Random Variable and its Properties
 Basic concepts, Discrete and Continuous random
variables, Density and Distribution functions,
Mathematical expectations and Variance, Functions
of a random variable. Marginal and Conditional
distributions.
Conditional
expectation
and
conditional variance. Conditional independence,
Entropy.
Random Variable
 Let Ω be a sample space corresponding to some
experiment ξ and let X: Ω→ R be a function from the
sample space to the real line. Then X is called a
random variable.
 A random variable is a function from a sample space
into the real numbers.
 the outcome of the experiment is a random variable
Examples of Random Variables
Discrete and Continuous R.V.
Examples of Discrete R.V.
 No. of students attending in a class.
 No. of accident in a particular day.
 No. of mobile calls in an hour.
Examples of Continuous R.V.
 Height of a man.
 Amount of water drinks a person per day.
 Amount of rice consumption of a family per month.
Probability Distribution
 Since Random variables cannot be predicted exactly,
they must be described in the language of probability.
 Every outcome of the experiment will have a
probability associated with it.
 The pattern of probabilities that are assigned to the
values of the random variable is called the probability
distribution of the random variable.
 Two types of Distribution: Discrete Distribution and
Continuous Distribution.
Probability Distributions of Discrete Random Variables
The probability distribution of a discrete random variable Y is the
set of values that this random variable can take, together with
their associated probabilities. Probabilities are numbers between
zero and one inclusive that always add to one when summed over
all possible values of the random variable.
Example: If we toss a fair coin twice, and Y is the total number of
heads that turn up.
Probability Distribution of Y is
Possible values of the
random variable Y
0
1
2
Associated probabilities
.25
.50
.25
Probability Distribution and CDF
Mathematical form of this distribution is
Pr{y=x}= 2Cx(1 2) x (1 1 2)2 x
The CDF is defined by F(x)=
Pr{Y≤x} =
x2
x

0
Cx(1 2) (1  1 2)
2 x
Graph of CDF for Discrete Distribution
Example
Probability Distribution and CDF
Density Functions of Binomial and Poisson
Distributions.
Examples of Discrete Probability Distribution
 Bernoulli
 Binomial
 Poisson
 Negative Binomial
 Uniform
 Hypergeometric
Density Function for Continuous Random
Variables
Density Function and CDF for Continuous R.V.
Density Function for Continuous Random
Variables
Examples of Continuous Distribution
 Uniform
 Normal
 Exponential
 Gamma
 Beta
Normal Probability Distribution
The pdf of this distribution is given by
Density Curve of Normal Distribution
Example of Normal Distribution
Answer
Answer
Answer
Answer
Expectations of Random Variables
Variance of R.V.
Example
Consider as an example the New York State Daily Numbers lottery
game. The simplest version of the game works as follows: each day
a three digit number between 000 and 999, inclusive, is chosen.
You pay $1 to bet on a particular number in a game. If your umber
comes up, you get back $500 (for a net profit of $499), and if your
number doesn’t come up, you get nothing (for a net loss of $1).
Consider the random process that corresponds to one play of the
game, and define the random variable W to be the net winnings
from a play. The following table summarizes the properties of the
random variable as they relate to the random process:
Outcome of process
Probability
Your number comes up
p = 1/1000
Your number doesn’t come up p = 999/1000
Value of W
W = 499
W = −1
Mean
In the long run, we expect to win 1 time out of every 1000
plays, where we’d win $499, and we expect to lose 999 out
of every 1000 plays, where we’d lose $1 (this is just the
frequency theory definition of probabilities). That is, our
rate of winnings per play, in the long run, would be $499,
.001 of the time, and −$1, .999 of the time, or
(499)(.001) + (−1)(.999) = −.5.
In the long run, we lose 50/c each time we play. Note that
on any one play, we never lose 50/c (we either win $499
or lose $1); rather, this is saying that if you play the game
10000 times, you can expect to be roughly $5000 down at
the end. An even better way to look
at it is that if 10
million people play the game every day, the state can expect
to only have to give back about $5 million, a daily profit of a
cool $5 million (this is why states run lotteries!).
Variance
Examples
Expectations of Functions of Random Variable
Properties of Expectations
Properties of Expectations
Properties of Expectations
Joint Probability Mass Function
Marginal Distribution
Conditional Distribution
Conditional Variance
Marginal and Conditional Distribution
X↓ Y→
0
1
2
P(X)
P(X|Y=1)
0
1/8
1/8
0
1/4
1/4
1
1/8
1/4
1/8
1/2
1/2
2
0
1/8
1/8
1/4
1/4
P(Y)
1/4
1/2
1/4
P(Y|X=2)
0
1/2
1/2
Conditional Mean and Variance
 Marginal Distribution of X:
X : 0 1
2
P(X) : ¼ ½ ¼
 Conditional Distribution of X given Y = 1:
X : 0 1
2
P(X|Y=1) : ¼ ½ ¼
E(X|Y=1) = 0x1/4 + 1x1/2 + 2x1/4 = 1
E(X2|Y=1) = 02 x1/4 +12 x1/2 +22 x1/4 = 3/2
V(X|Y=1) = E(X2|Y=1) - {E(X|Y=1)}2 = 1/2
Conditional Density and Conditional Expectation of Continuous
R.V.
Conditional Density of Continuous R.V.
Conditional Variance of Continuous R.V.
Marginal and Conditional Distribution for Continuous R.V.
Conditional Mean and Variance for Continuous R.V
Independence of R.V.
Entropy
For a random variable with outcomes {x1,x2,...xn}, the Shannon entropy, a measure of
uncertainty is defined as
where
is the probability mass function of outcome .
Conditional Entropy
To understand the meaning of Eq. (1), first consider a set of possible outcomes (events)
, with equal probability
. An example would be a fair
die with values, from to . The uncertainty for such a set of outcomes is defined by
the conditional entropy of two events X and Y taking values xi and yj respectively, as
Differential Entropy
Example: Uniform Distribution
Relative Entropy and Mutual Information
Joint and Conditional Differential Entropy
Download