Exam #2 Review Chapter 3: Several Random Variables

advertisement
Exam #2 Review
Chapter 3: Several Random Variables
Sections
3.1
3.2
3.3
3.4
3.5
3.6
3.7
Two Random Variables
Conditional Probability--Revisited
Statistical Independence
Correlation between Random Variables
Density Function of the Sum of Two Random Variables
Probability Density Function of a Function of Two Random Variables
The Characteristic Function
Chapter 4: Elements of Statistics
Sections
4-1
4-2&3
4-4
4-5
4-6
4-7
Introduction
The Sampling Problem
Unbiased Estimators
Sampling Theory --The Sample Mean and Variance
Sampling Theorem
Sampling Distributions and Confidence Intervals
Student’s T-Distribution
Hypothesis Testing
Curve Fitting and Linear Regression
Correlation Between Two Sets of Data
5. Random Processes
Sections
5-1
5-2
5-3
5-4
5-5
5-6
5-7
Introduction
Continuous and Discrete Random Processes
Deterministic and Nondeterministic Random Processes
Stationary and Nonstationary Random Processes
Ergodic and Nonergodic Random Processes
Measurement of Process Parameters
Smoothing Data with a Moving Window Average
Homework problems
Previous homework problem solutions as examples – Dr. Severance’s Skill Examples
Skills #3
Skills #4
Skills #5
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
1 of 12
ECE 3800
3. Several Random Variables
3.1. Two Random Variables
Joint Probability Distribution Function (PDF)
F  x, y   Pr  X  x, Y  y 
for    x   and    y  
 0  F  x, y   1,
 F  , y   F  x,   F  ,   0
 F ,    1
 F  x, y  is non-decreasing as either x or y increases
 F  x,    FX  x  and F , y   FY  y 
Joint Probability Density Function (pdf)
 2 FX x 
xy
for    x   and    y  
f  x, y  

f  x, y   0,
 

  f x, y   dx  dy  1
 
y


x
  f u, v  du  dv
F  x, y  
 

f X x  
 f x, y   dy
and f Y  y  
 f x, y   dx




Pr  x1  X  x2 , y1  Y  y 2  
y 2 x2
  f x, y   dx  dy
y1 x1
Expected Values
E g  X , Y  
 
  g x, y   f x, y   dx  dy
  
Correlation
EX  Y  
 
  x  y  f x, y   dx  dy
 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
2 of 12
ECE 3800
3.2. Conditional Probability--Revisited
FX x | Y  y  
Pr  X  x | M  F  x, y 

Pr M 
FY  y 
FX  x | y1  Y  y 2  
F  x, y 2   F  x, y1 
FY  y 2   FY  y1 
FX x | Y  y  
f  x, y 
fY  y 
FY  y | X  x  
f  x, y 
f X x 
f  y | x   f X x 
fY  y 
f x | y  
f  x, y   f  x | Y  y   f Y  y   f  y | X  x   f X  x 
The multiple variable Bayes Theorem, use
f x | Y  y  
f  y | X  x   f X x 
fY  y 
f  y | X  x 
f x | Y  y   fY  y 
f X x 
3.3. Statistical Independence
f  x, y   f X  x   f Y  y 
E X  Y   E X   E Y   X  Y
3.4. Correlation between Random Variables
 
  x  y  f x, y   dx  dy
EX  Y  
 
covariance
E  X  E X   Y  E Y  
 
  x   X    y  Y   f x, y   dx  dy
 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
3 of 12
ECE 3800
correlation coefficient or normalized covariance,
 X   X
  E 
  X
  Y  Y
  
  Y
 

 x  X
 


 X
 

  y  Y
  
  Y

  f x, y   dx  dy

E x  y    X   Y
 X  Y

3.5. Density Function of the Sum of Two Random Variables
Z  X Y
y
F  x, y  
x
  f u, v  du  dv
 
FZ  z  


z y
fY  y  

f Z z  
 f X x dx  dy





 f X x  fY z  x  dx   fY  y   f X z  y   dy
3.6. Probability Density Function of a Function of Two Random Variables
Define the function as
Z  1  X , Y  and W   2  X , Y 
and the inverse as
X   1 Z ,W  and Y   2 Z ,W 
The original pdf is f x, y  with the derived pdf in the transform space of g z, w .
Then it can be proven that:
Pr  z1  Z  z 2 , w1  W  w2   Pr  x1  X  x2 , y1  Y  y 2 
or equivalently
w2 z 2
y2 x2
w1 z1
y1 x1
  g z, w  dz  dw    f x, y   dx  dy
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
4 of 12
ECE 3800
Using an advanced calculus theorem to perform a transformation of coordinates.
x
g z , w  f  1  z, w, 2 z , w  z
y
z
w2 z 2
w2 z 2
w1 z1
w1 z1
x
w  f   z , w,  z , w  J
1
2
y
w
  g z, w  dz  dw    f  z, w, z, w  J  dz  dw
1
2
If only one output variable desired (letting W=X), integrate for all W to find Z, do not integrate
for z ….
g z  




 g z, w  dw 
 f  z, w, z, w  J  dw
1
2
3.7. The Characteristic Function (Material will not be on the exams for ECE 3800)
 u   Eexp j  u  X 

 u  
 f x  exp j  u  x  dx

The inverse of the characteristic function is then defined as:
1
f x  
2

  u   exp j  u  x  du

Computing other moments is performed similarly, where:
d n  u 
du
du

u 0
 f x    j  x 
n

 exp j  u  x   dx


d n  u 
n
n

  j  x
n
 f  x   dx  j 
n

x
n
 
 f x   dx  j n  E X n


Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
5 of 12
ECE 3800
4. Elements of Statistics
4.1. Introduction
4.2. Sampling Theory--The Sample Mean
1
Xˆ 
n
Sample Mean
n
 Xi ,
where X i are random variables with a pdf.
i 1
Variance of the sample mean

 
   2   2
1
 n   X 2   X 2  X
Var Xˆ    X 2 

n
n2
n

 
n
Destructive testing or sampling without replacement in a finite population results in another
expression:

 2  N n
Var Xˆ 


n  N 1 
4.3. Sampling Theory--The Sample Variance
 X i  Xˆ 
i 1
n 1 2
E S 2  

n
N n 1 2


E S 2  
N 1 n
1
S 
n
n
2
2
This is biased. To make it unbiased,
 
 
n
n 1
~
ES2 
E S2 

n 1
n 1 n

n
i 1



n
2
2
1
ˆ
Xi  X 
X i  Xˆ
n 1
i 1
When the population is not large, the biased and unbiased estimates become
N n 1 2


E S2 
N 1 n
N
n
~

ES2 
E S2
N 1 n 1
 
 
 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
6 of 12
ECE 3800
4.4. Sampling Distributions and Confidence Intervals
   x   2 
X
 exp

2
2   X
 2   X

f X x  
Gaussian
1
The Student’s t probability density function (letting v=n-1, the degrees of freedom) is defined as
f T t  
where 
v 1
 v  1

 
2  2
 2   1  t 
v 
v 
v      
 2
 is the gamma function.
k  1  k  k 
for any k
for k an integer
 k!
and
 2 
1

Confidence Intervals based on Gaussian and Student’s t
Gaussian
Z
Xˆ  X

n
X
k 
k 
 Xˆ  X 
n
n
Student’s t distribution
Xˆ  X
Xˆ  X
 ~
T
S
S
n 1
n
~
~
X t S
 Xˆ  X  t  S
n
n
4.5. Hypothesis Testing
The null Hypothesis
Accept H0:
if the computed value “passes” the significance test.
Reject H0:
if the computed value “fails” the significance test.
One tail or two-tail testing
Using Confidence Interval value computations
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
7 of 12
ECE 3800
4.6. Curve Fitting and Linear Regression
y  a  bx
For
a
n
1  n

   y i  b   xi 
n  i 1
i 1

and
n

 n   n
n  y i  xi     xi     y i 
 i 1   i 1 
b  i 1
2
n
 n 
2
n  xi    xi 
i 1
 i 1 
nd
Or using 2 moment, correlation, and covariance values
a
Yˆ  R




 Xˆ  R XY
Yˆ  R XX  Xˆ  R XY

2
C XX
R XX  Xˆ
XX
b
    C
C
 Xˆ 
R XY  Yˆ  Xˆ
R XX
2
XY
XX
4.7. Correlation Between Two Sets of Data
n
1
 X  E X   
n
 xi
 
 xi 2
E X 2  R XX 
1

n
i 1
n
i 1
2
1 n

1
 X 2  C XX  
xi 2   
xi   R XX   X 2
n

n
i 1
 i 1 
n


1
R XY  E X  Y   
n



C XY  E X  X  Y  Y 



 X  X Y Y 

r   XY  E 



X
Y 

1

n
n
 xi  y i
1

n
i 1
n
  xi  y i    X  Y
i 1
n
  xi  y i    X  Y
i 1
 X  Y

C XY
 X  Y
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
8 of 12
ECE 3800
5. Random Processes
5.1. Introduction
A random process is a collection of time functions and an associated probability
description.
The entire collection of possible time functions is an ensemble, designated as xt ,
where one particular member of the ensemble, designated as xt  , is a sample function of
the ensemble. In general only one sample function of a random process can be observed!
5.2. Continuous and Discrete Random Processes
5.3. Deterministic and Nondeterministic Random Processes
A nondeterministic random process is one where future values of the ensemble cannot be
predicted from previously observed values.
A deterministic random process is one where one or more observed samples allow all
future values of the sample function to be predicted (or pre-determined).
5.4. Stationary and Nonstationary Random Processes
If all marginal and joint density functions of a process do not depend upon the choice of
the time origin, the process is said to be stationary.
Wide-Sense Stationary: the mean value of any random variable is independent of the
choice of time, t, and that the correlation of two random variables depends only upon the
time difference between them.
5.5. Ergodic and Nonergodic Random Processes
The probability generated means and moments are equivalent to the time averaged means
and moments.
A Process for Determining Stationarity and Ergodicity
a) Find the mean and the 2nd moment based on the probability
b) Find the time sample mean and time sample 2nd moment based on time
averaging.
c) If the means or 2nd moments are functions of time … non-stationary
d) If the time average mean and moments are not equal to the probabilistic mean
and moments or if it is not stationary, then it is non ergodic.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
9 of 12
ECE 3800
5.6. Measurement of Process Parameters
The process of taking discrete time measurements of a continuous process in order to
compute the desired statistics and probabilities.
5.7. Smoothing Data with a Moving Window Average
An example of using a “FIR filter” to smooth high frequency noise from a slowly time
varying signal of interest. .
A Process for Determining Stationarity and Ergodicity
a)
Find the mean and the 2nd moment based on the probability
b)
Find the time sample mean and time sample 2nd moment based on time averaging.
c)
If the means or 2nd moments are functions of time … non-stationary
d)
If the time average mean and moments are not equal to the probabilistic mean and
moments or if it is not stationary, then it is non ergodic.
Example Computations for means and 2nd moment (pdf/pmf based and time average based):

X 

 x  f X x  dx
and
X
2
x
2
 f X  x   dx


1
x  Xˆ  lim
T   2T

T
 xt   dt
and
T
x
2
1
 lim
T   2T
T
 xt 
2
 dt
T
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
10 of 12
ECE 3800
Example Exam Questions
1. [60 pts])
Consider a joint density function defined for random variable X and Y as follows:
f XY  x, y   ???
a.(10) Find the mean of X and Y:
b.(10) Find the 2nd Moment of X and Y
c.(10) Find the Variance of X and Y
d.(10) Find the correlation E XY 
e.(10) Find the Correlation Coefficient
f.i.(5) Are X and Y independent? Why?
f.ii.(5) Are X and Y Uncorrelated? Why?
2. [55 pts])
Consider the random process X t   ??? , where ….
a.(10) Find the probabilistic mean EX t 
b.(10) Find the time average mean xt 

c.(10) Find the probabilistic 2nd moment E X t 
d.(10) Find the time 2nd moment xt 
2

2
e.i.(5)
Is the random process stationary? Why or why not?
e.ii..(5)
Is the random process ergodic? Why or why not?
e.iii.(5)
Is the random process deterministic? Why or why not?
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
11 of 12
ECE 3800
3. [40 pts])
Confidence intervals
a.(20) Student’s T Distribution
b.(20) Gaussian Distribution.
4. [50 pts])
Statistical Computations
a.(10) Create the scatter plot
b.(5)
Calculate the mean of X and Y
c.(5)
Calculate the and variances of X and Y
d.(5)
Calculate the correlation
e.(5)
Calculate the correlation coefficient
f.(20) Perform a Linear Regression.
5. [35 pts])
functions:
Suppose X and Y are independent random variables with the following density
Also, let Z   a  X  b  Y  c
a.(15) Find the density function f Z z  .
b.(10) Find the mean of Z. Hint: X and Y are independent.
c.(10) Find the variance of Z. Hint: X and Y are independent.
Additional ideas ….
Remember the ECE student catching the bus example ….
Homework problems that were skipped, but are very similar to those you did …
etc.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
12 of 12
ECE 3800
Download