EE 640 Applied Random Processes Fall 2002 Outline

advertisement
EE 640
Applied Random Processes Fall 2002
Outline
Course
Instructor: James Yee
Office: POST 205I, Office Hours: MWF 1:30-2:20 or by appointment,
Phone Number: 956-7576, Email: jyee@spectra.eng.hawaii.edu.
Prerequisites
Math 471 or EE342, (Probability Theory)
Linear Time Invariant Systems
Fourier Transforms, Laplace Transforms
Grading (Approximate)
HW(quizzes) : 20 (assignments)
MT: 30 (exam directory)
Final: 50 (exam directory)
Course Description (lecture summary)
Probability Theory, Random Variables, and Hilbert Spaces
sigma fields, Probability Axioms, Independence, Random Variables, Multivariate Distributions,
Stochastic Convergence, Limit Theorems, Conditional Expectation, Hilbert Spaces, Projection
Theorem.
Random Processes
Stationarity, Ergodicity, Gaussian Random Processes, Markov Processes, Poisson Processes, Second
Order Processes, Power Spectral Density, Linear Time Invariant Systems, Karhunen Loeve
Expansion, Baseband and Narrowband Processes.
Applications
Bayesian Estimation, Linear Mean Square Estimation, Wiener Filters, Kalman Filters, Hypothesis
Testing.
References
H. Stark and J. W. Woods, Probability, Random Processes, and Estimation Theory for
Engineers, Prentice Hall, 2nd Ed., 1994.
G. Grimmett and D. Stirzaker, Probability, Random Processes, Oxford Science Publications ,
2nd Ed., 1992.
A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGraw Hill, 3rd
Ed., 1991.
J. B. Thomas, An Introduction to Communication Systems, Springer Verlag, 1988.
E. Wong, Introduction to Random Processes, Springer Verlag, 1983.
E. Wong and B. Hajek, Stochastic Processes in Engineering Systems, Springer Verlag, 1985.
T. Kailath, Lectures on Wiener and Kalman Filtering, Springer Verlag, 1981.
R. Mortensen, Random Signals and Systems, John Wiley, 1987.
W. Davenport and W. Root, An Introduction to the Theory of Random Signals and Noise,
IEEE Press, 1987.
W. Feller, An Introduction to Probability Theory and its Applications Vol. I and II, John
Wiley, 1968 and 1971.
D. Luenberger, Optimization by Vector Space Methods, John Wiley, 1969.
EE 640
Applied Random Processes Fall 2002
Lecture Summary
Lecture 1: Introduction. Why random processes? Engineering Applications:
communication system (matched filter), ATM switch, information theory.
Lecture 2: Probability triple, sample space, events, set operations, fields, sigma fields,
probability measure, probability axioms, properties, conditional probability, Theorem of
total probability, Bayes rule, examples.
Lecture 3: Conditional probability example (BSC), independence, random variable
definition, distribution function and properties, continous random variables, pdf,
examples.
Lecture 4: Review RVs and distribution functions, discrete RVs, Multiple RVs, joint
distribution function and properties, joint pdf, marginals, condition density and dist.
functions, independence.
Lecture 5: Functions of one random variable, transformation of multiple RVs, auxiliary
method, distribution function method, examples, simulation of RVs.
Lecture 6: Examples of functions of RVs, Expectation, definition, properties, and
examples, quiz 1
Lecture 7: Expectation, moments, moment generating function, characteristic function,
expectation of multiple random variables, correlation, covariance, joint characteristic
function, properties, conditional expectation, examples.
Lecture 8: Review representation of RVs, Jointly Gaussian random variables, properties.
Expectation, definition, properties.
Lecture 9: Linear transformations of Gaussian RVs, similarity transformations,
conditional Gaussian RVs, sequences of RVs, convergence, sequence of events and
convergence.
Lecture 10: Sequence of events, AS, KM, IP, and ID convergence. Relationships
between different convergence and proofs.
Lecture 11: Proof AS convergence implies convergence IP. examples. quiz 2
Lecture 12: Review convergence relations and definitions, mutual convergence, Borel
Cantelli lemma, examples, Weak Law of Large Numbers, Chernoff bounds.
Lecture 13: (2 hours) Strong Law of Large Numbers (SLLN), Martingales, Martingale
Convergence Theorem (MCT), proof of SLLN using MCT, renewal theory application,
Central Limit Theorem, Bayesian estimation, minimum mean squared error estimation,
properties of conditional expectation, examples.
Lecture 14: Linear vector spaces, normed vector spaces, inner product spaces, complete
spaces, Banach and Hilbert space properties.
Lecture 15: Proof of projection theorem, normal equations, application to linear
estimation, examples.
Lecture 16: Minimum mean squre error estimation, linear MMSE estimation, examples,
Lecture 17: quiz 3
Lecture 18: Estimation of Widesense Markov processes, Random processes, definitions,
characterizations, joint distribution functions, joint pdf/mass functions.
Lecture 19: Separability, mean function, autocorrelation function, examples, Bernoulli
process, Binomail process, Sinusoid process with random phase.
Lecture 20: Poisson process, process of independent increments, Gaussian processes,
Brownian motion.
Lecture 21: Gaussian processes, Brownian motion, AR processes. quiz 4
Lecture 22: Markov processes and properties, Chapman-Kolmogorov equations, Markov
chains, state trasition diagrams, state transition matrix, stationary processes.
Lecture 23: Stationary processes, widesense stationary processes, two random processes,
orthogonal processes, independent processes, jointly widesense stationary processes,
examples.
Lecture 24: Self-similar and long range dependent processes, second order self-similar
processes and properties, short range dependent processes, second order processes,
properties of autocorrelation function.
Lecture 25: second order processes, properties of autocorrelation function, continuous
process definitions. quiz 5
Lecture 26: continuous processes, quadratic mean continuous, second order calculus,
quadratic mean derivatives, quadratic mean integrals.
Lecture 27: time averages, ergodic processes, mean ergodic, ergodic in correlation,
ergodic in distribution, examples.
Lecture 28: power spectral density, properties, examples, cross power spectral density,
stationary processes and linear time invariant systems.
Lecture 29: exam
Lecture 30: autocorrelation and power spectral density review, linear time invariant
systems, examples, Karhunen Loeve expansion.
Lecture 31: Karhunen Loeve expansion, Mercers Theorem, convergence, examples.
Lecture 32: KL expansion examples, KL application to waveform detection in additive
Gaussian noise, baseband processes, narrowband processes. quiz 6
Lecture 33: Hilbert Transform, narrowband processes, quadrature representation,
envelope and phase representation, Gaussian narrowband processes, spectral
factorization.
Lecture 34: Spectral factorization, minimum phase systems, innovations filter, whitening
filter, Paley Wiener condition, Discrete process factorization, state space representation.
Lecture 35: (11/20) Matched Filter, unconstrained solution using Schwarz Inequality,
optimal filter, signal to noise power ratio, examples: white noise and nonwhite noise.
Lecture 36: Realizable Matched Filter, integral equation, frequency domain solution
using spectral factorization and additive decompositions, examples.
Lecture 37: Matched Filter example, comparison between unconstrained and realizable,
Wiener Filter introduction. quiz 7
Lecture 38: Wiener Filter, principle of orthogonality, integral equation, unconstrained
case, examples, realizable filter derivation.
Lecture 37: Realizable Wiener Filter, examples, pure prediction, innovations processes.
Lecture 38: (12/4) Innovations process derivation, estimation using innovations,
example, state space model.
Lecture 39: Discrete time Kalman filter, measurement update and time update equations,
covariance update equation, gain equation, example.
Lecture 40: quiz 8
Download