EE-302 Stochastic Processes Syllabus - Fall 2012 August 19, 2012 Instructor: Office Office Hours: Phone: URL: Email: Gagan Mirchandani 355 Votey Hall TR 10:00-11:00am 656-4587 http://www.cems.uvm.edu/~mirchand gmirchan@uvm.edu Catalog Description: Probability theory, random variables and stochastic processes. Response of linear systems to stochastic inputs. Applications in engineering. (4 credit hours). Course Prerequisites: EE 171 Signals & Systems and STAT 151, Applied Probability. (Basic knowledge of Linear Algebra, Vector Spaces, Probability & Statistics, Signals & Systems. Programming experience in a high level language.) Course Objectives: The objectives of this course are to: • understand the fundamental meaning of probability and fundamental concepts of random variables and stochastic processes. • understand basic concepts of Bayesian analysis, features and classifiers. Course Outcomes: Upon satisfactory completion of the course, the student will be able to: • solve problems in discrete and continuous probability ranging from ”counting” problems to problems involving signal and noise. • propose possible methods to resolve basic problems in data analysis requiring dimensionality reduction, feature extraction and classifying data. Provide an error analysis. Course Lectures: We have some 15 weeks of classes for are a total of 25 lectures (100 minutes each). Lectures and associated readings are as follows. (Note that allocated time and order of presentation may vary somewhat from that described below.) • Lectures 1 & 2: Classic and frequency definition of probability; binomial distribution; Gaussian distribution. Analysis versus computer simulation. Used random number generator to generate desired distribution. Histograms and pdfs. Multiple random variables. Scattergrams. (Chs. 1,2, Kay). • Lecture 3 & 4. Concept of a set. Set elements, subsets and set operations. Assigning and determining probabilities using sets. Notion of an event. Axioms of probability. Finite and countably infinite sets. Properties of probability functions. Continuous sample spaces. Combinatorics. Binomial probability law. Application and computer simulation. (Ch.3, Kay, Ch.2 PP). • Lectures 5: Conditional probability. Joint events. Independence and mutually exclusive events. Bayes theorem. Examples with medical diagnosis, cluster recognition. (Ch.4, Kay, Chs.2.3, 3.1, 3.2 PP). • Lecture 6: Random variable (RV) and probability of a random variable. Important probability mass functions (PMFs). Functions of a random variable. Cumulative Distribution functions (CDF). Computer Simulation. Supermarket example. (Ch.5 Kay, Chs.4.1, 4.2, 4.3 PP). EE 302 Syllabus 1 • Lecture 7: Expected value of a RV. Expected value of functions of RVs. Characteristic function. Real world example of data compression. Computer simulation. (Ch. 6 Kay, Chs.5.1 - 5.5 PP). • Lecture 8: Multiple discrete RVs. Sample space. Joint distribution. Marginals PMFs and CDFs. Independence. Transformations. Expected value. Joint moments. Covariance and independentce. Correlation and causality. Computer simulation of random vectors. Example illustrating assessing health risks. (Ch.7 Kay, Chs.5.1 - 5.5 PP). • Lecture 9: Conditional PMFs for compound experiments. Joint, conditional and marginal PMFs. Conditional expectation. Realizing joint PMFs through computer simulation. Modeling human learning. (Ch.8 Kay, Chs.6.1 - 6.6 except Ch.6.6 PP). • Lecture 10: Discrete N-dimensional RVs. Random vectors and joint PMFs. Expected value and covariance matrix. Conditional PMFs. Decorrelation. Example of image coding. (Ch.9 Kay, Chs.7.1-7.4 PP). • Lecture 11: Continuous RVs. PDFs, CDFs. Transformation of RVs. Real world example of critical software testing. (Ch.10 Kay, Ch.5.1 - 5.5 PP). • Lecture 12: Expected value of common RVs. Characteristic function. Moments of RVs. Tchebyschef inequality. Real world example of importance sampling. (Ch. 11 Kay, Ch.6 PP). • Lecture 13: Conditional PDFs. Joint, conditional and marginal PDFs. Independent RVs. Transformations. Expected values. Computer simulation of going continuous RVs. Real world example of optical character recognition. (Ch.12 Kay, Ch.6 PP). • Lecture 14: Continuous N-dimensional RVs. Functions of random vectors. Real world example of signal detection. • Lecture 15: Law of large numbers. Central Limit theorem. Computer simulation of joint continuous RVs. Real world example of retirement planning. (Ch. 13 Kay, Ch.6 PP). • Lecture 16: Stochastic Processes. Stationary. White Gausian noise. Moving average RP. Real world example of data analysis. (Ch. 16 Kay, Chs. 9.1 -9.5 PP). • Lecture 17: WSS Stochastic Processes. Autocorrelation. Ergodocity and temporal averages. Power spectral density. Random vibration testing. (Ch.17 Kay, Ch.9.1 - 9.5 PP). • Lecture 18: Linear systems and WSS RPs.Wiener Filtering. Real world example of speech synthesis. (Ch. 18 Kay, Chs. 9.1 - 9.5 PP). • Lecture 19. Concept of Features. Dimensionality reduction. PCA vs LDA. (Class Notes). • Lectures 20, 21: Concepts of Classifiers. Bayesian estimation. Likelihood function. Error probability. Biased and unbiased estimators. (Class Notes, parts of Ch.8.1 - 8.3 PP). • Lectures 22, 23: Discrete Markov process and hidden Markov models. (Ch.15 PP, Class Notes). • (2 Lectures reserved for exams). Text: Intuitive Probability and Random Processes using MATLAB, S.Kay, (Springer 2006), Ch.1-18 Recommended: Probability, Random Variables and Stochastic Processes, 4th Edition, A. Papoulis & S.U.Pillai, (McGrawHill 2002 ) EE 302 Syllabus 2 Schedule - Tentative Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Lecture Lectures 1 and 2 Lectures 3 and 4 Lecture 5 Lecture 6 Lecture 7 Lectures 8 Lectures 9 Lectures 10 Lecture 11 Lectures 12 and 13 Lectures 14 and 15 Lecture 16 and 17 Lectures 18 and 19 Lectures 20 and 21 Lectures 22 and 23 Homework HW. 1 HW. 2 HW. 3 HW. 4 HW. 5 HW. 6 HW. 7 HW. 8 HW. 9 HW. 10 HW. 11 HW. 12 Reference Texts: Modern Probability Theory and Its Applications, E. Parzen, (John Wiley & Sons, 1960), Ch.1-10, An Introduction to Probability Theory and its Applications, Vol.1, 2nd Edition,W. Feller (John Wiley & Sons, 1957) Ch.1,2,5,6,10,15. Probability and Random Processes with Applications to Signal Processing, 3rd Edition, H. Stark & J.W.Woods, (July 2001), Ch.1-7. EE 302 Syllabus 3