Introduction to Neural Networks CMSC475/675, Fall 2011 Mon/Wed 2:30 – 3:45, SOUND101 Instructor: Professor Yun Peng ITE 341, x5-3816, ypeng@umbc.edu Office Hour: MW 1:00 – 2:00PM or by appointment The most important thing you should remember: URL for the course http://www.csee.umbc.edu/~ypeng/F11NN/NN.html • All information important to the class is/will be posted on this web site. • No hard copy handouts will be given • Read class announcement on the web • Contact me by email outside the office hours Course Overview • Texts: Elements of Artificial Neural Networks, by Mehrotra, Mohan, and Ranka, MIT Press, 1997. • Prerequisites: – CMSC341 (or experience in other programming languages) – Knowledge in algorithms and AI is helpful – Some knowledge in linear algebra/matrix and differential equations – Biology/psychology/neural science/cognitive science not required Course Overview • Grading – 2 projects – 2 exam – Project 3 20% each 30% each (additional questions for 675 students) 20% for 675 students only • Academic dishonesty – UMBC Student Honor Code (from Student Handbook) “… Cheating, fabrication, plagiarism, and helping others to commit these acts are all forms of academic dishonesty, ... Academic misconduct could result in disciplinary action that may include, but is not limited to, suspension or dismissal…” Course Overview • Highlights – Central theme borrowed from networks of neurons in animal nerve systems – An alternative approach to computational models for problem solving (in contrast to algorithmic, Von Neumann approach) – Multi-disciplinary • Computer science • Mathematics/statistics/physics • Neural science (medicine, zoology, biology) • Cognitive science/psychology • Engineering Course Overview • Emphasis of this course: – Computational aspects of NN • Network structures • Computation and learning methods – Not on modeling of • Biological nerve systems and their functions • Cognitive behaviors – Limited mathematical treatment – Limited coverage on applications Course Overview • Main topics: – Basics of NN • Neurons and inter-neuron connections • Differences with conventional Von Neumann computing – Major NN models • • • • • • Adaline Perceptron Multi-layer feed forward nets Recurrent nets Hopfield nets and other thermodynamic models Self organizing nets Course Overview – Learning • • • • Hebbian rule Backpropogation (error driven and gradient descent) Supervised and unsupervised learning Reinforcement learning – Applications • • • • Function approximation Pattern analysis Prediction Optimization CMSC 475/675 All the best !!!! 10/5/2009 • Exam 1: Oct 12 (next Monday) – – • Project 1 – – • Review for exam 1 is posted Revised slides for Ch 4 is posted Due Oct 19 (by the end of the class of that day) What to submit • Source code • Running result • Project report Authors’ Errant – http://www.cis.syr.edu/~mohan/html/book.html 10/19/2009 • Exam1 5 graduate students (120 points) min max average 79 116 96 6 undergrad students (100 points) min max average 52 88 75.6 10/26/2009 • Project 2 – Due Monday, 11/23 – Experimenting SOM 10/26/2009 • Project 2 – Due Monday, 11/23 • Project 3 (for 675 students only) – Experiment with different NN models for auto associative memory • Discrete Hopfield model (DHM) • Back propagation learning + recurrent recall (BDRR) – Test their storage capacities and other properties (e.g., pattern correction, cross-talk) – Project description will be uploaded to the class website by tomorrow – Due May 15, the last day of class (also the day for final exam) 11/16/2009 • Project 2 – Due Nov. 23 • Project 3 (for 675 students only) – Due Dec 14 (the last day of class) – Proposal due 11/18 • Final exam – Dec 14 – Material since midterm exam – A review on Dec 9 (Wed.) • No class on Wed. Nov. 25 5/8/2006 • Project 3 (for 675 students only) – Due May 15 • No class Wed. 5/10 • Final exam – May 15 – Material since midterm exam