Arbib: CS564 "Brain Theory and AI" Syllabus Fall 2007 1 CSCI 564

advertisement
CSCI 564 (NEUR 535): Fall 2008
Brain Theory and Artificial Intelligence
Tu Th 11:00am-12:20pm
Instructor: Prof. Michael A. Arbib; HNB-03, (213)740-9220, arbib@pollux.usc.edu. (Office hours: 1-2 pm
Tuesdays, HNB 03.)
TA: Jinyong Lee. Office Hours: to be announced, HNB 10.
Brains have proved highly successful in integrating perception, planning, memory and action in guiding creatures
that interact with a complex world. The course has two overlapping aims: “To understand the workings of our own
brains” and “To explore the implications of brain function for developing exotic, highly distributed adaptive
embodied computing systems.” As we move to distributed computation, sensor networks, embedded systems, and
robots interacting with humans in complex ways, we will discover Brain Operating Principles (BOPs) that will not
only illuminate our understanding of ourselves but will also guide us in the development of new brain-style adaptive,
distributed embedded computing technologies.
The course will introduce you to the basic facts about the brain, teach you how to model the brain conceptually
and how to implement those models in our Neural Simulation Language (NSL), and how to keep track of BOPs and
brain models in our Brain Operation Database (BODB).
Course Requirements:
One “mid-term” will cover the entire contents of the lectures and required readings up to that time. The final will
emphasize, but not be restricted to, material covered after the mid-term.
Each student will be required to prepare a four-part project to get an overall feel for the architecture of a largish
brain model, understand how models are related to empirical data, and think through the details of at least one
important subsystem. Joint work on Parts 3 and 4 of the Project is allowed but not required.
Prerequisites:
Graduate standing; ability to program in Java, C++ or MatLab or willingness to learn to program in one of these
systems. Basic background in neuroscience will be supplied, but students with experience in this area are still invited
to join the course to gain an understanding of the computational approach to the brain.
Neuroscience students less skilled in computer programming will still study MatLab and the NSL homework, but
may either (a) negotiate a project that involves analysis of a neural system without computer implementation, or (b)
conduct joint work on Parts 3 and 4 of the Project taking responsibility for literature review and system design rather
than programming.
Texts:
[NSL Book]: A. Weitzenfeld, M.A. Arbib and A. Alexander, 2002, NSL Neural Simulation Language, MIT Press
(A draft version is available at http://neuroinformatics.usc.edu/cs564/nslbook.htm)
[HBTNN] Selected articles from M.A. Arbib, Ed., 2003, The Handbook of Brain Theory and Neural Networks,
MIT Press, Second Edition. (The Handbook is available as one of the reference works on-line at the Cognet website
of The MIT Press. This can be reached from USC machines by going to http://cognet.mit.edu.)
Other articles will be placed on the class Website including extracts from
[TMB2] M.A. Arbib, 1989, The Metaphorical Brain 2: Neural Networks and Beyond, Wiley-Interscience.
Arbib: CS564 "Brain Theory and AI" Syllabus
Fall 2007
2
Grades: Homework: 20%; Mid-term: 20%; Final 20%; Project: 40% (5% for Part 1; 10% for Part 2; 5% for Part 3;
20% Part 4)
Syllabus
Date
1. 8/26
Topic
Brain-Inspired Computer Architecture 1
2.
8/28
The Brain as a Network of Neurons
3.
4.
9/2
9/4
The Structure of Brains
Early Visual Processing
5.
9/9
Schemas & Cooperative Computation; Brief
introduction to BOPS and BODB
6.
9/11
Differential equations for Neural Networks;
Arrays; Winner-Take-All
7.
9/16
8.
9/18
Introduction to the MatLab & Simulink
Environment. (Lee)
Practical Introduction to NSL-MatLab 1: WinnerTake-All (Lee)
9.
9/23
10. 9/25
Eye Movements
Dominey-Arbib model; Introduction to the Project
Readings
Arbib, M.A., 2003, Towards a neurallyinspired computer architecture, Natural
computing, 2:1-46.
TMB2: Chapter 1; Section 2.3; Section
9.1
HBTNN: Part I, Sections I.1 and I.2; Part
III, Single Cell Models
TMB2: Section 2.4.
TMB2: Chapter 3.3
HBTNN 2e: Feature Analysis
TMB2: Sections 2.1, 2.2, 4.2 (on
HEARSAY)
Anon Plangprasopchok, Nantana
Tinroongroj, and Michael A. Arbib,
User’s Manual for the Brain Operation
Database.
Supplementary Reading:
HBTNN: Schema Theory ; Multiagent
systems
TMB2: Section 5.1 (more on frogs) and
5.4 (brief look at language)
HBTNN: Visuomotor Coordination in
Frog and Toad; Hybrid
Symbolic/Connectionist Systems
TMB2: 4.3, pp. 194-197. Prey Selection or Winner Takes All; 4.4. A Mathematical
Analysis of Neural Competition
HBTNN: Part I, Sections I.1 and I.2; Part
III, Single Cell Models
MatLab & SimuLink documentation (to be
specified)
NSL Book: Chapter 1 & Chapter 2: The
book uses Java and C++, but we will use
the MatLab version of NSL.
Required Readings: TMB2, Section 6.2.
HBTNN: Collicular Visuomotor
Transformations for Saccades
“Reprint: Dominey, P. F., and Arbib, M.
A., 1992, A Cortico-Subcortical Model for
Generation of Spatially Accurate
Sequential Saccades, Cerebral Cortex,
2:153-175.
Supplementary Reading: NSL Book:
Chapter 14 – The Modular Design of the
Oculomotor System in Monkeys
[Even though it uses Java-NSL, not
MatLab-NSL, this will help you prepare
for the lecture on 10/23: Practical
Introduction to MatLab-NSL 3: Basal
Ganglia: Learning Associations and
Sequences.]
Arbib: CS564 "Brain Theory and AI" Syllabus
11. 9/30
Adaptive networks 1: Hebbian learning,
Perceptrons; Landmark learning
[Include overview of later lectures on RL and
backprop.]
12. 10/2
Supplementary reading (not a lecture): Hopfield
Networks, Constraint Satisfaction, and
Optimization
Adaptive networks 2: Reinforcement learning;
Conditional motor learning
13. 10/7
Adaptive networks 3: Gradient descent and
backpropagation; Forward & Inverse Models 1
14. 10/9
15. 10/14
Mid-Term
Practical Introduction to NSL-MatLab 2: Dynamic
Remapping (Lee)
17. 10/21
Practical Introduction to NSL-MatLab 3: Basal
Ganglia: Learning Associations with Reinforcement
Learning (Lee)
Scene perception & Visual Attention
18. 10/23
The FARS model of control of grasping
19. 10/28
The first model of the mirror neuron system
20. 10/30
Basal Ganglia: Learning Associations and
Sequences
16. 10/16
Fall 2007
3
Required Readings: TMB2: Section 3.4.
HBTNN: Associative Networks:
Perceptrons, Adalines, and BackPropagation.
Supplementary Readings: HBTNN:
Competitive Learning; Hebbian Synaptic
Plasticity. NSL Book: Chapter 12: The
Associative Search Network: Landmark
Learning and Hill Climbing
Required Reading: TMB2: 3.2. Material
on Stability (pp.106-114); 8.2 Material on
Associative Networks and Hopfield
Networks (pp.375-382); HBTNN 2e:
Associative Networks; Computing with
Attractors (Hertz)
Required Reading: HBTNN:
Reinforcement Learning; Reinforcement
Learning in Motor Control
HBTNN: I.3 Dynamics and Adaptation in
Neural Networks; Perceptrons, Adalines,
and Backpropagation; Sensorimotor
Learning
Supplementary Reading: HBTNN:
Backpropagation
Closed Book
NSL Book: Chapter 14 – The Modular
Design of the Oculomotor System in
Monkeys
TMB2 Section 5.3; Itti &Koch;
Navalpakkam, Itti & Arbib; Didday &
Arbib; A nod to Bayesian models.
TMB 2, Sections 5.3, 6.3. Fagg, A. H.,
and Arbib, M. A., 1998, Modeling
Parietal-Premotor Interactions in Primate
Control of Grasping, Neural Networks,
11:1277-1303.
Supplementary Reading from HBTNN:
Decoding Population Codes; Grasping
Movements: Visuomotor Transformations.
Oztop, E., and Arbib, M.A., 2002, Schema
Design and Implementation of the GraspRelated Mirror Neuron System, Biological
Cybernetics, 87:116-140
HBTNN: Basal Ganglia; Dopamine, Roles
of.
Reprint: Dominey, P.F., Arbib, M.A., and
Joseph, J.-P., 1995, A Model of
Corticostriatal Plasticity for Learning
Associations and Sequences, J. Cog.
Neurosci., 7:311-336
Arbib: CS564 "Brain Theory and AI" Syllabus
21. 11/4
Adaptive networks 4: Learning Sequences
22. 11/6
23. 11/11
Practical Introduction to NSL-MatLab 4: From
Backprop to Learning Sequences (BPTT) (Bonaiuto)
Further modeling of the mirror neuron system
24. 11/13
(Augmented) Competitive Queuing
25. 11/18
Neural Models of Imitation
26. 11/20
Prefrontal cortex: Working memories; Neural
mechanisms for planning
27. 11/25
28. 12/2
29. 12/4
From Action to Language: The Mirror System
Hypothesis 1
From Action to Language: The Mirror System
Hypothesis 2
Project 4 due
Final Exam
Fall 2007
4
Required: TMB2: 3.2. Material on
Stability; 8.2 Material on Associative
Networks and Hopfield Networks.
HBTNN 2: Recurrent Networks: Learning
Algorithms (Doya); Associative Networks
(Anderson); Computing with Attractors
(Hertz).
Optional: HBTNN: Dynamics and
bifurcations in neural nets (Ermentrout).
Bonaiuto, J., Rosta, E., and Arbib, M.A.,
2005, Extending the Mirror Neuron
System Model, I: Audible Actions and
Invisible Grasps
Bonaiuto & Arbib: ACQ paper;
HBTNN: Competitive Queuing
Oztop, E., Kawato, M., & Arbib, M.A.,
2006, Mirror Neurons and Imitation: A
Computationally Guided Review, Neural
Networks
HBTNN 2e. Required: Cortical Memory;
Prefrontal Cortex in Temporal
Organization of Action; Patricia S.
Goldman-Rakic, Seamas P. O Scalaidhe,
and Matthew V. Chafee, 2000, Domain
Specificity in Cognitive Systems, In The
New Cognitive Neurosciences (2nd ed.),
Edited by Michael S. Gazzaniga et al.,
Cambridge, MA: MIT Press. (You can
find this book by going to Cognet.)
Required: Arbib, M.A., 2005, From
Monkey-like Action Recognition to
Human Language: An Evolutionary
Framework for Neurolinguistics,
Behavioral and Brain Sciences, 28:105167.) with supplement at
http://www.bbsonline.org/Preprints/Arbib05012002/Supplemental/Arbib.EResponse_Supplemental.pdf
Readings from the previous lecture
continued.
Download