Computational Neuroscience

advertisement
BSCS 2013
Computational Neuroscience
Summary
This course aims to provide a comprehensive picture of the quantitative approaches to
understand neural dynamics and function. We will discuss how to create mathematical models of
various structures and functions of the brain on different levels of abstraction, fitting these
models to explain measurement data and possibilities for drawing conclusion from their analysis.
The course will put an emphasis on explaining cognitive phenomena by computational models,
and also connecting to related research areas, like biophysics and machine learning. Informal
discussions of the questions arising during the course will take place.
Mihály Bányai
Wigner Research Centre for Physics of the Hungarian Academy of Sciences
Computational Systems Neuroscience Lab
cneuro.rmki.kfki.hu/people/banmi
banyai.mihaly@wigner.mta.hu
DETAILED SCHEDULE
Day 1. Introduction
What is computational neuroscience? The neural code. Brief history of computational
neuroscience.
What is a model? Types of models. Interplay between computational neuroscience and machine
learning. Structure, dynamics and function.
Case study - visual classification. A model neuron. Brief recap on probability theory. A network
model. Setting the parameters. Inference and data generation.
Validating a model on data.
Day 2. Bottom-up modelling
How to choose a scale? Choosing the spike - the Hodgkin-Huxley model. The dynamical systems
picture. Abstracting away what we care about.
Temporal codes. Oscillations and phase codes. Rate codes. Digital neurons.
The opposite direction - spatially extended models. Beyond spikes: dendritic computation,
retrograde signals, other signaling pathways.
Synapse models. Network models. Example: synaptic theory of working memory, limitations.
Towards the unit of computation. Canonical microcircuits. Canonical computations.
Day 3. Modelling how the brain models things
What are learning and memory? Types of memory. Types of learning problems. Neural models for
learning.
Perceptron. Learning rules: Hebbian and delta. Learning logical functions.
1
Computational Neuroscience
Feed-forward network. Error backpropagation.
Reinforcement learning with neural models. State space represeantations. Temporal difference
learning.
Hopfield network. Attractor dynamics.
Day 4. Representing uncertainty and learning structure
From uncertainty to probability and other choices. Probabilistic models. Causality.
Perception as probabilistic inference. Probabilistic perceptron. Possible representations of
random quantities in the cortex.
Probabilistic extension of the Hopfield model. Boltzmann machines. Sampling from a
distribution. Learning distributions.
Choosing an optimal model. Learning the structure of the data.
Prerequisites
Only understanding of basic mathematical principles will be assumed. All used mathematical
concepts from calculus, algebra, dynamical systems and probability theory will be thoroughly
discussed. If needed, references to textbook material on mathematical concepts will be provided.
Basic biological knowledge of neural systems and measurement techniques will be assumed on
the level provided by preceding BSCS courses.
Recommended reading
The course does not follow a single textbook, certain chapters from the following books cover
most of the material:
P. Dayan and L. F. Abbott: Theoretical Neuroscience
C. M. Bishop: Pattern Recognition and Machine Learning
R. P. N. Rao, B. A. Olshausen and M. S. Lewicki: Probabilistic Models of the Brain
D. Marr: Vision
C. Koch: Biophysics of Computation
The following articles from the Scholarpedia Encyclopedia of Computational Neuroscience
provide additional details and overviews of some topics of the course:
http://www.scholarpedia.org/article/Working_memory
http://www.scholarpedia.org/article/Models_of_visual_cortex
http://www.scholarpedia.org/article/Conductance-based_models
http://www.scholarpedia.org/article/Thalamocortical_oscillations
http://www.scholarpedia.org/article/Attractor_network
http://www.scholarpedia.org/article/Boltzmann_machine
2
Computational Neuroscience
http://www.scholarpedia.org/article/Hopfield_network
http://www.scholarpedia.org/article/Reinforcement_learning
http://www.scholarpedia.org/article/Temporal_difference_learning
http://www.scholarpedia.org/article/Bayesian_statistics
Exam
There will be a written exam covering all topics on Day 5.
3
Download