FUF010 - Quantum Matter: Coherence and Correlations

advertisement
FFR 135 – Neural Networks (FY 4210 –Neural Networks)
7.5 ECTS credit units
Grading: Fail, 3, 4, 5
Level: Advanced
Department: Applied Physics
Teaching language: English
1st quarter
In programs
Engineering Physics
FCMAS (Complex Adaptive Systems)
NKASM (Complex Adaptive Systems)
TDATA (Datateknik)
TELTA (Elektroteknik)
TITEA (Informationsteknik, årskurs 3)
TITEA (Informationsteknik Årskurs 4)
TAUTA (Automation och mekatronik)
Examiner
Professor Bernhard Mehlig
Aim
Neural networks are distributed computational models inspired by the structure of the
human brain, consisting of many simple processing elements which are connected in a
network. Neural networks are increasingly used in the engineering sciences for tasks
such as pattern recognition, prediction and control. The theory of neural networks is a
inter-disciplinary field (neurobiology, computer science and statistical physics).
The course gives an overview and a basic understanding of neural-network algorithms.
can develop an understanding of when neural networks are useful in application problems
Learning outcomes
After successfully completing this course the students shall be able to







understand and explain strengths and weaknesses of the neural-network
algorithms discussed in class
determine under which circumstances neural networks are useful in real
applications
distinguish between supervised and unsupervised learning and explain the key
principles of the corresponding algorithms
efficiently and reliably implement the algorithms introduced in class on a
computer, interpret the results of computer simulations
describe principles of more general optimisation algorithms
write well-structured technical reports in English presenting and explaining
analytical calculations and numerical results
communicate results and conclusions in a clear and logical fashion
Content
 Introduction to neural networks (McCulloch –Pitts neurons, associative memory
problem, Hopfield model and Hebb’s rule, storage capacity, energy function)
 Stochastic neural networks (noise, order parameter, mean-field theory for the
storage capacity)
 Optimisation
 Supervised learning: perceptrons and layered networks (feed-forward networks,
multilayer perceptrons, gradient descent, backpropagation, conjugate-gradient
methods, performance of multilayer networks)
 Unsupervised learning (Hebbian learning, Oja’s rule, competitive learning,
topographic maps)
 Recurrent networks and time-series analysis (recurrent backpropagation,
backpropagation in time
 Reinforcement learning
Organization
Lectures., set homework problems, examples classes.
Web-based course evaluation.
Literature
Lecture notes will be made available. They are based on the course book: Hertz, A.
Krogh, and R. G. Palmer, Introduction to the theory of neural computation, AddisonWesely, Redwood City (1991).
Additional reading: S. Haykin, Neural Networks: a comprehensive foundation, 2nd ed.,
Prentice Hall, New Jersey (1999)
Examination
The examination is based on exercises and homework assignments (100%). The
examinator must be informed within a week after the course starts if a student would like
to receive ECTS grades.
Prerequisites
Sufficient knowledge of Mathematics (analysis in one real variable, linear algebra), basic
programming skills .
Download