1 Introduction

advertisement
Intelligent Systems I - Empirical Inference
1. I NTRODUCTION
Michael Hirsch & Jonas Peters
16. April 2015
Intelligent Systems Lecture Series
Inference & Intelligence
Reasoning under uncertainty
Intelligence
[“Mainstream Science on Intelligence”, WSJ, 13 Dec. 1994]
A very general mental capability that, among other things, involves the
ability to reason, plan, solve problems, think abstractly, comprehend
complex ideas, learn quickly and learn from experience
Definition (Intelligence, for our purposes)
An intelligent system is an artifact that
I
collects information about the world
I
discovers (infers) a structured model of the world
I
acts to influence the world
Inference & Intelligence
Reasoning under uncertainty
Intelligence
[“Mainstream Science on Intelligence”, WSJ, 13 Dec. 1994]
A very general mental capability that, among other things, involves the
ability to reason, plan, solve problems, think abstractly, comprehend
complex ideas, learn quickly and learn from experience
Definition (Intelligence, for our purposes)
An intelligent system is an artifact that
I
collects information about the world
I
discovers (infers) a structured model of the world
I
acts to influence the world
last semester
this course
next semester
Outline
1 Logistics
2 Introduction and Motivation
3 Probability Theory
Outline
1 Logistics
2 Introduction and Motivation
3 Probability Theory
Logistics & Organisation
Lecturers: Michael Hirsch | michael.hirsch@tuebingen.mpg.de
Jonas Peters | jonas.peters@tuebingen.mpg.de
Tutors: Behzad Tabibian | btabibian@tuebingen.mpg.de
Carl Johann Simon-Gabriel | cjsimon@tuebingen.mpg.de
Lecture: Thursdays, 8 am c.t. - 10 am in A104
Webpage: http://webdav.tuebingen.mpg.de/lectures/ei-SS2015/
contains slides and exercise sheets
Mailing List: https://groups.google.com/d/forum/is-ei-2015
Exercises: Tuesdays, 8 am c.t. - 10 am, A302
need 30% correct to be admitted to exam
50% correct → 0.3 bonus for final grade
70% correct → 0.6 bonus for final grade
90% correct → 1.0 bonus for final grade
Final grade: 100% exam
Credit: If (and only if) you pass the exam, you will get the
credits (4) for this course.
Exercises
• Tuesdays, 8 am c.t. - 10 am in A302.
• First sesssion next Tuesday, 21. April.
• Introduction to Python and IPython Notebooks
• Bring your laptop along
• First exercise:
Follow these instructions
http://btabibian.github.io/notebooks/learnpython/
to install Python environment incl. numpy, scipy, matplotlib.
• Each week one sheet with homework assignment
• Team up with a fellow student (teams fo 2)
• No copying, otherwise 0 points on exercise sheet
Related Courses
• Andreas Schilling: Maschinelles Lernen
Probability Theory, Bayes Theorem, Likelihood and MAP estimation, PCA, EM
algorithm, Bayesian Networks, Hidden Markov Models, kNN, SVM, Structural
Risk Minimization
• Martin Giese: Machine Learning II
Probability Theory, Bayesian networks, Hidden Markov Models, Variational
Inference, Sampling methods, GPs, SVM
• Alexandra Kirsch: Advanced Artificial Intelligence
Search Methods, Probabilistic Reasoning, Bayesian Networks, AI Learning
(case-based reasoning, explanation-based learning, decision trees, neural
networks, naive bayes), Overview of Cognitive Systems, Action Selection (incl. AI
planning, MDPs), Perception (incl. dynamic Bayesian networks), Agent
Architectures
• Martin Spüler: Neuronal Computing
• Martin Butz: Advanced Neural Networks
Backpropagation, Deep Learning, Reservoir Networks, Echo State Networks,
Convolutional Networks, Entropy and Free Energy based Networks
Overview of covered topics
• Probability Theory
• Unsupervised Learning
• Density Estimation
• Mixture Models
• Dimensionality Reduction incl. PCA, ICA, NMF, LDA
• Supervised Learning
• Nearest Neighbor Search, Clustering
• Classification
• Linear and Kernel Regression
• Gaussian Processes
• Support Vector Machines and Kernels
• Neural Networks
• Statistical Learning Theory
• Short introduction to Causality (core concepts)
Some Literature
but this course is not based explicitly on any of them
Free online:
• David J C MacKay
Cambridge, 2003
Information Theory, Inference, and Learning Algorithms
http://www.inference.phy.cam.ac.uk/itprnn/book.pdf
• David Barber
Cambridge, 2012
Bayesian Reasoning and Machine Learning
http://web4.cs.ucl.ac.uk/staff/D.Barber/textbook/270212.pdf
• Carl E Rasmussen & Christopher K I Williams
MIT, 2006
Gaussian Processes for Machine Learning
http://www.gaussianprocess.org/gpml/chapters/RW.pdf
Non-free:
• Judea Pearl
Morgan Kaufmann, 1988
Probabilistic Reasoning in Intelligent Systems
• Christopher Bishop
Springer, 2007
Pattern Recognition and Machine Learning
• Bernhard Schölkopf & Alexander J Smola
MIT, 2001
Learning with Kernels
• Edwin T Jaynes & G Larry Bretthorst
Probability Theory – the Logic of Science
Cambridge, 2003
Credit
• Some slides are borrowed from the lecture of Prof. Stefan
Harmeling and Dr. Philipp Hennig (WS 2013/2014)
Outline
1 Logistics
2 Introduction and Motivation
3 Probability Theory
Inference
Reasoning under Uncertainty
Definition (Inference)
An inference problem requires statements about the value of an
unobserved (latent) variable x based on observations y which are
related to x, but not sufficient to fully determine x. This requires a
notion of uncertainty.
Inference in real life
Inference in real life
Inference in real life
Johannes Kepler (1571 - 1630)
T12
T22
=
a31
a32
Stock market
Spam classification
Large Hadron Collider - Higgs Boson
Object recognition
Challenges - Semantics
Challenges - Semantics
Challenges - Causality
Challenges - Causality
Causal Inference
interventional test data point
(intervention on gene 5954)
1
interventional training data
(interv. on genes other than 5954 and 4710)
−5
−4
−1.0
ACTIVITY GENE 4710
−3
−2
−1
0
ACTIVITY GENE 4710
−0.5
0.0
0.5
observational training data
−1.0
−0.5
0.0
0.5
ACTIVITY GENE 5954
−1.0
−0.5
0.0
0.5
ACTIVITY GENE 5954
−5
−4
−3
−2
−1
0
ACTIVITY GENE 5954
1
Outline
1 Logistics
2 Introduction and Motivation
3 Probability Theory
Download