Exact inference: messsage passing, variable elimination, Factor graphs from DAG, clique graphs/trees, inferences with evidence, junction tree algorithm etc
Approximate inference: ”Loopy belief” - propagation, the Monte Carlo principen, (Markov Chain Monte Carlo (MCMC), variational methods, MAP-inference etc
Learning: parameter estimation, the maximum likelihood method, conjugate prior, Gaussian, Beta and Dirichlet distributions, partially observed data, the gradient ascent method, Expectation Maximization (EM) etc
Intended learning outcomes
After passing the course, the student should be able to
explain and discuss how different graphs represent both factorization and independent relations
explain and discuss exact inference in graphical models
use message passing algorithms for inference
explain and discuss methods for learning uncertainties in a model's parameters
explain and discuss approximate inference methods such as sampling, ”loopy belief” propagation and variational methods.
Students can obtain higher grades by explaining how the methods above can be used to solve specific problems. Highest grade can be obtained by explaining complex real research with these methods.
Learning activities
The course consists of a series of 8 lectures in which the core concepts will be covered. These lectures all have required reading to be completed before the lecture from the
Course book:
Probabilistic Graphical Models, Principles and Techniques, by Daphne Koller and Nir Friedman.
That basic knowledge is examined by four quizes given after the lecture on that topic.
In addition, there are three required tutorials that serve both as teaching and examination covering the first three learning goals.
There is no Final Exam.
To obtain a grade above E one can choose to do upto 4 additional tutorials from a selection of 7 tutorials. These vary in form and level of difficulty. All tutorials have a written uploaded report and an individual oral examination. Some have separate theory and practical parts with the theory being covered in a separate seminar. The seminars include the assignment being disscussed in groups.
The tutorials give a deeper knowledge and practical examples. Topics include: Factor Graphs (SLAM); Imitation Learning; Partially Observed Data; Markov Chain Monte Carlo; Variational Inference on Gaussian Mixture Models; Latent Dirichlet Allocation; and Variational inference with Sequential Monte Carlo.
Doing these tutorials will require that the student actively seeks to fill any gaps in knowledge. Some of the topics are quite advanced and one must read the referenced material in order to understand the assignment fully. Passive students will struggle to do the advanced tutorials.
Detailed plan
The lectures are coupled to the quizes and tutorials according to this table:
Lectures
Quizes
Tutorials
Learning Goal
Required For Passing
1-3
1 Message Passing
Representations; Exact Inference;
Tutorial 1
4
2 Bayes Nets;
3 Conditional Random Fields
Representations; Exact Inference; Learning
Tutorial 2
Tutorial 3
5
A - Learning
4 GraphSLAM;
6 Imitation Learning
Learning
A
5
B - Partially Observable Data
5 Partial Observable Data;
Learning
B
6
C - Monte Carlo Methods
7 Markov Chain Monte Carlo
Approximate Inference
C
7
D - Variational Inference
8 Variational Inference on Gaussian Mixture Models;
9 Latent Dirichlet Allocation;
10 Variational Inference with Sequential Monte Carlo
Approximate Inference; Learning
D
The ugly colors are to separate weeks that that the lectures are given.
Lectures:
The recommended reading should be done in preparation for each lecture.
Preparations before course start
Recommended prerequisites
SF1625 one variable calculus;
SF1901 Probability and statistics;
either DD2421 Machine Learning or DD2434 Machine Learning Advanced Course;
Programming in matlab and python.
Literature
Probabilistic Graphical Models, Principles and Techniques, by Daphne Koller and Nir Friedman.
Support for students with disabilities
Students at KTH with a permanent disability can get support during studies from Funka:
OVN1 - Exercises, 2.5 credits, Grading scale: P, F
OVN2 - Exercises, 2.5 credits, Grading scale: P, F
TENT - Written exam, 2.5 credits, Grading scale: P, F
Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.
The examiner may apply another examination format when re-examining individual students.
The section below is not retrieved from the course syllabus:
Exercises ( OVN1 )
Exercises ( OVN2 )
Written exam ( TENT )
Ethical approach
All members of a group are responsible for the group's work.
In any assessment, every student shall honestly disclose any help received and sources used.
In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.