- Relevant concepts from probability theory and estimation
- Introduction to synthesis problems and generative models
- Principles of synthesis versus classification
- Regression versus probabilistic modelling
- Modelling goals and evaluation
- Mixture density networks (MDNs)
- Autoregression and large language models (LLMs)
- Normalising flows
- Variational autoencoders (VAEs)
- Diffusion models and flow matching
- Generative adversarial networks (GANs)
- Subjective evaluation
- Hybrid approaches
- Recent developments
- Ethical aspects of generative AI
DD2601 Deep Generative Models and Synthesis 7.5 credits

Generative AI is changing the world. This course teaches you the most common paradigms in deep generative modelling, along with key principles for using and evaluating such models for synthesis.
Take this course if you want to…
- Learn generative modelling and deep generative modelling
- Learn the principles behind GenAI and other synthesis applications of machine learning
- Learn to train and tune generative models
Conversely, be aware that this course does not teach you to…
- Perform prompt engineering and content production with existing GenAI tools
- Develop new theory and paradigms in generative modelling
- Become an expert in generative models for a specific application domain, for example images or text
The course assumes prior knowledge of deep learning (see prerequisites), so it will not teach you what various deep-learning architectures such as RNNs or transformers look like on the inside.
Information per course offering
Information for Autumn 2026 Start 24 Aug 2026 programme students
- Course location
KTH Campus
- Duration
- 24 Aug 2026 - 23 Oct 2026
- Periods
Autumn 2026: P1 (7.5 hp)
- Pace of study
50%
- Application code
11296
- Form of study
Normal Daytime
- Language of instruction
English
- Course memo
- Number of places
1 - 125
- Target group
- Open to all master's programmes as long as it can be included in the programme.
- Planned modular schedule
- [object Object]
- Schedule
- Schedule is not published
Contact
Course syllabus as PDF
Please note: all information from the Course syllabus is available on this page in an accessible format.
Course syllabus DD2601 (Autumn 2025–)Content and learning outcomes
Course contents
Intended learning outcomes
After passing the course, the students should be able to:
- characterise synthesis problems, deep generative methods, and their applications
- distinguish different objectives, performance measures, and common problems with generative modelling
- describe the relation between deep generative models and regression-based methods
- train and tune deep generative models on different datasets
- evaluate generative models objectively and subjectively
- discuss ethical aspects of particular relevance to generative AI
in order to
- be able to judiciously use deep generative modelling to solve problems in industry and/or academia.
Literature and preparations
Specific prerequisites
Knowledge in deep learning, 6 credits, equivalent to completed course DD2424/DD2437.
Active participation in DD2424/DD2437 whose final examination has not yet been reported to Ladok is equated with course completion.
Knowledge and skills in programming, 6 credits, equivalent to completed course DD1337/DD1310-DD1319/DD1321/DD1331/DD100N/ID1018.
Knowledge in multivariable analysis, 7.5 credits, equivalent to completed course SF1626.
Knowledge in probability theory and statistics, 6 credits, equivalent to completed course SF1910-SF1925/SF1935.
Recommended prerequisites
- Good programming skills including Python, PyTorch, Jupyter Notebooks.
- Algebra and geometry including vectors, matrices, systems of linear equations, inner and outer products, norms, triangle inequality, metric spaces, determinants, eigenvalues, linear dependence, subspaces, trace of a matrix.
- Single-variable calculus including functions, domains, ranges, monotonicity, exponentials and logarithms, limits, sequences, change of variables, convex functions, ordinary differential equations (ODEs), Euler’s method.
- Multivariate calculus including partial derivatives, multivariate chain rule, change of variables, gradients, Hessian and Jacobian matrices, Fourier series.
- Probability theory including probability, conditional probability, Bayes’ law, independence, random variables, probability mass and density functions, samples, random sampling, expectation/mean, variance, standard deviation, median, correlation, covariance, uniform distributions, multivariate Gaussian distributions and their properties, conditional expectation, parameter estimation, maximum-likelihood estimation, consistency, change of variables, Jensen’s inequality, Markov chains, least-squares regression.
- Machine learning including optimisation, loss functions, train/val/test sets, mean squared error, classification, accuracy, overfitting, Gaussian mixture models, high-dimensional geometry/curse of dimensionality, baselines, ablation studies. Information theory for machine learning including entropy, bits, cross-entropy, Kullback-Leibler divergence.
- Deep learning including feed-forward networks, activation functions, ReLU, softmax, CNNs, RNNs, residual networks, skip connections, U-Nets, transformer architectures, self-attention, position embeddings, mean and variance normalisation, initialisation, hyperparameters, stochastic gradient descent, updates, epochs, dropout.
Important: Taking DD2437 in parallel (i.e., in the same study period as DD2601) does not satisfy the necessary specific prerequisites for DD2601!
Literature
Examination and completion
Grading scale
Examination
- LAB1 - Digital Assignment with Oral Comprehension Questions, 7.5 credits, grading scale: A, B, C, D, E, FX, F
Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.
The examiner may apply another examination format when re-examining individual students.
If the course is discontinued, students may request to be examined during the following two academic years.
Examiner
Ethical approach
- All members of a group are responsible for the group's work.
- In any assessment, every student shall honestly disclose any help received and sources used.
- In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.
Further information
Course room in Canvas
Offered by
Main field of study
Education cycle
Supplementary information
In this course, the EECS code of honor applies, see:
http://www.kth.se/en/eecs/utbildning/hederskodex