Skip to main content
Till KTH:s startsida

DD2437 Artificial Neural Networks and Deep Architectures 7.5 credits

The course serves as a fundamental introduction to computational problems in artificial neural networks (ANNs) and provides more detailed insights into the problem of generalisation, computational nature of supervised as well as unsupervised learning in different network types and deep learning algorithms. The course offers an opportunity to develop the conceptual and theoretical understanding of computational capabilities of ANNs starting from simpler systems and progressively studying more advanced network architectures. An important objective of the course is for the students to gain practical experience of selecting, developing, applying and validating suitable networks and algorithms to effectively address a broad class of regression, classification, temporal prediction, data modelling, explorative data analytics or clustering problems.

Information per course offering

Choose semester and course offering to see current information and more about the course, such as course syllabus, study period, and application information.

Termin

Information for Spring 2025 annda125 programme students

Course location

KTH Campus

Duration
14 Jan 2025 - 16 Mar 2025
Periods
P3 (7.5 hp)
Pace of study

50%

Application code

61667

Form of study

Normal Daytime

Language of instruction

English

Course memo
Course memo is not published
Number of places

Places are not limited

Target group

Searchable for students from year 3 and for students admitted to a master's programme as long as it can be included in your programme.

Planned modular schedule
[object Object]
Part of programme

Contact

Examiner
No information inserted
Course coordinator
No information inserted
Teachers
No information inserted
Contact

Pawel Herman e-post: paherman@kth.se

Course syllabus as PDF

Please note: all information from the Course syllabus is available on this page in an accessible format.

Course syllabus DD2437 (Autumn 2024–)
Headings with content from the Course syllabus DD2437 (Autumn 2024–) are denoted with an asterisk ( )

Content and learning outcomes

Course contents

The course is concerned with computational problems in massively parallel artificial neural network (ANN) architectures, which rely on distributed simple computational nodes and robust learning algorithms that iteratively adjust the connections between the nodes by making extensive use of available data. The learning rule and network architecture determine specific computational properties of the ANN. The course offers a possibility to develop the conceptual and theoretical understanding of the computability of ANNs starting from simpler systems and then gradually study more advanced architectures.  A wide range of learning types are thus studied – from strictly supervised to purely exploratory unsupervised situations. The course content therefore includes among others multi-layer perceptrons (MLPs), self-organising maps (SOMs), Boltzmann machines, Hopfield networks and state-of-the-art deep neural networks (DNNs) along with the corresponding learning algorithms. An important objective of the course is for the students to gain practical experience of selecting, developing, applying and validating suitable networks and algorithms to effectively address a broad class of regression, classification, temporal prediction, data modelling, explorative data analytics or clustering problems. Finally, the course provides revealing insights into the principles of generalisation capabilities of ANNs, which underlie their predictive power.

Intended learning outcomes

After completing the course, the students shall be able to

  • describe the structure and the function of the most common artificial neural network types (ANN), e.g. (feedforward) multi layer perceptron, recurrent network, self organising maps, Boltzmann machine, deep belief network, autoencoder, and give examples of their applications
  • explain mechanisms of supervised/unsupervised learning from data- and information processing in different ANN architectures, and give an account for derivations of the basic ANN algorithms discussed in the course
  • demonstrate when and how deep architectures lead to increased performance in pattern recognition and data mining problems
  • quantitatively analyse the process and outcomes of learning in ANNs, and account for their shortcomings, limitations
  • apply, validate and evaluate suggested types of ANNs in typical small problems in the realm of regression, prediction, pattern recognition, scheduling and optimisation
  • design and implement ANN approaches to selected problems in pattern recognition, system identification or predictive analytics using commonly available development tools, and critically examine their effectiveness

in order to

  • obtain an understanding of the technical potential as well as advantages and limitations of today's learning, adaptive and self-organizing systems,
  • acquire the ANN practitioner’s competence to apply and develop ANN based solutions to data analytics problems.

Literature and preparations

Specific prerequisites

Knowledge and skills in programming, 6 credits, equivalent to completed course DD1337/DD1310-DD1319/DD1321/DD1331/DD100N/ID1018.

Knowledge in linear algebra, 7,5 higher education credits, equivalent to completed course SF1624/SF1672/SF1684.

Knowledge in multivariable calculus, 7,5 higher education credits, equivalent to completed course SF1626/SF1674.

Knowledge in probability theory and statistics, 6 higher education credits, equivalent to completed course SF1910-SF1924/SF1935.

Recommended prerequisites

The mandatory courses in mathematics, numerical analysis and computer science for D, E, and F-students or the equivalent.

Equipment

No information inserted

Literature

[1] Stephen Marsland. Machine Learning, an Algorithmic Perspective, 2009,CSC-Press.

[2] Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep learning., 2016, MIT press.

Additional recommended literature will be provided on the course webpage.

Examination and completion

If the course is discontinued, students may request to be examined during the following two academic years.

Grading scale

A, B, C, D, E, FX, F

Examination

  • KON1 - Partial exam, 1.5 credits, grading scale: P, F
  • LAB2 - Laboratory assignments, 4.0 credits, grading scale: P, F
  • TEN3 - Written exam, 2.0 credits, grading scale: A, B, C, D, E, FX, F

Based on recommendation from KTH’s coordinator for disabilities, the examiner will decide how to adapt an examination for students with documented disability.

The examiner may apply another examination format when re-examining individual students.

Opportunity to complete the requirements via supplementary examination

A passed individual lab assignment can be credited in later course offerings if the assignment is unchanged (bonus points for other lab assignments will be discarded).

Opportunity to raise an approved grade via renewed examination

If the examination/re-examination is taken in later course offerings, all bonus points will be discarded.

Examiner

Ethical approach

  • All members of a group are responsible for the group's work.
  • In any assessment, every student shall honestly disclose any help received and sources used.
  • In an oral assessment, every student shall be able to present and answer questions about the entire assignment and solution.

Further information

Course room in Canvas

Registered students find further information about the implementation of the course in the course room in Canvas. A link to the course room can be found under the tab Studies in the Personal menu at the start of the course.

Offered by

Main field of study

Computer Science and Engineering, Information Technology

Education cycle

Second cycle

Add-on studies

No information inserted

Contact

Pawel Herman e-post: paherman@kth.se

Transitional regulations

The previous written examination TEN2 (3.5 higher education credits) is replaced by the written examination TEN3 (2 higher education credits) and three written tests that are combined to form the component KON1 (1.5 higher education credits). During the academic year 2022/2023 examination can be carried out within the framework of earlier instances (with TEN2) or the new model with TEN3 and KON1 (which together can be treated as earlier component TEN2).

Supplementary information

In this course, the EECS code of honor applies, see:
http://www.kth.se/en/eecs/utbildning/hederskodex