Lecture 5, Support Vector Machines [Örjan]
Tid: Måndag 23 september 2013 kl 15:00 - 17:00
Plats: E1
Aktivitet: Föreläsning
Studentgrupper: TCSCM1-AS, TCSCM1-BER, TCSCM1-PRS, TCSCM1-SPR, TITMM2, TIVNM1, TKOMK3, TMAIM1, TMAIM1-BIO, TMAIM1-IR, TMAIM1-PC, TSCRM1, TSCRM2
Info:
Support Vector Machines
Readings: Marsland, Chapter 5
- How does linear separation act in high-dimensional spaces?
- What does empirical and structural risk refer to?
- Why are classification margins good for generalization performance?
- When are slack variables useful?
- How can a support vector machine be trained?
- Why is the dual optimization problem often easier to solve?
- What is a support vector?
- What is the advantage of using kernel functions?