Till KTH:s startsida Till KTH:s startsida

Lecture 5, Support Vector Machines [Örjan]

Tid: Måndag 23 september 2013 kl 15:00 - 17:00 2013-09-23T15:00:00 2013-09-23T17:00:00

Kungliga Tekniska högskolan
HT 2013

Plats: E1

Aktivitet: Föreläsning

Studentgrupper: TCSCM1-AS, TCSCM1-BER, TCSCM1-PRS, TCSCM1-SPR, TITMM2, TIVNM1, TKOMK3, TMAIM1, TMAIM1-BIO, TMAIM1-IR, TMAIM1-PC, TSCRM1, TSCRM2

Info:

Support Vector Machines

Readings: Marsland, Chapter 5

  • How does linear separation act in high-dimensional spaces?
  • What does empirical and structural risk refer to?
  • Why are classification margins good for generalization performance?
  • When are slack variables useful?
  • How can a support vector machine be trained?
  • Why is the dual optimization problem often easier to solve?
  • What is a support vector?
  • What is the advantage of using kernel functions?

Slides on SVM

Schemahandläggare skapade händelsen 13 mars 2013
Schemahandläggare redigerade 15 augusti 2013

[u'TIVNM1', u'TKOMK3', u'TMAIM1-PC', u'TCSCM1-BER', u'TMAIM1', u'TCSCM1-AS', u'TCSCM1-SPR', u'TMAIM1-IR', u'TMAIM1-BIO', u'TSCRM1', u'TITMM2', u'TSCRM2', u'TCSCM1-PRS']

Lärare Örjan Ekeberg redigerade 16 augusti 2013

FöreläsningLecture 5, Support Vector Machines [Örjan]

Support Vector Machines Readings: Marsland, Chapter 5¶


* How does linear separation act in high-dimensional spaces?
* What does empirical and structural risk refer to?
* Why are classification margins good for generalization performance?
* When are slack variables useful?
* How can a support vector machine be trained?
* Why is the dual optimization problem often easier to solve?
* What is a support vector?
* What is the advantage of using kernel functions?

Schemahandläggare redigerade 31 augusti 2013

[u'TIVNM1', u'TKOMK3', u'TMAIM1-PC', u'TCSCM1-BER', u'TMAIM1', u'TCSCM1-AS', u'TCSCM1-SPR', u'TMAIM1-IR', u'TCSCM1-AS, TCSCM1-BER, TCSCM1-PRS, TCSCM1-SPR, TITMM2, TIVNM1, TKOMK3, TMAIM1, TMAIM1-BIO', u'TSCRM1', u'TITMM2', u'TSCRM2', u'TCSCM1-PRS']TMAIM1-IR, TMAIM1-PC, TSCRM1, TSCRM2

Schemahandläggare redigerade 14 september 2013

TCSCM1-AS, [u'TIVNM1', u'TKOMK3', u'TMAIM1-PC', u'TCSCM1-BER', TCSCM1-PRS, u'TMAIM1', u'TCSCM1-AS', u'TCSCM1-SPR', TITMM2, TIVNM1, TKOMK3, TMAIM1, u'TMAIM1-IR', u'TMAIM1-BIO', TMAIM1-IR, TMAIM1-PC, TSCRM1, TSCRM2u'TSCRM1', u'TITMM2', u'TSCRM2', u'TCSCM1-PRS']

Administratör Örjan Ekeberg redigerade 19 september 2013

Support Vector Machines Readings: Marsland, Chapter 5


* How does linear separation act in high-dimensional spaces?
* What does empirical and structural risk refer to?
* Why are classification margins good for generalization performance?
* When are slack variables useful?
* How can a support vector machine be trained?
* Why is the dual optimization problem often easier to solve?
* What is a support vector?
* What is the advantage of using kernel functions?
Slides on SVM¶

Schemahandläggare ställde in händelsen 14 december 2013

Hela världen får läsa.

Senast ändrad 2013-12-14 00:59

Taggar: Saknas än så länge.