Relevance LVQ versus SVM

Autor(en): Hammer, B
Strickert, M
Villmann, T
Herausgeber: Rutkowski, L
Siekmann, J
Tadeusiewicz, R
Zadeh, LA
Stichwörter: Computer Science; Computer Science, Artificial Intelligence; VECTOR QUANTIZATION
Erscheinungsdatum: 2004
Herausgeber: SPRINGER-VERLAG BERLIN
Journal: ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING - ICAISC 2004
Lecture Notes in Artificial Intelligence
Volumen: 3070
Startseite: 592
Seitenende: 597
Zusammenfassung: 
The support vector machine (SVM) constitutes one of the most successful current learning algorithms with excellent classification accuracy in large real-life problems and strong theoretical background. However, a SVM solution is given by a not intuitive classification in terms of extreme values of the training set and the size of a SVM classifier scales with the number of training data. Generalized relevance learning vector quantization (GRLVQ) has recently been introduced as a simple though powerful expansion of basic LVQ Unlike SVM, it provides a very intuitive classification in terms of prototypical vectors the number of which is independent of the size of the training set. Here, we discuss GRLVQ in comparison to the SVM and point out its beneficial theoretical properties which are similar to SVM whereby providing sparse and intuitive solutions. In addition, the competitive performance of GRLVQ is demonstrated in one experiment from computational biology.
Beschreibung: 
7th International Conference on Artificial Intelligence and Soft Computing, Zakopane, POLAND, JUN 07-11, 2004
ISBN: 9783540221234
ISSN: 03029743

Show full item record

Google ScholarTM

Check

Altmetric