Generalized relevance learning vector quantization

Autor(en): Hammer, B
Villmann, T
Stichwörter: adaptive metric; clustering; Computer Science; Computer Science, Artificial Intelligence; learning vector quantization; Neurosciences; Neurosciences & Neurology; relevance determination; SELECTION; SELF-ORGANIZING MAPS
Erscheinungsdatum: 2002
Herausgeber: PERGAMON-ELSEVIER SCIENCE LTD
Journal: NEURAL NETWORKS
Volumen: 15
Ausgabe: 8-9
Startseite: 1059
Seitenende: 1068
Zusammenfassung: 
We propose a new scheme for enlarging generalized learning vector quantization (GLVQ) with weighting factors for the input dimensions. The factors allow an appropriate scaling of the input dimensions according to their relevance. They are adapted automatically during training according to the specific classification task whereby training can be interpreted as stochastic gradient descent on an appropriate error function. This method leads to a more powerful classifier and to an adaptive metric with little extra cost compared to standard GLVQ. Moreover, the size of the weighting factors indicates the relevance of the input dimensions. This proposes a scheme for automatically pruning irrelevant input dimensions. The algorithm is verified on artificial data sets and the iris data from the UCI repository. Afterwards, the method is compared to several well known algorithms which determine the intrinsic data dimension on real world satellite image data. (C) 2002 Elsevier Science Ltd. All rights reserved.
ISSN: 08936080
DOI: 10.1016/S0893-6080(02)00079-5

Show full item record

Google ScholarTM

Check

Altmetric