Distance Learning in Discriminative Vector Quantization

Petra Schneider*, Michael Biehl, Barbara Hammer

*Corresponding author for this work

    Research output: Contribution to journalArticleAcademicpeer-review

    79 Citations (Scopus)
    58 Downloads (Pure)

    Abstract

    Discriminative vector quantization schemes such as learning vector quantization (LVQ) and extensions thereof offer efficient and intuitive classifiers based on the representation of classes by prototypes. The original methods, however, rely on the Euclidean distance corresponding to the assumption that the data can be represented by isotropic clusters. For this reason, extensions of the methods to more general metric structures have been proposed, such as relevance adaptation in generalized LVQ (GLVQ) and matrix learning in GLVQ. In these approaches, metric parameters are learned based on the given classification task such that a data-driven distance measure is found. In this letter, we consider full matrix adaptation in advanced LVQ schemes. In particular, we introduce matrix learning to a recent statistical formalization of LVQ, robust soft LVQ, and we compare the results on several artificial and real-life data sets to matrix learning in GLVQ, a derivation of LVQ-Iike learning based on a (heuristic) cost function. In all cases, matrix adaptation allows a significant improvement of the classification accuracy. Interestingly, however, the principled behavior of the models with respect to prototype locations and extracted matrix dimensions shows several characteristic differences depending on the data sets.

    Original languageEnglish
    Pages (from-to)2942-2969
    Number of pages28
    JournalNeural computation
    Volume21
    Issue number10
    DOIs
    Publication statusPublished - Oct-2009

    Keywords

    • GENERALIZATION ABILITY

    Fingerprint

    Dive into the research topics of 'Distance Learning in Discriminative Vector Quantization'. Together they form a unique fingerprint.

    Cite this