The Mathematics of Divergence Based Online Learning in Vector Quantization

Thomas Villmann, Sven Haase, Frank-Michael Schleif, Barbara Hammer, Michael Biehl

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

6 Citations (Scopus)


We propose the utilization of divergences in gradient descent learning of supervised and unsupervised vector quantization as an alternative for the squared Euclidean distance. The approach is based on the determination of the Fréchet-derivatives for the divergences, wich can be immediately plugged into the online-learning rules. We provide the mathematical foundation of the respective framework. This framework includes usual gradient descent learning of prototypes as well as parameter optimization and relevance learning for improvement of the performance.
Original languageEnglish
Title of host publicationArtificial Neural Networks In Pattern Recognition
Subtitle of host publicationProc. ANNPR 2010
Number of pages12
Publication statusPublished - 2010

Publication series

NameLecture Notes in Computer Science


  • classification
  • clustering
  • information theory
  • divergence based learning
  • vector quantization

Cite this