Kernel Learning in Support Vector Machines using Dual-Objective Optimization

Auke-Dirk Pietersma, Lambertus Schomaker, Marco Wiering

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)
98 Downloads (Pure)


Support vector machines (SVMs) are very popular methods for solving classification problems that require mapping input features to target labels. When dealing with real-world data sets, the different classes are usually not linearly separable, and therefore support vector machines employ a particular kernel function. Such a kernel function computes the similarity between two input patterns, but has as drawback that all input dimensions are considered equally important for computing this similarity. In this paper we proposea novel method that uses the dual objective of the SVM in order to update scaling weight vectors to scale
different input features. We developed a gradient descent method that updates the scaling weight vectors to minimize the dual objective, after which the SVM is retrained, and this procedure is repeated a number of times. Experiments on noisy data sets show that our proposed algorithm leads to significantly higher accuracies than obtained with the standard SVM.
Original languageEnglish
Title of host publication Belgian Dutch Artificial Intelligence Conference
Subtitle of host publicationBNAIC
Number of pages8
Publication statusPublished - 2011


  • Machine learning
  • Support Vector Machine

Cite this