Abstract
Support vector machines (SVMs) are very popular methods for solving classification problems that require mapping input features to target labels. When dealing with real-world data sets, the different classes are usually not linearly separable, and therefore support vector machines employ a particular kernel function. Such a kernel function computes the similarity between two input patterns, but has as drawback that all input dimensions are considered equally important for computing this similarity. In this paper we proposea novel method that uses the dual objective of the SVM in order to update scaling weight vectors to scale
different input features. We developed a gradient descent method that updates the scaling weight vectors to minimize the dual objective, after which the SVM is retrained, and this procedure is repeated a number of times. Experiments on noisy data sets show that our proposed algorithm leads to significantly higher accuracies than obtained with the standard SVM.
different input features. We developed a gradient descent method that updates the scaling weight vectors to minimize the dual objective, after which the SVM is retrained, and this procedure is repeated a number of times. Experiments on noisy data sets show that our proposed algorithm leads to significantly higher accuracies than obtained with the standard SVM.
| Original language | English |
|---|---|
| Title of host publication | Belgian Dutch Artificial Intelligence Conference |
| Subtitle of host publication | BNAIC |
| Number of pages | 8 |
| Publication status | Published - 2011 |
Keywords
- Machine learning
- Support Vector Machine