A framework for feature selection through boosting

Ahmad Alsahaf*, Nicolai Petkov, Vikram Shenoy, George Azzopardi

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

80 Citations (Scopus)
705 Downloads (Pure)


As dimensions of datasets in predictive modelling continue to grow, feature selection becomes increasingly practical. Datasets with complex feature interactions and high levels of redundancy still present a challenge to existing feature selection methods. We propose a novel framework for feature selection that relies on boosting, or sample re-weighting, to select sets of informative features in classification problems. The method uses as its basis the feature rankings derived from fast and scalable tree-boosting models, such as XGBoost. We compare the proposed method to standard feature selection algorithms on 9 benchmark datasets. We show that the proposed approach reaches higher accuracies with fewer features on most of the tested datasets, and that the selected features have lower redundancy.
Original languageEnglish
Article number115895
JournalExpert systems with applications
Early online date16-Sept-2021
Publication statusPublished - Jan-2022


  • Feature selection
  • Boosting
  • Ensemble learning
  • XGBoost


Dive into the research topics of 'A framework for feature selection through boosting'. Together they form a unique fingerprint.

Cite this