Bandit-Aided Boosting

R. Busa-Fekete 1, 2 B. Kégl 1, 2, 3
3 TAO - Machine Learning and Optimisation
Inria Saclay - Ile de France, UP11 - Université Paris-Sud - Paris 11, CNRS - Centre National de la Recherche Scientifique : UMR8623, LRI - Laboratoire de Recherche en Informatique
Abstract : In this paper we apply multi-armed bandits (MABs) to accelerate ADABOOST. ADABOOST constructs a strong classifier in a stepwise fashion by selecting simple base classifiers and using their weighted "vote" to determine the final classification. We model this stepwise base classifier selection as a sequential decision problem, and optimize it with MABs. Each arm represent a subset of the base classifier set. The MAB gradually learns the "utility" of the subsets, and selects one of the subsets in each iteration. ADABOOST then searches only this subset instead of optimizing the base classifier over the whole space. The reward is defined as a function of the accuracy of the base classifier. We investigate how the MAB algorithms (UCB, UCT) can be applied in the case of boosted stumps, trees, and products of base classifiers. On benchmark datasets, our bandit-based approach achieves only slightly worse test errors than the standard boosted learners for a computational cost that is an order of magnitude smaller than with standard ADABOOST.
Type de document :
Poster
OPT 2009: 2nd NIPS Workshop on Optimization for Machine Learning, Dec 2009, Whistler, Canada


http://hal.in2p3.fr/in2p3-00580588
Contributeur : Sabine Starita <>
Soumis le : lundi 28 mars 2011 - 16:01:06
Dernière modification le : mercredi 20 juillet 2016 - 09:44:52
Document(s) archivé(s) le : mercredi 29 juin 2011 - 02:35:42

Fichier

OPT2009-BusaFekete.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : in2p3-00580588, version 1

Collections

Citation

R. Busa-Fekete, B. Kégl. Bandit-Aided Boosting. OPT 2009: 2nd NIPS Workshop on Optimization for Machine Learning, Dec 2009, Whistler, Canada. <in2p3-00580588>

Exporter

Partager

Métriques

Consultations de
la notice

188

Téléchargements du document

117