Open Access
February 2004 On the Bayes-risk consistency of regularized boosting methods
Gábor Lugosi, Nicolas Vayatis
Ann. Statist. 32(1): 30-55 (February 2004). DOI: 10.1214/aos/1079120129

Abstract

The probability of error of classification methods based on convex combinations of simple base classifiers by "boosting" algorithms is investigated. The main result of the paper is that certain regularized boosting algorithms provide Bayes-risk consistent classifiers under the sole assumption that the Bayes classifier may be approximated by a convex combination of the base classifiers. Nonasymptotic distribution-free bounds are also developed which offer interesting new insight into how boosting works and help explain its success in practical classification problems.

Citation

Download Citation

Gábor Lugosi. Nicolas Vayatis. "On the Bayes-risk consistency of regularized boosting methods." Ann. Statist. 32 (1) 30 - 55, February 2004. https://doi.org/10.1214/aos/1079120129

Information

Published: February 2004
First available in Project Euclid: 12 March 2004

zbMATH: 1105.62319
MathSciNet: MR2051000
Digital Object Identifier: 10.1214/aos/1079120129

Subjects:
Primary: 60G99 , 62C12 , 62G99

Keywords: Bayes-risk consistency , boosting , ‎classification‎ , convex cost functions , Empirical processes , penalized model selection , smoothing parameter

Rights: Copyright © 2004 Institute of Mathematical Statistics

Vol.32 • No. 1 • February 2004
Back to Top