Open Access
December 2007 Analysis of boosting algorithms using the smooth margin function
Cynthia Rudin, Robert E. Schapire, Ingrid Daubechies
Ann. Statist. 35(6): 2723-2768 (December 2007). DOI: 10.1214/009053607000000785


We introduce a useful tool for analyzing boosting algorithms called the “smooth margin function,” a differentiable approximation of the usual margin for boosting algorithms. We present two boosting algorithms based on this smooth margin, “coordinate ascent boosting” and “approximate coordinate ascent boosting,” which are similar to Freund and Schapire’s AdaBoost algorithm and Breiman’s arc-gv algorithm. We give convergence rates to the maximum margin solution for both of our algorithms and for arc-gv. We then study AdaBoost’s convergence properties using the smooth margin function. We precisely bound the margin attained by AdaBoost when the edges of the weak classifiers fall within a specified range. This shows that a previous bound proved by Rätsch and Warmuth is exactly tight. Furthermore, we use the smooth margin to capture explicit properties of AdaBoost in cases where cyclic behavior occurs.


Download Citation

Cynthia Rudin. Robert E. Schapire. Ingrid Daubechies. "Analysis of boosting algorithms using the smooth margin function." Ann. Statist. 35 (6) 2723 - 2768, December 2007.


Published: December 2007
First available in Project Euclid: 22 January 2008

zbMATH: 1132.68827
MathSciNet: MR2382664
Digital Object Identifier: 10.1214/009053607000000785

Primary: 68Q25 , 68W40
Secondary: 68Q32

Keywords: AdaBoost , arc-gv , boosting , Convergence rates , Coordinate descent , large margin classification

Rights: Copyright © 2007 Institute of Mathematical Statistics

Vol.35 • No. 6 • December 2007
Back to Top