Open Access
2010 Consistency of Boosting under Normality
C. Andy Tsao, W. Drago Chen
Taiwanese J. Math. 14(6): 2125-2136 (2010). DOI: 10.11650/twjm/1500406066

Abstract

Boosting is one of the important ensemble classifiers emerging in the past decade. [10] provides a statistical insight: AdaBoost can be viewed as a Newton-like updates minimizing exponential criterion. This powerful insight, however, does not address (1) whether the Newton update converges (2) whether the update procedure converge to the Bayes procedure if it does converge. Under a normal-normal setting, we cast the learning problem as a Bayesian minimization problem. It is shown that the Bayes procedure can be obtained via an iterative Newton update minimizing exponential criterion. In addition, the step sizes of AdaBoost are shown to be highly effective and lead to a one-step convergence. While our results based on strong distributional assumption, they require little conditions on the complexity of base learners nor regularization on step sizes or number of boosting iterations.

Citation

Download Citation

C. Andy Tsao. W. Drago Chen. "Consistency of Boosting under Normality." Taiwanese J. Math. 14 (6) 2125 - 2136, 2010. https://doi.org/10.11650/twjm/1500406066

Information

Published: 2010
First available in Project Euclid: 18 July 2017

zbMATH: 05896850
MathSciNet: MR2742355
Digital Object Identifier: 10.11650/twjm/1500406066

Subjects:
Primary: 62C10 , 62G05 , 68T05

Keywords: Bayesian optimization , boosting , loss approximation , statistical machine leaming

Rights: Copyright © 2010 The Mathematical Society of the Republic of China

Vol.14 • No. 6 • 2010
Back to Top