Open Access
April 2000 Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors)
Jerome Friedman, Trevor Hastie, Robert Tibshirani
Ann. Statist. 28(2): 337-407 (April 2000). DOI: 10.1214/aos/1016218223

Abstract

Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data and then taking a weighted majority vote of the sequence of classifiers thus produced. For many classification algorithms, this simple strategy results in dramatic improvements in performance. We show that this seemingly mysterious phenomenon can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to boosting. Direct multiclass generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multiclass generalizations of boosting in most situations, and far superior in some. We suggest a minor modification to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best-first truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally, making it more suitable to large-scale data mining applications.

Citation

Download Citation

Jerome Friedman. Trevor Hastie. Robert Tibshirani. "Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors)." Ann. Statist. 28 (2) 337 - 407, April 2000. https://doi.org/10.1214/aos/1016218223

Information

Published: April 2000
First available in Project Euclid: 15 March 2002

zbMATH: 1106.62323
MathSciNet: MR1790002
Digital Object Identifier: 10.1214/aos/1016218223

Subjects:
Primary: 62G05 , 62G07 , 68T05 , 68T10

Keywords: ‎classification‎ , machine learning , nonparametric estimation , stagewise fitting , tree

Rights: Copyright © 2000 Institute of Mathematical Statistics

Vol.28 • No. 2 • April 2000
Back to Top