The Annals of Applied Statistics

Remembrance of Leo Breiman

Peter Bühlmann

Full-text: Open access

Article information

Ann. Appl. Stat., Volume 4, Number 4 (2010), 1638-1641.

First available in Project Euclid: 4 January 2011

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G08: Nonparametric regression 62G09: Resampling methods
Secondary: 68T10: Pattern recognition, speech recognition {For cluster analysis, see 62H30}

Bagging boosting classification and regression trees random forests


Bühlmann, Peter. Remembrance of Leo Breiman. Ann. Appl. Stat. 4 (2010), no. 4, 1638--1641. doi:10.1214/10-AOAS381.

Export citation


  • Amit, Y. and Geman, D. (1997). Shape quantization and recognition with randomized trees. Neural Comput. 9 1545–1588.
  • Audrino, F. and Bühlmann, P. (2001). Tree-structured generalized autoregressive conditional heteroscedastic models. J. Roy. Statist. Soc. Ser. B 63 727–744.
  • Biau, G., Devroye, L. and Lugosi, G. (2008). Consistency of random forests and other averaging classifiers. J. Mach. Learn. Res. 9 2015–2033.
  • Breiman, L. (1996a). Bagging predictors. Mach. Learn. 24 123–140.
  • Breiman, L. (1996b). Heuristics of instability and stabilization in model selection. Ann. Statist. 24 2350–2383.
  • Breiman, L. (1998). Arcing classifiers (with discussion). Ann. Statist. 26 801–849.
  • Breiman, L. (1999). Prediction games and arcing algorithms. Neural Comput. 11 1493–1517.
  • Breiman, L. (2001). Random forests. Mach. Learn. 45 5–32.
  • Breiman, L., Friedman, J. H., Olshen, R. A. and Stone, C. J. (1984). Classification and Regression Trees. Wadsworth, Belmont, CA.
  • Bühlmann, P. and Wyner, A. J. (1999). Variable length Markov chains. Ann. Statist. 27 480–513.
  • Bühlmann, P. and Yu, B. (2000). Discussion of “Additive logistic regression: A statistical view” by J. Friedman, T. Hastie and R. Tibshirani. Ann. Statist. 28 377–386.
  • Bühlmann, P. and Yu, B. (2002). Analyzing bagging. Ann. Statist. 30 927–961.
  • Bühlmann, P. and Yu, B. (2003). Boosting with the L2 loss: Regression and classification. J. Amer. Statist. Assoc. 98 324–339.
  • Bühlmann, P. and Yu, B. (2006). Sparse boosting. J. Mach. Learn. Res. 7 1001–1024.
  • Diaz-Uriarte, R. and Alvarez de Andres, S. (2006). Gene selection and classification of microarray data using random forest. BMC Bioinformatics 7 1–25.
  • Freund, Y. and Schapire, R. E. (1996). Experiments with a new boosting algorithm. In Proceedings of the Thirteenth International Conference on Machine Learning 148–156. Morgan Kaufmann, San Francisco, CA.
  • Friedman, J. H. (2001). Greedy function approximation: A gradient boosting machine. Ann. Statist. 29 1189–1232.
  • Friedman, J. H., Hastie, T. and Tibshirani, R. (2000). Additive logistic regression: A statistical view of boosting (with discussion). Ann. Statist. 28 337–407.
  • Lin, Y. and Jeon, Y. (2006). Random forests and adaptive nearest neighbors. J. Amer. Statist. Assoc. 101 578–590.
  • Meinshausen, N. and Bühlmann, P. (2010). Stability selection (with discussion). J. Roy. Statist. Soc. Ser. B 72 417–473.
  • Menze, B. H., Kelm, B. M., Masuch, R., Himmelreich, U., Bachert, P., Petrich, W. and Hamprecht, F. A. (2009). A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data. BMC Bioinformatics 10 1–16.