Bernoulli

  • Bernoulli
  • Volume 11, Number 2 (2005), 293-307.

Nonparametric bootstrap prediction

Tadayoshi Fushiki, Fumiyasu Komaki, and Kazuyuki Aihara

Full-text: Open access

Abstract

Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction than a plug-in distribution with the maximum likelihood estimator. In this paper, we investigate nonparametric bootstrap predictive distributions. The nonparametric bootstrap predictive distribution is precisely that obtained by applying bagging to the statistical prediction problem. We show that the nonparametric bootstrap predictive distribution gives predictions asymptotically as good as the parametric bootstrap predictive distribution.

Article information

Source
Bernoulli, Volume 11, Number 2 (2005), 293-307.

Dates
First available in Project Euclid: 17 May 2005

Permanent link to this document
https://projecteuclid.org/euclid.bj/1116340296

Digital Object Identifier
doi:10.3150/bj/1116340296

Mathematical Reviews number (MathSciNet)
MR2132728

Zentralblatt MATH identifier
1063.62062

Keywords
asymptotic theory bagging bootstrap predictive distribution information geometry Kullback-Leibler divergence

Citation

Fushiki, Tadayoshi; Komaki, Fumiyasu; Aihara, Kazuyuki. Nonparametric bootstrap prediction. Bernoulli 11 (2005), no. 2, 293--307. doi:10.3150/bj/1116340296. https://projecteuclid.org/euclid.bj/1116340296


Export citation

References