• Bernoulli
  • Volume 11, Number 2 (2005), 293-307.

Nonparametric bootstrap prediction

Tadayoshi Fushiki, Fumiyasu Komaki, and Kazuyuki Aihara

Full-text: Open access


Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction than a plug-in distribution with the maximum likelihood estimator. In this paper, we investigate nonparametric bootstrap predictive distributions. The nonparametric bootstrap predictive distribution is precisely that obtained by applying bagging to the statistical prediction problem. We show that the nonparametric bootstrap predictive distribution gives predictions asymptotically as good as the parametric bootstrap predictive distribution.

Article information

Bernoulli, Volume 11, Number 2 (2005), 293-307.

First available in Project Euclid: 17 May 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

asymptotic theory bagging bootstrap predictive distribution information geometry Kullback-Leibler divergence


Fushiki, Tadayoshi; Komaki, Fumiyasu; Aihara, Kazuyuki. Nonparametric bootstrap prediction. Bernoulli 11 (2005), no. 2, 293--307. doi:10.3150/bj/1116340296.

Export citation