Open Access
September, 1993 An Improvement of the Jackknife Distribution Function Estimator
James G. Booth, Peter Hall
Ann. Statist. 21(3): 1476-1485 (September, 1993). DOI: 10.1214/aos/1176349268

Abstract

In a recent paper, C. F. J. Wu showed that the jackknife estimator of a distribution function has optimal convergence rate $O(n^{-1/2})$, where $n$ denotes the sample size. This rate is achieved by retaining $O(n)$ data values from the original sample during the jackknife algorithm. Wu's result is particularly important since it permits a direct comparison of jackknife and bootstrap methods for distribution estimation. In the present paper we show that a very simple, nonempirical modification of the jackknife estimator improves the convergence rate from $O(n^{-1/2})$ to $O(n^{-5/6})$, and that this rate may be achieved by retaining only $O(n^{2/3})$ data values from the original sample. Our technique consists of mixing the jackknife distribution estimator with the standard normal distribution in an appropriate proportion. The convergence rate of $O(n^{-5/6})$ makes the jackknife significantly more competitive with the bootstrap, which enjoys a convergence rate of $O(n^{-1})$ in this particular problem.

Citation

Download Citation

James G. Booth. Peter Hall. "An Improvement of the Jackknife Distribution Function Estimator." Ann. Statist. 21 (3) 1476 - 1485, September, 1993. https://doi.org/10.1214/aos/1176349268

Information

Published: September, 1993
First available in Project Euclid: 12 April 2007

zbMATH: 0792.62036
MathSciNet: MR1241275
Digital Object Identifier: 10.1214/aos/1176349268

Subjects:
Primary: 62G05
Secondary: 62D05

Keywords: asymptotic normality , bootstrap , convergence rate , Distribution estimation , Edgeworth expansion , jackknife , mixture , sampling without replacement

Rights: Copyright © 1993 Institute of Mathematical Statistics

Vol.21 • No. 3 • September, 1993
Back to Top