## The Annals of Statistics

### An Improvement of the Jackknife Distribution Function Estimator

#### Abstract

In a recent paper, C. F. J. Wu showed that the jackknife estimator of a distribution function has optimal convergence rate $O(n^{-1/2})$, where $n$ denotes the sample size. This rate is achieved by retaining $O(n)$ data values from the original sample during the jackknife algorithm. Wu's result is particularly important since it permits a direct comparison of jackknife and bootstrap methods for distribution estimation. In the present paper we show that a very simple, nonempirical modification of the jackknife estimator improves the convergence rate from $O(n^{-1/2})$ to $O(n^{-5/6})$, and that this rate may be achieved by retaining only $O(n^{2/3})$ data values from the original sample. Our technique consists of mixing the jackknife distribution estimator with the standard normal distribution in an appropriate proportion. The convergence rate of $O(n^{-5/6})$ makes the jackknife significantly more competitive with the bootstrap, which enjoys a convergence rate of $O(n^{-1})$ in this particular problem.

#### Article information

Source
Ann. Statist., Volume 21, Number 3 (1993), 1476-1485.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176349268

Digital Object Identifier
doi:10.1214/aos/1176349268

Mathematical Reviews number (MathSciNet)
MR1241275

Zentralblatt MATH identifier
0792.62036

JSTOR