Abstract
In a recent paper, C. F. J. Wu showed that the jackknife estimator of a distribution function has optimal convergence rate $O(n^{-1/2})$, where $n$ denotes the sample size. This rate is achieved by retaining $O(n)$ data values from the original sample during the jackknife algorithm. Wu's result is particularly important since it permits a direct comparison of jackknife and bootstrap methods for distribution estimation. In the present paper we show that a very simple, nonempirical modification of the jackknife estimator improves the convergence rate from $O(n^{-1/2})$ to $O(n^{-5/6})$, and that this rate may be achieved by retaining only $O(n^{2/3})$ data values from the original sample. Our technique consists of mixing the jackknife distribution estimator with the standard normal distribution in an appropriate proportion. The convergence rate of $O(n^{-5/6})$ makes the jackknife significantly more competitive with the bootstrap, which enjoys a convergence rate of $O(n^{-1})$ in this particular problem.
Citation
James G. Booth. Peter Hall. "An Improvement of the Jackknife Distribution Function Estimator." Ann. Statist. 21 (3) 1476 - 1485, September, 1993. https://doi.org/10.1214/aos/1176349268
Information