The Annals of Statistics

A General Theory for Jackknife Variance Estimation

Jun Shao and C. F. J. Wu

Full-text: Open access


The delete-1 jackknife is known to give inconsistent variance estimators for nonsmooth estimators such as the sample quantiles. This well-known deficiency can be rectified by using a more general jackknife with $d$, the number of observations deleted, depending on a smoothness measure of the point estimator. Our general theory explains why jackknife works or fails. It also shows that (i) for "sufficiently smooth" estimators, the jackknife variance estimators with bounded $d$ are consistent and asymptotically unbiased and (ii) for "nonsmooth" estimators, $d$ has to go to infinity at a rate explicitly determined by a smoothness measure to ensure consistency and asymptotic unbiasedness. Improved results are obtained for several classes of estimators. In particular, for the sample $p$-quantiles, the jackknife variance estimators with $d$ satisfying $n^{1/2}/d \rightarrow 0$ and $n - d \rightarrow \infty$ are consistent and asymptotically unbiased.

Article information

Ann. Statist., Volume 17, Number 3 (1989), 1176-1197.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62G05: Estimation
Secondary: 62E20: Asymptotic distribution theory 62G99: None of the above, but in this section

Asymptotic unbiasedness balanced subsampling consistency Frechet differentiability grouped jackknife $L$-estimator $M$-estimator sample quantile smoothness of an estimator $U$-statistic von Mises expansion


Shao, Jun; Wu, C. F. J. A General Theory for Jackknife Variance Estimation. Ann. Statist. 17 (1989), no. 3, 1176--1197. doi:10.1214/aos/1176347263.

Export citation