• Bernoulli
  • Volume 25, Number 1 (2019), 654-682.

Second order correctness of perturbation bootstrap M-estimator of multiple linear regression parameter

Debraj Das and S.N. Lahiri

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


Consider the multiple linear regression model $y_{i}=\mathbf{x}'_{i}\boldsymbol{\beta}+\varepsilon_{i}$, where $\varepsilon_{i}$’s are independent and identically distributed random variables, $\mathbf{x}_{i}$’s are known design vectors and $\boldsymbol{\beta}$ is the $p\times1$ vector of parameters. An effective way of approximating the distribution of the M-estimator $\bar{\boldsymbol{\beta}}_{n}$, after proper centering and scaling, is the Perturbation Bootstrap Method. In this current work, second order results of this non-naive bootstrap method have been investigated. Second order correctness is important for reducing the approximation error uniformly to $o(n^{-1/2})$ to get better inferences. We show that the classical studentized version of the bootstrapped estimator fails to be second order correct. We introduce an innovative modification in the studentized version of the bootstrapped statistic and show that the modified bootstrapped pivot is second order correct (S.O.C.) for approximating the distribution of the studentized M-estimator. Additionally, we show that the Perturbation Bootstrap continues to be S.O.C. when the errors $\varepsilon_{i}$’s are independent, but may not be identically distributed. These findings establish perturbation Bootstrap approximation as a significant improvement over asymptotic normality in the regression M-estimation.

Article information

Bernoulli, Volume 25, Number 1 (2019), 654-682.

Received: May 2016
Revised: September 2017
First available in Project Euclid: 12 December 2018

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Edgeworth expansion generalized bootstrap M-estimation perturbation bootstrap residual bootstrap S.O.C. Studentization wild bootstrap


Das, Debraj; Lahiri, S.N. Second order correctness of perturbation bootstrap M-estimator of multiple linear regression parameter. Bernoulli 25 (2019), no. 1, 654--682. doi:10.3150/17-BEJ1001.

Export citation


  • [1] Allen, M. and Datta, S. (1999). A note on bootstrapping $M$-estimators in ARMA models. J. Time Series Anal. 20 365–379.
  • [2] Arcones, M.A. and Giné, E. (1992). On the bootstrap of $M$-estimators and other statistical functionals. In Exploring the Limits of Bootstrap (East Lansing, MI, 1990). Wiley Ser. Probab. Math. Statist. Probab. Math. Statist. 13–47. New York: Wiley.
  • [3] Arlot, S. (2009). Model selection by resampling penalization. Electron. J. Stat. 3 557–624.
  • [4] Barbe, P. and Bertail, P. (2012). The Weighted Bootstrap. Lecture Notes in Statistics 98. New York: Springer.
  • [5] Beran, R. (1986). Discussion: Jackknife, bootstrap and other resampling methods in regression analysis. Ann. Statist. 14 1295–1298.
  • [6] Bhattacharya, R.N. and Ghosh, J.K. (1978). On the validity of the formal Edgeworth expansion. Ann. Statist. 6 434–451.
  • [7] Bhattacharya, R.N. and Rao, R.R. (1986). Normal Approximation and Asymptotic Expansions. New York: Wiley.
  • [8] Bickel, P.J. and Freedman, D.A. (1981). Some asymptotic theory for the bootstrap. Ann. Statist. 9 1196–1217.
  • [9] Chatterjee, S. (1999). Generalised bootstrap techniques. Ph.D. dissertation, Indian Statistical Institute, Calcutta.
  • [10] Chatterjee, S. and Bose, A. (2005). Generalized bootstrap for estimating equations. Ann. Statist. 33 414–436.
  • [11] Chatterjee, A. and Lahiri, S.N. (2013). Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap. Ann. Statist. 41 1232–1259.
  • [12] Cheng, G. (2015). Moment consistency of the exchangeably weighted bootstrap for semiparametric M-estimation. Scand. J. Stat. 42 665–684.
  • [13] Cheng, G. and Huang, J.Z. (2010). Bootstrap consistency for general semiparametric M-estimation. Ann. Statist. 38 2884–2915.
  • [14] Das, D. and Lahiri, S.N. (2017). Supplement to “Second order correctness of perturbation bootstrap M-estimator of multiple linear regression parameter”. DOI:10.3150/17-BEJ1001SUPP.
  • [15] Davidson, R. and Flachaire, E. (2008). The wild bootstrap, tamed at last. J. Econometrics 146 162–169.
  • [16] Davidson, R. and MacKinnon, J.G. (2010). Wild bootstrap tests for IV regression. J. Bus. Econom. Statist. 28 128–144.
  • [17] Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Ann. Statist. 7 1–26.
  • [18] El Bantli, F. (2004). M-estimation in linear models under nonstandard conditions. J. Statist. Plann. Inference 121 231–248.
  • [19] Feng, X., He, X. and Hu, J. (2011). Wild bootstrap for quantile regression. Biometrika 98 995–999.
  • [20] Freedman, D.A. (1981). Bootstrapping regression models. Ann. Statist. 9 1218–1228.
  • [21] Fuk, D.H. and Nagaev, S.V. (1971). Probabilistic inequalities for sums of independent random variables. Teor. Verojatnost. i Primenen. 16 660–675.
  • [22] Haeusler, E., Mason, D.M. and Newton, M.A. (1991). Weighted bootstrapping of means. CWI Quarterly 4 213–228.
  • [23] Hall, P. (1992). The Bootstrap and Edgeworth Expansion. Springer Series in Statistics. New York: Springer.
  • [24] Hlávka, Z. (2003). Asymptotic properties of robust three-stage procedure based on bootstrap for $M$-estimator. J. Statist. Plann. Inference 115 637–656.
  • [25] Hu, F. (1996). Efficiency and robustness of a resampling M-estimator in the linear model. J. Multivariate Anal. 78 252–271.
  • [26] Hu, F. and Kalbfleisch, D.J. (2000). The estimating function bootstrap. Canad. J. Statist. 28 449–499.
  • [27] Huber, P.J. (1981). Robust Statistics. Wiley Series in Probability and Mathematical Statistics. New York: Wiley.
  • [28] Jin, Z., Ying, Z. and Wei, L.J. (2001). A simple resampling method by perturbing the minimand. Biometrika 88 381–390.
  • [29] Karabulut, I.K. and Lahiri, S.N. (1997). Two-term Edgeworth expansion for M-estimators of a linear regression parameter without Cramer-type conditions and an application to the bootstrap. J. Aust. Math. Soc. A 62 361–370.
  • [30] Kline, P. and Santos, A. (2012). A score based approach to wild bootstrap inference. J. Econometric Methods 1 23–41.
  • [31] Lahiri, S.N. (1989). Bootstrap approximation and Edgeworth expansion for the distributions of the M-estimators of a regression parameter. Preprint 89-36, Dept. Statistics, Iowa State Univ.
  • [32] Lahiri, S.N. (1992). Bootstrapping $M$-estimators of a multiple linear regression parameter. Ann. Statist. 20 1548–1570.
  • [33] Lahiri, S.N. (1994). On two-term Edgeworth expansions and bootstrap approximations for Studentized multivariate M-estimators. Sankhya, Ser. A 56 201–226.
  • [34] Lahiri, S.N. (1996). On Edgeworth expansion and moving block bootstrap for Studentized M-estimators in multiple linear regression models. J. Multivariate Anal. 56 42–59.
  • [35] Lahiri, S.N. and Zhu, J. (2006). Resampling methods for spatial regression models under a class of stochastic designs. Ann. Statist. 34 1774–1813.
  • [36] Lee, S.M.S. (2012). General M-estimation and its bootstrap. J. Korean Statist. Soc. 41 471–490.
  • [37] Liu, R.Y. (1988). Bootstrap procedures under some non-IID models. Ann. Statist. 16 1696–1708.
  • [38] Ma, S. and Kosorok, M.R. (2004). Robust semiparametric M-estimation and the weighted bootstrap. J. Multivariate Anal. 96 190–217.
  • [39] Mammen, E. (1993). Bootstrap and wild bootstrap for high dimensional linear models. Ann. Statist. 21 255–285.
  • [40] Mason, D.M. and Newton, M.A. (1992). A rank statistics approach to the consistency of a general bootstrap. Ann. Statist. 20 1611–1624.
  • [41] Minnier, J., Tian, L. and Cai, T. (2011). A perturbation method for inference on regularized regression estimates. J. Amer. Statist. Assoc. 106 1371–1382.
  • [42] Navidi, W. (1989). Edgeworth expansions for bootstrapping regression models. Ann. Statist. 17 1472–1478.
  • [43] Qumsiyeh, M.B. (1990). Edgeworth expansion in regression models. J. Multivariate Anal. 35 86–101.
  • [44] Qumsiyeh, M.B. (1994). Bootstrapping and empirical Edgeworth expansions in multiple linear regression models. Comm. Statist. Theory Methods 23 3227–3239.
  • [45] Rao, C.R. and Zhao, L.C. (1992). Approximation to the distribution of $M$-estimates in linear models by randomly weighted bootstrap. Sankhya, Ser. A 54 323–331.
  • [46] Rubin, D.B. (1981). The Bayesian bootstrap. Ann. Statist. 9 130–134.
  • [47] Wang, X.M. and Zhou, W. (2004). Bootstrap approximation to the distribution of M-estimates in a linear model. Acta Math. Sin. (Engl. Ser.) 20 93–104.
  • [48] Wellner, J.A. and Zhan, Y. (1996). Bootstrapping $Z$-estimators. Technical report, Dept. Statistics, Univ. Washington.
  • [49] Wu, C.-F.J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis. Ann. Statist. 14 1261–1350. With discussion and a rejoinder by the author.
  • [50] You, J. and Chen, G. (2006). Wild bootstrap estimation in partially linear models with heteroscedasticity. Statist. Probab. Lett. 76 340–348.

Supplemental materials

  • Supplement to “Second order correctness of perturbation bootstrap M-estimator of multiple linear regression parameter”. Details of the proofs are provided.