Electronic Journal of Statistics

Thresholding least-squares inference in high-dimensional regression models

Mihai Giurcanu

Full-text: Open access

Abstract

We propose a thresholding least-squares method of inference for high-dimensional regression models when the number of parameters, $p$, tends to infinity with the sample size, $n$. Extending the asymptotic behavior of the F-test in high dimensions, we establish the oracle property of the thresholding least-squares estimator when $p=o(n)$. We propose two automatic selection procedures for the thresholding parameter using Scheffé and Bonferroni methods. We show that, under additional regularity conditions, the results continue to hold even if $p=\exp(o(n))$. Lastly, we show that, if properly centered, the residual-bootstrap estimator of the distribution of thresholding least-squares estimator is consistent, while a naive bootstrap estimator is inconsistent. In an intensive simulation study, we assess the finite sample properties of the proposed methods for various sample sizes and model parameters. The analysis of a real world data set illustrates an application of the methods in practice.

Article information

Source
Electron. J. Statist., Volume 10, Number 2 (2016), 2124-2156.

Dates
Received: January 2016
First available in Project Euclid: 18 July 2016

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1468849973

Digital Object Identifier
doi:10.1214/16-EJS1160

Mathematical Reviews number (MathSciNet)
MR3522671

Zentralblatt MATH identifier
1347.62131

Subjects
Primary: 62J05: Linear regression
Secondary: 62E20: Asymptotic distribution theory

Keywords
Regression models high-dimensional inference F-test thresholding least-squares residual-bootstrap

Citation

Giurcanu, Mihai. Thresholding least-squares inference in high-dimensional regression models. Electron. J. Statist. 10 (2016), no. 2, 2124--2156. doi:10.1214/16-EJS1160. https://projecteuclid.org/euclid.ejs/1468849973


Export citation

References

  • Athreya, K. B. and Lahiri, S. N. (2006)., Measure Theory and Probability Theory. Springer, New York.
  • Bhatia, R. (1997)., Matrix Analysis. Springer, New York.
  • Bickel, P. J. and Freedman, D. A. (1983). Bootstrapping regression models with many parameters. In, A Festschrift for Erich L. Lehmann in Honour of his Sixty-fifth Birtday (P. J. Bickel, K. A. Doksum and J. L. Hodges, eds.) 28–48. Wadsworth, Belmont, CA.
  • Bickel, P. J. and Levina, E. (2008). Covariance Regularization by thresholding., The Annals of Statistics 36 2577–2604.
  • Bühlmann, P. (2006). Boosting for high-dimensional linear models., The Annals of Statistics 34 559–583.
  • Chatterjee, A., Gupta, S. and Lahiri, S. N. (2015). On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property., Journal of Econometrics 186 317–324.
  • Chatterjee, A. and Lahiri, S. N. (2011). Bootstrapping Lasso estimators., Journal of the American Statistical Association 106 608-625.
  • Chatterjee, A. and Lahiri, S. N. (2013). Rates of convergence of the adaptive Lasso estimators to the oracle distribution and higher order refinements by the bootstrap., The Annals of Statistics 41 1232-1259.
  • Chow, Y. S. and Teicher, H. (1997)., Probability Theory: independence, interchangeability, martingales. Springer, New York.
  • Davison, A. C. and Hinkley, D. V. (1997)., Bootstrap Methods and their Application. Cambridge University Press, Cambridge.
  • Donoho, D. L. and Johnstone, I. M. (1994). Ideal spatial adaptation by wavelet shrinkage., Biometrika 81 425-455.
  • Donoho, D. L., Johnstone, I. M., Kerkyacharian, G. and Picard, D. (1996). Density estimation by wavelet thresholding., The Annals of Statistics 24 508-539.
  • Dudley, R. M. (2002)., Real Analysis and Probability. Cambridge University Press, Cambridge.
  • Efron, B. (1979). Bootstrap methods: another look at Jackknife., The Annals of Statistics 7 1–26.
  • El Ghaoui, L., Viallon, V. and Rabbani, T. (2012). Safe feature elimination for the lasso and sparse supervised learning problems., Pacific Journal of Optimization 8(4) 667–698.
  • El Karoui, N. (2008). Operator norm consistent estimation of large-dimensional sparse covariance matrices., The Annals of Statistics 36 2717–2756.
  • Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties., Journal of the American Statistical Association 96 1348–1360.
  • Fan, J. and Lv, J. (2008). Sure independence screening for ultrahigh dimensional feature space., Journal of the Royal Statistical Society. Series B 96 1348–1360.
  • Frank, I. E. and Friedman, J. H. (1993). A statistical view of some chemometrics regression tools (with discussion)., Technometrics 35 109–148.
  • Freedman, D. A. (1981). Bootstrapping regression models., The Annals of Statistics 9 1218–1228.
  • Friedman, J., Hastie, T. and Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent., Journal of Statistical Software 33 1-22.
  • Hall, P. (1992)., The Bootstrap and Edgeworth Expansion. Springer-Verlag, New York.
  • Hastie, T., Tibshirani, R. and Friedman, J. (2008)., The Elements of Statistical Learning. Springer, New York.
  • Huang, J., Horowitz, J. L. and Ma, S. (2008). Asymptotic properties of Bridge estimators in sparse high-dimensional regression models., Annals of Statistics 36 587–613.
  • Huang, J., Ma, S. and Zhang, C.-H. (2008). Adaptive LASSO for sparse high-dimensional regression models., Statistica Sinica 18 1603–1618.
  • Huber, P. (1973). Robust regression: asymptotics, conjectures, and Monte Carlo., The Annals of Statistics 1 799–821.
  • Khuri, A. I. (2010)., Linear Model Methodology. Chapman & Hall/CRC, Boca Raton, Florida.
  • Knight, K. and Fu, W. (2000). Asymptotics for LASSO-type estimators., The Annals of Statistics 5 1356–1378.
  • Mammen, E. (1989). Asymptotics with increasing dimension for robust regression with applications to the bootstrap., The Annals of Statistics 17 382–400.
  • Mammen, E. (1993). Bootstrap and wild bootstrap for high dimensional linear models., The Annals of Statistics 21 255–285.
  • Portnoy, S. (1984). Asymptotic behavior of M-estimators of $p$ regression parameters when $p^2/n$ is large; I. Consistency., The Annals of Statistics 12 1298-1309.
  • Portnoy, S. (1985). Asymptotic behavior of M-estimators of $p$ regression parameters when $p^2/n$ is large; II. Normal approximation., The Annals of Statistics 13 1403-1417.
  • Tibshirani, R. (1996). Regression shrinkage and selection estimator via the Lasso., Journal of the Royal Statistical Society: Series B 58 267–288.
  • van de Geer, S., Bühlmann, P. and Zhou, S. (2011). The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)., Electronic Journal of Statistics 5 688–749.
  • van der Laan, M. and Bryan, J. (2001). Gene expression analysis with the parametric bootstrap., Biostatistics 2 445–461.
  • van der Vaart, A. W. (1998)., Asymptotic Statistics. Springer-Verlag, New York.
  • Wang, X. and Leng, C. (2016). High dimensional ordinary least squares projection for screening variables., Journal of the Royal Statistical Society. Series B. 78 589–611.
  • Wasserman, L. and Roeder, K. (2009). High-dimensional variable selection., The Annals of Statistics 37 2178–2201.
  • Zelen, M. and Severo, N. (1972). Probability Functions. In, Handbook of Mathematical Functions with Formula, Graphs, and Mathematical Tables, (M. Abramowitz and I. Stegun, eds.). Applied Mathematics Series 55 925–997. National Bureau of Standards, Washington.
  • Zou, H. (2006). The adaptive lasso and its oracle properties., Journal of the American Statistical Association 101 1418–1429.