The Annals of Statistics

Consistent parameter estimation for LASSO and approximate message passing

Ali Mousavi, Arian Maleki, and Richard G. Baraniuk

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

This paper studies the optimal tuning of the regularization parameter in LASSO or the threshold parameters in approximate message passing (AMP). Considering a model in which the design matrix and noise are zero-mean i.i.d. Gaussian, we propose a data-driven approach for estimating the regularization parameter of LASSO and the threshold parameters in AMP. Our estimates are consistent, that is, they converge to their asymptotically optimal values in probability as $n$, the number of observations, and $p$, the ambient dimension of the sparse vector, grow to infinity, while $n/p$ converges to a fixed number $\delta$. As a byproduct of our analysis, we will shed light on the asymptotic properties of the solution paths of LASSO and AMP.

Article information

Source
Ann. Statist., Volume 45, Number 6 (2017), 2427-2454.

Dates
Received: November 2015
Revised: November 2016
First available in Project Euclid: 15 December 2017

Permanent link to this document
https://projecteuclid.org/euclid.aos/1513328578

Digital Object Identifier
doi:10.1214/16-AOS1529

Mathematical Reviews number (MathSciNet)
MR3737897

Zentralblatt MATH identifier
06838138

Subjects
Primary: 62G05: Estimation 62J05: Linear regression

Keywords
LASSO estimation sparsity approximate message passing

Citation

Mousavi, Ali; Maleki, Arian; Baraniuk, Richard G. Consistent parameter estimation for LASSO and approximate message passing. Ann. Statist. 45 (2017), no. 6, 2427--2454. doi:10.1214/16-AOS1529. https://projecteuclid.org/euclid.aos/1513328578


Export citation

References

  • [1] Amelunxen, D., Lotz, M., McCoy, M. B. and Tropp, J. A. (2014). Living on the edge: A geometric theory of phase transitions in convex optimization. Preprint. Available at arXiv:1303.6672.
  • [2] Bayati, M., Lelarge, M. and Montanari, A. (2012). Universality in polytope phase transitions and message passing algorithms. Preprint. Available at arXiv:1207.7321.
  • [3] Bayati, M. and Montanari, A. (2012). The LASSO risk for Gaussian matrices. IEEE Trans. Inform. Theory 58 1997–2017.
  • [4] Bayati, M. and Montanri, A. (2011). The dynamics of message passing on dense graphs, with applications to compressed sensing. IEEE Trans. Inform. Theory 57 764–785.
  • [5] Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of LASSO and Dantzig selector. Ann. Statist. 1705–1732.
  • [6] Bunea, F., Tsybakov, A. and Wegkamp, M. (2007). Sparsity oracle inequalities for the Lasso. Electron. J. Stat. 1 169–194.
  • [7] Cai, T. T., Wang, L. and Xu, G. (2010). Shifting inequality and recovery of sparse signals. IEEE Trans. Signal Process. 58 1300–1308.
  • [8] Cai, T. T., Wang, L. and Xu, G. (2010). Stable recovery of sparse signals and an oracle inequality. IEEE Trans. Inform. Theory 56 3516–3522.
  • [9] Cai, T. T., Xu, G. and Zhang, J. (2009). On recovery of sparse signals via $\ell_{1}$-minimization. IEEE Trans. Inform. Theory 55 3388–3397.
  • [10] Candès, E. J., Romberg, J. K. and Tao, T. (2006). Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math. 59 1207–1223.
  • [11] Chatterjee, S. and Jafarov, J. (2015). Prediction error of cross-validated lasso. Preprint. Available at arXiv:1502.06291.
  • [12] Chen, S. S., Donoho, D. L. and Saunders, M. A. (1998). Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20 33–61.
  • [13] Donoho, D. and Tanner, J. (2009). Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing. Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 367 4273–4293.
  • [14] Donoho, D. L. (2006). High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension. Discrete Comput. Geom. 35 617–652.
  • [15] Donoho, D. L., Maleki, A. and Montanari, A. (2009). Message passing algorithms for compressed sensing. Proc. Natl. Acad. Sci. USA 106 18914–18919.
  • [16] Donoho, D. L., Maleki, A. and Montanari, A. (2011). Noise sensitivity phase transition. IEEE Trans. Inform. Theory 57 6920–6941.
  • [17] Donoho, D. L. and Tanner, J. (2005). Neighborliness of randomly projected simplices in high dimensions. Proc. Natl. Acad. Sci. USA 102 9452–9457.
  • [18] Donoho, D. L. and Tanner, J. (2009). Counting faces of randomly projected polytopes when the projection radically lowers dimension. J. Amer. Math. Soc. 22 1–53.
  • [19] Donoho, D. L. and Tanner, J. (2010). Precise undersampling theorems. Proc. IEEE 98 913–924.
  • [20] Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist. 32 407–499.
  • [21] Hastie, T., Tibshirani, R., Friedman, J. and Franklin, J. (2005). The elements of statistical learning: Data mining, inference and prediction. Math. Intelligencer 27 83–85.
  • [22] Homrighausen, D. and McDonald, D. J. (2014). Leave-one-out cross-validation is risk consistent for lasso. Mach. Learn. 97 65–78.
  • [23] Knight, K. and Fu, W. (2000). Asymptotics for LASSO-type estimators. Ann. Statist. 28 1356–1378.
  • [24] Lockhart, R., Taylor, J., Tibshirani, R. J. and Tibshirani, R. (2012). A significance test for the LASSO. Preprint. Available at arXiv:1301.7161.
  • [25] Maleki, A. (2010). Approximate message passing algorithm for compressed sensing. Ph.D. Thesis, Stanford Univ., Stanford, CA.
  • [26] Meinshausen, N. and Bühlmann, P. (2006). High-dimensional graphs and variable selection with the lasso. Ann. Statist. 34 1436–1462.
  • [27] Meinshausen, N. and Yu, B. (2009). Lasso-type recovery of sparse representations for high-dimensional data. Ann. Statist. 37 246–270.
  • [28] Mousavi, A., Maleki, A. and Baraniuk, R. G. (2013). Parameterless optimal approximate message passing. Preprint. Available at arXiv:1311.0035.
  • [29] Mousavi, A., Maleki, A. and Baraniuk, R. G. (2017). Supplement to “Consistent parameter estimation for LASSO and approximate message passing.” DOI:10.1214/16-AOS1529SUPP.
  • [30] Oymak, S. and Hassibi, B. (2012). On a relation between the minimax risk and the phase transitions of compressed recovery. In Proc. Allerton Conf. Communication, Control, and Computing 1018–1025, Urbana-Champaign, IL.
  • [31] Raskutti, G., Wainwright, M. J. and Yu, B. (2011). Minimax rates of estimation for high-dimensional linear regression over $\ell_{q}$-Balss. IEEE Trans. Inform. Theory 57 6976–6994.
  • [32] Stein, C. M. (1981). Estimation of the mean of a multivariate normal distribution. Ann. Statist. 1135–1151.
  • [33] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • [34] Tibshirani, R. J. (2013). The lasso problem and uniqueness. Electron. J. Stat. 7 1456–1490.
  • [35] van de Geer, S. A. and Bühlmann, P. (2009). On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3 1360–1392.
  • [36] Wasserman, L. and Roeder, K. (2009). High-dimensional variable selection. Ann. Statist. 37 2178–2201.
  • [37] Zhang, C.-H. and Huang, J. (2008). The sparsity and bias of the LASSO selection in high-dimensional linear regression. Ann. Statist. 36 1567–1594.
  • [38] Zhang, T. (2009). Some sharp performance bounds for least squares regression with $L_{1}$ regularization. Ann. Statist. 37 2109–2144.
  • [39] Zhao, P. and Yu, B. (2006). On model selection consistency of Lasso. J. Mach. Learn. Res. 7 2541–2563.
  • [40] Zheng, L., Maleki, A., Wang, X. and Long, T. (2015). Does $\ell_{p}$-minimization outperform $\ell_{1}$-minimization? Preprint. Available at arXiv:1501.03704.
  • [41] Zou, H., Hastie, T. and Tibshirani, R. (2007). On the degrees of freedom of the LASSO. Ann. Statist. 35 2173–2192.

Supplemental materials

  • Supplement to “Consistent parameter estimation for LASSO and approximate message passing”. This supplementary material includes the proof of theorems and simulation results.