## Institute of Mathematical Statistics Collections

### The Lasso, correlated design, and improved oracle inequalities

#### Abstract

We study high-dimensional linear models and the $\ell_1$-penalized least squares estimator, also known as the Lasso estimator. In literature, oracle inequalities have been derived under restricted eigenvalue or compatibility conditions. In this paper, we complement this with entropy conditions which allow one to improve the dual norm bound, and demonstrate how this leads to new oracle inequalities. The new oracle inequalities show that a smaller choice for the tuning parameter and a trade-off between $\ell_1$-norms and small compatibility constants are possible. This implies, in particular for correlated design, improved bounds for the prediction error of the Lasso estimator as compared to the methods based on restricted eigenvalue or compatibility conditions only.

#### Chapter information

Source
Banerjee, M., Bunea, F., Huang, J., Koltchinskii, V., and Maathuis, M. H., eds., From Probability to Statistics and Back: High-Dimensional Models and Processes -- A Festschrift in Honor of Jon A. Wellner, (Beachwood, Ohio, USA: Institute of Mathematical Statistics, 2013) , 303-316

Dates
First available in Project Euclid: 8 March 2013

Permanent link to this document
https://projecteuclid.org/euclid.imsc/1362751196

Digital Object Identifier
doi:10.1214/12-IMSCOLL922

Zentralblatt MATH identifier
1327.62426

Subjects
Primary: 62J05: Linear regression
Secondary: 62J99: None of the above, but in this section

Rights
Copyright © 2010, Institute of Mathematical Statistics

#### Citation

van de Geer, Sara; Lederer, Johannes. The Lasso, correlated design, and improved oracle inequalities. From Probability to Statistics and Back: High-Dimensional Models and Processes -- A Festschrift in Honor of Jon A. Wellner, 303--316, Institute of Mathematical Statistics, Beachwood, Ohio, USA, 2013. doi:10.1214/12-IMSCOLL922. https://projecteuclid.org/euclid.imsc/1362751196

#### References

• [1] Ball, K. and Pajor, A. (1990). The entropy of convex bodies with few extreme points. London Math. Soc. Lecture Note Ser. 158 25–32.
• [2] Bickel, P., Ritov, Y. and Tsybakov, A. (2009). Simultaneous analysis of Lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
• [3] Bühlmann, P. and van de Geer, S. (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer, Berlin.
• [4] Bunea, F., Tsybakov, A. and Wegkamp, M. (2006). Aggregation and sparsity via $\ell_1$-penalized least squares. In: Lecture Notes in Artificial Intelligence (COLT 2006). Springer, Berlin, 379–391.
• [5] Bunea, F., Tsybakov, A. and Wegkamp, M. (2007). Aggregation for Gaussian regression. Ann. Statist. 35 1674–1697.
• [6] Bunea, F., Tsybakov, A. and Wegkamp, M. (2007). Sparsity oracle inequalities for the Lasso. Electron. J. Stat. 1 169–194.
• [7] Dudley, R. (1987). Universal Donsker classes and metric entropy. Ann. Probab. 15 1306–1326.
• [8] Koltchinskii, V. (2009). Sparsity in penalized empirical risk minimization. Ann. Inst. Henri Poincaré Probab. Stat. 45 7–57.
• [8] Pollard, D. (1990). Empirical Processes: Theory and Applications. CBMS Reg. Conf. Ser. Math., IMS, Hayward.
• [10] van de Geer, S. (2007). The deterministic Lasso. 2007 Proc. Amer. Math. Soc. [CD-ROM]. URL http://www.stat.math.ethz.ch/~geer/lasso.pdf
• [11] van de Geer, S. (2007). On non-asymptotic bounds for estimation in generalized linear models with highly correlated design. In: Asymptotics: Particles, Processes and Inverse Problems (E. A. Cator, G. Jongbloed, C. Kraaikamp, H. P. Lopuhaä, J. A. Wellner, eds.). IMS, Hayward, 121–134.
• [12] van de Geer, S. (2008). High-dimensional generalized linear models and the Lasso. Ann. Statist. 36 614–645.
• [13] van de Geer, S. and Bühlmann, P. (2009). On the conditions used to prove oracle results for the Lasso. Electron. J. Stat. 3 1360–1392.
• [14] van de Geer, S. A. (2000). Empirical Processes in M-Estimation. Cambridge University Press, Cambridge.
• [15] van der Vaart, A. W. and Wellner, J. A. (1996). Weak Convergence and Empirical Processes. Springer Ser. Statist. Springer, New York.