Open Access
December 2018 Slope meets Lasso: Improved oracle bounds and optimality
Pierre C. Bellec, Guillaume Lecué, Alexandre B. Tsybakov
Ann. Statist. 46(6B): 3603-3642 (December 2018). DOI: 10.1214/17-AOS1670

Abstract

We show that two polynomial time methods, a Lasso estimator with adaptively chosen tuning parameter and a Slope estimator, adaptively achieve the minimax prediction and $\ell_{2}$ estimation rate $(s/n)\log(p/s)$ in high-dimensional linear regression on the class of $s$-sparse vectors in $\mathbb{R}^{p}$. This is done under the Restricted Eigenvalue (RE) condition for the Lasso and under a slightly more constraining assumption on the design for the Slope. The main results have the form of sharp oracle inequalities accounting for the model misspecification error. The minimax optimal bounds are also obtained for the $\ell_{q}$ estimation errors with $1\le q\le2$ when the model is well specified. The results are nonasymptotic, and hold both in probability and in expectation. The assumptions that we impose on the design are satisfied with high probability for a large class of random matrices with independent and possibly anisotropically distributed rows. We give a comparative analysis of conditions, under which oracle bounds for the Lasso and Slope estimators can be obtained. In particular, we show that several known conditions, such as the RE condition and the sparse eigenvalue condition are equivalent if the $\ell_{2}$-norms of regressors are uniformly bounded.

Citation

Download Citation

Pierre C. Bellec. Guillaume Lecué. Alexandre B. Tsybakov. "Slope meets Lasso: Improved oracle bounds and optimality." Ann. Statist. 46 (6B) 3603 - 3642, December 2018. https://doi.org/10.1214/17-AOS1670

Information

Received: 1 May 2016; Revised: 1 May 2017; Published: December 2018
First available in Project Euclid: 11 September 2018

zbMATH: 1405.62056
MathSciNet: MR3852663
Digital Object Identifier: 10.1214/17-AOS1670

Subjects:
Primary: 60K35 , 62G08
Secondary: 62C20 , 62G05 , 62G20

Keywords: High-dimensional statistics , Lasso , Minimax rates , slope , sparse linear regression

Rights: Copyright © 2018 Institute of Mathematical Statistics

Vol.46 • No. 6B • December 2018
Back to Top