Open Access
October 2009 Some sharp performance bounds for least squares regression with L1 regularization
Tong Zhang
Ann. Statist. 37(5A): 2109-2144 (October 2009). DOI: 10.1214/08-AOS659

Abstract

We derive sharp performance bounds for least squares regression with L1 regularization from parameter estimation accuracy and feature selection quality perspectives. The main result proved for L1 regularization extends a similar result in [Ann. Statist. 35 (2007) 2313–2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358–2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.

Citation

Download Citation

Tong Zhang. "Some sharp performance bounds for least squares regression with L1 regularization." Ann. Statist. 37 (5A) 2109 - 2144, October 2009. https://doi.org/10.1214/08-AOS659

Information

Published: October 2009
First available in Project Euclid: 15 July 2009

MathSciNet: MR2543687
Digital Object Identifier: 10.1214/08-AOS659

Subjects:
Primary: 62G05
Secondary: 62J05

Keywords: L_1 regularization , Lasso , Parameter estimation , regression , Sparsity , Variable selection

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 5A • October 2009
Back to Top