Open Access
2021 Iteratively reweighted 1-penalized robust regression
Xiaoou Pan, Qiang Sun, Wen-Xin Zhou
Author Affiliations +
Electron. J. Statist. 15(1): 3287-3348 (2021). DOI: 10.1214/21-EJS1862

Abstract

This paper investigates tradeoffs among optimization errors, statistical rates of convergence and the effect of heavy-tailed errors for high-dimensional robust regression with nonconvex regularization. When the additive errors in linear models have only bounded second moments, we show that iteratively reweighted 1-penalized adaptive Huber regression estimator satisfies exponential deviation bounds and oracle properties, including the oracle convergence rate and variable selection consistency, under a weak beta-min condition. Computationally, we need as many as O(logs+loglogd) iterations to reach such an oracle estimator, where s and d denote the sparsity and ambient dimension, respectively. Extension to a general class of robust loss functions is also considered. Numerical studies lend strong support to our methodology and theory.

Funding Statement

Sun was supported in part by the NSERC Grant RGPIN-2018-06484. Zhou acknowledges the support from NSF Grant DMS-1811376 and UCSD General Campus Research Grant.

Acknowledgments

The authors are very grateful to the Editor and two anonymous referees for their careful reading of the manuscript, and many valuable remarks and suggestions.

Citation

Download Citation

Xiaoou Pan. Qiang Sun. Wen-Xin Zhou. "Iteratively reweighted 1-penalized robust regression." Electron. J. Statist. 15 (1) 3287 - 3348, 2021. https://doi.org/10.1214/21-EJS1862

Information

Received: 1 December 2020; Published: 2021
First available in Project Euclid: 22 June 2021

arXiv: 1907.04027
Digital Object Identifier: 10.1214/21-EJS1862

Subjects:
Primary: 62A01
Secondary: 62J07

Keywords: Adaptive Huber regression , convex relaxation , heavy-tailed noise , nonconvex regularization , optimization error , oracle property , oracle rate

Vol.15 • No. 1 • 2021
Back to Top