The Annals of Statistics
- Ann. Statist.
- Volume 39, Number 5 (2011), 2766-2794.
Robust linear least squares regression
We consider the problem of robustly predicting as well as the best linear combination of d given functions in least squares regression, and variants of this problem including constraints on the parameters of the linear combination. For the ridge estimator and the ordinary least squares estimator, and their variants, we provide new risk bounds of order d/n without logarithmic factor unlike some standard results, where n is the size of the training data. We also provide a new estimator with better deviations in the presence of heavy-tailed noise. It is based on truncating differences of losses in a min–max framework and satisfies a d/n risk bound both in expectation and in deviations. The key common surprising factor of these results is the absence of exponential moment condition on the output distribution while achieving exponential deviations. All risk bounds are obtained through a PAC-Bayesian analysis on truncated differences of losses. Experimental results strongly back up our truncated min–max estimator.
Ann. Statist., Volume 39, Number 5 (2011), 2766-2794.
First available in Project Euclid: 22 December 2011
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Linear regression generalization error shrinkage PAC-Bayesian theorems risk bounds robust statistics resistant estimators Gibbs posterior distributions randomized estimators statistical learning theory
Audibert, Jean-Yves; Catoni, Olivier. Robust linear least squares regression. Ann. Statist. 39 (2011), no. 5, 2766--2794. doi:10.1214/11-AOS918. https://projecteuclid.org/euclid.aos/1324563355
- Supplementary material: Supplement to “Robust linear least squares regression”. The supplementary material provides the proofs of Theorems 2.1, 2.2 and 3.1.