Open Access
October 2011 Robust linear least squares regression
Jean-Yves Audibert, Olivier Catoni
Ann. Statist. 39(5): 2766-2794 (October 2011). DOI: 10.1214/11-AOS918

Abstract

We consider the problem of robustly predicting as well as the best linear combination of d given functions in least squares regression, and variants of this problem including constraints on the parameters of the linear combination. For the ridge estimator and the ordinary least squares estimator, and their variants, we provide new risk bounds of order d/n without logarithmic factor unlike some standard results, where n is the size of the training data. We also provide a new estimator with better deviations in the presence of heavy-tailed noise. It is based on truncating differences of losses in a min–max framework and satisfies a d/n risk bound both in expectation and in deviations. The key common surprising factor of these results is the absence of exponential moment condition on the output distribution while achieving exponential deviations. All risk bounds are obtained through a PAC-Bayesian analysis on truncated differences of losses. Experimental results strongly back up our truncated min–max estimator.

Citation

Download Citation

Jean-Yves Audibert. Olivier Catoni. "Robust linear least squares regression." Ann. Statist. 39 (5) 2766 - 2794, October 2011. https://doi.org/10.1214/11-AOS918

Information

Published: October 2011
First available in Project Euclid: 22 December 2011

zbMATH: 1231.62126
MathSciNet: MR2906886
Digital Object Identifier: 10.1214/11-AOS918

Subjects:
Primary: 62J05 , 62J07

Keywords: Generalization error , Gibbs posterior distributions , Linear regression , PAC-Bayesian theorems , randomized estimators , resistant estimators , risk bounds , robust statistics , shrinkage , statistical learning theory

Rights: Copyright © 2011 Institute of Mathematical Statistics

Vol.39 • No. 5 • October 2011
Back to Top