Open Access
2016 On Bayesian robust regression with diverging number of predictors
Daniel Nevo, Ya’acov Ritov
Electron. J. Statist. 10(2): 3045-3062 (2016). DOI: 10.1214/16-EJS1205

Abstract

This paper concerns the robust regression model when the number of predictors and the number of observations grow in a similar rate. Theory for M-estimators in this regime has been recently developed by several authors (El Karoui et al., 2013; Bean et al., 2013; Donoho and Montanari, 2013). Motivated by the inability of M-estimators to successfully estimate the Euclidean norm of the coefficient vector, we consider a Bayesian framework for this model. We suggest a two-component mixture of normals prior for the coefficients and develop a Gibbs sampler procedure for sampling from relevant posterior distributions, while utilizing a scale mixture of normal representation for the error distribution. Unlike M-estimators, the proposed Bayes estimator is consistent in the Euclidean norm sense. Simulation results demonstrate the superiority of the Bayes estimator over traditional estimation methods.

Citation

Download Citation

Daniel Nevo. Ya’acov Ritov. "On Bayesian robust regression with diverging number of predictors." Electron. J. Statist. 10 (2) 3045 - 3062, 2016. https://doi.org/10.1214/16-EJS1205

Information

Received: 1 July 2015; Published: 2016
First available in Project Euclid: 9 November 2016

zbMATH: 1366.62139
MathSciNet: MR3571962
Digital Object Identifier: 10.1214/16-EJS1205

Subjects:
Primary: 62F15 , 62H12 , 62J05

Keywords: Bayesian estimation , high dimensional regression , MCMC , robust regression

Rights: Copyright © 2016 The Institute of Mathematical Statistics and the Bernoulli Society

Vol.10 • No. 2 • 2016
Back to Top