Open Access
2010 MAP model selection in Gaussian regression
Felix Abramovich, Vadim Grinshtein
Electron. J. Statist. 4: 932-949 (2010). DOI: 10.1214/10-EJS573

Abstract

We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting model selector. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for “nearly-orthogonal” and “multicollinear” designs.

Citation

Download Citation

Felix Abramovich. Vadim Grinshtein. "MAP model selection in Gaussian regression." Electron. J. Statist. 4 932 - 949, 2010. https://doi.org/10.1214/10-EJS573

Information

Published: 2010
First available in Project Euclid: 24 September 2010

zbMATH: 1329.62051
MathSciNet: MR2721039
Digital Object Identifier: 10.1214/10-EJS573

Subjects:
Primary: 62C99
Secondary: 62C10, 62C20, 62G05

Keywords: Adaptivity , complexity penalty , Gaussian linear regression , maximum a posteriori rule , minimax estimation , Model selection , Oracle inequality , Sparsity

Rights: Copyright © 2010 The Institute of Mathematical Statistics and the Bernoulli Society

Back to Top