Abstract
We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting model selector. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for “nearly-orthogonal” and “multicollinear” designs.
Citation
Felix Abramovich. Vadim Grinshtein. "MAP model selection in Gaussian regression." Electron. J. Statist. 4 932 - 949, 2010. https://doi.org/10.1214/10-EJS573
Information