Open Access
April 2009 Gaussian model selection with an unknown variance
Yannick Baraud, Christophe Giraud, Sylvie Huet
Ann. Statist. 37(2): 630-672 (April 2009). DOI: 10.1214/07-AOS573

Abstract

Let Y be a Gaussian vector whose components are independent with a common unknown variance. We consider the problem of estimating the mean μ of Y by model selection. More precisely, we start with a collection $\mathcal{S}=\{S_{m},m\in\mathcal{M}\}$ of linear subspaces of ℝn and associate to each of these the least-squares estimator of μ on Sm. Then, we use a data driven penalized criterion in order to select one estimator among these. Our first objective is to analyze the performance of estimators associated to classical criteria such as FPE, AIC, BIC and AMDL. Our second objective is to propose better penalties that are versatile enough to take into account both the complexity of the collection $\mathcal{S}$ and the sample size. Then we apply those to solve various statistical problems such as variable selection, change point detections and signal estimation among others. Our results are based on a nonasymptotic risk bound with respect to the Euclidean loss for the selected estimator. Some analogous results are also established for the Kullback loss.

Citation

Download Citation

Yannick Baraud. Christophe Giraud. Sylvie Huet. "Gaussian model selection with an unknown variance." Ann. Statist. 37 (2) 630 - 672, April 2009. https://doi.org/10.1214/07-AOS573

Information

Published: April 2009
First available in Project Euclid: 10 March 2009

zbMATH: 1162.62051
MathSciNet: MR2502646
Digital Object Identifier: 10.1214/07-AOS573

Subjects:
Primary: 62G08

Keywords: adaptive estimation , AIC , AMDL , BIC , change-points detection , FPE , Model selection , penalized criterion , Variable selection

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 2 • April 2009
Back to Top