The Annals of Statistics

Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model

Charles J. Stone

Full-text: Open access

Abstract

Let $M_0$ be a normal linear regression model and let $M_1,\cdots, M_K$ be distinct proper linear submodels of $M_0$. Let $\hat k \in \{0,\cdots, K\}$ be a model selection rule based on observed data from the true model. Given $\hat k$, let the unknown parameters of the selected model $M_{\hat k}$ be fitted by the maximum likelihood method. A loss function is introduced which depends additively on two parts: (i) a measure of the difference between the fitted model $M_{\hat k}$ and the true model; and (ii) a measure $C_{\hat k}$ of the "complexity" of the selected model. A natural model selection rule $\bar{k}$, which minimizes an empirical version of this loss, is shown to be admissible and very nearly Bayes.

Article information

Source
Ann. Statist., Volume 9, Number 3 (1981), 475-485.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176345452

Digital Object Identifier
doi:10.1214/aos/1176345452

Mathematical Reviews number (MathSciNet)
MR615424

Zentralblatt MATH identifier
0499.62056

JSTOR
links.jstor.org

Subjects
Primary: 62J05: Linear regression
Secondary: 62C15: Admissibility

Keywords
Admissibility normal linear regression model generalized Bayes parsimony complexity

Citation

Stone, Charles J. Admissible Selection of an Accurate and Parsimonious Normal Linear Regression Model. Ann. Statist. 9 (1981), no. 3, 475--485. doi:10.1214/aos/1176345452. https://projecteuclid.org/euclid.aos/1176345452


Export citation