The Annals of Statistics

A Note on Admissibility When Precision is Unbounded

Charles Anderson and Nabendu Pal

Full-text: Open access

Abstract

The estimation of a common mean vector $\theta$ given two independent normal observations $X \sim N_p(\theta, \sigma^2_x I)$ and $Y \sim N_p(\theta, \sigma^2_y I)$ is reconsidered. It being known that the estimator $\eta X + (1 - \eta)Y$ is inadmissible when $\eta \in (0, 1)$, we show that when $\eta$ is 0 or 1, then the opposite is true, that is, the estimator is admissible. The general situation is that an estimator $X^\ast$ can be improved by shrinkage when there exists a statistic $B$ which, in a certain sense, estimates a lower bound on the risk of $X^\ast$. On the other hand, an estimator is admissible under very general conditions if there is no reasonable way to detect that its risk is small.

Article information

Source
Ann. Statist., Volume 23, Number 2 (1995), 593-597.

Dates
First available in Project Euclid: 11 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176324537

Digital Object Identifier
doi:10.1214/aos/1176324537

Mathematical Reviews number (MathSciNet)
MR1332583

Zentralblatt MATH identifier
0824.62007

JSTOR
links.jstor.org

Subjects
Primary: 62C15: Admissibility
Secondary: 62H12: Estimation

Keywords
Inadmissibility shrinkage estimation Stein's normal identity

Citation

Anderson, Charles; Pal, Nabendu. A Note on Admissibility When Precision is Unbounded. Ann. Statist. 23 (1995), no. 2, 593--597. doi:10.1214/aos/1176324537. https://projecteuclid.org/euclid.aos/1176324537


Export citation