## The Annals of Mathematical Statistics

### The Order of the Minimum Variance in a Non-Regular Case

Thomas Polfeldt

#### Abstract

Let $f(y)$ be a probability density function on the real line, and $F(y)$ the corresponding distribution function. It is assumed that \begin{equation*}\tag{1} F(y) = 0 \text{for} y \leqq 0,\quad F(y) > 0 \text{for} y > 0.\end{equation*} Let $\theta$ be a location parameter, and let $X = (x_1, \cdots, x_n)$ denote a sample of $n$ independent observations, with each $x_i$ distributed according to $F(x - \theta)$. In this paper, we study the minimum variance of unbiased estimators $t = t(X)$ of $\theta$, with special reference to the order, in $n$, of that variance. For example, if $F(y) = 1 - e^{-y}(y > 0)$, the minimum variance is $n^{-2}$ rather than of order $n^{-1}$ as in regular cases. We shall state conditions on $f(y)$ which determine this order. One of these is that $f(y)$ varies regularly at zero with exponent $c - 1 (c > 0)$ (cf. [3], chapter 8, sect. 8-9). Under the conditions imposed, the smallest attainable variance order is $n^{-1}$ if $c > 2$, but $(F^{-1}(n^{-1}))^2$ if $0 < c < 2$. The case $c = 2$ has special features. Since $F(y)$ varies regularly with exponent $c$, the minimum variance order will be $n^{-2/c}L(n^{-1})$ with slowly varying $L (0 < c < 2$; also true for $c = 2)$. When $c > \frac{1}{2}$, the Chapman and Robbins inequality [2] is used to obtain a lower bound for the minimum variance. For $0 < c \leqq \frac{1}{2}$, a new inequality is used; we then restrict slightly the class of unbiased estimators. The results carry over, of course, to distributions with $F(y) < 1$ for $y < 0, F(y) = 1$ for $y \geqq 0$. A generalization to biased estimators (or to mean square error) is straightforward, but some conditions on the bias function will be necessary. Some questions recently raised by Blischke et al. [1] are answered by the theorems. The conditions imposed here may probably be relaxed to some extent. Notation. $K$ and $K'$ denote positive, finite constants. If there exist $K$ and $K'$ such that $K < a(x)/b(x) < K'(|x| < x_0)$, we shall write $a(x) = \Omega(b(x)) (x \rightarrow 0)$. The qualification $(x \rightarrow 0)$ will often be omitted.

#### Article information

Source
Ann. Math. Statist., Volume 41, Number 2 (1970), 667-672.

Dates
First available in Project Euclid: 27 April 2007

https://projecteuclid.org/euclid.aoms/1177697111

Digital Object Identifier
doi:10.1214/aoms/1177697111

Mathematical Reviews number (MathSciNet)
MR256499

Zentralblatt MATH identifier
0193.47302

JSTOR