The Annals of Mathematical Statistics

The Theory of Unbiased Estimation

Paul R. Halmos

Abstract

Let $F(P)$ be a real valued function defined on a subset $\mathscr{D}$ of the set $\mathscr{D}^\ast$ of all probability distributions on the real line. A function $f$ of $n$ real variables is an unbiased estimate of $F$ if for every system, $X_1, \cdots, X_n$, of independent random variables with the common distribution $P$, the expectation of $f(X_1 \cdots, X_n)$ exists and equals $F(P)$, for all $P$ in $\mathscr{D}$. A necessary and sufficient condition for the existence of an unbiased estimate is given (Theorem 1), and the way in which this condition applies to the moments of a distribution is described (Theorem 2). Under the assumptions that this condition is satisfied and that $\mathscr{D}$ contains all purely discontinuous distributions it is shown that there is a unique symmetric unbiased estimate (Theorem 3); the most general (non symmetric) unbiased estimates are described (Theorem 4); and it is proved that among them the symmetric one is best in the sense of having the least variance (Theorem 5). Thus the classical estimates of the mean and the variance are justified from a new point of view, and also, from the theory, computable estimates of all higher moments are easily derived. It is interesting to note that for $n$ greater than 3 neither the sample $n$th moment about the sample mean nor any constant multiple thereof is an unbiased estimate of the $n$th moment about the mean. Attention is called to a paradoxical situation arising in estimating such non linear functions as the square of the first moment.

Article information

Source
Ann. Math. Statist., Volume 17, Number 1 (1946), 34-43.

Dates
First available in Project Euclid: 28 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aoms/1177731020

Digital Object Identifier
doi:10.1214/aoms/1177731020

Mathematical Reviews number (MathSciNet)
MR15746

Zentralblatt MATH identifier
0063.01891

JSTOR