The Annals of Mathematical Statistics

Locally Optimal Designs for Estimating Parameters

Herman Chernoff

Full-text: Open access


It is desired to estimate $s$ parameters $\theta_1, \theta_2, \cdots, \theta_s.$ There is available a set of experiments which may be performed. The probability distribution of the data obtained from any of these experiments may depend on $\theta_1, \theta_2, \cdots, \theta_k, k \geqq s.$ One is permitted to select a design consisting of $n$ of these experiments to be performed independently. The repetition of experiments is permitted in the design. We shall show that, under mild conditions, locally optimal designs for large $n$ may be approximated by selecting a certain set of $r \leqq k + (k - 1) + \cdots + (k - s + 1)$ of the experiments available and by repeating each of these $r$ experiments in certain specified proportions. Examples are given illustrating how this result simplifies considerably the problem of obtaining optimal designs. The criterion of optimality that is employed is one that involves the use of Fisher's information matrix. For the case where it is desired to estimate one of the $k$ parameters, this criterion corresponds to minimizing the variance of the asymptotic distribution of the maximum likelihood estimate of that parameter. The result of this paper constitutes a generalization of a result of Elfving [1]. As in Elfving's paper, the results extend to the case where the cost depends on the experiment and the amount of money to be allocated on experimentation is determined instead of the sample size.

Article information

Ann. Math. Statist., Volume 24, Number 4 (1953), 586-602.

First available in Project Euclid: 28 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier



Chernoff, Herman. Locally Optimal Designs for Estimating Parameters. Ann. Math. Statist. 24 (1953), no. 4, 586--602. doi:10.1214/aoms/1177728915.

Export citation