Open Access
March, 1959 Application of a Measure of Information to the Design and Comparison of Regression Experiments
M. Stone
Ann. Math. Statist. 30(1): 55-70 (March, 1959). DOI: 10.1214/aoms/1177706359

Abstract

A normal regression experiment can be represented by \begin{equation*}\tag{1.1} Y_i = \sum_{j=1}^k X_{ij} \theta_j + \eta_i\qquad (i = 1, \cdots, n)\end{equation*} where $\{\eta_i/i = 1, \cdots, n\}$ is a set of normally distributed random variables with zero means and non-singular dispersion matrix $C, \theta = (\theta_1, \cdots, \theta_k)$ is the parameter-vector of interest and $X = (X_{ij})$ is a known $n \times k$ matrix which will be called the allocation matrix. The rows of $X$ will be called the allocation vectors. We denote the experiment by $\varepsilon(X, C)$. We assume that $C$ is known; generally it will be a function of $X, C(X)$. The particular realisation of $Y$ will be denoted $y$. The matrix $F = X'C^{-1}X$ is the Fisher-information-matrix of $\varepsilon(X, C)$. When $F$ is non-singular, one answer to the question "What information does $y$ give about $\theta$?" is to quote $F^{-1}$, the dispersion matrix of the maximum-likelihood-estimates of $\theta$. A strong argument in favour of this is that $F^{-1}$ is independent of both $\theta$ and $y$. The fact that it is independent of $\theta$ means that the answer is not "local"; the fact that it is independent of $y$ leads to simplicity. This approach is taken by Box and Hunter [1] in their work on rotatable designs. However, we must accept the fact that many experimenters wish to have a one-dimensional answer to the question i.e. we must associate with $\varepsilon(X, C)$ a single number which we call the "information". For instance Elfving [5] has developed the use of trace $F^{-1}$. In this paper we adopt the measure of information introduced by Lindley [7]. In Section 2 we generalise Lindley's treatment of the regression situation to include the singular case, explain the uses of the measure and compare it with that of Elfving. Section 3 deals with the analogue of Elfving's main theorem. Theorems 4.1 and 4.2 of Section 4 provide links with the traditional variance approach. In Section 5 we derive the asymptotic form of the measure as the $n$ of (1.1) increases and show that this form can be derived also from Neyman-Pearsonian theory. In Section 6 the influence of nuisance parameters is discussed and an analogue of a theorem of Chernoff [2] is established.

Citation

Download Citation

M. Stone. "Application of a Measure of Information to the Design and Comparison of Regression Experiments." Ann. Math. Statist. 30 (1) 55 - 70, March, 1959. https://doi.org/10.1214/aoms/1177706359

Information

Published: March, 1959
First available in Project Euclid: 27 April 2007

zbMATH: 0094.13602
MathSciNet: MR106528
Digital Object Identifier: 10.1214/aoms/1177706359

Rights: Copyright © 1959 Institute of Mathematical Statistics

Vol.30 • No. 1 • March, 1959
Back to Top