## The Annals of Mathematical Statistics

### Linear Spaces and Unbiased Estimation--Application to the Mixed Linear Model

Justus Seely

#### Abstract

Exemplification of the theory developed in [9] using a linear space of random variables other than linear combinations of the components of a random vector, and unbiased estimation for the parameters of a mixed linear model using quadratic estimators are the primary reasons for the considerations in this paper. For a random vector $Y$ with expectation $X\beta$ and covariance matrix $\sum_i\nu_iV_i$ ($\nu_1, \cdots, \nu_m$, and $\beta$ denote the parameters), interest centers upon quadratic estimability for parametric functions of the form $\sum_{i\leqq j}\gamma_{ij}\beta_i\beta_j + \sum_k\gamma_k\nu k$ and procedures for obtaining quadratic estimators for such parametric functions. Special emphasis is given to parametric functions of the form $\sum_k\gamma_k\nu_k$. Unbiased estimation of variance components is the main reason for quadratic estimability considerations regarding parametric functions of the form $\sum_k\gamma_k\nu_k$. Concerning variance component models, Airy, in 1861 (Scheffe [6]), appears to have been the first to introduce a model with more than one source of variation. Such a model is also implied (Scheffe [6]) by Chauvenet in 1863. Fisher [1], [2] reintroduced variance component models and discussed, apparently for the first time, unbiased estimation in such models. Since Fisher's introduction and discussion of unbiased estimation in models with more than one source of variation, there has been considerable literature published on the subject. One of these papers is a description by Henderson [5] which popularized three methods (now known as Henderson's Methods I, II, and III) for obtaining unbiased estimates of variance components. We mention these methods since they seem to be commonly used in the estimation of variance components. For a review as well as a matrix formulation of the methods see Searle [7]. Among the several pieces of work which have dealt with Henderson's methods, only that of Harville [4] seems to have been concerned with consistency of the equations leading to the estimators and to the existence of unbiased (quadratic) estimators under various conditions. Harville, however, only treats a completely random two-way classification model with interaction. One other result which deals with existence of unbiased quadratic estimators in a completely random model is given by Graybill and Hultquist [3]. In Section 2 the form we assume for a mixed linear model is introduced and the pertinent quantiles needed for the application of the results in [9] are obtained. Definitions, terminology, and notation are consistent with the usage in [9]. Section 3 considers parametric functions of the form $\sum_{i\leqq j}\gamma_{ij}\beta_i\beta_j + \sum_k\gamma_k\nu_k$ and Section 4 concerns parametric functions of the form $\sum_k\gamma_k\nu_k$. One particular method for obtaining unbiased estimators for linear combinations of variance components is given in Section 4 that is computationally simpler than the Henderson Method III procedure which is the most widely used general approach applicable to any mixed linear model. The method described in Section 4 has the added advantage of giving necessary and sufficient conditions for the existence of unbiased quadratic estimators which is not always the case with the Henderson Method III. In the last section an example is given which illustrates the Henderson Method III procedure from the viewpoint of this paper.

#### Article information

Source
Ann. Math. Statist., Volume 41, Number 5 (1970), 1735-1748.

Dates
First available in Project Euclid: 27 April 2007

https://projecteuclid.org/euclid.aoms/1177696818

Digital Object Identifier
doi:10.1214/aoms/1177696818

Mathematical Reviews number (MathSciNet)
MR275560

Zentralblatt MATH identifier
0263.62041

JSTOR