Abstract
The general mixed linear model can be written $y = X\alpha + Zb$, where $\alpha$ is a vector of fixed effects and $b$ is a vector of random variables. Assume that $E(b) = 0$ and that $\operatorname{Var} (b) = \sigma^2D$ with $D$ known. Consider the estimation of $\lambda_1'\alpha + \lambda_2'\beta$, where $\lambda_1'\alpha$ is estimable and $\beta$ is the realized, though unobservable, value of $b$. Among linear estimators $c + r'y$ having $E(c + r'y) \equiv E(\lambda_1'\alpha + \lambda_2'b)$, mean squared error $E(c + r'y - \lambda_1'\alpha - \lambda_2'b)^2$ is minimized by $\lambda_1'\hat{\alpha} + \lambda_2'\hat{\beta}$, where $\hat{\beta} = DZ'V^{\tt\#}(y - X\hat{\alpha}), \hat{\alpha} = (X'V^{\tt\#}X) - X'V^{\tt\#}y$, and $V^{\tt\#}$ is any generalized inverse of $V = ZDZ'$ belonging to the Zyskind-Martin class. It is shown that $\hat{\alpha}$ and $\hat{\beta}$ can be computed from the solution to any of a certain class of linear systems, and that doing so facilitates the exploitation, for computational purposes, of the kind of structure associated with ANOVA models. These results extend the Gauss-Markov theorem. The results can also be applied in a certain Bayesian setting.
Citation
David Harville. "Extension of the Gauss-Markov Theorem to Include the Estimation of Random Effects." Ann. Statist. 4 (2) 384 - 395, March, 1976. https://doi.org/10.1214/aos/1176343414
Information