The Annals of Statistics

Estimation Via Linearly Combining Two Given Statistics

J. K. Baksalary and R. Kala

Full-text: Open access


We consider the problems of (i) covariance adjustment of an unbiased estimator, (ii) combining two unbiased estimators, and (iii) improving upon an unbiased estimator. All these problems consist in determining a minimum dispersion linear unbiased combination of given two statistics, one of which is an unbiased estimator of a vector parameter $\mathbf{\theta} \in \mathscr{H}$, and the expectation of the other is a zero vector in the problem of covariance adjustment, is equal to $\mathbf{\theta}$ in the problem of combining, and is equal to a subvector of $\mathbf{\theta}$ in the problem of improving. The solutions obtained are substantial generalizations of known results, in the sense that they are valid for an arbitrary joint dispersion matrix of the given statistics as well as for the parameter space $\mathscr{H}$ being an arbitrary subspace of $\mathscr{R}^k$.

Article information

Ann. Statist., Volume 11, Number 2 (1983), 691-696.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62F10: Point estimation
Secondary: 62J99: None of the above, but in this section

Covariance adjustment of an unbiased estimator combining two unbiased estimators improving upon an unbiased estimator


Baksalary, J. K.; Kala, R. Estimation Via Linearly Combining Two Given Statistics. Ann. Statist. 11 (1983), no. 2, 691--696. doi:10.1214/aos/1176346173.

Export citation