## The Annals of Statistics

### Estimation Via Linearly Combining Two Given Statistics

#### Abstract

We consider the problems of (i) covariance adjustment of an unbiased estimator, (ii) combining two unbiased estimators, and (iii) improving upon an unbiased estimator. All these problems consist in determining a minimum dispersion linear unbiased combination of given two statistics, one of which is an unbiased estimator of a vector parameter $\mathbf{\theta} \in \mathscr{H}$, and the expectation of the other is a zero vector in the problem of covariance adjustment, is equal to $\mathbf{\theta}$ in the problem of combining, and is equal to a subvector of $\mathbf{\theta}$ in the problem of improving. The solutions obtained are substantial generalizations of known results, in the sense that they are valid for an arbitrary joint dispersion matrix of the given statistics as well as for the parameter space $\mathscr{H}$ being an arbitrary subspace of $\mathscr{R}^k$.

#### Article information

Source
Ann. Statist., Volume 11, Number 2 (1983), 691-696.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176346173

Digital Object Identifier
doi:10.1214/aos/1176346173

Mathematical Reviews number (MathSciNet)
MR696079

Zentralblatt MATH identifier
0515.62053

JSTOR