The Annals of Statistics

Estimation Via Linearly Combining Two Given Statistics

J. K. Baksalary and R. Kala

Full-text: Open access

Abstract

We consider the problems of (i) covariance adjustment of an unbiased estimator, (ii) combining two unbiased estimators, and (iii) improving upon an unbiased estimator. All these problems consist in determining a minimum dispersion linear unbiased combination of given two statistics, one of which is an unbiased estimator of a vector parameter $\mathbf{\theta} \in \mathscr{H}$, and the expectation of the other is a zero vector in the problem of covariance adjustment, is equal to $\mathbf{\theta}$ in the problem of combining, and is equal to a subvector of $\mathbf{\theta}$ in the problem of improving. The solutions obtained are substantial generalizations of known results, in the sense that they are valid for an arbitrary joint dispersion matrix of the given statistics as well as for the parameter space $\mathscr{H}$ being an arbitrary subspace of $\mathscr{R}^k$.

Article information

Source
Ann. Statist., Volume 11, Number 2 (1983), 691-696.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176346173

Digital Object Identifier
doi:10.1214/aos/1176346173

Mathematical Reviews number (MathSciNet)
MR696079

Zentralblatt MATH identifier
0515.62053

JSTOR
links.jstor.org

Subjects
Primary: 62F10: Point estimation
Secondary: 62J99: None of the above, but in this section

Keywords
Covariance adjustment of an unbiased estimator combining two unbiased estimators improving upon an unbiased estimator

Citation

Baksalary, J. K.; Kala, R. Estimation Via Linearly Combining Two Given Statistics. Ann. Statist. 11 (1983), no. 2, 691--696. doi:10.1214/aos/1176346173. https://projecteuclid.org/euclid.aos/1176346173


Export citation