The Annals of Statistics

A Bound for the Euclidean Norm of the Difference Between the Least Squares and the Best Linear Unbiased Estimators

J. K. Baksalary and R. Kala

Full-text: Open access

Abstract

Haberman's bound for a norm of the difference between the least squares and the best linear unbiased estimators in a linear model with nonsingular covariance structure is examined in the particular case when a vector norm involved is taken as the Euclidean one. In this frequently occurring case, a new substantially improved bound is developed which, furthermore, is applicable regardless of any additional condition.

Article information

Source
Ann. Statist., Volume 6, Number 6 (1978), 1390-1393.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176344383

Digital Object Identifier
doi:10.1214/aos/1176344383

Mathematical Reviews number (MathSciNet)
MR523772

Zentralblatt MATH identifier
0392.62051

JSTOR
links.jstor.org

Subjects
Primary: 62J05: Linear regression
Secondary: 62J10: Analysis of variance and covariance

Keywords
Linear model least squares estimator best linear unbiased estimator Euclidean norm spectral norm

Citation

Baksalary, J. K.; Kala, R. A Bound for the Euclidean Norm of the Difference Between the Least Squares and the Best Linear Unbiased Estimators. Ann. Statist. 6 (1978), no. 6, 1390--1393. doi:10.1214/aos/1176344383. https://projecteuclid.org/euclid.aos/1176344383


Export citation