Open Access
January, 1979 On Estimating the Slope of a Straight Line when Both Variables are Subject to Error
Clifford Spiegelman
Ann. Statist. 7(1): 201-206 (January, 1979). DOI: 10.1214/aos/1176344565

Abstract

Let $X_i$ and $Y_i$ be random variables related to other random variables $U_i, V_i$, and $W_i$ as follows: $X_i = U_i + W_i, Y_i = \alpha + \beta U_i + V_i, i = 1, \cdots, n$, where $\alpha$ and $\beta$ are finite constants. Here $X_i$ and $Y_i$ are observable while $U_i, V_i$ and $W_i$ are not. This model is customarily referred to as the regression problem with errors in both variables and the central question is the estimation of $\beta$. We give a class of estimates for $\beta$ which are asymptotically normal with mean $\beta$ and variance proportional to $1/n^{\frac{1}{2}}$, under weak assumptions. We then show how to choose a good estimate of $\beta$ from this class.

Citation

Download Citation

Clifford Spiegelman. "On Estimating the Slope of a Straight Line when Both Variables are Subject to Error." Ann. Statist. 7 (1) 201 - 206, January, 1979. https://doi.org/10.1214/aos/1176344565

Information

Published: January, 1979
First available in Project Euclid: 12 April 2007

zbMATH: 0412.62048
MathSciNet: MR515694
Digital Object Identifier: 10.1214/aos/1176344565

Subjects:
Primary: 62J05

Keywords: 62-02 , asymptotic distribution , errors in variables , regression

Rights: Copyright © 1979 Institute of Mathematical Statistics

Vol.7 • No. 1 • January, 1979
Back to Top