## The Annals of Statistics

### Regression Analysis Under Link Violation

#### Abstract

We study the behavior of regression analysis when there might be some violation of the assumed link function, the functional form of the model which relates the outcome variable $y$ to the regressor variable $\mathbf{x}$ and the random error. We allow the true link function to be completely arbitrary, except that $y$ depends on $\mathbf{x}$ only through a linear combination $\beta\mathbf{x}$. The slope vector $\beta$ is identified only up to a multiplicative scalar. Under appropriate conditions, any maximum likelihood-type regression estimate is shown to be consistent for $\beta$ up to a multiplicative scalar, even though the estimate might be based on a misspecified link function. The crucial conditions are (1) the estimate is based on minimizing a criterion function $L(\theta, y)$ which is convex in $\theta$, where $\theta = a + b\mathbf{x}$, (2) the expected criterion function $E\lbrack L(a + b\mathbf{x}, y)\rbrack$ has a proper minimizer and (3) the regressor variable $\mathbf{x}$ is sampled randomly from a probability distribution such that $E(b\mathbf{x}\mid\beta\mathbf{x})$ is linear in $\beta\mathbf{x}$ for all linear combinations $b\mathbf{x}$. The least squares estimate, the GLM estimates and the $M$-estimates for robust regression are discussed in detail. These estimates are asymptotically normal. With the assumption that the regressor variable has an elliptically symmetric distribution, we show that under a scale-invariant null hypothesis of the form $H_0: \beta W = 0$, the asymptotic covariance matrix for $\hat{\beta}W$ is proportional to the one derived by treating the assumed link function as being true. The Wald test as well as the likelihood ratio test for a scale-invariant null hypothesis has the correct asymptotic null distribution after an appropriate rescaling of the test statistic to account for the proportionality constant between the two asymptotic covariance matrices. For normally distributed $\mathbf{x}$, the rescaling factor for $M$-estimates is the same as the one used in robust regression, while the rescaling factor for GLM estimates is related to adjustment for overdispersion. Confidence sets can be constructed by inverting Wald's tests. The impact of the violation of linear conditional expectation condition 3 is discussed. A new dimension is added to the regression diagnostics by exploring the elliptical symmetry of the design distribution. A connection between this work and adaptive estimation is briefly discussed.

#### Article information

Source
Ann. Statist., Volume 17, Number 3 (1989), 1009-1052.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176347254

Digital Object Identifier
doi:10.1214/aos/1176347254

Mathematical Reviews number (MathSciNet)
MR1015136

Zentralblatt MATH identifier
0753.62041

JSTOR