Abstract
This paper deals with $M$-estimators for the linear model $y_i = \mathbf{x}'_i\mathbf{\theta} + u_i1 \leqslant i \leqslant n$, where the $\mathbf{x}_i$ are fixed $p$-dimensional vectors, and the $u_i$ are i.i.d. random variables with distribution $F$. The estimators considered are solutions $\hat\mathbf{\theta}$ of the equation $\sum^n_{j = 1}\psi(y_j - \mathbf{x}'_i\hat{\mathbf{\theta}})\mathbf{x_j = 0}$ for some function $\psi$. Let $\mathbf{X}$ be the matrix whose $i$th row is $\mathbf{x}'_i$. Then it is proved that $(\mathbf{\hat{\theta} - \theta)'X'X(\hat{\theta} - \theta)}$ is bounded in probability assuming that $\psi$ satisfies a set of conditions which include $\psi$ to be monotone and $X$ to have full rank. This implies that a sufficient condition for consistency is that the smallest eigenvalue of $\mathbf{X'X}$ tends to infinity. For the case in which $p = p_n \rightarrow \infty$ it is proved that $p^{-1}(\mathbf{\hat{\theta} - \theta)'X'X(\hat{\theta} - \theta)}$ is bounded in probability, assuming that $p\varepsilon\rightarrow 0$ where $\varepsilon = \max_{1\leqslant i \leqslant n}(\mathbf{x'_iX'Xx_i})$. The asymptotic normality of these estimators is proved for both the cases of $p$ fixed and $p \rightarrow\infty$. The proof of the former is an easy consequence of a result of Bickel on one-step $M$-estimators. In the case of $p\rightarrow \infty$ we assume that $\psi$ has a bounded derivative and that $p^{3/2}\varepsilon\rightarrow 0$. This improves an analogous result by Huber, who requires $p^2\varepsilon\rightarrow 0$.
Citation
Victor J. Yohai. Ricardo A. Maronna. "Asymptotic Behavior of $M$-Estimators for the Linear Model." Ann. Statist. 7 (2) 258 - 268, March, 1979. https://doi.org/10.1214/aos/1176344610
Information