The Annals of Statistics

Nonparametric Regression with Errors in Variables

Jianqing Fan and Young K. Truong

Full-text: Open access


The effect of errors in variables in nonparametric regression estimation is examined. To account for errors in covariates, deconvolution is involved in the construction of a new class of kernel estimators. It is shown that optimal local and global rates of convergence of these kernel estimators can be characterized by the tail behavior of the characteristic function of the error distribution. In fact, there are two types of rates of convergence according to whether the error is ordinary smooth or super smooth. It is also shown that these results hold uniformly over a class of joint distributions of the response and the covariate, which is rich enough for many practical applications. Furthermore, to achieve optimality, we show that the convergence rates of all possible estimators have a lower bound possessed by the kernel estimators.

Article information

Ann. Statist., Volume 21, Number 4 (1993), 1900-1925.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62G20: Asymptotic properties
Secondary: 62G05: Estimation 62J99: None of the above, but in this section

Errors in variables nonparametric regression deconvolution kernel estimator optimal rates of convergence


Fan, Jianqing; Truong, Young K. Nonparametric Regression with Errors in Variables. Ann. Statist. 21 (1993), no. 4, 1900--1925. doi:10.1214/aos/1176349402.

Export citation