The Annals of Statistics

Multivariate Locally Weighted Least Squares Regression

D. Ruppert and M. P. Wand

Full-text: Open access


Nonparametric regression using locally weighted least squares was first discussed by Stone and by Cleveland. Recently, it was shown by Fan and by Fan and Gijbels that the local linear kernel-weighted least squares regression estimator has asymptotic properties making it superior, in certain senses, to the Nadaraya-Watson and Gasser-Muller kernel estimators. In this paper we extend their results on asymptotic bias and variance to the case of multivariate predictor variables. We are able to derive the leading bias and variance terms for general multivariate kernel weights using weighted least squares matrix theory. This approach is especially convenient when analysing the asymptotic conditional bias and variance of the estimator at points near the boundary of the support of the predictors. We also investigate the asymptotic properties of the multivariate local quadratic least squares regression estimator discussed by Cleveland and Devlin and, in the univariate case, higher-order polynomial fits and derivative estimation.

Article information

Ann. Statist., Volume 22, Number 3 (1994), 1346-1370.

First available in Project Euclid: 11 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62G07: Density estimation
Secondary: 62J02: General nonlinear regression

Bandwidth matrix boundary effects derivative estimation kernel estimator local polynomial fitting nonparametric regression weighted least squares


Ruppert, D.; Wand, M. P. Multivariate Locally Weighted Least Squares Regression. Ann. Statist. 22 (1994), no. 3, 1346--1370. doi:10.1214/aos/1176325632.

Export citation