Open Access
October 2005 Cross-validation in nonparametric regression with outliers
Denis Heng-Yan Leung
Ann. Statist. 33(5): 2291-2310 (October 2005). DOI: 10.1214/009053605000000499

Abstract

A popular data-driven method for choosing the bandwidth in standard kernel regression is cross-validation. Even when there are outliers in the data, robust kernel regression can be used to estimate the unknown regression curve [Robust and Nonlinear Time Series Analysis. Lecture Notes in Statist. (1984) 26 163–184]. However, under these circumstances standard cross-validation is no longer a satisfactory bandwidth selector because it is unduly influenced by extreme prediction errors caused by the existence of these outliers. A more robust method proposed here is a cross-validation method that discounts the extreme prediction errors. In large samples the robust method chooses consistent bandwidths, and the consistency of the method is practically independent of the form in which extreme prediction errors are discounted. Additionally, evaluation of the method’s finite sample behavior in a simulation demonstrates that the proposed method performs favorably. This method can also be applied to other problems, for example, model selection, that require cross-validation.

Citation

Download Citation

Denis Heng-Yan Leung. "Cross-validation in nonparametric regression with outliers." Ann. Statist. 33 (5) 2291 - 2310, October 2005. https://doi.org/10.1214/009053605000000499

Information

Published: October 2005
First available in Project Euclid: 25 November 2005

zbMATH: 1086.62055
MathSciNet: MR2211087
Digital Object Identifier: 10.1214/009053605000000499

Subjects:
Primary: 62G08
Secondary: 62F35 , 62F40

Keywords: bandwidth , cross-validation , ‎kernel‎ , Nonparametric regression , robust , smoothing

Rights: Copyright © 2005 Institute of Mathematical Statistics

Vol.33 • No. 5 • October 2005
Back to Top