Open Access
August 2007 Consistency and robustness of kernel-based regression in convex risk minimization
Andreas Christmann, Ingo Steinwart
Bernoulli 13(3): 799-819 (August 2007). DOI: 10.3150/07-BEJ5102

Abstract

We investigate statistical properties for a broad class of modern kernel-based regression (KBR) methods. These kernel methods were developed during the last decade and are inspired by convex risk minimization in infinite-dimensional Hilbert spaces. One leading example is support vector regression. We first describe the relationship between the loss function L of the KBR method and the tail of the response variable. We then establish the L-risk consistency for KBR which gives the mathematical justification for the statement that these methods are able to “learn”. Then we consider robustness properties of such kernel methods. In particular, our results allow us to choose the loss function and the kernel to obtain computationally tractable and consistent KBR methods that have bounded influence functions. Furthermore, bounds for the bias and for the sensitivity curve, which is a finite sample version of the influence function, are developed, and the relationship between KBR and classical M estimators is discussed.

Citation

Download Citation

Andreas Christmann. Ingo Steinwart. "Consistency and robustness of kernel-based regression in convex risk minimization." Bernoulli 13 (3) 799 - 819, August 2007. https://doi.org/10.3150/07-BEJ5102

Information

Published: August 2007
First available in Project Euclid: 7 August 2007

zbMATH: 1129.62031
MathSciNet: MR2348751
Digital Object Identifier: 10.3150/07-BEJ5102

Keywords: consistency , convex risk minimization , influence function , Nonparametric regression , robustness , sensitivity curve , Support vector regression

Rights: Copyright © 2007 Bernoulli Society for Mathematical Statistics and Probability

Vol.13 • No. 3 • August 2007
Back to Top