Journal of Applied Mathematics

Approximation Analysis of Learning Algorithms for Support Vector Regression and Quantile Regression

Dao-Hong Xiang, Ting Hu, and Ding-Xuan Zhou

Full-text: Open access

Abstract

We study learning algorithms generated by regularization schemes in reproducing kernel Hilbert spaces associated with an {\epsilon} -insensitive pinball loss. This loss function is motivated by the {\epsilon} -insensitive loss for support vector regression and the pinball loss for quantile regression. Approximation analysis is conducted for these algorithms by means of a variance-expectation bound when a noise condition is satisfied for the underlying probability measure. The rates are explicitly derived under a priori conditions on approximation and capacity of the reproducing kernel Hilbert space. As an application, we get approximation orders for the support vector regression and the quantile regularized regression.

Article information

Source
J. Appl. Math., Volume 2012 (2012), Article ID 902139, 17 pages.

Dates
First available in Project Euclid: 17 October 2012

Permanent link to this document
https://projecteuclid.org/euclid.jam/1350479410

Digital Object Identifier
doi:10.1155/2012/902139

Mathematical Reviews number (MathSciNet)
MR2880823

Zentralblatt MATH identifier
1235.68206

Citation

Xiang, Dao-Hong; Hu, Ting; Zhou, Ding-Xuan. Approximation Analysis of Learning Algorithms for Support Vector Regression and Quantile Regression. J. Appl. Math. 2012 (2012), Article ID 902139, 17 pages. doi:10.1155/2012/902139. https://projecteuclid.org/euclid.jam/1350479410


Export citation