Abstract and Applied Analysis

Regularized Ranking with Convex Losses and 1 -Penalty

Heng Chen and Jitao Wu

Full-text: Open access

Abstract

In the ranking problem, one has to compare two different observations and decide the ordering between them. It has received increasing attention both in the statistical and machine learning literature. This paper considers 1 -regularized ranking rules with convex loss. Under some mild conditions, a learning rate is established.

Article information

Source
Abstr. Appl. Anal., Volume 2013, Special Issue (2013), Article ID 927827, 8 pages.

Dates
First available in Project Euclid: 26 February 2014

Permanent link to this document
https://projecteuclid.org/euclid.aaa/1393450309

Digital Object Identifier
doi:10.1155/2013/927827

Mathematical Reviews number (MathSciNet)
MR3139446

Zentralblatt MATH identifier
07095500

Citation

Chen, Heng; Wu, Jitao. Regularized Ranking with Convex Losses and ${\ell }^{1}$ -Penalty. Abstr. Appl. Anal. 2013, Special Issue (2013), Article ID 927827, 8 pages. doi:10.1155/2013/927827. https://projecteuclid.org/euclid.aaa/1393450309


Export citation

References

  • S. Clémençon, G. Lugosi, and N. Vayatis, “Ranking and empirical minimization of $U$-statistics,” The Annals of Statistics, vol. 36, no. 2, pp. 844–874, 2008.
  • W. Rejchel, “On ranking and generalization bounds,” Journal of Machine Learning Research, vol. 13, pp. 1373–1392, 2012.
  • Y. Lin, “Support vector machines and the Bayes rule in classification,” Data Mining and Knowledge Discovery, vol. 6, no. 3, pp. 259–275, 2002.
  • D.-R. Chen, Q. Wu, Y. Ying, and D.-X. Zhou, “Support vector machine soft margin classifiers: error analysis,” Journal of Machine Learning Research, vol. 5, pp. 1143–1175, 2003/04.
  • N. Aronszajn, “Theory of reproducing kernels,” Transactions of the American Mathematical Society, vol. 68, pp. 337–404, 1950.
  • F. Cucker and D.-X. Zhou, Learning Theory: An Approximation Theory Viewpoint, vol. 24, Cambridge University Press, Cambridge, UK, 2007.
  • H. Chen and J. T. Wu, “Support čommentComment on ref. [7?]: Please update the information of this reference, if possible.vecor machine for ranking,” submitted.
  • P. S. Bradley and O. L. Mangasarian, “Feature selection via concave minimization and support vector machines,” in Proceedings of the 15th International Conference on Machine Learning (ICML '98), J. Shavlik, Ed., Morgan Kaufmann, 1998.
  • M. Song, C. M. Breneman, J. Bi et al., “Prediction of protein retention times in anion-exchange chromatography systems using support vector regression,” Journal of Chemical Information and Computer Sciences, vol. 42, no. 6, pp. 1347–1357, 2002.
  • D. L. Donoho, “For most large underdetermined systems of linear equations the minimal $\ell ^{1}$-norm solution is also the sparsest solution,” Communications on Pure and Applied Mathematics, vol. 59, no. 6, pp. 797–829, 2006.
  • J. Zhu, S. Rosset, T. Hastie, and R. Tibshirani, “1-norm support vector machines,” Advances in Neural Information Processing Systems, vol. 16, pp. 49–56, 2004.
  • B. Tarigan and S. A. van de Geer, “Classifiers of support vector machine type with ${\ell }^{1}$ complexity regularization,” Bernoulli, vol. 12, no. 6, pp. 1045–1076, 2006.
  • I. Daubechies, M. Defrise, and C. De Mol, “An iterative thresholding algorithm for linear inverse problems with a sparsity constraint,” Communications on Pure and Applied Mathematics, vol. 57, no. 11, pp. 1413–1457, 2004.
  • H. Chen, “The convergence rate of a regularized ranking algorithm,” Journal of Approximation Theory, vol. 164, no. 12, pp. 1513–1519, 2012.
  • S. Smale and D.-X. Zhou, “Learning theory estimates via integral operators and their approximations,” Constructive Approximation, vol. 26, no. 2, pp. 153–172, 2007.
  • P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe, “Convexity, classification, and risk bounds,” Journal of the American Statistical Association, vol. 101, no. 473, pp. 138–156, 2006.
  • I. Steinwart and C. Scovel, “Fast rates for support vector machines using Gaussian kernels,” The Annals of Statistics, vol. 35, no. 2, pp. 575–607, 2007.
  • Q.-W. Xiao and D.-X. Zhou, “Learning by nonsymmetric kernels with data dependent spaces and ${\ell }^{1}$-regularizer,” Taiwanese Journal of Mathematics, vol. 14, no. 5, pp. 1821–1836, 2010.
  • H. Tong, D.-R. Chen, and F. Yang, “Support vector machines regression with ${\ell }^{1}$-regularizer,” Journal of Approximation Theory, vol. 164, no. 10, pp. 1331–1344, 2012.
  • H. Tong, D.-R. Chen, and F. Yang, “Learning rates for $\ell ^{1}$-regularized kernel classifiers,” Journal of Applied Mathematics, vol. 2013, Article ID 496282, 11 pages, 2013.
  • H. Tong, D.-R. Chen, and L. Peng, “Analysis of support vector machines regression,” Foundations of Computational Mathematics, vol. 9, no. 2, pp. 243–257, 2009.
  • M. A. Arcones and E. Giné, “$U$-processes indexed by Vapnik-Červonenkis classes of functions with applications to asymptotics and bootstrap of $U$-statistics with estimated parameters,” Stochastic Processes and their Applications, vol. 52, no. 1, pp. 17–38, 1994.
  • M. A. Arcones and E. Giné, “Limit theorems for $U$-processes,” The Annals of Probability, vol. 21, no. 3, pp. 1494–1542, 1993. \endinput