Bernoulli
- Bernoulli
- Volume 17, Number 4 (2011), 1368-1385.
Support vector machines with a reject option
Abstract
This paper studies $ℓ_1$ regularization with high-dimensional features for support vector machines with a built-in reject option (meaning that the decision of classifying an observation can be withheld at a cost lower than that of misclassification). The procedure can be conveniently implemented as a linear program and computed using standard software. We prove that the minimizer of the penalized population risk favors sparse solutions and show that the behavior of the empirical risk minimizer mimics that of the population risk minimizer. We also introduce a notion of classification complexity and prove that our minimizers adapt to the unknown complexity. Using a novel oracle inequality for the excess risk, we identify situations where fast rates of convergence occur.
Article information
Source
Bernoulli, Volume 17, Number 4 (2011), 1368-1385.
Dates
First available in Project Euclid: 4 November 2011
Permanent link to this document
https://projecteuclid.org/euclid.bj/1320417508
Digital Object Identifier
doi:10.3150/10-BEJ320
Mathematical Reviews number (MathSciNet)
MR2854776
Zentralblatt MATH identifier
1243.68256
Keywords
adaptive prediction classification with a reject option lasso oracle inequalities sparsity support vector machines statistical learning
Citation
Wegkamp, Marten; Yuan, Ming. Support vector machines with a reject option. Bernoulli 17 (2011), no. 4, 1368--1385. doi:10.3150/10-BEJ320. https://projecteuclid.org/euclid.bj/1320417508