Open Access
2018 Improved classification rates under refined margin conditions
Ingrid Blaschzyk, Ingo Steinwart
Electron. J. Statist. 12(1): 793-823 (2018). DOI: 10.1214/18-EJS1406

Abstract

In this paper we present a simple partitioning based technique to refine the statistical analysis of classification algorithms. The core of our idea is to divide the input space into two parts such that the first part contains a suitable vicinity around the decision boundary, while the second part is sufficiently far away from the decision boundary. Using a set of margin conditions we are then able to control the classification error on both parts separately. By balancing out these two error terms we obtain a refined error analysis in a final step. We apply this general idea to the histogram rule and show that even for this simple method we obtain, under certain assumptions, better rates than the ones known for support vector machines, for certain plug-in classifiers, and for a recently analyzed tree based adaptive-partitioning ansatz. Moreover, we show that a margin condition which sets the critical noise in relation to the decision boundary makes it possible to improve the optimal rates proven for distributions without this margin condition.

Citation

Download Citation

Ingrid Blaschzyk. Ingo Steinwart. "Improved classification rates under refined margin conditions." Electron. J. Statist. 12 (1) 793 - 823, 2018. https://doi.org/10.1214/18-EJS1406

Information

Received: 1 October 2016; Published: 2018
First available in Project Euclid: 3 March 2018

zbMATH: 06864477
MathSciNet: MR3770888
Digital Object Identifier: 10.1214/18-EJS1406

Subjects:
Primary: 62H30
Secondary: 62G20 , 68T05

Keywords: ‎classification‎ , excess risk , fast rates of convergence , histogram rule , Statistical learning

Vol.12 • No. 1 • 2018
Back to Top