Electronic Journal of Statistics

Sufficient dimension reduction via principal L$q$ support vector machine

Andreas Artemiou and Yuexiao Dong

Full-text: Open access

Abstract

Principal support vector machine was proposed recently by Li, Artemiou and Li (2011) to combine L1 support vector machine and sufficient dimension reduction. We introduce the principal L$q$ support vector machine as a unified framework for linear and nonlinear sufficient dimension reduction. By noticing that the solution of L1 support vector machine may not be unique, we set $q>1$ to ensure the uniqueness of the solution. The asymptotic distribution of the proposed estimators are derived for $q>1$. We demonstrate through numerical studies that the proposed L2 support vector machine estimators improve existing methods in accuracy, and are less sensitive to the tuning parameter selection.

Article information

Source
Electron. J. Statist., Volume 10, Number 1 (2016), 783-805.

Dates
Received: August 2014
First available in Project Euclid: 6 April 2016

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1459967423

Digital Object Identifier
doi:10.1214/16-EJS1122

Mathematical Reviews number (MathSciNet)
MR3486417

Zentralblatt MATH identifier
06576607

Keywords
Inverse regression L2 support vector machine Reproducing kernel Hilbert space

Citation

Artemiou, Andreas; Dong, Yuexiao. Sufficient dimension reduction via principal L$q$ support vector machine. Electron. J. Statist. 10 (2016), no. 1, 783--805. doi:10.1214/16-EJS1122. https://projecteuclid.org/euclid.ejs/1459967423


Export citation

References

  • Abe, S. (2002). Analysis of support vector machines. In, Neural Networks for Signal Processing XII – Proceedings of the 2002 IEEE Signal Processing Society Workshops, 89–98.
  • Abe, S. (2010)., Support Vector Machines for Pattern Classification. Second Edition. Springer.
  • Artemiou, A. and Shu, M. (2014). A cost based reweighed scheme of principal support vector machine. In, Topics in Nonparametric Statistics, Springer Proceedings in Mathematics & Statistics, 74, 1–12.
  • Bura, E. and Pfeiffer, R. (2008). On the distribution of the left singular vectors of a random matrix and its applications., Statistics and Probability Letters, 78, 2275–2280.
  • Burges, C. J. C. and Crisp, S. J. (1999). Uniqueness of the SVM solution. In, Proceedings of Neural Information Processing Systems, 12, 223–229.
  • Cook, R. D. (1998a)., Regression Graphics: Ideas for Studying Regressions through Graphics. New York: Wiley.
  • Cook, R. D. (1998b). Principal Hessian directions revisited (with discussion)., Journal of the American Statistical Association, 93, 84–100.
  • Cook, R. D. (2004). Testing predictors contributions in sufficient dimension reduction., The Annals of Statistics, 32, 1062–1092.
  • Cook, R. D. (2007). Fisher lecture: dimension reduction in regression., Statistical Science, 22, 1–40.
  • Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction”., Journal of the American Statistical Association. 86, 316–342.
  • Cortes, C. and Vapnik, V. (1995). Support vector networks., Machine Learning, 20, 1–25.
  • Fukumizu, K., Bach, F. R., and Jordan, M. I. (2009). Kernel dimension reduction in regression., The Annals of Statistics, 37, 1871–1905.
  • Jiang, B., Zhang, X., and Cai, T. (2008). Estimating the confidence interval for prediction errors of support vector machine classifiers., Journal of Machine Learning Research, 9, 521–540.
  • Koo, J.-Y., Lee, Y., Kim, Y. and Park, C. (2008). A Bahadur representation of the linear support vector machine., Journal of Machine Learning Research, 9, 1343–1368.
  • Li, B., Artemiou, A. and Li, L. (2011). Principal support vector machine for linear and nonlinear sufficient dimension reduction., The Annals of Statistics, 39, 3182–3210
  • Li, B. and Wang, S. (2007). On directional regression for dimension reduction., Journal of the American Statistical Association, 102, 997–1008.
  • Li, B., Zha, H., and Chiaromonte, F. (2005). Contour regression: a general approach to dimension reduction., The Annals of Statistics, 33, 1580–1616.
  • Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion)., Journal of the American Statistical Association, 86, 316–342.
  • Shin, S. J., Wu, Y., Zhang, H. H. and Liu, Y. (2014). Probability-enhanced sufficient dimension reduction for binary classification., Biometrics, 70, 546–555.
  • Suykens, J. A. K., Gestel, T. V., Brabanter, J. D., Moor, B. D. and Vandewalle, J. (2002). Least Squares Support Vector Machines. World Scientific Pub. Co., Singapore.
  • Vapnik, N. V. (1998)., Statistical Learning Theory. John Wiley & Sons, Inc.
  • Wang, Q. and Yin, X. (2008). Sufficient dimension reduction and variable selection for regression mean function with categorical predictors., Statistics and Probability Letters, 78, 2798–2803.
  • Wu, H. M. (2008). Kernel sliced inverse regression with applications on classification., Journal of Computational and Graphical Statistics, 17, 590–610.
  • Xia, Y., Tong, H., Li, W. K. and Zhu, L. X. (2002). An adaptive estimation of optimal regression subspace., Journal of the Royal Statistical Society, Series B., 64, 363–410.
  • Ye, Z. and Weiss, R. (2003). Using the bootstrap to select one of a new class of dimension reduction methods., Journal of the American Statistical Association, 98, 968–979.
  • Yeh, I. C. (2007). Modeling slump flow of concrete using second-order regressions and artificial neural networks., Cement and Concrete Composites, 29, 474–480.
  • Yeh, Y. R., Huang, S. Y. and Lee, Y. Y. (2009). Nonlinear dimension reduction with kernel sliced inverse regression., IEEE Transactions on Knowledge and Data Engineering, 21, 1590–1603.
  • Yin, X., Li, B. and Cook, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression., Journal of Multivariate Analysis. 99, 1733–1757.
  • Zhu, L. X., Miao, B. and Peng, H. (2006). On sliced inverse regression with large dimensional covariates., Journal of the American Statistical Association, 101, 630–643.