The Annals of Statistics

Dimension reduction based on constrained canonical correlation and variable filtering

Jianhui Zhou and Xuming He

Full-text: Open access

Abstract

The “curse of dimensionality” has remained a challenge for high-dimensional data analysis in statistics. The sliced inverse regression (SIR) and canonical correlation (CANCOR) methods aim to reduce the dimensionality of data by replacing the explanatory variables with a small number of composite directions without losing much information. However, the estimated composite directions generally involve all of the variables, making their interpretation difficult. To simplify the direction estimates, Ni, Cook and Tsai [Biometrika 92 (2005) 242–247] proposed the shrinkage sliced inverse regression (SSIR) based on SIR. In this paper, we propose the constrained canonical correlation (C3) method based on CANCOR, followed by a simple variable filtering method. As a result, each composite direction consists of a subset of the variables for interpretability as well as predictive power. The proposed method aims to identify simple structures without sacrificing the desirable properties of the unconstrained CANCOR estimates. The simulation studies demonstrate the performance advantage of the proposed C3 method over the SSIR method. We also use the proposed method in two examples for illustration.

Article information

Source
Ann. Statist., Volume 36, Number 4 (2008), 1649-1668.

Dates
First available in Project Euclid: 16 July 2008

Permanent link to this document
https://projecteuclid.org/euclid.aos/1216237295

Digital Object Identifier
doi:10.1214/07-AOS529

Mathematical Reviews number (MathSciNet)
MR2435451

Zentralblatt MATH identifier
1142.62045

Subjects
Primary: 62J07: Ridge regression; shrinkage estimators
Secondary: 62H20: Measures of association (correlation, canonical correlation, etc.)

Keywords
Canonical correlation dimension reduction L_1-norm constraint

Citation

Zhou, Jianhui; He, Xuming. Dimension reduction based on constrained canonical correlation and variable filtering. Ann. Statist. 36 (2008), no. 4, 1649--1668. doi:10.1214/07-AOS529. https://projecteuclid.org/euclid.aos/1216237295


Export citation

References

  • [1] Chen, C.-H. and Li, K.-C. (1998). Can SIR be as popular as multiple linear regression? Statist. Sinica 8 289–316.
  • [2] Cook, R. D. (1994). Using dimension-reduction subspaces to identify important inputs in models of physical systems. In 1994 Proceedings of the Section on Physical Engineering Sciences 18–25. Amer. Statist. Assoc., Alexandria, VA.
  • [3] Cook, R. D. (2004). Testing predictor contributions in sufficient dimension reduction. Ann. Statist. 32 1062–1092.
  • [4] Cook, R. D. and Critchely, F. (2000). Identifying outliers and regression mixtures graphically. J. Amer. Statist. Assoc. 95 781–794.
  • [5] Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction” by K. C. Li. J. Amer. Statist. Assoc. 86 328–332.
  • [6] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • [7] Fan, J. and Peng, H. (2004). Nonconcave penalized likelihood with a diverging number of parameters. Ann. Statist. 32 928–961.
  • [8] Fung, W. K., He, X., Liu, L. and Shi, P. (2002). Dimension reduction based on canonical correlation. Statist. Sinica 12 1093–1113.
  • [9] Li, B., Zha, H. and Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580–1616.
  • [10] Li, L. (2007). Sparse sufficient dimension reduction. Biometrika 94 603–613.
  • [11] Li, K.-C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316–327.
  • [12] Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025–1039.
  • [13] Li, K.-C. (2000) High dimensional data analysis via the SIR/PHD approach. Available at http://www.stat.ucla.edu/~kcli/sir-PHD.pdf.
  • [14] Li, K.-C. and Duan, N. (1989). Regression analysis under link violation. Ann. Statist. 17 1009–1052.
  • [15] Muirhead, R. J. and Waternaux, C. M. (1980). Asymptotic distributions in canonical correlation analysis and other multivariate procedures for nonnormal populations. Biometrika 67 31–43.
  • [16] Naik, P. A. and Tsai, C.-L. (2001). Single-index model selections. Biometrika 88 821–832.
  • [17] Ni, L., Cook, R. D. and Tsai, C.-L. (2005). A note on shrinkage sliced inverse regression. Biometrika 92 242–247.
  • [18] Shi, P. and Tsai, C.-L. (2002). Regression model selection—a residual likelihood approach. J. Roy. Statist. Soc. Ser. B 64 237–252.
  • [19] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • [20] Xia, Y., Tong, H., Li, W. K. and Zhu, L.-X. (2002). An adaptive estimation of dimension reduction space. J. Roy. Statist. Soc. Ser. B 64 363–410.
  • [21] Zhou, J. (2008). Robust dimension reduction based on canonical correlation. Preprint.