Electronic Journal of Statistics

Automatic grouping using smooth-threshold estimating equations

Masao Ueki and Yoshinori Kawasaki

Full-text: Open access

Abstract

Use of redundant statistical model is often the case with practical data analysis. Redundancy widely investigated is inclusion of irrelevant predictors which is resolved by setting their coefficients to zero. On the other hand, it is also useful to consider overlapping parameters of which the values are similar. Grouping by regarding a set of parameters as a single parameter contributes to building intimate parameterization and increasing estimation accuracy by dimension reduction.

The paper proposes a data adaptive automatic grouping of parameters, which simultaneously enables variable selection that can yield sparse solution, by applying the smooth-thresholding. The new procedure is applicable to several estimation equation-based methods, and is shown to possess the oracle property. No convex optimization is needed for its implementation. Numerical examinations including large p small n situation are performed. Proposed automatic grouping applies to interaction modeling for Ohio wheeze data and for credit scoring data.

Article information

Source
Electron. J. Statist., Volume 5 (2011), 309-328.

Dates
First available in Project Euclid: 28 April 2011

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1303996032

Digital Object Identifier
doi:10.1214/11-EJS608

Mathematical Reviews number (MathSciNet)
MR2802045

Zentralblatt MATH identifier
1274.62470

Subjects
Primary: 62J07: Ridge regression; shrinkage estimators
Secondary: 62J10: Analysis of variance and covariance

Keywords
Automatic grouping lasso smooth-thresholding variable selection

Citation

Ueki, Masao; Kawasaki, Yoshinori. Automatic grouping using smooth-threshold estimating equations. Electron. J. Statist. 5 (2011), 309--328. doi:10.1214/11-EJS608. https://projecteuclid.org/euclid.ejs/1303996032


Export citation

References

  • [1] Bondell, H. D. and Reich, B. J. (2009). Simultaneous factor selection and collapsing levels in ANOVA., Biometrics 65 169–177.
  • [2] Buckley, J. J. and James, I. R. (1979). Linear regression with censored data., Biometrika 66 429–36.
  • [3] Chipman, H. (1996). Bayesian variable selection with related predictors., Canadian Journal of Statistics 24 407–499.
  • [4] Choi, N. H., Li, W. and Zhu, J. (2010). Variable seletion with the strong heredity constraint and its oracle property., Journal of the American Statistical Association 105 354–364.
  • [5] Fahrmeier, L. and Tutz, G. (2001)., Multivariate Statistical Modelling Based on Generalized Linear Model, 2nd Edition. New York: Springer.
  • [6] Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties., Journal of the American Statistical Association 96 1348–1360.
  • [7] Fitzmaurice, G. M. and Laird, N. M. (1993). A likelihood-based method for analysing longitudinal binary responses., Biometrika 80 141–151.
  • [8] Fu, W. J. (2003). Penalized estimating equations., Biometrics 59 126–32.
  • [9] Hamada, M. and Wu, C. (1992). Analisis of designed experiments with complex aliasing., Journal of Quality Technology 24 130–137.
  • [10] Jiang, W. and Liu, X. (2004). Consistent model selection based on parameter estimates., Journal of Statistical Planning and Inference 121 265–283.
  • [11] Johnson, B. A., Lin, D. Y. and Zeng, D. (2008). Penalized estimating functions and variable selection in semiparametric regression models., Journal of the American Statistical Association 103 672–680.
  • [12] Joseph, V. (2006). A Bayesian approach to the design and analysis of fractionated experiments., Technometrics 48 219–229.
  • [13] Lai, T. L. and Ying, Z. (1991). Large sample theory of a modified Buckley–James estimator for regression analysis with censored data., Annals of Statistics 19 1370–1402.
  • [14] Liang, K. and Zeger, S. (1986). Longitudinal data analysis using generalized linear models., Biometrika 73 13–22.
  • [15] Rockafellar, R. T. (1979)., Convex Analysis. Princeton University Press.
  • [16] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso., Journal of the Royal Statistical Society, Series B 58 267–288.
  • [17] Tsiatis, A. A. (2006)., Semiparametric Theory and Missing Data. New York: Springer.
  • [18] Ueki, M. (2009). A note on automatic variable selection using smooth-threshold estimationg equations., Biometirka 96 1005–1011.
  • [19] van der Vaart, A. W. (1998)., Asymptotic Statistics. New York: Cambridge University Press.
  • [20] Wang, H. and Leng, C. (2007). Unified lasso estimation via least squares approximation., Journal of the American Statistical Association 102 1039–48.
  • [21] Wang, H., Li, R. and Tsai, C. L. (2007). Tuning parameter selectors for the smoothly clipped absolute deviation method., Biometrika 94 553–68.
  • [22] Ware, J. H., Dockery, D. W., Spiro, A. III, Speizer, F. E. and Fenis, B. G. Jr (1984). Passive smoking, gas cooking and respiratory health in children living in six cities., American Review of Respiratory Disease 129 366–374.
  • [23] Zeger, S. L., Liang, K. Y. and Albert, P. A. (1988). Models for longitudinal data: a generalized estimating equation approach., Biometrics 44 1049–1060.
  • [24] Zheng, X. and Loh, W. Y. (1995). Consistent variable selection in linear models., Journal of the American Statistical Association 90 151–156.
  • [25] Zou, H. (2006). The adaptive lasso and its oracle properties., Journal of the American Statistical Association 101 1418–1429.
  • [26] Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net., Journal of the Royal Statistical Society, Series B 67 301–320.