Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 6 (2012), 38-90.
Minimax risks for sparse regressions: Ultra-high dimensional phenomenons
Consider the standard Gaussian linear regression model Y=Xθ0+ε, where Y∈ℝn is a response vector and X∈ℝn×p is a design matrix. Numerous work have been devoted to building efficient estimators of θ0 when p is much larger than n. In such a situation, a classical approach amounts to assume that θ0 is approximately sparse. This paper studies the minimax risks of estimation and testing over classes of k-sparse vectors θ0. These bounds shed light on the limitations due to high-dimensionality. The results encompass the problem of prediction (estimation of Xθ0), the inverse problem (estimation of θ0) and linear testing (testing Xθ0=0). Interestingly, an elbow effect occurs when the number of variables klog(p/k) becomes large compared to n. Indeed, the minimax risks and hypothesis separation distances blow up in this ultra-high dimensional setting. We also prove that even dimension reduction techniques cannot provide satisfying results in an ultra-high dimensional setting. Moreover, we compute the minimax risks when the variance of the noise is unknown. The knowledge of this variance is shown to play a significant role in the optimal rates of estimation and testing. All these minimax bounds provide a characterization of statistical problems that are so difficult so that no procedure can provide satisfying results.
Electron. J. Statist., Volume 6 (2012), 38-90.
First available in Project Euclid: 25 January 2012
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Verzelen, Nicolas. Minimax risks for sparse regressions: Ultra-high dimensional phenomenons. Electron. J. Statist. 6 (2012), 38--90. doi:10.1214/12-EJS666. https://projecteuclid.org/euclid.ejs/1327505822
- Supplementary material: Technical Appendix to “Minimax risks for sparse regressions: Ultra-high dimensional phenomenons”.