Abstract
We consider the problem of predictive density estimation under Kullback-Leibler loss in a high-dimensional Gaussian model with exact sparsity constraints on the location parameters. For non-asymptotic sparsity levels, the least favorable prior is discrete. Here, we study the first order asymptotic minimax risk of Bayes predictive density estimates where the proportion of non-zero coordinates converges to zero as dimension increases. Motivated by an optimal thresholding rule in Mukherjee and Johnstone (2015), we propose a discrete prior and show that its Bayes predictive density estimate is minimax optimal. This produces a nonsubjective discrete prior distribution that minimizes the maximum posterior predictive relative entropy regret. We discuss the decision theoretic implications and the structural differences between our proposed prior and its closest predecessor – the geometrically decaying discrete prior of Johnstone (1994a) that produced minimax optimal point estimators under quadratic loss. Through numerical experiments, we present non-asymptotic worst-case risk of our proposed estimator across different sparsity levels.
Funding Statement
The research here was partially supported by NSF DMS-1811866.
Acknowledgments
GM is indebted to Professor Iain Johnstone for numerous stimulating discussions which led to many of the ideas in this paper.
Citation
Ujan Gangopadhyay. Gourab Mukherjee. "On discrete priors and sparse minimax optimal predictive densities." Electron. J. Statist. 15 (1) 1636 - 1660, 2021. https://doi.org/10.1214/21-EJS1818
Information