Open Access
February 2006 Improved minimax predictive densities under Kullback–Leibler loss
Edward I. George, Feng Liang, Xinyi Xu
Ann. Statist. 34(1): 78-91 (February 2006). DOI: 10.1214/009053606000000155

Abstract

Let X|μNp(μ,vxI) and Y|μNp(μ,vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on only observing X=x, we consider the problem of obtaining a predictive density (y|x) for Y that is close to p(y|μ) as measured by expected Kullback–Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density U(y|x) under the uniform prior πU(μ)1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate U(y|x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.

Citation

Download Citation

Edward I. George. Feng Liang. Xinyi Xu. "Improved minimax predictive densities under Kullback–Leibler loss." Ann. Statist. 34 (1) 78 - 91, February 2006. https://doi.org/10.1214/009053606000000155

Information

Published: February 2006
First available in Project Euclid: 2 May 2006

zbMATH: 1091.62003
MathSciNet: MR2275235
Digital Object Identifier: 10.1214/009053606000000155

Subjects:
Primary: 62C20
Secondary: 62C10 , 62F15

Keywords: Bayes rules , heat equation , inadmissibility , multiple shrinkage , multivariate normal , Prior distributions , shrinkage estimation , superharmonic marginals , superharmonic priors , unbiased estimate of risk

Rights: Copyright © 2006 Institute of Mathematical Statistics

Vol.34 • No. 1 • February 2006
Back to Top