Open Access
2011 On improved predictive density estimation with parametric constraints
Dominique Fourdrinier, Éric Marchand, Ali Righi, William E. Strawderman
Electron. J. Statist. 5: 172-191 (2011). DOI: 10.1214/11-EJS603


We consider the problem of predictive density estimation for normal models under Kullback-Leibler loss (KL loss) when the parameter space is constrained to a convex set. More particularly, we assume that $X\sim {\cal N}_{p}(\mu,v_{x}I)$ is observed and that we wish to estimate the density of $Y\sim {\cal N}_{p}(\mu,v_{y}I)$ under KL loss when μ is restricted to the convex set Cp. We show that the best unrestricted invariant predictive density estimator U is dominated by the Bayes estimator πC associated to the uniform prior πC on C. We also study so called plug-in estimators, giving conditions under which domination of one estimator of the mean vector μ over another under the usual quadratic loss, translates into a domination result for certain corresponding plug-in density estimators under KL loss. Risk comparisons and domination results are also made for comparisons of plug-in estimators and Bayes predictive density estimators. Additionally, minimaxity and domination results are given for the cases where: (i) C is a cone, and (ii) C is a ball.


Download Citation

Dominique Fourdrinier. Éric Marchand. Ali Righi. William E. Strawderman. "On improved predictive density estimation with parametric constraints." Electron. J. Statist. 5 172 - 191, 2011.


Published: 2011
First available in Project Euclid: 14 April 2011

zbMATH: 1274.62079
MathSciNet: MR2792550
Digital Object Identifier: 10.1214/11-EJS603

Primary: 62C15 , 62C20 , 62F10 , 62H12

Keywords: Bayes estimators , cones , convex sets , Kullback-Leibler loss , multivariate normal , Predictive estimation , quadratic loss , risk function , uniform priors

Rights: Copyright © 2011 The Institute of Mathematical Statistics and the Bernoulli Society

Back to Top