Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 5 (2011), 172-191.
On improved predictive density estimation with parametric constraints
We consider the problem of predictive density estimation for normal models under Kullback-Leibler loss (KL loss) when the parameter space is constrained to a convex set. More particularly, we assume that is observed and that we wish to estimate the density of under KL loss when μ is restricted to the convex set C⊂ℝp. We show that the best unrestricted invariant predictive density estimator p̂U is dominated by the Bayes estimator p̂πC associated to the uniform prior πC on C. We also study so called plug-in estimators, giving conditions under which domination of one estimator of the mean vector μ over another under the usual quadratic loss, translates into a domination result for certain corresponding plug-in density estimators under KL loss. Risk comparisons and domination results are also made for comparisons of plug-in estimators and Bayes predictive density estimators. Additionally, minimaxity and domination results are given for the cases where: (i) C is a cone, and (ii) C is a ball.
Electron. J. Statist., Volume 5 (2011), 172-191.
First available in Project Euclid: 14 April 2011
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Fourdrinier, Dominique; Marchand, Éric; Righi, Ali; Strawderman, William E. On improved predictive density estimation with parametric constraints. Electron. J. Statist. 5 (2011), 172--191. doi:10.1214/11-EJS603. https://projecteuclid.org/euclid.ejs/1302784852