Open Access
March 2015 Asymptotic Properties of Bayesian Predictive Densities When the Distributions of Data and Target Variables are Different
Fumiyasu Komaki
Bayesian Anal. 10(1): 31-51 (March 2015). DOI: 10.1214/14-BA886

Abstract

Bayesian predictive densities when the observed data x and the target variable y to be predicted have different distributions are investigated by using the framework of information geometry. The performance of predictive densities is evaluated by the Kullback–Leibler divergence. The parametric models are formulated as Riemannian manifolds. In the conventional setting in which x and y have the same distribution, the Fisher–Rao metric and the Jeffreys prior play essential roles. In the present setting in which x and y have different distributions, a new metric, which we call the predictive metric, constructed by using the Fisher information matrices of x and y, and the volume element based on the predictive metric play the corresponding roles. It is shown that Bayesian predictive densities based on priors constructed by using non-constant positive superharmonic functions with respect to the predictive metric asymptotically dominate those based on the volume element prior of the predictive metric.

Citation

Download Citation

Fumiyasu Komaki. "Asymptotic Properties of Bayesian Predictive Densities When the Distributions of Data and Target Variables are Different." Bayesian Anal. 10 (1) 31 - 51, March 2015. https://doi.org/10.1214/14-BA886

Information

Published: March 2015
First available in Project Euclid: 28 January 2015

zbMATH: 1335.62053
MathSciNet: MR3420896
Digital Object Identifier: 10.1214/14-BA886

Keywords: Differential geometry , Fisher–Rao metric , Jeffreys prior , Kullback– Leibler divergence , predictive metric

Rights: Copyright © 2015 International Society for Bayesian Analysis

Vol.10 • No. 1 • March 2015
Back to Top