Open Access
June 2016 Importance Sampling Schemes for Evidence Approximation in Mixture Models
Jeong Eun Lee, Christian P. Robert
Bayesian Anal. 11(2): 573-597 (June 2016). DOI: 10.1214/15-BA970

Abstract

The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g. a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed through choices for the importance function: a maximum likelihood estimate (MLE) proposal and a Rao–Blackwellised importance function. The second scheme is called dual importance sampling. We demonstrate that this dual importance sampling is a valid estimator of the evidence. To reduce the induced high demand in computation, the original importance function is approximated, but a suitable approximation can produce an estimate with the same precision and with less computational workload.

Citation

Download Citation

Jeong Eun Lee. Christian P. Robert. "Importance Sampling Schemes for Evidence Approximation in Mixture Models." Bayesian Anal. 11 (2) 573 - 597, June 2016. https://doi.org/10.1214/15-BA970

Information

Published: June 2016
First available in Project Euclid: 25 August 2015

zbMATH: 1357.62116
MathSciNet: MR3472003
Digital Object Identifier: 10.1214/15-BA970

Keywords: importance sampling , marginal likelihood , Mixture models , Model evidence

Rights: Copyright © 2016 International Society for Bayesian Analysis

Vol.11 • No. 2 • June 2016
Back to Top