- Statist. Sci.
- Volume 28, Number 2 (2013), 168-188.
Variational Inference for Generalized Linear Mixed Models Using Partially Noncentered Parametrizations
The effects of different parametrizations on the convergence of Bayesian computational algorithms for hierarchical models are well explored. Techniques such as centering, noncentering and partial noncentering can be used to accelerate convergence in MCMC and EM algorithms but are still not well studied for variational Bayes (VB) methods. As a fast deterministic approach to posterior approximation, VB is attracting increasing interest due to its suitability for large high-dimensional data. Use of different parametrizations for VB has not only computational but also statistical implications, as different parametrizations are associated with different factorized posterior approximations. We examine the use of partially noncentered parametrizations in VB for generalized linear mixed models (GLMMs). Our paper makes four contributions. First, we show how to implement an algorithm called nonconjugate variational message passing for GLMMs. Second, we show that the partially noncentered parametrization can adapt to the quantity of information in the data and determine a parametrization close to optimal. Third, we show that partial noncentering can accelerate convergence and produce more accurate posterior approximations than centering or noncentering. Finally, we demonstrate how the variational lower bound, produced as part of the computation, can be useful for model selection.
Statist. Sci., Volume 28, Number 2 (2013), 168-188.
First available in Project Euclid: 21 May 2013
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Tan, Linda S. L.; Nott, David J. Variational Inference for Generalized Linear Mixed Models Using Partially Noncentered Parametrizations. Statist. Sci. 28 (2013), no. 2, 168--188. doi:10.1214/13-STS418. https://projecteuclid.org/euclid.ss/1369147910