Bayesian Analysis

Variational Message Passing for Elaborate Response Regression Models

M. W. McLean and M. P. Wand

Full-text: Open access

Abstract

We build on recent work concerning message passing approaches to approximate fitting and inference for arbitrarily large regression models. The focus is on regression models where the response variable is modeled to have an elaborate distribution, which is loosely defined to mean a distribution that is more complicated than common distributions such as those in the Bernoulli, Poisson and Normal families. Examples of elaborate response families considered here are the Negative Binomial and t families. Variational message passing is more challenging due to some of the conjugate exponential families being non-standard and numerical integration being needed. Nevertheless, a factor graph fragment approach means the requisite calculations only need to be done once for a particular elaborate response distribution family. Computer code can be compartmentalized, including that involving numerical integration. A major finding of this work is that the modularity of variational message passing extends to elaborate response regression models.

Article information

Source
Bayesian Anal., Volume 14, Number 2 (2019), 371-398.

Dates
First available in Project Euclid: 25 May 2018

Permanent link to this document
https://projecteuclid.org/euclid.ba/1527213628

Digital Object Identifier
doi:10.1214/18-BA1098

Mathematical Reviews number (MathSciNet)
MR3934090

Zentralblatt MATH identifier
07045435

Subjects
Primary: 62F15: Bayesian inference 62J05: Linear regression
Secondary: 62G08: Nonparametric regression

Keywords
Bayesian computing factor graph generalized additive models generalized linear mixed models mean field variational Bayes support vector machine classification

Rights
Creative Commons Attribution 4.0 International License.

Citation

McLean, M. W.; Wand, M. P. Variational Message Passing for Elaborate Response Regression Models. Bayesian Anal. 14 (2019), no. 2, 371--398. doi:10.1214/18-BA1098. https://projecteuclid.org/euclid.ba/1527213628


Export citation

References

  • Azzalini, A. (2017). The R package sn: The skew-normal and related distributions, such as the skew-t (version 1.5). URL http://azzalini.stat.unipd.it/SN
  • Azzalini, A. and Dalla Valle, A. (1996). “The multivariate skew-normal distribution.” Biometrika, 83: 715–726.
  • Frühwirth-Schnatter, S., Frühwirth, R., Held, L., and Rue, H. (2009). “Improved auxiliary mixture sampling for hierarchical models of non-Gaussian data.” Statistics and Computing, 19: 479–492.
  • Frühwirth-Schnatter, S. and Pyne, S. (2010). “Bayesian inference for finite mixtures of univariate and multivariate skew-normal and skew-t distributions.” Biostatistics, 11: 317–336.
  • Frühwirth-Schnatter, S. and Wagner, H. (2006). “Auxiliary mixture sampling for parameter-driven models of time series of counts with applications to state space modelling.” Biometrika, 93: 827–841.
  • Hoffman, M. D., Blei, D. M., Wang, C., and Paisley, J. W. (2013). “Stochastic variational inference.” Journal of Machine Learning Research, 14: 1303–1347.
  • Knowles, D. A. and Minka, T. (2011). “Non-conjugate variational message passing for multinomial and binary regression.” In Advances in Neural Information Processing Systems, 1701–1709.
  • Kotz, S., Kozubowski, T. J., and Podgórski, K. (2001). The Laplace Distribution and Generalizations. Boston: Birkhäuser.
  • Kucukelbir, A., Tran, D., Ranganath, R., Gelman, A., and Blei, D. M. (2017). “Automatic differentiation variational inference.” Journal of Machine Learning Research, 18: 1–45.
  • Lachos, V. H., Ghosh, P., and Arellano-Valle, R. B. (2010). “Likelihood based inference for skew-normal independent linear mixed models.” Statistica Sinica, 303–322.
  • Lange, K. L., Little, R. J. A., and Taylor, J. M. G. (1989). “Robust statistical modeling using the t distribution.” Journal of the American Statistical Association, 84: 881–896.
  • Luts, J. and Ormerod, J. T. (2014). “Mean field variational Bayesian inference for support vector machine classification.” Computational Statistics & Data Analysis, 73: 163–176.
  • Luts, J. and Wand, M. P. (2015). “Variational inference for count response semiparametric regression.” Bayesian Analysis, 10: 991–1023.
  • McLean, M. W. and Wand, M. P. (2018). “Supplement for: Variational Message Passing for Elaborate Response Regression Models.” Bayesian Analysis.
  • Minka, T. (2005). “Divergence measures and message passing.” Microsoft Research Technical Report Series, MSR-TR-2005-173: 1–17.
  • Minka, T. and Winn, J. (2008). “Gates: A graphical notation for mixture models.” Microsoft Research Technical Report Series, MSR-TR-2008-185: 1–16.
  • Nadarajah, S. (2008). “A new model for symmetric and skewed data.” Probability in the Engineering and Informational Sciences, 22: 261–271.
  • Ormerod, J. T. and Wand, M. P. (2010). “Explaining variational approximations.” The American Statistician, 64: 140–153.
  • Polson, N. G. and Scott, S. L. (2011). “Data augmentation for support vector machines.” Bayesian Analysis, 6: 1–23.
  • R Core Team (2017). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/
  • Rue, H., Martino, S., Lindgren, F., Simpson, D., Riebler, A., and Krainski, E. (2016). The R package ‘INLA’: Functions which allow to perform full Bayesian analysis of latent Gaussian models using integrated nested Laplace approximation (version 0.0). URL http://www.r-inla.org
  • Ruppert, D., Wand, M. P., and Carroll, R. J. (2003). Semiparametric Regression. New York: Cambridge University Press.
  • Tipping, M. E. and Lawrence, N. D. (2003). “A variational approach to robust Bayesian interpolation.” In Institute of Electrical and Electronics Engineers Workshop of Neural Networds for Signal Processing, 229–238.
  • Titsias, M. K. and Lázaro-Gredilla, M. (2014). “Doubly stochastic variational Bayes for non-conjugate inference.” Proceedings of Machine Learning Research, 32: 1971–1979.
  • Verdinelli, I. and Wasserman, L. (1991). “Bayesian analysis of outlier problems using the Gibbs sampler.” Statistics and Computing, 1: 105–117.
  • Wand, M. P. (2017). “Fast approximate inference for arbitrarily large semiparametric regression models via message passing (with discussion).” Journal of the American Statistical Association, 112: 137–168.
  • Wand, M. P. and Ormerod, J. T. (2008). “On semiparametric regression with O’Sullivan penalized splines.” Australian & New Zealand Journal of Statistics, 50: 179–198.
  • Wand, M. P., Ormerod, J. T., Padoan, S. A., and Frühwirth, R. F. (2011). “Mean field variational Bayes for elaborate distributions.” Bayesian Analysis, 6: 847–900.
  • Winn, J. and Bishop, C. M. (2005). “Variational message passing.” Journal of Machine Learning Research, 6: 661–694.
  • Yang, Y., Wang, H. J., and He, X. (2016). “Posterior inference in Bayesian quantile regression with Asymmetric Laplace likelihood.” International Statistical Review, 84: 327–344.
  • Yu, K. and Moyeed, R. A. (2001). “Bayesian quantile regression.” Statistics and Probability Letters, 54: 437–447.

Supplemental materials