Electronic Journal of Statistics

Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors

Xin Wang and Vivekananda Roy

Full-text: Open access

Abstract

In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior density associated with Bayesian probit linear mixed models under improper priors on the regression coefficients and variance components. In particular, we construct a two-block Gibbs sampler using the data augmentation (DA) techniques. Furthermore, we prove geometric ergodicity of the Gibbs sampler, which is the foundation for building central limit theorems for MCMC based estimators and subsequent inferences. The conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for the propriety of posterior distributions with a general link function when the design matrices take commonly observed forms. In general, the Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. Here we construct a Haar PX-DA algorithm, which has essentially the same computational cost as the two-block Gibbs sampler.

Article information

Source
Electron. J. Statist., Volume 12, Number 2 (2018), 4412-4439.

Dates
Received: November 2017
First available in Project Euclid: 18 December 2018

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1545123629

Digital Object Identifier
doi:10.1214/18-EJS1506

Mathematical Reviews number (MathSciNet)
MR3892344

Zentralblatt MATH identifier
07003247

Subjects
Primary: 60J05: Discrete-time Markov processes on general state spaces
Secondary: 62F15: Bayesian inference

Keywords
Data augmentation drift condition geometric ergodicity GLMM Haar PX-DA algorithm Markov chains posterior propriety

Rights
Creative Commons Attribution 4.0 International License.

Citation

Wang, Xin; Roy, Vivekananda. Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models with improper priors. Electron. J. Statist. 12 (2018), no. 2, 4412--4439. doi:10.1214/18-EJS1506. https://projecteuclid.org/euclid.ejs/1545123629


Export citation

References

  • Albert, J. H. and Chib, S. (1993). Bayesian analysis of binary and polychotomous response data., Journal of the American statistical Association 88 669–679.
  • Asmussen, S. and Glynn, P. W. (2011). A new proof of convergence of MCMC via the ergodic theorem., Statistics and Probability Letters 81 1482–1485.
  • Baragatti, M. (2011). Bayesian variable selection for probit mixed models applied to gene selection., Bayesian Analysis 6 209–229.
  • Breslow, N. E. and Clayton, D. G. (1993). Approximate inference in generalized linear mixed models., Journal of the American statistical Association 88 9–25.
  • Chakraborty, S. and Khare, K. (2017). Convergence properties of Gibbs samplers for Bayesian probit regression with proper priors., Electronic Journal of Statistics 11 177-210.
  • Chen, M.-H. and Shao, Q.-M. (2001). Propriety of posterior distribution for dichotomous quantal response models., Proceedings of the American Mathematical Society 129 293–302.
  • Chen, M.-H., Shao, Q.-M. and Xu, D. (2002). Necessary and sufficient conditions on the properiety of posterior distributions for generalized linear mixed models., Sankhyā: The Indian Journal of Statistics, Series A 64 57–85.
  • Choi, H. M. and Hobert, J. P. (2013). The Polya-Gamma Gibbs sampler for Bayesian logistic regression is uniformly ergodic., Electronic Journal of Statistics 7 2054–2064.
  • Flegal, J. M. and Jones, G. L. (2010). Batch means and spectral variance estimators in Markov chain Monte Carlo., The Annals of Statistics 38 1034–1070.
  • Hobert, J. P. and Marchev, D. (2008). A theoretical comparison of the data augmentation, marginal augmentation and PX-DA algorithms., The Annals of Statistics 36 532–554.
  • Johnson, A. A. and Jones, G. L. (2010). Gibbs sampling for a Bayesian hierarchical general linear model., Electronic Journal of Statistics 4 313–333.
  • Jones, G. L. and Hobert, J. P. (2001). Honest exploration of intractable probability distributions via Markov chain Monte Carlo., Statistical Science 16 312–334.
  • Jones, G. L. and Hobert, J. P. (2004). Sufficient burn-in for Gibbs samplers for a hierarchical random effects model., Annals of statistics 32 784–817.
  • Liu, J. S., Wong, W. H. and Kong, A. (1994). Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes., Biometrika 81 27–40.
  • Liu, J. S. and Wu, Y. N. (1999). Parameter expansion for data augmentation., Journal of the American Statistical Association 94 1264–1274.
  • McCulloch, C. E., Searle, S. R. and Neuhaus, J. M. (2011)., Generalized, Linear, and Mixed Models. John Wiley & Sons.
  • Meng, X.-L. and Van Dyk, D. A. (1999). Seeking efficient data augmentation schemes via conditional and marginal augmentation., Biometrika 86 301–320.
  • Meyn, S. P. and Tweedie, R. L. (1993)., Markov chains and stochastic stability. Springer.
  • Polson, N. G., Scott, J. G. and Windle, J. (2013). Bayesian inference for logistic models using Pólya–Gamma latent variables., Journal of the American statistical Association 108 1339–1349.
  • Roberts, G. O., Rosenthal, J. S. (1997). Geometric ergodicity and hybrid Markov chains., Electron. Comm. Probab 2 13–25.
  • Roberts, G. O. and Rosenthal, J. S. (2001). Markov chains and de-initializing processes., Scandinavian Journal of Statistics 28 489–504.
  • Román, J. C. (2012). Convergence analysis of block Gibbs samplers for Bayesian general linear mixed models PhD thesis, University of, Florida.
  • Román, J. C. and Hobert, J. P. (2012). Convergence analysis of the Gibbs sampler for Bayesian general linear mixed models with improper priors., The Annals of Statistics 40 2823–2849.
  • Román, J. C. and Hobert, J. P. (2015). Geometric ergodicity of Gibbs samplers for Bayesian general linear mixed models with proper priors., Linear Algebra and its Applications 473 54–77.
  • Roy, V. (2012a). Spectral analytic comparisons for data augmentation., Stat. and Prob. Letters 82 103-108.
  • Roy, V. (2012b). Convergence rates for MCMC algorithms for a robust Bayesian binary regression model., Electronic Journal of Statistics 6 2463–2485.
  • Roy, V. (2014). Efficient estimation of the link function parameter in a robust Bayesian binary regression model., Computational Statistics & Data Analysis 73 87–102.
  • Roy, V. and Hobert, J. P. (2007). Convergence rates and asymptotic standard errors for Markov chain Monte Carlo algorithms for Bayesian probit regression., Journal of the Royal Statistical Society: Series B 69 607–623.
  • Tan, A. and Hobert, J. P. (2009). Block Gibbs sampling for Bayesian random effects models with improper priors: Convergence and regeneration., Journal of Computational and Graphical Statistics 18 861–878.
  • Van Dyk, D. A. and Meng, X.-L. (2001). The Art of Data Augmentation (with discussion)., Journal of Computational and Graphical Statistics 10 1–50.
  • Wang, X. and Roy, V. (2018). Geometric ergodicity of Pólya-Gamma Gibbs sampler for Bayesian logistic regression with a flat prior., Electronic Journal of Statistics 12 3295–3311.