Electronic Journal of Statistics

Efficient MCMC for Gibbs random fields using pre-computation

Aidan Boland, Nial Friel, and Florian Maire

Full-text: Open access


Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the normalizing constant of both the likelihood function and the posterior distribution are not in closed-form. The exploration of the posterior distribution of such models is typically carried out with a sophisticated Markov chain Monte Carlo (MCMC) method, the exchange algorithm [28], which requires simulations from the likelihood function at each iteration. The purpose of this paper is to consider an approach to dramatically reduce this computational overhead. To this end we introduce a novel class of algorithms which use realizations of the GRF model, simulated offline, at locations specified by a grid that spans the parameter space. This strategy speeds up dramatically the posterior inference, as illustrated on several examples. However, using the pre-computed graphs introduces a noise in the MCMC algorithm, which is no longer exact. We study the theoretical behaviour of the resulting approximate MCMC algorithm and derive convergence bounds using a recent theoretical development on approximate MCMC methods.

Article information

Electron. J. Statist., Volume 12, Number 2 (2018), 4138-4179.

Received: January 2018
First available in Project Euclid: 13 December 2018

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Gibbs random fields MCMC exponential random graph models

Creative Commons Attribution 4.0 International License.


Boland, Aidan; Friel, Nial; Maire, Florian. Efficient MCMC for Gibbs random fields using pre-computation. Electron. J. Statist. 12 (2018), no. 2, 4138--4179. doi:10.1214/18-EJS1504. https://projecteuclid.org/euclid.ejs/1544670254

Export citation


  • [1] Alquier, P., Friel, N., Everitt, R., and Boland, A. (2016). Noisy Monte Carlo: convergence of Markov chains with approximate transition kernels., Statistics and Computing, 26(1):29–47.
  • [2] Andrieu, C. and Roberts, G. (2009). The pseudo-marginal approach for efficient Monte Carlo computations., The Annals of Statistics, 37(2):697–725.
  • [3] Andrieu, C. and Thoms, J. (2008). A tutorial on adaptive MCMC., Statistics and Computing, 18(4):343–373.
  • [4] Augustin, N. H., Mugglestone, M. A., and Buckland, S. T. (1996). An autologistic model for the spatial distribution of wildlife., Journal of Applied Ecology, 33(2):pp. 339–347.
  • [5] Bardenet, R., Doucet, A., and Holmes, C. (2014). Towards scaling up Markov chain Monte Carlo: an adaptive subsampling approach. In, Proceedings of the 31st International Conference on Machine Learning (ICML-14), pages 405–413.
  • [6] Bardenet, R., Doucet, A., and Holmes, C. (2017). On Markov chain Monte Carlo methods for tall data., The Journal of Machine Learning Research, 18(1):1515–1557.
  • [7] Besag, J. E. (1974). Spatial interaction and the statistical analysis of lattice systems., Journal of the Royal Statistical Society, Series B, 36:192–236.
  • [8] Bhamidi, S., Bresler, G., and Sly, A. (2011). Mixing time of exponential random graphs., The Annals of Applied Probability, 21(6):2146–2170.
  • [9] Bouranis, L., Friel, N., and Maire, F. (2017). Efficient Bayesian inference for exponential random graph models by correcting the pseudo-posterior distribution., Social Networks, 50:98–108.
  • [10] Caimo, A. and Friel, N. (2011). Bayesian inference for exponential random graph models., Social Networks, 33(1):41–55.
  • [11] Cucala, L., Marin, J.-M., Robert, C. P., and Titterington, D. (2009). A Bayesian reassessment of nearest-neighbour classification., Journal of the American Statistical Association, 104:263–273.
  • [12] Del Moral, P., Doucet, A., and Jasra, A. (2006). Sequential Monte Carlo samplers., Journal of the Royal Statistical Society, Series B, 68(3):411–436.
  • [13] Everitt, R. (2012). Bayesian parameter estimation for latent Markov random fields and social networks., Journal of Computational and Graphical Statistics, 21(4):940–960.
  • [14] Everitt, R., Prangle, D., Maybank, P., and Bell, M. (2017). Marginal sequential Monte Carlo for doubly intractable models., arXiv:1710.04382.
  • [15] Fellows, I. and Handcock, M. S. (2012). Exponential-family random network models., arXiv preprint arXiv:1208.0121.
  • [16] Friel, N., Mira, A., Oates, C. J., et al. (2016). Exploiting multi-core architectures for reduced-variance estimation with intractable likelihoods., Bayesian Analysis, 11(1):215–245.
  • [17] Gelman, A. and Meng, X.-L. (1998). Simulating normalizing constants: From importance sampling to bridge sampling to path sampling., Statistical Science, pages 163–185.
  • [18] Georgii, H.-O. (2011)., Gibbs measures and phase transitions, volume 9. Walter de Gruyter.
  • [19] Gilks, W. R., Richardson, S., and Spiegelhalter, D. (1995)., Markov chain Monte Carlo in practice. CRC press.
  • [20] Handcock, M. S. (2003). Statistical models for social networks: Inference and degeneracy. Technical report, University of, Washington.
  • [21] Majoram, P., Molitor, J., Plagnol, V., and Tavaré, S. (2003). Markov chain Monte Carlo without likelihoods., Proceedings of the National Academy of Sciences, 100(26):324–328.
  • [22] Marin, J.-M., Pudlo, P., Robert, C. P., and Ryder, R. J. (2012). Approximate Bayesian computational methods., Statistics and Computing, 22(6):1167–1180.
  • [23] Medina-Aguayo, F. J., Lee, A., and Roberts, G. O. (2016). Stability of noisy Metropolis-Hastings., Statistics and Computing, 26(6):1187–1211.
  • [24] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., and Teller, E. (1953). Equation of state calculations by fast computing machines., The Journal of Chemical Physics, 21(6):1087–1092.
  • [25] Moores, M., Drovandi, C., Mengersen, K., and Robert, C. (2015a). Pre-processing for approximate Bayesian computation in image analysis., Statistics and Computing, 25(1):23–33.
  • [26] Moores, M. T., Pettitt, A. N., and Mengersen, K. (2015b). Scalable Bayesian inference for the inverse temperature of a hidden Potts model., arXiv:1503.08066.
  • [27] Morris, M., Handcock, M., and Hunter, D. (2008). Specification of exponential-family random graph models: Terms and computational aspects., Journal of Statistical Software, 24(4):1–24.
  • [28] Murray, I., Ghahramani, Z., and MacKay, D. (2006). MCMC for doubly-intractable distributions. In, Proceedings of the 22nd Annual Conference on Uncertainty in Artificial Intelligence UAI06. Arlington, Virginia, AUAI Press.
  • [29] Peskun, P. H. (1973). Optimum Monte Carlo sampling using Markov chains., Biometrika, 60(3):607–612.
  • [30] Pritchard, J., Seielstad, M., Perez-Lwzaun, A., and Feldman, M. (1999). Population growth of human y chromosomes: a study of y chromosome microsatellites., Molecular Biology and Evolution, 16:1791–1798.
  • [31] Propp, J. and Wilson, D. (1996). Exactly sampling with coupled Markov chains and applications to statistical mechanics., Random Structures and Algorithms, 9:223–252.
  • [32] Quiroz, M., Tran, M.-N., Villani, M., and Kohn, R. (2018). Speeding up MCMC by delayed acceptance and data subsampling., Journal of Computational and Graphical Statistics, 27(1):12–22.
  • [33] Robbins, H. and Monro, S. (1951). A stochastic approximation method., The Annals of Mathematical Statistics, 22(3):400–407.
  • [34] Robins, G., Pattison, P., Kalish, Y., and Lusher, D. (2007). An introduction to exponential random graph models for social networks., Social Networks, 29(2):169–348.
  • [35] Rue, H., Martino, S., and Chopin, N. (2009). Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations., Journal of the Royal Statistical Society, Series B, 71(2):319–392.
  • [36] Schweinberger, M. (2011). Instability, sensitivity, and degeneracy of discrete exponential families., Journal of the American Statistical Association, 106(496):1361–1370.
  • [37] Stoehr, J., Benson, A., and Friel, N. (2018). Noisy hamiltonian monte carlo for doubly-intractable distributions., Journal of Computational and Graphical Statistics, (to appear).
  • [38] Stoehr, J. and Friel, N. (2015). Calibration of conditional composite likelihood for Bayesian inference on Gibbs random fields., AISTATS, Journal of Machine Learning Research:W&CP.
  • [39] Tierney, L. (1998). A note on Metropolis-Hastings kernels for general state spaces., The Annals of Applied Probability, pages 1–9.
  • [40] Zachary, W. W. (1977). An information flow model for conflict and fission in small groups., Journal of Anthropological Research, pages 452–473.