Electronic Journal of Statistics

Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods

Jimmy Olsson, Tatjana Pavlenko, and Felix L. Rios

Full-text: Open access


We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.

Article information

Electron. J. Statist., Volume 13, Number 2 (2019), 2865-2897.

Received: June 2018
First available in Project Euclid: 29 August 2019

Permanent link to this document

Digital Object Identifier

Primary: 62L20: Stochastic approximation 62L20: Stochastic approximation
Secondary: 62-09: Graphical methods

Structure learning sequential sampling decomposable graphical models particle Gibbs

Creative Commons Attribution 4.0 International License.


Olsson, Jimmy; Pavlenko, Tatjana; Rios, Felix L. Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods. Electron. J. Statist. 13 (2019), no. 2, 2865--2897. doi:10.1214/19-EJS1585. https://projecteuclid.org/euclid.ejs/1567065622

Export citation


  • Andrieu, C., Doucet, A. and Holenstein, R. (2010). Particle Markov chain Monte Carlo methods., Journal of the Royal Statistical Society: Series B (Statistical Methodology) 72 269–342.
  • Bornn, L., Caron, F. et al. (2011). Bayesian clustering in decomposable graphs., Bayesian Analysis 6 829–846.
  • Cappé, O., Moulines, E. and Rydén, T. (2005)., Inference in hidden Markov models. Springer New York.
  • Chopin, N. and Singh, S. S. (2015). On particle Gibbs sampling., Bernoulli 21 1855–1883.
  • Dawid, A. P. and Lauritzen, S. L. (1993). Hyper Markov laws in the statistical analysis of decomposable graphical models., The Annals of Statistics 21 1272–1317.
  • Del Moral, P., Doucet, A. and Jasra, A. (2006). Sequential Monte Carlo samplers., Journal of the Royal Statistical Society. Series B (Statistical Methodology) 68 411–436.
  • Dellaportas, P. and Forster, J. J. (1999). Markov chain Monte Carlo model determination for hierarchical and graphical log-linear models., Biometrika 86 615–633.
  • Diestel, R. (2005)., Graph theory (Graduate texts in mathematics) 173. Springer Heidelberg.
  • Edwards, D. and Havránek, T. (1985). A fast procedure for model search in multidimensional contingency tables., Biometrika 72 339–351.
  • Elmasri, M. (2017a). On decomposable random graphs., ArXiv e-prints.
  • Elmasri, M. (2017b). Sub-clustering in decomposable graphs and size-varying junction trees., ArXiv e-prints.
  • Giudici, P. and Green, P. J. (1999). Decomposable graphical Gaussian model determination., Biometrika 86 785–801.
  • Green, P. J. and Thomas, A. (2013). Sampling decomposable graphs using a Markov chain on junction trees., Biometrika 100 91–110.
  • Green, P. J. and Thomas, A. (2017). A structural Markov property for decomposable graph laws that allows control of clique intersections., Biometrika 105 19–29.
  • Jones, B., Carvalho, C., Dobra, A., Hans, C., Carter, C. and West, M. (2005). Experiments in stochastic computation for high-dimensional graphical models., Statistical Science 20 388–400.
  • Lauritzen, S. L. (1996)., Graphical Models. Oxford University Press.
  • Madigan, D. and Raftery, A. E. (1994). Model selection and accounting for model uncertainty in graphical models using Occam’s window., Journal of the American Statistical Association 89 1535–1546.
  • Maire, F., Douc, R. and Olsson, J. (2014). Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods., The Annals of Statistics 42 1483–1510.
  • Markenzon, L., Vernet, O. and Araujo, L. (2008). Two methods for the generation of chordal graphs., Annals of Operations Research 157 47–60.
  • Massam, H., Liu, J. and Dobra, A. (2009). A conjugate prior for discrete hierarchical log-linear models., The Annals of Statistics 37 3431–3467.
  • Olsson, J., Pavlenko, T. and Rios, F. L. (2018). Sequential sampling of junction trees for decomposable graphs., ArXiv e-prints.
  • Speed, T. P. and Kiiveri, H. T. (1986). Gaussian Markov distributions over finite graphs., The Annals of Statistics 14 138–150.
  • Stingo, F. and Marchetti, G. M. (2015). Efficient local updates for undirected graphical models., Statistics and Computing 25 159–171.
  • Thomas, A. and Green, P. J. (2009). Enumerating the junction trees of a decomposable graph., Journal of Computational and Graphical Statistics 18 930–940.