Open Access
June 2015 Scaling It Up: Stochastic Search Structure Learning in Graphical Models
Hao Wang
Bayesian Anal. 10(2): 351-377 (June 2015). DOI: 10.1214/14-BA916

Abstract

Gaussian concentration graph models and covariance graph models are two classes of graphical models that are useful for uncovering latent dependence structures among multivariate variables. In the Bayesian literature, graphs are often determined through the use of priors over the space of positive definite matrices with fixed zeros, but these methods present daunting computational burdens in large problems. Motivated by the superior computational efficiency of continuous shrinkage priors for regression analysis, we propose a new framework for structure learning that is based on continuous spike and slab priors and uses latent variables to identify graphs. We discuss model specification, computation, and inference for both concentration and covariance graph models. The new approach produces reliable estimates of graphs and efficiently handles problems with hundreds of variables.

Citation

Download Citation

Hao Wang. "Scaling It Up: Stochastic Search Structure Learning in Graphical Models." Bayesian Anal. 10 (2) 351 - 377, June 2015. https://doi.org/10.1214/14-BA916

Information

Published: June 2015
First available in Project Euclid: 2 February 2015

zbMATH: 1335.62068
MathSciNet: MR3420886
Digital Object Identifier: 10.1214/14-BA916

Keywords: Bayesian inference , Bi-directed graph , Block Gibbs , Concentration graph models , Covariance graph models , Credit default swap , structural learning , undirected graph

Rights: Copyright © 2015 International Society for Bayesian Analysis

Vol.10 • No. 2 • June 2015
Back to Top