Advances in Applied Probability

Convergence of conditional Metropolis-Hastings samplers

Galin L. Jones, Gareth O. Roberts, and Jeffrey S. Rosenthal

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings updates, resulting in a conditional Metropolis-Hastings sampler (CMH sampler). We develop conditions under which the CMH sampler will be geometrically or uniformly ergodic. We illustrate our results by analysing a CMH sampler used for drawing Bayesian inferences about the entire sample path of a diffusion process, based only upon discrete observations.

Article information

Adv. in Appl. Probab., Volume 46, Number 2 (2014), 422-445.

First available in Project Euclid: 29 May 2014

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60J05: Discrete-time Markov processes on general state spaces 60J22: Computational methods in Markov chains [See also 65C40] 65C40: Computational Markov chains 62F15: Bayesian inference

Markov chain Monte Carlo algorithm independence sampler Gibbs sampler geometric ergodicity convergence rate


Jones, Galin L.; Roberts, Gareth O.; Rosenthal, Jeffrey S. Convergence of conditional Metropolis-Hastings samplers. Adv. in Appl. Probab. 46 (2014), no. 2, 422--445. doi:10.1239/aap/1401369701.

Export citation


  • Brooks, S., Gelman, A., Jones, G. L. and Meng, X.-L. (eds) (2011). Handbook of Markov Chain Monte Carlo. CRC Press, Boca Raton, FL.
  • Chan, K. S. and Geyer, C. J. (1994). Discussion: Markov chains for exploring posterior distributions. Ann. Statist. 22, 1747–1758.
  • Diaconis, P. and Saloff-Coste, L. (1993). Comparison theorems for reversible Markov chains. Ann. Appl. Prob. 3, 696–730.
  • Elerian, O., Chib, S. and Shephard, N. (2001). Likelihood inference for discretely observed nonlinear diffusions. Econometrica 69, 959–993.
  • Flegal, J. M., Haran, M. and Jones, G. L. (2008). Markov chain Monte Carlo: can we trust the third significant figure? Statist. Sci. 23, 250–260.
  • Gelfand, A. E. and Smith, A. F. M. (1990). Sampling-based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85, 398–409.
  • Hobert, J. P. and Geyer, C. J. (1998). Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model. J. Multivariate Anal. 67, 414–430.
  • Jarner, S. F. and Hansen, E. (2000). Geometric ergodicity of Metropolis algorithms. Stoch. Process. Appl. 85, 341–361.
  • Johnson, A. A. and Jones, G. L. (2010). Gibbs sampling for a Bayesian hierarchical general linear model. Electron. J. Statist. 4, 313–333.
  • Johnson, A. A., Jones, G. L. and Neath, R. C. (2013). Component-wise Markov chain Monte Carlo: uniform and geometric ergodicity under mixing and composition. Statist. Sci. 28, 360–375.
  • Jones, G. L. (2004). On the Markov chain central limit theorem. Prob. Surveys 1, 299–320.
  • Jones, G. L. and Hobert, J. P. (2001). Honest exploration of intractable probability distributions via Markov chain Monte Carlo. Statist. Sci. 16, 312–334.
  • Jones, G. L. and Hobert, J. P. (2004). Sufficient burn-in for Gibbs samplers for a hierarchical random effects model. Ann. Statist. 32, 784–817.
  • Jones, G. L., Haran, M., Caffo, B. S. and Neath, R. (2006). Fixed-width output analysis for Markov chain Monte Carlo. J. Amer. Statist. Assoc. 101, 1537–1547.
  • Lawler, G. F. and Sokal, A. D. (1988). Bounds on the ${L}^2$ spectrum for Markov chains and Markov processes: a generalization of Cheeger's inequality. Trans. Amer. Math. Soc. 309, 557–580.
  • Liu, J. S. (1996). Metropolized independent sampling with comparisons to rejection sampling and importance sampling. Statist. Comput. 6, 113–119.
  • Liu, J. S., Wong, W. H. and Kong, A. (1994). Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes. Biometrika 81, 27–40.
  • Marchev, D. and Hobert, J. P. (2004). Geometric ergodicity of van Dyk and Meng's algorithm for the multivariate Student's $t$ model. J. Amer. Statist. Assoc. 99, 228–238.
  • Mengersen, K. L. and Tweedie, R. L. (1996). Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24, 101–121.
  • Meyn, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London.
  • Papaspiliopoulos, O. and Roberts, G. (2008). Stability of the Gibbs sampler for Bayesian hierarchical models. Ann. Statist. 36, 95–117.
  • Peskun, P. H. (1973). Optimum Monte-Carlo sampling using Markov chains. Biometrika 60, 607–612.
  • Robert, C. P. (1995). Convergence control methods for Markov chain Monte Carlo algorithms. Statist. Sci. 10, 231–253.
  • Roberts, G. O. and Polson, N. G. (1994). On the geometric convergence of the Gibbs sampler. J. R. Statist. Soc. B 56, 377–384.
  • Roberts, G. O. and Rosenthal, J. S. (1997). Geometric ergodicity and hybrid Markov chains. Electron. Commun. Prob. 2, 13–25.
  • Roberts, G. O. and Rosenthal, J. S. (1998). Two convergence properties of hybrid samplers. Ann. Appl. Prob. 8, 397–407.
  • Roberts, G. O. and Rosenthal, J. S. (1999). Convergence of slice sampler Markov chains. J. R. Statist. Soc. B 61, 643–660.
  • Roberts, G. O. and Rosenthal, J. S. (2001). Markov chains and de-initializing processes. Scand. J. Statist. 28, 489–504.
  • Roberts, G. O. and Rosenthal, J. S. (2004). General state space Markov chains and MCMC algorithms. Prob. Surveys 1, 20–71.
  • Roberts, G. O. and Rosenthal, J. S. (2011). Quantitative non-geometric convergence bounds for independence samplers. Methodol. Comput. Appl. Prob. 13, 391–403.
  • Roberts, G. O. and Sahu, S. K. (1997). Updating schemes, correlation structure, blocking and parametrization for the Gibbs sampler. J. R. Statist. Soc. B 59, 291–317.
  • Roberts, G. O. and Stramer, O. (2001). On inference for partially observed nonlinear diffusion models using the Metropolis–Hastings algorithm. Biometrika 88, 603–621.
  • Roberts, G. O. and Tweedie, R. L. (1996). Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. Biometrika 83, 95–110.
  • Rogers, L. C. G. and Williams, D. (1994). Diffusions, Markov Processes, and Martingales, Vol. 1, Foundations, 2nd edn. John Wiley, Chichester.
  • Rosenthal, J. S. (1995). Minorization conditions and convergence rates for Markov chain Monte Carlo. J. Amer. Statist. Assoc. 90, 558–566. (Correction: 90 (1995), 1136.)
  • Rosenthal, J. S. (1996). Analysis of the Gibbs sampler for a model related to James–Stein estimators. Statist. Comput. 6, 269–275.
  • Roy, V. and Hobert, J. P. (2007). Convergence rates and asymptotic standard errors for Markov chain Monte Carlo algorithms for Bayesian probit regression. J. R. Statist. Soc. B 69, 607–623.
  • Schervish, M. J. and Carlin, B. P. (1992). On the convergence of successive substitution sampling. J. Comput. Graph. Statist. 1, 111–127.
  • Sinclair, A. (1992). Improved bounds for mixing rates of Markov chains and multicommodity flow. Combin. Prob. Comput. 1, 351–370.
  • Smith, R. L. and Tierney, L. (1996). Exact transition probabilities for the independence Metropolis sampler. Tech. Rep., University of North Carolina.
  • Tan, A. and Hobert, J. P. (2009). Block Gibbs sampling for Bayesian random effects models with improper priors: convergence and regeneration. J. Comput. Graph. Statist. 18, 861–878.
  • Tierney, L. (1994). Markov chains for exploring posterior distributions. With discussion and a rejoinder by the author. Ann. Statist. 22, 1701–1762.
  • Tierney, L. (1998). A note on Metropolis–Hastings kernels for general state spaces. Ann. Appl. Prob. 8, 1–9.