Bayesian Analysis

Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals

L. F. South, A. N. Pettitt, and C. C. Drovandi

Full-text: Open access


Sequential Monte Carlo (SMC) methods for sampling from the posterior of static Bayesian models are flexible, parallelisable and capable of handling complex targets. However, it is common practice to adopt a Markov chain Monte Carlo (MCMC) kernel with a multivariate normal random walk (RW) proposal in the move step, which can be both inefficient and detrimental for exploring challenging posterior distributions. We develop new SMC methods with independent proposals which allow recycling of all candidates generated in the SMC process and are embarrassingly parallelisable. A novel evidence estimator that is easily computed from the output of our independent SMC is proposed. Our independent proposals are constructed via flexible copula-type models calibrated with the population of SMC particles. We demonstrate through several examples that more precise estimates of posterior expectations and the marginal likelihood can be obtained using fewer likelihood evaluations than the more standard RW approach.

Article information

Bayesian Anal., Volume 14, Number 3 (2019), 753-776.

First available in Project Euclid: 11 June 2019

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

copula evidence importance sampling independent proposal Markov chain Monte Carlo marginal likelihood

Creative Commons Attribution 4.0 International License.


South, L. F.; Pettitt, A. N.; Drovandi, C. C. Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals. Bayesian Anal. 14 (2019), no. 3, 753--776. doi:10.1214/18-BA1129.

Export citation


  • Bekaert, G., Engstrom, E., and Ermolov, A. (2015). “Bad environments, good environments: A non-Gaussian asymmetric volatility model.” Journal of Econometrics, 186(1): 258–275.
  • Beskos, A., Jasra, A., Kantas, N., and Thiery, A. (2016). “On the convergence of adaptive sequential Monte Carlo methods.” Annals of Applied Probability, 26(2): 1111–1146.
  • Bloomberg (2017). “S&P 500 Composite Index for the period 31/12/1989 to 29/7/2016.”
  • Cappé, O., Godsill, S. J., and Moulines, E. (2007). “An overview of existing methods and recent advances in sequential Monte Carlo.” Proceedings of the IEEE, 95(5): 899–924.
  • Chopin, N. (2002). “A sequential particle filter method for static models.” Biometrika, 89(3): 539–552.
  • Cranley, R. and Patterson, T. N. L. (1976). “Randomization of number theoretic methods for multiple integration.” SIAM Journal on Numerical Analysis, 13(6): 904–914.
  • Del Moral, P., Doucet, A., and Jasra, A. (2006). “Sequential Monte Carlo samplers.” Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68: 411–436.
  • Del Moral, P. and Miclo, L. (2000). “Branching and interacting particle systems approximations of Feynman-Kac formulae with applications to non-linear filtering.” Seminaire de Probabilites XXXIV, Lecture notes in Mathematics, 1–145.
  • Dempster, A. P., Laird, N. M., and Rubin, D. B. (1977). “Maximum likelihood from incomplete data via the EM algorithm.” Journal of the Royal Statistical Society: Series B (Statistical Methodology), 39(1): 1–38.
  • Donnet, S. and Robin, S. (2017). “Using deterministic approximations to accelerate SMC for posterior sampling.”
  • Drovandi, C. C. and Pettitt, A. N. (2011). “Estimation of parameters for macroparasite population evolution using approximate Bayesian computation.” Biometrics, 67(1): 225–233.
  • Elvira, V., Martino, L., Luengo, D., and Bugallo, M. (2015). “Efficient multiple importance sampling estimators.” IEEE Signal Processing Letters, 22(10): 1757–1761.
  • Fang, H.-B., Fang, K.-T., and Kotz, S. (2002). “The meta-elliptical distributions with given marginals.” Journal of Multivariate Analysis, 82(1): 1–16.
  • Finke, A. (2015). “On extended state-space constructions for Monte Carlo methods.” Ph.D. thesis, University of Warwick.
  • Gelman, A., Roberts, G., and Gilks, W. (1996). “Efficient Metropolis jumping rules.” Bayesian Statistics, 5: 599–607.
  • Gerber, M., Chopin, N., and Whiteley, N. (2017). “Negative association, ordorder and convergence of resampling methods.” arXiv:1707.01845.
  • Gramacy, R., Samworth, R., and King, R. (2010). “Importance tempering.” Statistics and Computing, 20(1): 1–7.
  • Hesterberg, T. (1995). “Weighted average importance sampling and defensive mixture distributions.” Technometrics, 37(2): 185–194.
  • Jasra, A., Stephens, D. A., Doucet, A., and Tsagaris, T. (2011). “Inference for Lévy-driven stochastic volatility models via adaptive sequential Monte Carlo.” Scandinavian Journal of Statistics, 38(1): 1–22.
  • Keribin, C. (2000). “Consistent estimate of the order of mixture models.” Sankhyā: The Indian Journal of Statistics, Series A, 62(1): 49–66.
  • Kong, A., Liu, J. S., and Wong, W. H. (1994). “Sequential imputations and Bayesian missing data problems.” Journal of the American Statistical Association, 89(425): 278–288.
  • Lopes, H. F. and West, M. (2004). “Bayesian model assessment in factor analysis.” Statistica Sinica, 14(1): 41–67.
  • McLachlan, G. and Peel, D. (2000). Finite mixture models. New York: John Wiley & Sons.
  • Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., and Teller, E. (1953). “Equations of state calculations by fast computing machines.” Journal of Chemical Physics, 12(6): 1087–1092.
  • Neal, R. M. (2001). “Annealed importance sampling.” Statistics and Computing, 11: 125–139.
  • Nguyen, T. L. T., Septier, F., Peters, G. W., and Delignon, Y. (2014). “Improving SMC sampler estimate by recycling all past simulated particles.” In 2014 IEEE Workshop. In Statistical Signal Processing (SSP), p. 117–120.
  • Nguyen, T. L. T., Septier, F., Peters, G. W., and Delignon, Y. (2016). “Efficient sequential Monte-Carlo samplers for Bayesian inference.” IEEE Transactions on Signal Processing, 64(5): 1305–1319.
  • Oh, M.-S. and Berger, J. O. (1992). “Adaptive importance sampling in Monte Carlo integration.” Journal of Statistical Computation and Simulation, 41(3–4): 143–168.
  • Owen, A. and Zhou, Y. (2000). “Safe and effective importance sampling.” Journal of the American Statistical Association, 95(449): 135–143.
  • Pearson, K. (1894). “Contributions to the mathematical theory of evolution.” Philosophical Transactions of the Royal Society of London A.
  • Schäfer, C. and Chopin, N. (2013). “Sequential Monte Carlo on large binary sampling spaces.” Statistics and Computing, 23(2): 163–184.
  • Schmidl, D., Czado, C., Hug, S., and Theis, F. J. (2013). “A vine-copula based adaptive MCMC sampler for efficient inference of dynamical systems.” Bayesian Analysis, 8(1): 1–22.
  • Schwarz, G. (1978). “Estimating the dimension of a model.” The Annals of Statistics, 6(2): 461–464.
  • Silva, R., Kohn, R., Giordani, P., and Mun, X. (2010). “A copula based approach to adaptive sampling.”
  • Sklar, M. (1959). “Fonctions de répartition à n dimensions et leurs marges.” Publications de l’Institut de statistique de l’Université de Paris, 229–231.
  • South, L. F., Pettitt, A. N., and Drovandi, C. C. (2018). “Supplementary Material: Sequential Monte Carlo Samplers with Independent Markov Chain Monte Carlo Proposals.” Bayesian Analysis.
  • Tierney, L. (1994). “Markov chains for exploring posterior distributions.” The Annals of Statistics, 22(4): 1701–1728.
  • Tran, M.-N., Giordani, P., Mun, X., Kohn, R., and Pitt, M. K. (2014). “Copula-type estimators for flexible multivariate density modeling using mixtures.” Journal of Computational and Graphical Statistics, 23(4): 1163–1178.
  • Veach, E. and Guibas, L. (1995). “Optimally combining sampling techniques for Monte Carlo rendering.” In SIGGRAPH 1995 Conference Proceedings, 419-428. Addison-Wesley.
  • West, M. and Harrison, J. (1997). Bayesian forecasting and dynamic models. New York: Springer-Verlag New York.

Supplemental materials