The Annals of Applied Probability

On the convergence of adaptive sequential Monte Carlo methods

Alexandros Beskos, Ajay Jasra, Nikolas Kantas, and Alexandre Thiery

Full-text: Open access


In several implementations of Sequential Monte Carlo (SMC) methods it is natural and important, in terms of algorithmic efficiency, to exploit the information of the history of the samples to optimally tune their subsequent propagations. In this article we provide a carefully formulated asymptotic theory for a class of such adaptive SMC methods. The theoretical framework developed here will cover, under assumptions, several commonly used SMC algorithms [Chopin, Biometrika 89 (2002) 539–551; Jasra et al., Scand. J. Stat. 38 (2011) 1–22; Schäfer and Chopin, Stat. Comput. 23 (2013) 163–184]. There are only limited results about the theoretical underpinning of such adaptive methods: we will bridge this gap by providing a weak law of large numbers (WLLN) and a central limit theorem (CLT) for some of these algorithms. The latter seems to be the first result of its kind in the literature and provides a formal justification of algorithms used in many real data contexts [Jasra et al. (2011); Schäfer and Chopin (2013)]. We establish that for a general class of adaptive SMC algorithms [Chopin (2002)], the asymptotic variance of the estimators from the adaptive SMC method is identical to a “limiting” SMC algorithm which uses ideal proposal kernels. Our results are supported by application on a complex high-dimensional posterior distribution associated with the Navier–Stokes model, where adapting high-dimensional parameters of the proposal kernels is critical for the efficiency of the algorithm.

Article information

Ann. Appl. Probab., Volume 26, Number 2 (2016), 1111-1146.

Received: February 2014
Revised: January 2015
First available in Project Euclid: 22 March 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 82C80: Numerical methods (Monte Carlo, series resummation, etc.) 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]
Secondary: 60F99: None of the above, but in this section 62F15: Bayesian inference

Adaptive sequential Monte Carlo CLT MCMC


Beskos, Alexandros; Jasra, Ajay; Kantas, Nikolas; Thiery, Alexandre. On the convergence of adaptive sequential Monte Carlo methods. Ann. Appl. Probab. 26 (2016), no. 2, 1111--1146. doi:10.1214/15-AAP1113.

Export citation


  • [1] Andrieu, C. and Moulines, É. (2006). On the ergodicity properties of some adaptive MCMC algorithms. Ann. Appl. Probab. 16 1462–1505.
  • [2] Beskos, A., Crisan, D. and Jasra, A. (2014). On the stability of sequential Monte Carlo methods in high dimensions. Ann. Appl. Probab. 24 1396–1445.
  • [3] Beskos, A., Roberts, G. and Stuart, A. (2009). Optimal scalings for local Metropolis–Hastings chains on nonproduct targets in high dimensions. Ann. Appl. Probab. 19 863–898.
  • [4] Cérou, F., Del Moral, P. and Guyader, A. (2011). A nonasymptotic theorem for unnormalized Feynman–Kac particle models. Ann. Inst. Henri Poincaré Probab. Stat. 47 629–649.
  • [5] Cérou, F. and Guyader, A. (2014). Fluctuation analysis of adaptive multilevel splitting. Technical report, INRIA.
  • [6] Chan, H. P. and Lai, T. L. (2013). A general theory of particle filters in hidden Markov models and some applications. Ann. Statist. 41 2877–2904.
  • [7] Chopin, N. (2002). A sequential particle filter method for static models. Biometrika 89 539–551.
  • [8] Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32 2385–2411.
  • [9] Cotter, S. L., Roberts, G. O., Stuart, A. M. and White, D. (2013). MCMC methods for functions: Modifying old algorithms to make them faster. Statist. Sci. 28 424–446.
  • [10] Crisan, D. and Doucet, A. (2000). Convergence of sequential Monte Carlo methods. Technical report, CUED/F-INFENG/, Cambridge Univ.
  • [11] Del Moral, P. (2004). Feynman–Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Springer, New York.
  • [12] Del Moral, P. (2013). Mean Field Simulation for Monte Carlo Integration. Chapman & Hall, London.
  • [13] Del Moral, P., Doucet, A. and Jasra, A. (2006). Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B. Stat. Methodol. 68 411–436.
  • [14] Del Moral, P., Doucet, A. and Jasra, A. (2012). On adaptive resampling strategies for sequential Monte Carlo methods. Bernoulli 18 252–278.
  • [15] Del Moral, P., Doucet, A. and Jasra, A. (2012). An adaptive sequential Monte Carlo method for approximate Bayesian computation. Stat. Comput. 22 1009–1020.
  • [16] Douc, R. and Moulines, E. (2008). Limit theorems for weighted samples with applications to sequential Monte Carlo methods. Ann. Statist. 36 2344–2376.
  • [17] Douc, R., Moulines, E. and Olsson, J. (2014). Long-term stability of sequential Monte Carlo methods under verifiable conditions. Ann. Appl. Probab. 24 1767–1802.
  • [18] Doucet, A. and Johansen, A. (2011). A tutorial on particle filtering and smoothing: Fifteen years later. In Handbook of Nonlinear Filtering (D. Crisan and B. Rozovsky, eds.). Oxford Univ. Press, Oxford.
  • [19] Gelman, A. and Meng, X.-L. (1998). Simulating normalizing constants: From importance sampling to bridge sampling to path sampling. Statist. Sci. 13 163–185.
  • [20] Giraud, F. and Del Moral, P. (2015). Non-asymptotic analysis of adaptive and annealed Feynman–Kac particle models. Bernoulli. To appear.
  • [21] Jasra, A., Stephens, D. A., Doucet, A. and Tsagaris, T. (2011). Inference for Lévy-driven stochastic volatility models via adaptive sequential Monte Carlo. Scand. J. Stat. 38 1–22.
  • [22] Kantas, N., Beskos, A. and Jasra, A. (2014). Sequential Monte Carlo methods for high-dimensional inverse problems: A case study for the Navier–Stokes equations. SIAM/ASA J. Uncertain. Quantificat. 2 464–489.
  • [23] Schäfer, C. and Chopin, N. (2013). Sequential Monte Carlo on large binary sampling spaces. Stat. Comput. 23 163–184.