Annals of Statistics

Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference

Nicolas Chopin

Full-text: Open access


The term “sequential Monte Carlo methods” or, equivalently, “particle filters,” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result holds under minimal assumptions on the distributions πt, and applies in a general framework which encompasses most of the sequential Monte Carlo methods that have been considered in the literature, including the resample-move algorithm of Gilks and Berzuini [J. R. Stat. Soc. Ser. B Stat. Methodol. 63 (2001) 127–146] and the residual resampling scheme. The corresponding asymptotic variances provide a convenient measurement of the precision of a given particle filter. We study, in particular, in some typical examples of Bayesian applications, whether and at which rate these asymptotic variances diverge in time, in order to assess the long term reliability of the considered algorithm.

Article information

Ann. Statist., Volume 32, Number 6 (2004), 2385-2411.

First available in Project Euclid: 7 February 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 65C05: Monte Carlo methods 62F15: Bayesian inference 60F05: Central limit and other weak theorems
Secondary: 82C80: Numerical methods (Monte Carlo, series resummation, etc.) 62L10: Sequential analysis

Markov chain Monte Carlo particle filter recursive Monte Carlo filter resample-move algorithms residual resampling state-space model


Chopin, Nicolas. Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32 (2004), no. 6, 2385--2411. doi:10.1214/009053604000000698.

Export citation


  • Andrieu, C. and Doucet, A. (2002). Particle filtering for partially observed Gaussian state space models. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 827–836.
  • Baker, J. E. (1985). Adaptive selection methods for genetic algorithms. In Proc. International Conference on Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 101–111. Erlbaum, Mahwah, NJ.
  • Baker, J. E. (1987). Reducing bias and inefficiency in the selection algorithm. In Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 14–21. Erlbaum, Mahwah, NJ.
  • Billingsley, P. (1995). Probability and Measure, 3rd ed. Wiley, New York.
  • Cappé, O., Guillin, A., Marin, J. M. and Robert, C. P. (2004). Population Monte Carlo. J. Comput. Graph. Statist. 13 907–929.
  • Carpenter, J., Clifford, P. and Fearnhead, P. (1999). Improved particle filter for nonlinear problems. IEE Proc. Radar Sonar Navigation 146 2–7.
  • Chen, R. and Liu, J. (2000). Mixture Kalman filters. J. R. Stat. Soc. Ser. B Stat. Methodol. 62 493–508.
  • Chopin, N. (2001). Sequential inference and state number determination for discrete state-space models through particle filtering. CREST Working Paper 2001-34.
  • Chopin, N. (2002). A sequential particle filter method for static models. Biometrika 89 539–552.
  • Crisan, D. and Doucet, A. (2000). Convergence of sequential Monte Carlo methods. Technical Report CUED/F-INFENG/TR381, Cambridge Univ.
  • Crisan, D., Gaines, J. and Lyons, T. (1998). Convergence of a branching particle method to the solution of the Zakai equation. SIAM J. Appl. Math. 58 1568–1590.
  • Crisan, D. and Lyons, T. (1997). Non-linear filtering and measure-valued processes. Probab. Theory Related Fields 109 217–244.
  • Crisan, D. and Lyons, T. (1999). A particle approximation of the solution of the Kushner–Stratonovich equation. Probab. Theory Related Fields 115 549–578.
  • Crisan, D. and Lyons, T. (2002). Minimal entropy approximations and optimal algorithms. Monte Carlo Methods Appl. 8 343–355.
  • Del Moral, P. and Doucet, A. (2002). Sequential Monte Carlo samplers. Technical Report CUED/F-INFENG/TR443, Cambridge Univ.
  • Del Moral, P. and Guionnet, A. (1999). Central limit theorem for nonlinear filtering and interacting particle systems. Ann. Appl. Probab. 9 275–297.
  • Del Moral, P. and Guionnet, A. (2001). On the stability of interacting processes with applications to filtering and genetic algorithms. Ann. Inst. H. Poincaré Probab. Statist. 37 155–194.
  • Del Moral, P. and Miclo, L. (2000). Branching and interacting particle systems approximations of Feynman–Kac formulae with applications to non-linear filtering. Séminaire de Probabilités XXXIV. Lecture Notes in Math. 1729 1–145. Springer, Berlin.
  • Dobrushin, R. L. (1956). Central limit theorem for non-stationary Markov chains I, II. Theory Probab. Appl. 1 65–80, 329–383.
  • Doucet, A., de Freitas, N. and Gordon, N. J., eds. (2001). Sequential Monte Carlo Methods in Practice. Springer, New York.
  • Doucet, A., Godsill, S. and Andrieu, C. (2000). On sequential Monte Carlo sampling methods for Bayesian filtering. Statist. Comput. 10 197–208.
  • Gilks, W. R. and Berzuini, C. (2001). Following a moving target–-Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 127–146.
  • Gordon, N. J., Salmond, D. J. and Smith, A. F. M. (1993). Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. F Radar Signal Proc. 140 107–113.
  • Künsch, H.-R. (2001). State space and hidden Markov models. In Complex Stochastic Systems (O. E. Barndorff-Nielsen, D. R. Cox and C. Klüppelberg, eds.) 109–173. Chapman and Hall, London.
  • Künsch, H.-R. (2003). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Technical Report 112, Seminar für Statistik, ETH Zürich.
  • Le Gland, F. and Oudjane, N. (2004). Stability and uniform approximation of nonlinear filters using the Hilbert metric, and application to particle filters. Ann. Appl. Probab. 14 144–187.
  • Liu, J. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044.
  • Pitt, M. and Shephard, N. (1999). Filtering via simulation: Auxiliary particle filters. J. Amer. Statist. Assoc. 94 590–599.
  • Robert, C. P. and Casella, G. (1999). Monte Carlo Statistical Methods. Springer, New York.
  • Rubin, D. (1988). Using the SIR algorithm to simulate posterior distributions. In Bayesian Statistics 3 (J. M. Bernardo, M. H. DeGroot, D. V. Lindley and A. F. M. Smith, eds.) 395–402. Oxford Univ. Press.
  • Schervish, M. J. (1995). Theory of Statistics. Springer, New York.
  • Tierney, L., Kass, R. E. and Kadane, J. B. (1989). Fully exponential Laplace approximations to expectations and variances of nonpositive functions. J. Amer. Statist. Assoc. 84 710–716.
  • Whitley, D. (1994). A genetic algorithm tutorial. Statist. Comput. 4 65–85.