The Annals of Statistics

Limit theorems for weighted samples with applications to sequential Monte Carlo methods

Randal Douc and Eric Moulines

Full-text: Open access

Abstract

In the last decade, sequential Monte Carlo methods (SMC) emerged as a key tool in computational statistics [see, e.g., Sequential Monte Carlo Methods in Practice (2001) Springer, New York, Monte Carlo Strategies in Scientific Computing (2001) Springer, New York, Complex Stochastic Systems (2001) 109–173]. These algorithms approximate a sequence of distributions by a sequence of weighted empirical measures associated to a weighted population of particles, which are generated recursively.

Despite many theoretical advances [see, e.g., J. Roy. Statist. Soc. Ser. B 63 (2001) 127–146, Ann. Statist. 33 (2005) 1983–2021, Feynman–Kac Formulae. Genealogical and Interacting Particle Systems with Applications (2004) Springer, Ann. Statist. 32 (2004) 2385–2411], the large-sample theory of these approximations remains a question of central interest. In this paper we establish a law of large numbers and a central limit theorem as the number of particles gets large. We introduce the concepts of weighted sample consistency and asymptotic normality, and derive conditions under which the transformations of the weighted sample used in the SMC algorithm preserve these properties. To illustrate our findings, we analyze SMC algorithms to approximate the filtering distribution in state-space models. We show how our techniques allow to relax restrictive technical conditions used in previously reported works and provide grounds to analyze more sophisticated sequential sampling strategies, including branching, resampling at randomly selected times, and so on.

Article information

Source
Ann. Statist., Volume 36, Number 5 (2008), 2344-2376.

Dates
First available in Project Euclid: 13 October 2008

Permanent link to this document
https://projecteuclid.org/euclid.aos/1223908095

Digital Object Identifier
doi:10.1214/07-AOS514

Mathematical Reviews number (MathSciNet)
MR2458190

Zentralblatt MATH identifier
1155.62056

Subjects
Primary: 60F05: Central limit and other weak theorems 62L10: Sequential analysis 65C05: Monte Carlo methods
Secondary: 65C35: Stochastic particle methods [See also 82C80] 65C60: Computational problems in statistics

Keywords
Branching conditional central limit theorems particle filtering sequential importance sampling sequential Monte Carlo

Citation

Douc, Randal; Moulines, Eric. Limit theorems for weighted samples with applications to sequential Monte Carlo methods. Ann. Statist. 36 (2008), no. 5, 2344--2376. doi:10.1214/07-AOS514. https://projecteuclid.org/euclid.aos/1223908095


Export citation

References

  • [1] Doucet, A., De Freitas, N. and Gordon, N., eds. (2001). Sequential Monte Carlo Methods in Practice. Springer, New York.
  • [2] Liu, J. (2001). Monte Carlo Strategies in Scientific Computing. Springer, New York.
  • [3] Künsch, H. R. (2001). State space and hidden Markov models. In Complex Stochastic Systems (O. E. Barndorff-Nielsen, D. R. Cox and C. Klueppelberg, eds.) 109–173. CRC Publisher, Boca Raton, FL.
  • [4] Gilks, W. R. and Berzuini, C. (2001). Following a moving target—Monte Carlo inference for dynamic Bayesian models. J. Roy. Statist. Soc. Ser. B 63 127–146.
  • [5] Künsch, H. R. (2005). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Ann. Statist. 33 1983–2021.
  • [6] Del Moral, P. (2004). Feynman–Kac Formulae. Genealogical and Interacting Particle Systems with Applications. Springer, Berlin.
  • [7] Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to bayesian inference. Ann. Statist. 32 2385–2411.
  • [8] Handschin, J. and Mayne, D. (1969). Monte Carlo techniques to estimate the conditional expectation in multi-stage non-linear filtering. Internat. J. Control 9 547–559.
  • [9] Rubin, D. B. (1987). A noniterative sampling/importance resampling alternative to the data augmentation algorithm for creating a few imputations when the fraction of missing information is modest: The SIR algorithm (discussion of Tanner and Wong). J. Amer. Statist. Assoc. 82 543–546.
  • [10] Landau, D. P. and Binder, K. (2000). A guide to Monte Carlo Simulations in Statistical Physics. Cambridge Univ. Press.
  • [11] Liu, J. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044.
  • [12] Ristic, B., Arulampalam, M. and Gordon, A. (2004). Beyond Kalman Filters: Particle Filters for Target Tracking. Artech House.
  • [13] Cappé, O., Guillin, A., Marin, J. M. and Robert, C. P. (2004). Population Monte Carlo. J. Comput. Graph. Statist. 13 907–929.
  • [14] Del Moral, P. (1996). Nonlinear filtering: Interacting particle solution. Markov Process. Related Fields 2 555–579.
  • [15] Del Moral, P. and Guionnet, A. (1999). Central limit theorem for nonlinear filtering and interacting particle systems. Ann. Appl. Probab. 9 275–297.
  • [16] Del Moral, P. and Miclo, L. (2000). Branching and interacting particle systems approximations of Feynman–Kac formulae with applications to nonlinear filtering. Séminaire de Probabilités XXXIV. Lecture Notes in Math. 1729 1–145. Springer, Berlin.
  • [17] Crisan, D. and Lyons, T. (1997). Nonlinear filtering and measure-valued processes. Probab. Theory Related Fields 109 217–244.
  • [18] Crisan, D. and Doucet, A. (2002). A survey of convergence results on particle filtering methods for practitioners. IEEE Trans. Signal Process. 50 736–746.
  • [19] Berzuini, C. and Gilks, W. R. (2001). Resample-move filtering with cross-model jumps. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. De Freitas and N. Gordon, eds.) 117–138. Springer, Berlin.
  • [20] Pitt, M. K. and Shephard, N. (1999). Filtering via simulation: Auxiliary particle filters. J. Amer. Statist. Assoc. 94 590–599.
  • [21] Liu, J. and Chen, R. (1995). Blind deconvolution via sequential imputations. J. Amer. Statist. Assoc. 90 567–576.
  • [22] Crisan, D., Gaines, J. and Lyons, T. (1998). Convergence of a branching particle method to the solution of the Zakai equation. SIAM J. Appl. Math. 58 1568–1590 (electronic).
  • [23] Crisan, D. (2003). Exact rates of convergence for a branching particle approximation to the solution of the Zakai equation. Ann. Probab. 31 693–718.
  • [24] Liu, J., Chen, R. and Logvinenko, T. (2001). A theoretical framework for sequential importance sampling and resampling. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. De Freitas and N. Gordon, eds.) 225–246. Springer, Berlin.
  • [25] Kong, A., Liu, J. S. and Wong, W. (1994). Sequential imputation and Bayesian missing data problems. J. Amer. Statist. Assoc. 89 278–288, 590–599.
  • [26] Shiryaev, A. N. (1996). Probability, 2nd ed. Springer, New York.
  • [27] Dvoretzky, A. (1972). Asymptotic normality for sums of dependent random variables. Proc. Sixth Berkeley Symp. Math. Statist. Probab. II (Univ. California, Berkeley, Calif., 1970/1971). Probability Theory 513–535. Univ. California Press, Berkeley.