The Annals of Applied Probability

A sequential Monte Carlo approach to computing tail probabilities in stochastic models

Hock Peng Chan and Tze Leung Lai

Full-text: Open access

Abstract

Sequential Monte Carlo methods which involve sequential importance sampling and resampling are shown to provide a versatile approach to computing probabilities of rare events. By making use of martingale representations of the sequential Monte Carlo estimators, we show how resampling weights can be chosen to yield logarithmically efficient Monte Carlo estimates of large deviation probabilities for multidimensional Markov random walks.

Article information

Source
Ann. Appl. Probab., Volume 21, Number 6 (2011), 2315-2342.

Dates
First available in Project Euclid: 23 November 2011

Permanent link to this document
https://projecteuclid.org/euclid.aoap/1322057323

Digital Object Identifier
doi:10.1214/10-AAP758

Mathematical Reviews number (MathSciNet)
MR2895417

Zentralblatt MATH identifier
1246.60042

Subjects
Primary: 60F10: Large deviations 65C05: Monte Carlo methods
Secondary: 60J22: Computational methods in Markov chains [See also 65C40] 60K35: Interacting random processes; statistical mechanics type models; percolation theory [See also 82B43, 82C43]

Keywords
Exceedance probabilities large deviations logarithmic efficiency sequential importance sampling and resampling

Citation

Chan, Hock Peng; Lai, Tze Leung. A sequential Monte Carlo approach to computing tail probabilities in stochastic models. Ann. Appl. Probab. 21 (2011), no. 6, 2315--2342. doi:10.1214/10-AAP758. https://projecteuclid.org/euclid.aoap/1322057323


Export citation

References

  • [1] Baker, J. E. (1985). Adaptive selection methods for genetic algorithms. In Proc. International Conference on Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 101–111. Erlbaum, Mahwah, NJ.
  • [2] Baker, J. E. (1987). Reducing bias and inefficiency in the selection algorithm. In Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 14–21. Erlbaum, Mahwah, NJ.
  • [3] Blanchet, J. and Glynn, P. (2008). Efficient rare-event simulation for the maximum of heavy-tailed random walks. Ann. Appl. Probab. 18 1351–1378.
  • [4] Brown, L. (1986). Fundamentals of Statistical Exponential Families. Institute of Mathematical Statistics Lecture Notes 9. IMS, Hayward, CA.
  • [5] Bucklew, J. A., Ney, P. and Sadowsky, J. S. (1990). Monte Carlo simulation and large deviations theory for uniformly recurrent Markov chains. J. Appl. Probab. 27 44–59.
  • [6] Chan, H. P. and Lai, T. L. (2000). Asymptotic approximations for error probabilities of sequential or fixed sample size tests in exponential families. Ann. Statist. 28 1638–1669.
  • [7] Chan, H. P. and Lai, T. L. (2003). Saddlepoint approximations and nonlinear boundary crossing probabilities of Markov random walks. Ann. Appl. Probab. 13 395–429.
  • [8] Chan, H. P. and Lai, T. L. (2007). Efficient importance sampling for Monte Carlo evaluation of exceedance probabilities. Ann. Appl. Probab. 17 440–473.
  • [9] Chan, H. P. and Lai, T. L. (2008). A general theory of particle filters in hidden Markov models and some applications. Technical report, Dept. Statistics, Stanford Univ.
  • [10] Collamore, J. F. (2002). Importance sampling techniques for the multidimensional ruin problem for general Markov additive sequences of random vectors. Ann. Appl. Probab. 12 382–421.
  • [11] Crisan, D., Del Moral, P. and Lyons, T. (1999). Discrete filtering using branching and interacting particle systems. Markov Process. Related Fields 5 293–318.
  • [12] de la Peña, V. H., Lai, T. L. and Shao, Q.-M. (2009). Self-Normalized Processes: Limit Theory and Statistical Applications. Springer, Berlin.
  • [13] Del Moral, P., Doucet, A. and Jasra, A. (2006). Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B Stat. Methodol. 68 411–436.
  • [14] Del Moral, P. and Garnier, J. (2005). Genealogical particle analysis of rare events. Ann. Appl. Probab. 15 2496–2534.
  • [15] Del Moral, P. and Jacod, J. (2001). Interacting particle filtering with discrete observations. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.) 43–75. Springer, New York.
  • [16] Dupuis, P. and Wang, H. (2005). Dynamic importance sampling for uniformly recurrent Markov chains. Ann. Appl. Probab. 15 1–38.
  • [17] Dupuis, P. and Wang, H. (2007). Subsolutions of an Isaacs equation and efficient schemes for importance sampling. Math. Oper. Res. 32 723–757.
  • [18] Glasserman, P. and Wang, Y. (1997). Counterexamples in importance sampling for large deviations probabilities. Ann. Appl. Probab. 7 731–746.
  • [19] Ney, P. and Nummelin, E. (1987). Markov Additive Processes. I. Eigenvalues Properties and Limit Theorems. II. Large Deviations. Ann. Probab. 15 561–592, 593–609.
  • [20] Press, W. H., Flannery, B. P., Teukolsky, S. A. and Vetterling, W. T. (1992). Numerical Recipes in C: The Art of Scientific Computing, 2nd ed. Cambridge Univ. Press, Cambridge.
  • [21] Sadowsky, J. S. and Bucklew, J. A. (1990). On large deviations theory and asymptotically efficient Monte Carlo estimation. IEEE Trans. Inform. Theory 36 579–588.
  • [22] Siegmund, D. (1975). Error probabilities and average sample number of the sequential probability ratio test. J. Roy. Statist. Soc. Ser. B 37 394–401.