Journal of Applied Probability

Automated state-dependent importance sampling for Markov jump processes via sampling from the zero-variance distribution

Adam W. Grace, Dirk P. Kroese, and Werner Sandmann

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


Many complex systems can be modeled via Markov jump processes. Applications include chemical reactions, population dynamics, and telecommunication networks. Rare-event estimation for such models can be difficult and is often computationally expensive, because typically many (or very long) paths of the Markov jump process need to be simulated in order to observe the rare event. We present a state-dependent importance sampling approach to this problem that is adaptive and uses Markov chain Monte Carlo to sample from the zero-variance importance sampling distribution. The method is applicable to a wide range of Markov jump processes and achieves high accuracy, while requiring only a small sample to obtain the importance parameters. We demonstrate its efficiency through benchmark examples in queueing theory and stochastic chemical kinetics.

Article information

J. Appl. Probab., Volume 51, Number 3 (2014), 741-755.

First available in Project Euclid: 5 September 2014

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60J28: Applications of continuous-time Markov processes on discrete state spaces
Secondary: 62M05: Markov processes: estimation

Importance sampling adaptive automated improved cross entropy state dependent zero-variance distribution Markov jump process continuous-time Markov chain stochastic chemical kinetics queueing system


Grace, Adam W.; Kroese, Dirk P.; Sandmann, Werner. Automated state-dependent importance sampling for Markov jump processes via sampling from the zero-variance distribution. J. Appl. Probab. 51 (2014), no. 3, 741--755. doi:10.1239/jap/1409932671.

Export citation