Automated State-Dependent Importance Sampling for Markov Jump Processes via Sampling from the Zero-Variance Distribution

Many complex systems can be modeled via Markov jump processes. Applications include chemical reactions, population dynamics, and telecommunication networks. Rare-event estimation for such models can be difficult and is often computationally expensive, because typically many (or very long) paths of the Markov jump process need to be simulated in order to observe the rare event. We present a statedependent importance sampling approach to this problem that is adaptive and uses Markov chain Monte Carlo to sample from the zero-variance importance sampling distribution. The method is applicable to a wide range of Markov jump processes and achieves high accuracy, while requiring only a small sample to obtain the importance parameters. We demonstrate its efficiency through benchmark examples in queueing theory and stochastic chemical kinetics.

[1]  Ronald W. Wolff,et al.  Stochastic Modeling and the Theory of Queues , 1989 .

[2]  Pieter-Tjerk de Boer,et al.  Adaptive state- dependent importance sampling simulation of markovian queueing networks , 2002, Eur. Trans. Telecommun..

[3]  P. K. Pollett,et al.  Diffusion approximations for some simple chemical reaction schemes , 1992, Advances in Applied Probability.

[4]  Robert L. Smith,et al.  Efficient Monte Carlo Procedures for Generating Points Uniformly Distributed over Bounded Regions , 1984, Oper. Res..

[5]  D. A. Mcquarrie Stochastic approach to chemical kinetics , 1967, Journal of Applied Probability.

[6]  Victor F. Nicola,et al.  Efficient importance sampling heuristics for the simulation of population overflow in Jackson networks , 2005 .

[7]  Robert L. Smith The hit-and-run sampler: a globally reaching Markov chain sampler for generating arbitrary multivariate distributions , 1996, Winter Simulation Conference.

[8]  Dirk P. Kroese,et al.  The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning , 2004 .

[9]  Min K. Roh,et al.  State-dependent doubly weighted stochastic simulation algorithm for automatic characterization of stochastic biochemical rare events. , 2011, The Journal of chemical physics.

[10]  J. E. Cohen,et al.  Markov population processes as models of primate social and population dynamics. , 1972, Theoretical population biology.

[11]  Donald L. Iglehart,et al.  Importance sampling for stochastic simulations , 1989 .

[12]  D. Gillespie Exact Stochastic Simulation of Coupled Chemical Reactions , 1977 .

[13]  Ian A. Wood,et al.  Approximating the tail of the Anderson-Darling distribution , 2012, Comput. Stat. Data Anal..

[14]  HeidelbergerPhilip Fast simulation of rare events in queueing and reliability models , 1995 .

[15]  Dirk P. Kroese,et al.  Handbook of Monte Carlo Methods , 2011 .

[16]  P. Dupuis,et al.  Dynamic importance sampling for queueing networks , 2007, 0710.4389.

[17]  R. Pearl,et al.  The biology of population growth , 2005, Zeitschrift für Induktive Abstammungs- und Vererbungslehre.

[18]  Henry Lam,et al.  State-dependent importance sampling for rare-event simulation: An overview and recent advances , 2012 .

[19]  Peter W. Glynn,et al.  Stochastic Simulation: Algorithms and Analysis , 2007 .

[20]  László Lovász,et al.  Hit-and-run mixes fast , 1999, Math. Program..

[21]  Dirk P. Kroese,et al.  Improved cross-entropy method for estimation , 2011, Statistics and Computing.

[22]  Bernie J Daigle,et al.  Automated estimation of rare event probabilities in biochemical systems. , 2011, The Journal of chemical physics.

[23]  Dirk P. Kroese,et al.  Efficient simulation of a tandem Jackson network , 1999, WSC.

[24]  Paul Dupuis,et al.  Importance sampling for Jackson networks , 2009, Queueing Syst. Theory Appl..

[25]  D. Sherrington Stochastic Processes in Physics and Chemistry , 1983 .

[26]  S. Asmussen,et al.  Applied Probability and Queues , 1989 .