Markov Chain Monte Carlo in Practice: A Roundtable Discussion

Abstract Markov chain Monte Carlo (MCMC) methods make possible the use of flexible Bayesian models that would otherwise be computationally infeasible. In recent years, a great variety of such applications have been described in the literature. Applied statisticians who are new to these methods may have several questions and concerns, however: How much effort and expertise are needed to design and use a Markov chain sampler? How much confidence can one have in the answers that MCMC produces? How does the use of MCMC affect the rest of the model-building process? At the Joint Statistical Meetings in August, 1996, a panel of experienced MCMC users discussed these and other issues, as well as various “tricks of the trade” This article is an edited recreation of that discussion. Its purpose is to offer advice and guidance to novice users of MCMC—and to not-so-novice users as well. Topics include building confidence in simulation results, methods for speeding and assessing convergence, estimating standard error...

[1]  K. Pearson Biometrika , 1902, The American Naturalist.

[2]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[3]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[4]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Wang,et al.  Nonuniversal critical dynamics in Monte Carlo simulations. , 1987, Physical review letters.

[6]  Brian D. Ripley,et al.  Stochastic Simulation , 2005 .

[7]  Adrian F. M. Smith,et al.  Sampling-Based Approaches to Calculating Marginal Densities , 1990 .

[8]  Charles J. Geyer,et al.  Practical Markov Chain Monte Carlo , 1992 .

[9]  G. Casella,et al.  Explaining the Gibbs Sampler , 1992 .

[10]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[11]  M. Tanner,et al.  Facilitating the Gibbs Sampler: The Gibbs Stopper and the Griddy-Gibbs Sampler , 1992 .

[12]  J. Besag,et al.  Spatial Statistics and Bayesian Computation , 1993 .

[13]  Peter Green,et al.  Spatial statistics and Bayesian computation (with discussion) , 1993 .

[14]  Walter R. Gilks,et al.  A Language and Program for Complex Bayesian Modelling , 1994 .

[15]  L. Mark Berliner,et al.  Subsampling the Gibbs Sampler , 1994 .

[16]  L. Tierney Markov Chains for Exploring Posterior Distributions , 1994 .

[17]  Bradley P. Carlin,et al.  Markov Chain Monte Carlo conver-gence diagnostics: a comparative review , 1996 .

[18]  Kerrie Mengersen,et al.  [Bayesian Computation and Stochastic Systems]: Rejoinder , 1995 .

[19]  A. Gelfand,et al.  Efficient parametrisations for normal linear mixed models , 1995 .

[20]  A. Gelfand,et al.  [Bayesian Computation and Stochastic Systems]: Comment , 1995 .

[21]  J. Besag,et al.  Bayesian Computation and Stochastic Systems , 1995 .

[22]  Walter R. Gilks,et al.  BUGS - Bayesian inference Using Gibbs Sampling Version 0.50 , 1995 .

[23]  C. Geyer,et al.  Annealing Markov chain Monte Carlo with applications to ancestral inference , 1995 .

[24]  David B. Dunson,et al.  Bayesian Data Analysis , 2010 .

[25]  P. Green,et al.  Bayesian Computation and Stochastic , 1995 .

[26]  Sylvia Richardson,et al.  Inference and monitoring convergence , 1995 .

[27]  N. Best,et al.  Convergence Diagnosis and Output Analysis Software for Gibbs sampling output Version 0.30 , 1995 .

[28]  Geoffrey E. Hinton,et al.  Bayesian Learning for Neural Networks , 1995 .

[29]  S. Chib,et al.  Understanding the Metropolis-Hastings Algorithm , 1995 .

[30]  K. Cowles,et al.  CODA: convergence diagnosis and output analysis software for Gibbs sampling output , 1995 .

[31]  Walter R. Gilks,et al.  Comment on "Bayesian Computation and Stochastic Systems" , 1995 .

[32]  M. Evans,et al.  Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems , 1995 .

[33]  A. Gelman,et al.  Physiological Pharmacokinetic Analysis Using Population Modeling and Informative Prior Distributions , 1996 .

[34]  Xiao-Li Meng,et al.  POSTERIOR PREDICTIVE ASSESSMENT OF MODEL FITNESS VIA REALIZED DISCREPANCIES , 1996 .

[35]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[36]  Radford M. Neal Sampling from multimodal distributions using tempered transitions , 1996, Stat. Comput..

[37]  A. Gelfand,et al.  Efficient parametrizations for generalized linear mixed models, (with discussion). , 1996 .

[38]  A. Gelfand,et al.  Identi ability, Propriety and Parametrization with Regard to Simulation-Based Fitting of Generalized Linear Mixed Models , 1996 .

[39]  Sylvia Richardson,et al.  Markov Chain Monte Carlo in Practice , 1997 .

[40]  Bradley P. Carlin,et al.  BAYES AND EMPIRICAL BAYES METHODS FOR DATA ANALYSIS , 1996, Stat. Comput..

[41]  Xiao-Li Meng,et al.  The EM Algorithm—an Old Folk‐song Sung to a Fast New Tune , 1997 .

[42]  R. Kass,et al.  Subregion-Adaptive Integration of Functions Having a Dominant Peak , 1997 .

[43]  Radford M. Neal,et al.  Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation , 1995, Learning in Graphical Models.

[44]  Adrian F. M. Smith,et al.  Bayesian Statistics 5. , 1998 .