SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates. Examples of the methods, including the generation of random orthogonal matrices and potential applications of the methods to numerical problems arising in statistics, are discussed. For numerical problems in a large number of dimensions, Monte Carlo methods are often more efficient than conventional numerical methods. However, implementation of the Monte Carlo methods requires sampling from high dimensional probability distributions and this may be very difficult and expensive in analysis and computer time. General methods for sampling from, or estimating expectations with respect to, such distributions are as follows. (i) If possible, factorize the distribution into the product of one-dimensional conditional distributions from which samples may be obtained. (ii) Use importance sampling, which may also be used for variance reduction. That is, in order to evaluate the integral J = X) p(x)dx = Ev(f), where p(x) is a probability density function, instead of obtaining independent samples XI, ..., Xv from p(x) and using the estimate J, = Zf(xi)/N, we instead obtain the sample from a distribution with density q(x) and use the estimate J2 = Y{f(xj)p(x1)}/{q(xj)N}. This may be advantageous if it is easier to sample from q(x) thanp(x), but it is a difficult method to use in a large number of dimensions, since the values of the weights w(xi) = p(x1)/q(xj) for reasonable values of N may all be extremely small, or a few may be extremely large. In estimating the probability of an event A, however, these difficulties may not be as serious since the only values of w(x) which are important are those for which x -A. Since the methods proposed by Trotter & Tukey (1956) for the estimation of conditional expectations require the use of importance sampling, the same difficulties may be encountered in their use. (iii) Use a simulation technique; that is, if it is difficult to sample directly from p(x) or if p(x) is unknown, sample from some distribution q(y) and obtain the sample x values as some function of the corresponding y values. If we want samples from the conditional dis

[1]
Morris H. Hansen,et al.
Sample survey methods and theory
,
1955
.

[2]
N. Metropolis,et al.
Equation of State Calculations by Fast Computing Machines
,
1953,
Resonance.

[3]
A. T. James,et al.
A generating function for averages over the orthogonal group
,
1955,
Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[4]
G. H. Jowett.
THE COMPARISON OF MEANS OF SETS OF OBSERVATIONS FROM SECTIONS OF INDEPENDENT STOCHASTIC SERIES
,
1955
.

[5]
Morris H. Hansen,et al.
Sample survey methods and theory
,
1955
.

[6]
J. M. Hammersley,et al.
Conditional Monte Carlo
,
1956,
JACM.

[7]
E. J. Hannan,et al.
The Variance of the Mean of a Stationary Process
,
1957
.

[8]
T. Teichmann,et al.
The Measurement of Power Spectra
,
1960
.

[9]
D. Handscomb,et al.
Computation of Order Parameters in an Ising Lattice by the Monte Carlo Method
,
1960
.

[10]
D. Handscomb.
The Monte Carlo method in quantum statistical mechanics
,
1962,
Mathematical Proceedings of the Cambridge Philosophical Society.

[11]
K. D. Tocher,et al.
The art of simulation
,
1967
.

[12]
J. Hammersley,et al.
Monte Carlo Methods
,
1965
.

[13]
A. James.
Distributions of Matrix Variates and Latent Roots Derived from Normal Samples
,
1964
.

[14]
A. Barker.
Monte Carlo calculations of the radial distribution functions for a proton-electron plasma
,
1965
.

[15]
Vidal R. Algazi,et al.
Data smoothing and prediction: by R. B. Blackman. 182 pages, diagrams, 6 × 9 in. Reading, Mass., Addison-Wesley Pub., Co., 1965. Price, $11.75
,
1966
.

[16]
Norman A. Baily,et al.
Computing Methods For Scientists And Engineers
,
1968
.

[17]
D. Fraser.
The Structure of Inference.
,
1969
.

[18]
R. Gnanadesikan,et al.
Probability plotting methods for the analysis of data.
,
1968,
Biometrika.