This paper establishes a framework for formal comparisons of several leading optimization algorithms, establishing guidance to practitioners for when to use or not use a particular method. The focus in this paper is five general algorithm forms: random search, simultaneous perturbation stochastic approximation, simulated annealing, evolutionary strategies, and genetic algorithms. We summarize the available theoretical results on rates of convergence for the five algorithm forms and then use the theoretical results to draw some preliminary conclusions on the relative efficiency. Our aim is to sort out some of the competing claims of efficiency and to suggest a structure for comparison that is more general and transferable than the usual problem-specific numerical studies.
E. L. Lehmann,et al.
Theory of point estimation
N. Metropolis,et al.
Equation of state calculations by fast computing machines
Ing Rj Ser.
Approximation Theorems of Mathematical Statistics
Marius Iosifescu,et al.
Finite Markov Processes and Their Applications
C. D. Gelatt,et al.
Optimization by Simulated Annealing
On Linear Convergence of a Class of Random Search Algorithms
Multivariate stochastic approximation using a simultaneous perturbation gradient approximation
S. Mitter,et al.
Metropolis-type annealing algorithms for global optimization in R d
Hans-Georg Beyer,et al.
Toward a Theory of Evolution Strategies: On the Benefits of Sex the (/, ) Theory
Joe Suzuki,et al.
A Markov chain analysis on simple genetic algorithms
IEEE Trans. Syst. Man Cybern..
G. Unter Rudolph.
Convergence Rates of Evolutionary Algorithms for a Class of Convex Objective Functions
D. C. Chin,et al.
Comparative study of stochastic algorithms for system optimization based on gradient approximations
IEEE Trans. Syst. Man Cybern. Part B.
David H. Wolpert,et al.
No free lunch theorems for optimization
IEEE Trans. Evol. Comput..
J. Dippon,et al.
Weighted Means in Stochastic Approximation of Minima
Finite Markov Chain Results in Evolutionary Computation: a Tour D'horizon
Gang George Yin.
Rates of Convergence for a Class of Global Stochastic Optimization Algorithms
SIAM J. Optim..
László Gerencsér,et al.
Convergence rate of moments in stochastic approximation with simultaneous perturbation gradient approximation and resetting
IEEE Trans. Autom. Control..
L. Gerencsér,et al.
SPSA in noise free optimization
Proceedings of the 2000 American Control Conference. ACC (IEEE Cat. No.00CH36334).
James C. Spall,et al.
Adaptive stochastic approximation by the simultaneous perturbation method
IEEE Trans. Autom. Control..
D. R. Stark,et al.
Computable bounds on the rate of convergence in evolutionary computation
Proceedings of the 2001 American Control Conference. (Cat. No.01CH37148).
Dirk V. Arnold,et al.
Noisy Optimization With Evolution Strategies
Genetic Algorithms and Evolutionary Computation.
Tim Hesterberg,et al.
Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control