Search, polynomial complexity, and the fast messy genetic algorithm

Blackbox optimization--optimization in presence of limited knowledge about the objective function--has recently enjoyed a large increase in interest because of the demand from the practitioners. This has triggered a race for new high performance algorithms for solving large, difficult problems. Simulated annealing, genetic algorithms, tabu search are some examples. Unfortuntely, each of these algorithms is creating a separate field in itself and their use in practice is often guided by personal discretion rather than scientific reasons. The primary reason behind this confusing situation is the lack of any comprehensive understanding about blackbox search. This dissertation takes a step toward clearing some of the confusion. The main objectives of this dissertation are: (1) present SEARCH (Search Envisioned As Relation & Class Hierarchizing)--an alternate perspective of blackbox optimization and its quantitative analysis that lays the foundation essential for transcending the limits of random enumerative search; (2) design and testing of the fast messy genetic algorithm. SEARCH is a general framework for understanding blackbox optimization in terms of relations, classes and ordering. The primary motivation comes from the observation that sampling in blackbox optimization is essentially an inductive process (Michalski, 1983) and in absence of any relation among the members of the search space, induction is no better than enumeration. The foundation of SEARCH is laid on a decomposition of BBO into relation, class, and sample spaces. An ordinal, probablistic, and approximate framework is developed on this foundation to identify the fundamental principles in-blackbox optimization, essential for transcending the limits of random enumerative search. Bounds on success probability and sample complexity ate derived. I explicitly consider specific blackbox algorithms like simulated annealing, genetic algorithms and demonstrate that the fundamental computations in all of them can be captured using SEARCH. SEARCH also offers an alternate perspective of natural evolution that establishes the computational role of gene expression (DNA $\to$ RNA $\to$ Protein) in evolution. This model of evolutionary computation hypothesizes a possible mapping of the decomposition is relation, class, and sample spaces of SEARCH into the transcriptional regulatory mechanisms, proteins, and DNA respectively. The second part of this dissertation starts by noting the limitations of simple GAs, which fail to properly search for relations and makes decision making very noisy by combining relation, class, and the sample spaces. Messy genetic algorithms (Goldberg, Korb, & Deb, 1989; Deb, 1991) are a rare class of algorithms that emphasize the search for relations. Despite this strength of messy GAs, they lacked complete benefits of implicit parallelism (Holland, 1975). The fast messy GA initiated by Goldberg, Deb, Kargupta, and Harik (1993) introduced some of the benefits of implicit parallelism in messy GA without sacrificing its other strengths very much. This dissertation investigates fast messy GAs and presents test results to demonstrate its performance for order-k delineable problems.

[1]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .

[2]  F. Young Biochemistry , 1955, The Indian Medical Gazette.

[3]  J. Monod,et al.  Genetic regulatory mechanisms in the synthesis of proteins. , 1961, Journal of molecular biology.

[4]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[5]  John Daniel. Bagley,et al.  The behavior of adaptive systems which employ genetic and correlation algorithms : technical report , 1967 .

[6]  R. Rosenberg Simulation of genetic populations with biochemical properties : technical report , 1967 .

[7]  Nils J. Nilsson,et al.  A Formal Basis for the Heuristic Determination of Minimum Cost Paths , 1968, IEEE Trans. Syst. Sci. Cybern..

[8]  M. H. Heycock,et al.  Papers , 1971, BMJ : British Medical Journal.

[9]  K. Dejong,et al.  An analysis of the behavior of a class of genetic adaptive systems , 1975 .

[10]  John H. Holland,et al.  Adaptation in natural and artificial systems , 1975 .

[11]  I. Olkin,et al.  Selecting and Ordering Populations: A New Statistical Methodology , 1977 .

[12]  Jacobus P. H. Wessels,et al.  The Art and Theory of Dynamic Programming , 1979 .

[13]  Temple F. Smith Occam's razor , 1980, Nature.

[14]  Anne Brindle,et al.  Genetic algorithms for function optimization , 1980 .

[15]  Nesa L'abbe Wu,et al.  Linear programming and extensions , 1981 .

[16]  Lashon B. Booker,et al.  Intelligent Behavior as an Adaptation to the Task Environment , 1982 .

[17]  J. Davies,et al.  Molecular Biology of the Cell , 1983, Bristol Medico-Chirurgical Journal.

[18]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[19]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[20]  Francesco Archetti,et al.  A survey on the global optimization problem: General theory and computational approaches , 1984, Ann. Oper. Res..

[21]  Alexander H. G. Rinnooy Kan,et al.  Stochastic methods for global optimization , 1984 .

[22]  David E. Goldberg,et al.  Alleles, loci and the traveling salesman problem , 1985 .

[23]  D. E. Goldberg,et al.  Simple Genetic Algorithms and the Minimal, Deceptive Problem , 1987 .

[24]  David J. Sirag,et al.  Toward a unified thermodynamic genetic operator , 1987 .

[25]  Lawrence Davis,et al.  Genetic Algorithms and Simulated Annealing , 1987 .

[26]  D. Ackley A connectionist machine for genetic hillclimbing , 1987 .

[27]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[28]  David Haussler,et al.  Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework , 1988, Artif. Intell..

[29]  Gilbert Syswerda,et al.  Uniform Crossover in Genetic Algorithms , 1989, ICGA.

[30]  J. Mockus,et al.  The Bayesian approach to global optimization , 1989 .

[31]  John J. Grefenstette,et al.  How Genetic Algorithms Work: A Critical Look at Implicit Parallelism , 1989, ICGA.

[32]  Kalyanmoy Deb,et al.  Messy Genetic Algorithms: Motivation, Analysis, and First Results , 1989, Complex Syst..

[33]  Fred W. Glover,et al.  Tabu Search - Part I , 1989, INFORMS J. Comput..

[34]  Kalyanmoy Deb,et al.  Messy Genetic Algorithms Revisited: Studies in Mixed Size and Scale , 1990, Complex Syst..

[35]  David E. Goldberg,et al.  A Note on Boltzmann Tournament Selection for Genetic Algorithms and Population-Oriented Simulated Annealing , 1990, Complex Syst..

[36]  D. Goldberg,et al.  An investigation of messy genetic algorithms , 1990 .

[37]  Gerhard W. Dueck,et al.  Threshold accepting: a general purpose optimization algorithm appearing superior to simulated anneal , 1990 .

[38]  Kalyanmoy Deb,et al.  A Comparative Analysis of Selection Schemes Used in Genetic Algorithms , 1990, FOGA.

[39]  S. Vavasis Nonlinear optimization: complexity issues , 1991 .

[40]  K. Deb Binary and floating-point function optimization using messy genetic algorithms , 1991 .

[41]  Nicholas J. Radcliffe,et al.  Forma Analysis and Random Respectful Recombination , 1991, ICGA.

[42]  Fabio Schoen,et al.  Stochastic techniques for global optimization: A survey of recent advances , 1991, J. Glob. Optim..

[43]  Kalyanmoy Deb,et al.  Genetic Algorithms, Noise, and the Sizing of Populations , 1992, Complex Syst..

[44]  Nicholas J. Radcliffe,et al.  Genetic Set Recombination , 1992, FOGA.

[45]  Kalyanmoy Deb,et al.  Accounting for Noise in the Sizing of Populations , 1992, FOGA.

[46]  David E. Goldberg,et al.  A Genetic Algorithm for Parallel Simulated Annealing , 1992, PPSN.

[47]  Translator-IEEE Expert staff Machine Learning: A Theoretical Approach , 1992, IEEE Expert.

[48]  Kenneth A. De Jong,et al.  Are Genetic Algorithms Function Optimizers? , 1992, PPSN.

[49]  H. Kargupta Drift, Diffusion And Boltzmann Distribution In Simple Genetic Algorithm , 1992, Workshop on Physics and Computation.

[50]  Kalyanmoy Deb,et al.  Ordering Genetic Algorithms and Deception , 1992, PPSN.

[51]  Kalyanmoy Deb,et al.  Analyzing Deception in Trap Functions , 1992, FOGA.

[52]  Andrew Dymek An Examination of Hypercube Implementations of Genetic Algorithms , 1992 .

[53]  Lashon B. Booker,et al.  Recombination Distributions for Genetic Algorithms , 1992, FOGA.

[54]  Melanie Mitchell,et al.  Relative Building-Block Fitness and the Building Block Hypothesis , 1992, FOGA.

[55]  Günter Rudolph,et al.  Massively Parallel Simulated Annealing and Its Relation to Evolutionary Algorithms , 1993, Evolutionary Computation.

[56]  Gary B. Lamont,et al.  Comparison of Parallel Messy Genetic Algorithm Data Distribution Strategies , 1993, ICGA.

[57]  Afonso Ferreira,et al.  BOUNDING THE PROBABILITY OF SUCCESS OF STOCHASTIC METHODS FOR GLOBAL OPTIMIZATION , 1993 .

[58]  Kalyanmoy Deb,et al.  RapidAccurate Optimization of Difficult Problems Using Fast Messy Genetic Algorithms , 1993, ICGA.

[59]  Dirk Thierens,et al.  Toward a Better Understanding of Mixing in Genetic Algorithms , 1993 .

[60]  Hillol Kargupta Information Transmission in Genetic Algorithm and Shannon's Second Theorem , 1993, ICGA.

[61]  Dirk Thierens,et al.  Mixing in Genetic Algorithms , 1993, ICGA.

[62]  Ryszard S. Michalski,et al.  A theory and methodology of inductive learning , 1993 .

[63]  Hillol Kargupta,et al.  Temporal sequence processing based on the biological reaction-diffusion process , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[64]  W. Hart Adaptive global optimization with local search , 1994 .

[65]  D. Goldberg Genetic Algorithm Di culty and the Modality ofFitness , 1994 .

[66]  G. Unter Rudolph Massively Parallel Simulated Annealing and itsRelation to Evolutionary , 1994 .

[67]  Terry Jones,et al.  A Description of Holland's Royal Road Function , 1994, Evolutionary Computation.

[68]  Joseph C. Culberson,et al.  Mutation-Crossover Isomorphisms and the Construction of Discriminating Functions , 1994, Evolutionary Computation.

[69]  David E. GoldbergDepartment Decision Making in Genetic Algorithms: a Signal-to-noise Perspective Decision Making in Genetic Algorithms: a Signal-to-noise Perspective , 1994 .

[70]  Sylvian R. Ray,et al.  A Temporal Sequence Processor Based on the Biological Reaction-diffusion Process , 1993, Complex Syst..

[71]  Hillol Kargupta,et al.  Signal-to-noise, Crosstalk, and Long Range Problem Difficulty in Genetic Algorithms , 1995, ICGA.

[72]  Terry Jones,et al.  Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms , 1995, ICGA.

[73]  Tim Jones Evolutionary Algorithms, Fitness Landscapes and Search , 1995 .

[74]  M. R. Rao,et al.  Combinatorial Optimization , 1992, NATO ASI Series.

[75]  Schloss Birlinghoven,et al.  How Genetic Algorithms Really Work I.mutation and Hillclimbing , 2022 .