Meta-Lamarckian learning in memetic algorithms

Over the last decade, memetic algorithms (MAs) have relied on the use of a variety of different methods as the local improvement procedure. Some recent studies on the choice of local search method employed have shown that this choice significantly affects the efficiency of problem searches. Given the restricted theoretical knowledge available in this area and the limited progress made on mitigating the effects of incorrect local search method choice, we present strategies for MA control that decide, at runtime, which local method is chosen to locally improve the next chromosome. The use of multiple local methods during a MA search in the spirit of Lamarckian learning is here termed Meta-Lamarckian learning. Two adaptive strategies for Meta-Lamarckian learning are proposed in this paper. Experimental studies with Meta-Lamarckian learning strategies on continuous parametric benchmark problems are also presented. Further, the best strategy proposed is applied to a real-world aerodynamic wing design problem and encouraging results are obtained. It is shown that the proposed approaches aid designers working on complex engineering problems by reducing the probability of employing inappropriate local search methods in a MA, while at the same time, yielding robust and improved design search performance.

[1]  M. Powell A Direct Search Optimization Method That Models the Objective and Constraint Functions by Linear Interpolation , 1994 .

[2]  David E. Goldberg,et al.  Decision making in a hybrid genetic algorithm , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[3]  J. Cousin,et al.  The BAe (commercial aircraft) LTD transport aircraft synthesis and optimisation program (TASOP) , 1990 .

[4]  Zbigniew Michalewicz,et al.  Adaptation in evolutionary computation: a survey , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[5]  H.-M. Voigt,et al.  Local evolutionary search enhancement by random memorizing , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[6]  Richard J. Beckman,et al.  A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code , 2000, Technometrics.

[7]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[8]  Jim Smith,et al.  Operator and parameter adaptation in genetic algorithms , 1997, Soft Comput..

[9]  Mika Johnsson,et al.  An adaptive hybrid genetic algorithm for the three-matching problem , 2000, IEEE Trans. Evol. Comput..

[10]  K. Dejong,et al.  An analysis of the behavior of a class of genetic adaptive systems , 1975 .

[11]  Edmund K. Burke,et al.  Multimeme Algorithms for Protein Structure Prediction , 2002, PPSN.

[12]  Jean-Michel Renders,et al.  Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible ways , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[13]  Zbigniew Michalewicz,et al.  Genetic Algorithms Plus Data Structures Equals Evolution Programs , 1994 .

[14]  Andy J. Keane,et al.  Aircraft wing design using GA-based multi-level strategies , 2000 .

[15]  James N. Siddall,et al.  Optimal Engineering Design: Principles and Applications , 1982 .

[16]  D. Quagliarella,et al.  Airfoil and wing design through hybrid optimization strategies , 1998 .

[17]  Lawrence Davis,et al.  Bit-Climbing, Representational Bias, and Test Suite Design , 1991, ICGA.

[18]  Hans-Paul Schwefel,et al.  Evolution and optimum seeking , 1995, Sixth-generation computer technology series.

[19]  David W. Johnson,et al.  Cooperation and Competition: Theory and Research , 1989 .

[20]  C. Floudas,et al.  Quadratic Optimization , 1995 .

[21]  A. J. Keane,et al.  Genetic algorithm optimization of multi-peak problems: studies in convergence and robustness , 1995, Artif. Intell. Eng..

[22]  Lawrence. Davis,et al.  Handbook Of Genetic Algorithms , 1990 .

[23]  Yew-Soon Ong,et al.  A domain knowledge based search advisor for design problem solving environments , 2002 .

[24]  David B. Fogel,et al.  An Evolutionary Programming Approach to Self-Adaptation on Finite State Machines , 1995, Evolutionary Programming.

[25]  Thomas Bäck,et al.  Intelligent Mutation Rate Control in Canonical Genetic Algorithms , 1996, ISMIS.

[26]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[27]  M. J. D. Powell,et al.  An efficient method for finding the minimum of a function of several variables without calculating derivatives , 1964, Comput. J..

[28]  Pablo Moscato,et al.  On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts : Towards Memetic Algorithms , 1989 .

[29]  A. Dickson On Evolution , 1884, Science.

[30]  W. Hart Adaptive global optimization with local search , 1994 .

[31]  H. Terashima-Marín,et al.  Evolution of Constraint Satisfaction strategies in examination timetabling , 1999 .

[32]  Aimo A. Törn,et al.  Global Optimization , 1999, Science.

[33]  Natalio Krasnogor,et al.  Studies on the theory and design space of memetic algorithms , 2002 .

[34]  Graham Kendall,et al.  A Hyperheuristic Approach to Scheduling a Sales Summit , 2000, PATAT.