Evolving networks: using the genetic algorithm with connectionist learning

It is appealing to consider hybrids of neural-network learning algorithms with evolutionary search procedures, simply because Nature has so successfully done so. In fact, computational models of learning and evolution ooer theoretical biology new tools for addressing questions about Nature that have dogged that eld since Darwin Belew, 1990]. The concern of this paper, however, is strictly artiicial: Can hybrids of connectionist learning algorithms and genetic algorithms produce more eecient and eeective algorithms than either technique applied in isolation? The paper begins with a survey of recent work (by us and others) that combines Holland's Genetic Algorithm (GA) with con-nectionist techniques and delineates some of the basic design problems these hybrids share. This analysis suggests the dangers of overly literal representations of the network on the genome (e.g., encoding each weight explicitly). A preliminary set of experiments that use the GA to nd unusual but successful values for BP parameters (learning rate, momentum) are also reported. The focus of the report is a series of experiments that use the GA to explore the space of initial weight values , from which two diierent gradient techniques (conjugate gradient and back propagation) are then allowed to optimize. We nd that use of the GA provides much greater conndence in the face of the stochas-tic variation that can plague gradient techniques, and can also allow training times to be reduced by as much as two orders of magnitude. Computational trade-oos between BP and the GA are considered, including discussion of a software facility that exploits the parallelism inherent in GA/BP hybrids. This evidence leads us to conclude that the GA's global sampling characteristics compliment connectionist local search techniques well, leading to eecient and reliable hybrids.

[1]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[2]  John H. Holland,et al.  Adaptation in natural and artificial systems , 1975 .

[3]  Albert Donally Bethke,et al.  Genetic Algorithms as Function Optimizers , 1980 .

[4]  K. De Jong Adaptive System Design: A Genetic Approach , 1980, IEEE Transactions on Systems, Man, and Cybernetics.

[5]  R. Brady Optimization strategies gleaned from biological evolution , 1985, Nature.

[6]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[7]  M. Rizki,et al.  Computing the theory of evolution , 1986 .

[8]  S. Kauffman,et al.  Adaptive automata based on Darwinian selection , 1986 .

[9]  J. David Schaffer,et al.  An Adaptive Crossover Distribution Mechanism for Genetic Algorithms , 1987, ICGA.

[10]  D. Ackley A connectionist machine for genetic hillclimbing , 1987 .

[11]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[12]  J. David Schaffer,et al.  Representation and Hidden Bias: Gray vs. Binary Coding for Genetic Algorithms , 1988, ML.

[13]  D. Purves Body and Brain: A Trophic Theory of Neural Connections , 1988 .

[14]  Marvin Minsky,et al.  Perceptrons: expanded edition , 1988 .

[15]  Paulien Hogeweg,et al.  Genetic Algorithms and Information Accumulation during the Evolution of Gene Regulation , 1989, ICGA.

[16]  M. F. Tenorio,et al.  Self-Organizing Neural Network for Optimum Supervised Learning , 1989 .

[17]  Jack D. Cowan,et al.  Development and Regeneration of Eye-Brain Maps: A Computational Model , 1989, NIPS.

[18]  Tariq Samad,et al.  Towards the Genetic Synthesisof Neural Networks , 1989, ICGA.

[19]  L. Darrell Whitley,et al.  Optimizing Neural Networks Using FasterMore Accurate Genetic Search , 1989, ICGA.

[20]  S. Smoliar Neural darwinism: The theory of neuronal group selection: Gerald M. Edelman, (Basic Books; New York, 1987); xxii + 371 pages , 1989 .

[21]  Ronald A. Cole,et al.  A neural-net training program based on conjugate-radient optimization , 1989 .

[22]  Peter M. Todd,et al.  Designing Neural Networks using Genetic Algorithms , 1989, ICGA.

[23]  John F. Kolen,et al.  Backpropagation is Sensitive to Initial Conditions , 1990, Complex Syst..

[24]  Stephen José Hanson,et al.  What connectionist models learn: Learning and representation in connectionist networks , 1990, Behavioral and Brain Sciences.

[25]  Richard K. Belew,et al.  Evolution, Learning, and Culture: Computational Metaphors for Adaptive Algorithms , 1990, Complex Syst..

[26]  W. Daniel Hillis,et al.  Co-evolving parasites improve simulated evolution as an optimization procedure , 1990 .

[27]  R. E. Uhrig,et al.  A stochastic learning algorithm for layered neural networks , 1992 .