Evolutionary Induction of Sparse Neural Trees

This paper is concerned with the automatic induction of parsimonious neural networks. In contrast to other program induction situations, network induction entails parametric learning as well as structural adaptation. We present a novel representation scheme called neural trees that allows efficient learning of both network architectures and parameters by genetic search. A hybrid evolutionary method is developed for neural tree induction that combines genetic programming and the breeder genetic algorithm under the unified framework of the minimum description length principle. The method is successfully applied to the induction of higher order neural trees while still keeping the resulting structures sparse to ensure good generalization performance. Empirical results are provided on two chaotic time series prediction problems of practical interest.

[1]  Byoung-Tak Zhang,et al.  Synthesis of sigma-pi neural networks by the breeder genetic programming , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[2]  Justinian P. Rosea Towards Automatic Discovery of Building Blocks in Genetic Programming , 1995 .

[3]  Richard K. Belew,et al.  Evolving networks: using the genetic algorithm with connectionist learning , 1990 .

[4]  John R. Koza,et al.  Genetic programming - on the programming of computers by means of natural selection , 1993, Complex adaptive systems.

[5]  J. Pollack,et al.  Coevolving High-Level Representations , 1993 .

[6]  Vasant Honavar,et al.  Evolutionary Design of Neural Architectures , 1995 .

[7]  Adam Kowalczyk,et al.  Developing higher-order networks with empirically selected units , 1994, IEEE Trans. Neural Networks.

[8]  Hitoshi Iba,et al.  System Identification using Structured Genetic Algorithms , 1993, ICGA.

[9]  J. Rissanen Stochastic Complexity and Modeling , 1986 .

[10]  Hiroaki Kitano,et al.  Designing Neural Networks Using Genetic Algorithms with Graph Generation System , 1990, Complex Syst..

[11]  Frédéric Gruau,et al.  Genetic Synthesis of Modular Neural Networks , 1993, ICGA.

[12]  Byoung-Tak Zhang,et al.  Adaptive fitness functions for dynamic growing/pruning of program trees , 1996 .

[13]  A. G. Ivakhnenko,et al.  Polynomial Theory of Complex Systems , 1971, IEEE Trans. Syst. Man Cybern..

[14]  Heinz Mühlenbein,et al.  The Science of Breeding and Its Application to the Breeder Genetic Algorithm (BGA) , 1993, Evolutionary Computation.

[15]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[16]  Una-May O'Reilly,et al.  Investigating the Generality of Automatically Defined Functions , 1996 .

[17]  Rolf Eckmiller,et al.  Structural adaptation of parsimonious higher-order neural classifiers , 1994, Neural Networks.

[18]  John R. Koza,et al.  Simultaneous Discovery of Reusable Detectors and Subroutines Using Genetic Programming , 1993, ICGA.

[19]  Byoung-Tak Zhang,et al.  Balancing Accuracy and Parsimony in Genetic Programming , 1995, Evolutionary Computation.

[20]  Neal B. Abraham,et al.  Lorenz-like chaos in NH3-FIR lasers , 1995 .

[21]  Hitoshi Iba,et al.  Extending genetic programming with recombinative guidance , 1996 .

[22]  Colin Giles,et al.  Learning, invariance, and generalization in high-order neural networks. , 1987, Applied optics.

[23]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.