Non-redundant genetic coding of neural networks

Feedforward neural networks have a number of functionally equivalent symmetries that make them difficult to optimise with genetic recombination operators. Although this problem has received considerable attention in the past, the proposed solutions all have a heuristic nature. We discuss a neural network genotype representation that completely eliminates the functional redundancies by transforming each neural network into its canonical form. This transformation is computationally extremely simple, since it only requires flipping the sign of some of the weights, followed by sorting the hidden neurons according to their bias. We have compared the redundant and non-redundant representations on the basis of their crossover correlation coefficient. As expected, the redundancy elimination results in a much higher crossover correlation coefficient, which shows that more information is now transmitted from the parents to the children. Finally, experimental results are given for the two-spirals classification problem.

[1]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[2]  Hugo de Garis,et al.  Genetic Programming , 1990, ML.

[3]  Stefano Nolfi,et al.  Econets: Neural networks that learn in an environment , 1990 .

[4]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[5]  Bernard Manderick,et al.  The Genetic Algorithm and the Structure of the Fitness Landscape , 1991, ICGA.

[6]  L. Darrell Whitley,et al.  Genetic Operators, the Fitness Landscape and the Traveling Salesman Problem , 1992, PPSN.

[7]  Peter J. B. Hancock,et al.  Recombination Operators for the Design of Neural Nets by Genetic Algorithm , 1992, Parallel Problem Solving from Nature.

[8]  J. D. Schaffer,et al.  Combinations of genetic algorithms and neural networks: a survey of the state of the art , 1992, [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks.

[9]  Héctor J. Sussmann,et al.  Uniqueness of the weights for minimal feedforward nets with a given input-output map , 1992, Neural Networks.

[10]  Robert Hecht-Nielsen,et al.  On the Geometry of Feedforward Neural Network Error Surfaces , 1993, Neural Computation.

[11]  Johan A. K. Suykens,et al.  Genetic Weight Optimization of a Feedforward Neural Network Controller , 1993 .

[12]  Dirk Thierens,et al.  Elitist recombination: an integrated selection recombination GA , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[13]  Vittorio Maniezzo,et al.  Genetic evolution of the topology and weight distribution of neural networks , 1994, IEEE Trans. Neural Networks.

[14]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.