Fast Evolutionary Learning with Batch-Type Self-Organizing Maps

Although no distance function over the input data is definable, it is still possible to implement the self-organizing map (SOM) process using evolutionary-learning operations. The process can be made to converge more rapidly when the probabilistic trials of conventional evolutionary learning are replaced by averaging using the so-called Batch Map version of the self-organizing map. Although no other condition or metric than a fitness function between the input samples and the models is assumed, an order in the map that complies with the ‘functional similarity’ of the models can be seen to emerge. There exist two modes of use of this new principle: representation of nonmetric input data distributions by models that may have variable structures, and fast generation of evolutionary cycles that resemble those defined by the genetic algorithms. The spatial order in the array of models can be utilized for finding more uniform variations, such as crossings between functionally similar models.