A generalized convergence theorem for neural networks

A neural network model is presented in which each neuron performs a threshold logic function. The model always converges to a stable state when operating in a serial mode and to a cycle of length at most 2 when operating in a fully parallel mode. This property is the basis for the potential applications of the model, such as associative memory devices and combinatorial optimization. The two convergence theorems (for serial and fully parallel modes of operation) are reviewed, and a general convergence theorem is presented that unifies the two known cases. New relations between the neural network model and the problem of finding a minimum cut in a graph are obtained. >

[1]  Ahmed H. Tewfik,et al.  Sampling theorems for two-dimensional isotropic random fields , 1988, IEEE Trans. Inf. Theory.

[2]  Jehoshua Bruck,et al.  A study on neural networks , 1988, Int. J. Intell. Syst..

[3]  H. D. Ratliff,et al.  Minimum cuts and related problems , 1975, Networks.

[4]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[5]  Eric Goles Ch.,et al.  Decreasing energy functions as a tool for studying threshold networks , 1985, Discret. Appl. Math..