Neural networks with nonlinear synapses and a static noise.

The theory of neural networks is extended to include a static noise as well as nonlinear updating of synapses by learning. The noise appears either in the form of spin-glass interactions, which are independent of the learning process, or as a random decaying of synapses. In an unsaturated network, the nonlinear learning algorithms may modify the energy surface and lead to interesting new computational capabilities. Close to saturation, they act as an additional source of a static noise. The effect of the noise on memory storage is calculated.