Analytic study of the memory storage capacity of a neural network

Abstract Previously, we developed a model of short and long term memory which was based on an analogy to the Ising spin system in a neural network. We assumed that the modification of the synaptic strengths was dependent upon the correlation of pre- and post-synaptic neuronal firing. This assumption we denote as the Hebb hypothesis. In this paper, we solve exactly a linearized version of the model and explicitly show that the capacity of the memory is related to the number of synapses rather than the much smaller number of neurons. In addition, we show that in order to utilize this large capacity, the network must store the major part of the information in the capability to generate patterns which evolve with time. We are also led to a modified Hebb hypothesis.