Epsilon capacity of neural networks

It is shown that the capacity of neural networks for storing associations under error‐tolerant conditions is linear in n, where n is the number of neurons in the network. Error‐tolerance is introduced into the retrieval mechanism by specifying components in the retrieved memory which are to be ignored (i.e. are treated as dont‐cares) by means of a binomial distribution of choice. The epsilon capacity Ce(n) is defined to be the largest rate of growth of the number of associations that can be stored such that with high probability the retrieved memory after one synchronous step differs from the desired associated memory in no more than (essentially) a fraction e components. It is shown that for large n, and with 0≤e<1/2, the epsilon capacity Ce(n) is at most 2n/(1−2e). The result is universal in the sense that any other mode of choice of dont‐care components with essentially a fraction e of components being dont‐care will have an upper bound of 2n/(1−2e) on capacity. The results are not tied to any particul...