Information and pattern capacities in neural associative memories with feedback for sparse memory patterns

How to judge the performance of associative memories in applications? Using information theory, we examine the static structure of memory states and spurious states of a recurrent associative memory after learning. In this framework we consider the critical pattern capacity often used in the literature and introduce the information capacity as a more relevant performance measure for pattern completion. For two types of local learning rule, the Hebb and the clipped Hebb rule our method yields new asymptotic estimates for the information capacity.