Constraints on learning in dynamic synapses

Hebbian-type learning is discussed in a network whose synapses are analogue, dynamic variables, whose values have to be periodically refreshed due to possible exponential decay, or other instability of continuous synaptic efficacies. It is shown that the end product of learning in such networks is very sensitive to the relation between the rate of presentation of patterns and the size of the refresh time interval. It is shown that in the limit of slow presentation, the network can learn at most O(In N) patterns in N neurons, and must learn each one in one shot, thus learning all errors present in a corrupt stimulus presented for retrieval.It is then shown that as the rate of presentation is increased the performance is increased rapidly. Another option we investigate is that in which the refresh mechanism is acting stochastically. In this case the rate of learning can be slowed down very significantly, but the number of stored patterns cannot surpass √N.