CAM Storage of Analog Patterns and Continuous Sequences with 3N2 Weights

A simple architecture and algorithm for analytically guaranteed associative memory storage of analog patterns, continuous sequences, and chaotic attractors in the same network is described. A matrix inversion determines network weights, given prototype patterns to be stored. There are N units of capacity in an N node network with 3N2 weights. It costs one unit per static attractor, two per Fourier component of each sequence, and four per chaotic attractor. There are no spurious attractors, and there is a Liapunov function in a special coordinate system which governs the approach of transient states to stored trajectories. Unsupervised or supervised incremental learning algorithms for pattern classification, such as competitive learning or bootstrap Widrow-Hoff can easily be implemented. The architecture can be "folded" into a recurrent network with higher order weights that can be used as a model of cortex that stores oscillatory and chaotic attractors by a Hebb rule. Hierarchical sensory-motor control networks may be constructed of interconnected "cortical patches" of these network modules. Network performance is being investigated by application to the problem of real time handwritten digit recognition.

[1]  Stanley C. Ahalt,et al.  Competitive learning algorithms for vector quantization , 1990, Neural Networks.

[2]  B. Baird A learning rule for CAM storage of continuous periodic sequences , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[3]  P. Holmes,et al.  Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields , 1983, Applied Mathematical Sciences.

[4]  B. Baird A bifurcation theory approach to vector field programming for periodic attractors , 1989, International 1989 Joint Conference on Neural Networks.