Competitive learning is an unsupervised algorithm that classifies input patterns into mutually exclusive clusters. In a neural net framework, each cluster is represented by a processing unit that competes with others in a winnertake-all pool for an input pattern. I present a simple extension to the algorithm that allows it to construct discrete, distributed representations. Discrete representations are useful because they are relatively easy to analyze and their information content can readily be measured. Distributed representations are useful because they explicitly encode similarity. The basic idea is to apply competitive learning iteratively to an input pattern, and after each stage to subtract from the input pattern the component that was captured in the representation at that stage. This component is simply the weight vector of the winning unit of the competitive pool. The subtraction procedure forces competitive pools at different stages to encode different aspects of the input. The algorithm is essentially the same as a traditional data compression technique known as multistep vector quantization, although the neural net perspective suggests potentially powerful extensions to that approach.
[1]
GrossbergS..
Adaptive pattern classification and universal recoding
,
1976
.
[2]
Garrison W. Cottrell,et al.
Image compression by back-propagation: An example of extensional programming
,
1988
.
[3]
R. Gray,et al.
Vector quantization
,
1984,
IEEE ASSP Magazine.
[4]
Terence D. Sanger,et al.
Optimal unsupervised learning in a single-layer linear feedforward neural network
,
1989,
Neural Networks.
[5]
John S. Bridle,et al.
Training Stochastic Model Recognition Algorithms as Networks can Lead to Maximum Mutual Information Estimation of Parameters
,
1989,
NIPS.
[6]
David Zipser,et al.
Feature discovery by competitive learning
,
1986
.
[7]
J Ambros-Ingerson,et al.
Simulation of paleocortex performs hierarchical clustering.
,
1990,
Science.
[8]
Tomas Hrycej,et al.
Unsupervised Learning by Backward Inhibition
,
1989,
IJCAI.