Unsupervised and supervised classifications by rival penalized competitive learning

For the classical k-means clustering algorithm, the problem of selecting an appropriate k is a hard problem and affects the performance of k-means strongly. When used for clustering analysis, the conventional competitive learning (CL) algorithms also have a similar crucial problem-the selection of an appropriate number of neural units. The performance of frequency sensitive competitive learning (FSCL)-one version of the improved CL algorithms, also significantly deteriorates when the number of units is inappropriately selected. The paper proposes a new algorithm called rival penalized competitive learning (RPCL), which has the ability of automatically allocating an appropriate number of units for an input data set. The experimental results have shown that RPCL outperforms FSCL significantly when they are used for unsupervised classification, and supervised classification through the radial basis function net.<<ETX>>