Boosting the HONG network

This paper gives a brief description of a hierarchical architecture (HONG) that has been described in Atukorale and Suganthan (Neurocomputing 35 (2000) 165). The learning algorithm it uses is a mixed unsupervised/supervised method with most of the learning being unsupervised. The architecture generates multiple classifications for every data pattern presented, and combines them to obtain the final classification. The main objective of this paper is to show how boosting can be used to improve the performance of the HONG classifier.

[1]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[2]  Ponnuthurai N. Suganthan,et al.  Hierarchical overlapped neural gas network with application to pattern classification , 2000, Neurocomputing.

[3]  Gunnar Rätsch,et al.  Regularizing AdaBoost , 1998, NIPS.

[4]  Nikhil R. Pal,et al.  Some novel classifiers designed using prototypes extracted by a new scheme based on self-organizing feature map , 2001, IEEE Trans. Syst. Man Cybern. Part B.

[5]  Ponnuthurai Nagaratnam Suganthan,et al.  Multiple HONG network fusion by fuzzy integral , 1999, ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378).

[6]  Teuvo Kohonen,et al.  The self-organizing map , 1990 .

[7]  Dale Schuurmans,et al.  Boosting in the Limit: Maximizing the Margin of Learned Ensembles , 1998, AAAI/IAAI.

[8]  Ronald R. Yager Element selection from a fuzzy subset using the fuzzy integral , 1993, IEEE Trans. Syst. Man Cybern..

[9]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[10]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[11]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[12]  Bernd Fritzke,et al.  A Growing Neural Gas Network Learns Topologies , 1994, NIPS.

[13]  Nils J. Nilsson,et al.  Learning Machines: Foundations of Trainable Pattern-Classifying Systems , 1965 .

[14]  Yoshua Bengio,et al.  Boosting Neural Networks , 2000, Neural Computation.

[15]  Thomas Martinetz,et al.  'Neural-gas' network for vector quantization and its application to time-series prediction , 1993, IEEE Trans. Neural Networks.

[16]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[17]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[18]  Geoffrey I. Webb MultiBoosting: A Technique for Combining Boosting and Wagging , 2000, Machine Learning.

[19]  Peter L. Bartlett,et al.  Improved Generalization Through Explicit Optimization of Margins , 2000, Machine Learning.

[20]  Sargur N. Srihari,et al.  Decision Combination in Multiple Classifier Systems , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Thomas G. Dietterich An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.