Multiple Classifier Systems

The performance of a single weak classifier can be improved by using combining techniques such as bagging, boosting and the random subspace method. When applying them to linear discriminant analysis, it appears that they are useful in different situations. Their performance is strongly affected by the choice of the base classifier and the training sample size. As well, their usefulness depends on the data distribution. In this paper, on the example of the pseudo Fisher linear classifier, we study the effect of the redundancy in the data feature set on the performance of the random subspace method and bagging.

[1]  Bernard R. Rosner,et al.  Fundamentals of Biostatistics. , 1992 .

[2]  Ron Kohavi,et al.  Option Decision Trees with Majority Votes , 1997, ICML.

[3]  Bernice W. Polemis Nonparametric Statistics for the Behavioral Sciences , 1959 .

[4]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[5]  J. Ross Quinlan,et al.  Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.

[6]  Olivier Debeir,et al.  Mixing Bagging and Multiple Feature Subsets to Improve Classification Accuracy of Decision Tree Combination , 2000 .

[7]  C. J. Stone,et al.  The Dimensionality Reduction Principle for Generalized Additive Models , 1986 .

[8]  Kagan Tumer,et al.  Classifier Combining: Analytical Results and Implications , 1995 .

[9]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Aiko M. Hormann,et al.  Programs for Machine Learning. Part I , 1962, Inf. Control..

[11]  Stephen D. Bay Nearest neighbor classification from multiple feature subsets , 1999, Intell. Data Anal..

[12]  L. Breiman Random Forests--random Features , 1999 .

[13]  Thomas G. Dietterich Approximate Statistical Tests for Comparing Supervised Classification Learning Algorithms , 1998, Neural Computation.

[14]  Olivier Debeir,et al.  Different Ways of Weakening Decision Trees and Their Impact on Classification Accuracy of DT Combination , 2000, Multiple Classifier Systems.

[15]  Fabio Roli,et al.  An approach to the automatic design of multiple classifier systems , 2001, Pattern Recognit. Lett..

[16]  Yuh-Jye Lee,et al.  SSVM: A Smooth Support Vector Machine for Classification , 2001, Comput. Optim. Appl..

[17]  Chuanyi Ji,et al.  Combinations of Weak Classifiers , 1996, NIPS.