An Ensemble Approach to Robust Biometrics Fusion

A clever information fusion algorithm is a key component in designing a robust multimodal biometrics algorithm. We present a novel information fusion approach that can be a very useful tool for multimodal biometrics learning. The proposed technique is a multiple view generalization of AdaBoost in the sense that weak learners from various information sources are selected in each iteration based on lowest weighted error rate. Weak learners trained on individual views in each iteration rectify the bias introduced by learners in preceding iterations resulting in a self regularizing behavior. We compare the classification performance of proposed technique with recent classifier fusion strategies in various domains such as face detection, gender classification and texture classification.

[1]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[2]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[3]  Hyeonjoon Moon,et al.  The FERET Evaluation Methodology for Face-Recognition Algorithms , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[5]  Yoav Freund,et al.  A Short Introduction to Boosting , 1999 .

[6]  Naonori Ueda,et al.  Optimal Linear Combination of Neural Networks for Improving Classification Performance , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  L. Breiman Arcing classifier (with discussion and a rejoinder by the author) , 1998 .

[8]  Nageswara S. V. Rao,et al.  On Fusers that Perform Better than Best Sensor , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Josef Kittler,et al.  Combining classifiers: A theoretical framework , 1998, Pattern Analysis and Applications.

[10]  Nello Cristianini,et al.  Kernel-Based Data Fusion and Its Application to Protein Function Prediction in Yeast , 2003, Pacific Symposium on Biocomputing.

[11]  David H. Wolpert,et al.  Stacked generalization , 1992, Neural Networks.

[12]  Nello Cristianini,et al.  Learning the Kernel Matrix with Semidefinite Programming , 2002, J. Mach. Learn. Res..

[13]  C. Berg,et al.  Harmonic Analysis on Semigroups: Theory of Positive Definite and Related Functions , 1984 .

[14]  L. Breiman Arcing Classifiers , 1998 .

[15]  Josef Kittler,et al.  A Framework for Classifier Fusion: Is It Still Needed? , 2000, SSPR/SPR.

[16]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[17]  Robert P. W. Duin,et al.  Is independence good for combining classifiers? , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[18]  C. Berg,et al.  Harmonic Analysis on Semigroups , 1984 .

[19]  Sherif Hashem,et al.  Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.

[20]  Jing Peng,et al.  Classifier fusion using shared sampling distribution for boosting , 2005, Fifth IEEE International Conference on Data Mining (ICDM'05).

[21]  Ludmila I. Kuncheva,et al.  A Theoretical Study on Six Classifier Fusion Strategies , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[23]  Arun Ross,et al.  Multimodal biometrics: An overview , 2004, 2004 12th European Signal Processing Conference.

[24]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..