Kernel Uncorrelated and Orthogonal Discriminant Analysis: A Unified Approach

Several kernel algorithms have recently been proposed for nonlinear discriminant analysis. However, these methods mainly address the singularity problem in the high dimensional feature space. Less attention has been focused on the properties of the resulting discriminant vectors and feature vectors in the reduced dimensional space. In this paper, we present a new formulation for kernel discriminant analysis. The proposed formulation includes, as special cases, kernel uncorrelated discriminant analysis (KUDA) and kernel orthogonal discriminant analysis (KODA). The feature vectors of KUDA are uncorrelated, while the discriminant vectors of KODA are orthogonal to each other in the feature space. We present theoretical derivations of proposed KUDA and KODA algorithms. The experimental results show that both KUDA and KODA are very competitive in comparison with other nonlinear discriminant algorithms in terms of classification accuracy.

[1]  Jieping Ye,et al.  Characterization of a Family of Algorithms for Generalized Discriminant Analysis on Undersampled Problems , 2005, J. Mach. Learn. Res..

[2]  David G. Stork,et al.  Pattern Classification , 1973 .

[3]  Robert P. W. Duin,et al.  Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix , 1998, Pattern Recognit. Lett..

[4]  David G. Stork,et al.  Pattern classification, 2nd Edition , 2000 .

[5]  W. V. McCarthy,et al.  Discriminant Analysis with Singular Covariance Matrices: Methods and Applications to Spectroscopic Data , 1995 .

[6]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[7]  Pengfei Shi,et al.  Uncorrelated discriminant vectors using a kernel method , 2005, Pattern Recognit..

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  Jieping Ye,et al.  Efficient Kernel Discriminant Analysis via QR Decomposition , 2004, NIPS.

[10]  Wenming Zheng,et al.  Foley-Sammon optimal discriminant vectors using kernel approach , 2005, IEEE Trans. Neural Networks.

[11]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[12]  Jian Yang,et al.  KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Haesun Park,et al.  Nonlinear Discriminant Analysis Using Kernel Functions and the Generalized Singular Value Decomposition , 2005, SIAM J. Matrix Anal. Appl..

[14]  Andy Harter,et al.  Parameterisation of a stochastic model for human face identification , 1994, Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.

[15]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[16]  Jing-Yu Yang,et al.  Face recognition based on the uncorrelated discriminant transformation , 2001, Pattern Recognit..

[17]  J. Friedman Regularized Discriminant Analysis , 1989 .

[18]  L. Duchene,et al.  An Optimal Transformation for Discriminant and Principal Component Analysis , 1988, IEEE Trans. Pattern Anal. Mach. Intell..