Two realizations of a general feature extraction framework

Abstract A general feature extraction framework is proposed as an extension of conventional linear discriminant analysis. Two nonlinear feature extraction algorithms based on this framework are investigated. The first is a kernel function feature extraction (KFFE) algorithm. A disturbance term is introduced to regularize the algorithm. Moreover, it is revealed that some existing nonlinear feature extraction algorithms are the special cases of this KFFE algorithm. The second feature extraction algorithm, mean–STD 1 –norm feature extraction algorithm, is also derived from the framework. Experiments based on both synthetic and real data are presented to demonstrate the performance of both feature extraction algorithms.

[1]  Keinosuke Fukunaga,et al.  Introduction to Statistical Pattern Recognition , 1972 .

[2]  Gunnar Rätsch,et al.  An introduction to kernel-based learning algorithms , 2001, IEEE Trans. Neural Networks.

[3]  M. Melamed Detection , 2021, SETI: Astronomy as a Contact Sport.

[4]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[5]  Keinosuke Fukunaga,et al.  Introduction to statistical pattern recognition (2nd ed.) , 1990 .

[6]  Volker Roth,et al.  Nonlinear Discriminant Analysis Using Kernel Functions , 1999, NIPS.

[8]  Ronald R. Coifman,et al.  Local discriminant bases , 1994, Optics & Photonics.

[9]  Wenyi Zhao,et al.  Discriminant component analysis for face recognition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[10]  Junshui Ma Function replacement vs. kernel trick , 2003, Neurocomputing.

[11]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[12]  E. Fowlkes,et al.  Variable selection in clustering , 1988 .

[13]  Lee C. Potter,et al.  Attributed scattering centers for SAR ATR , 1997, IEEE Trans. Image Process..

[14]  Stanley C. Ahalt,et al.  2D HRR Radar Data Modeling and Processing , 2003, Multidimens. Syst. Signal Process..

[15]  Bernhard Schölkopf,et al.  Sampling Techniques for Kernel Methods , 2001, NIPS.

[16]  David A. Landgrebe,et al.  Feature Extraction Based on Decision Boundaries , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Pedro E. López-de-Teruel,et al.  Nonlinear kernel-based statistical pattern analysis , 2001, IEEE Trans. Neural Networks.

[18]  Harry L. Van Trees,et al.  Detection, Estimation, and Modulation Theory, Part I , 1968 .

[19]  Jian Li,et al.  Efficient mixed-spectrum estimation with applications to target feature extraction , 1995, Conference Record of The Twenty-Ninth Asilomar Conference on Signals, Systems and Computers.

[20]  Rama Chellappa,et al.  Discriminant analysis of principal components for face recognition , 1998 .

[21]  Tzay Y. Young The Reliability of Linear Feature Extractors , 1971, IEEE Transactions on Computers.

[22]  F. D. Garber,et al.  Time-domain and frequency-domain feature selection for reliable radar target identification , 1988, Proceedings of the 1988 IEEE National Radar Conference.

[23]  David M. Suszcynsky,et al.  FORTE observations of simultaneous VHF and optical emissions from lightning: Basic phenomenology , 2000 .

[24]  Junshui Ma Feature study for high-range-resolution based automatic target recognition : analysis and extraction / , 2001 .

[25]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[26]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[27]  G. Baudat,et al.  Generalized Discriminant Analysis Using a Kernel Approach , 2000, Neural Computation.