From Neural Principal Components to Neural Independent Components

Several neural network learning rules for the linear Principal Component Analysis (PCA) have been shown to be closely related to classical PCA optimization criteria. These learning rules and the corresponding criteria are extended to versions containing nonlinear functions. It can be shown that the criteria and the learning functions solve the blind source separation (BSS) problem for the linear memoryless mixture model, based on the statistical independence of the source signals. This bottom-up approach to the BSS and Independent Component Analysis (ICA) problems allows us to choose the nonlinear functions so that the learning rules not only produce independent components, but also have other desirable properties like robustness, contrary to the often used polynomial functions ensuing from cumulant expansions. Also fast batch versions of the learning rules are reviewed.

[1]  Nathalie Delfosse,et al.  Adaptive blind separation of independent sources: A deflation approach , 1995, Signal Process..

[2]  Erkki Oja,et al.  One-unit Learning Rules for Independent Component Analysis , 1996, NIPS.

[3]  J. Rubner,et al.  A Self-Organizing Network for Principal-Component Analysis , 1989 .

[4]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[5]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[6]  Juha Karhunen,et al.  Generalizations of principal component analysis, optimization problems, and neural networks , 1995, Neural Networks.

[7]  Y. Chauvin,et al.  Principal component analysis by gradient descent on a constrained linear Hebbian cell , 1989, International 1989 Joint Conference on Neural Networks.

[8]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[9]  Sun-Yuan Kung,et al.  A neural network learning algorithm for adaptive principal component extraction (APEX) , 1990, International Conference on Acoustics, Speech, and Signal Processing.

[10]  Erkki Oja,et al.  Modified Hebbian learning for curve and surface fitting , 1992, Neural Networks.

[11]  Kurt Hornik,et al.  Convergence analysis of local feature extraction algorithms , 1992, Neural Networks.

[12]  A. Hyvarinen A family of fixed-point algorithms for independent component analysis , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[13]  Andrzej Cichocki,et al.  A New Learning Algorithm for Blind Signal Separation , 1995, NIPS.

[14]  Erkki Oja,et al.  Applications of neural blind separation to signal and image processing , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[15]  Sun-Yuan Kung,et al.  Principal Component Neural Networks: Theory and Applications , 1996 .

[16]  Juha Karhunen,et al.  Tracking of sinusoidal frequencies by neural network learning algorithms , 1991, [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing.

[17]  Marimuthu Palaniswami,et al.  Computational Intelligence: A Dynamic System Perspective , 1995 .

[18]  Erkki Oja,et al.  Subspace methods of pattern recognition , 1983 .

[19]  Erkki Oja,et al.  A class of neural networks for independent component analysis , 1997, IEEE Trans. Neural Networks.

[20]  Jean-François Cardoso,et al.  Equivariant adaptive source separation , 1996, IEEE Trans. Signal Process..

[21]  Erkki Oja,et al.  Robust fitting by nonlinear neural units , 1996, Neural Networks.

[22]  Erkki Oja,et al.  Independent component analysis by general nonlinear Hebbian-like learning rules , 1998, Signal Process..

[23]  Kurt Hornik,et al.  Neural networks and principal component analysis: Learning from examples without local minima , 1989, Neural Networks.

[24]  Juha Karhunen,et al.  Representation and separation of signals using nonlinear PCA type learning , 1994, Neural Networks.

[25]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[26]  Terence D. Sanger,et al.  Optimal unsupervised learning in a single-layer linear feedforward neural network , 1989, Neural Networks.

[27]  E. Oja,et al.  On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix , 1985 .

[28]  Erkki Oja,et al.  Simple Neuron Models for Independent Component Analysis , 1996, Int. J. Neural Syst..

[29]  G. Deco,et al.  An Information-Theoretic Approach to Neural Computing , 1997, Perspectives in Neural Computing.

[30]  P. Foldiak,et al.  Adaptive network for optimal linear feature extraction , 1989, International 1989 Joint Conference on Neural Networks.

[31]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[32]  Andrzej Cichocki,et al.  Robust learning algorithm for blind separation of signals , 1994 .

[33]  Jean-Francois Cardoso,et al.  Eigen-structure of the fourth-order cumulant tensor with application to the blind source separation problem , 1990, International Conference on Acoustics, Speech, and Signal Processing.

[34]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[35]  Erkki Oja,et al.  Neural Networks, Principal Components, and Subspaces , 1989, Int. J. Neural Syst..

[36]  Aapo Hyvärinen,et al.  A Fast Fixed-Point Algorithm for Independent Component Analysis , 1997, Neural Computation.

[37]  Erkki Oja,et al.  Signal Separation by Nonlinear Hebbian Learning , 1995 .