Redundancy Reduction and Independent Component Analysis: Conditions on Cumulants and Adaptive Approaches

In the context of both sensory coding and signal processing, building factorized codes has been shown to be an efficient strategy. In a wide variety of situations, the signal to be processed is a linear mixture of statistically independent sources. Building a factorized code is then equivalent to performing blind source separation. Thanks to the linear structure of the data, this can be done, in the language of signal processing, by finding an appropriate linear filter, or equivalently, in the language of neural modeling, by using a simple feedforward neural network. In this article, we discuss several aspects of the source separation problem. We give simple conditions on the network output that, if satisfied, guarantee that source separation has been obtained. Then we study adaptive approaches, in particular those based on redundancy reduction and maximization of mutual information. We show how the resulting updating rules are related to the BCM theory of synaptic plasticity. Eventually we briefly discuss extensions to the case of nonlinear mixtures. Through out this article, we take care to put into perspective our work with other studies on source separation and redundancy reduction. In particular we review algebraic solutions, pointing out their simplicity but also their drawbacks.

[1]  F. Attneave Some informational aspects of visual perception. , 1954, Psychological review.

[2]  Milton Abramowitz,et al.  Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables , 1964 .

[3]  N. S. Barnett,et al.  Private communication , 1969 .

[4]  Yeheskel Bar-ness,et al.  Bootstrapping adaptive interference cancelers - Some practical limitations , 1982 .

[5]  E. Bienenstock,et al.  Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[6]  Leon N. Cooper,et al.  Development and properties of neural networks , 1985 .

[7]  N. Parga,et al.  The stochastic quantisation of U(N) and SU(N) lattice gauge theory and Langevin equations for the Wilson loops , 1985 .

[8]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[9]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[10]  H. B. Barlow,et al.  Finding Minimum Entropy Codes , 1989, Neural Computation.

[11]  Jean-Francois Cardoso,et al.  Source separation using higher order moments , 1989, International Conference on Acoustics, Speech, and Signal Processing,.

[12]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[13]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[14]  R. Liu,et al.  AMUSE: a new blind identification algorithm , 1990, IEEE International Symposium on Circuits and Systems.

[15]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[16]  J J Hopfield,et al.  Olfactory computation and object perception. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Dinh Tuan Pham,et al.  Separation of a mixture of independent sources through a maximum likelihood approach , 1992 .

[18]  Nathan Intrator,et al.  Objective function formulation of the BCM theory of visual cortical plasticity: Statistical connections, stability conditions , 1992, Neural Networks.

[19]  Gilles Burel,et al.  Blind separation of sources: A nonlinear neural algorithm , 1992, Neural Networks.

[20]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[21]  J. Cardoso,et al.  Blind beamforming for non-gaussian signals , 1993 .

[22]  A. Belouchrani,et al.  Séparation aveugle au second ordre de sources corrélées , 1993 .

[23]  A. Norman Redlich,et al.  Redundancy Reduction as a Strategy for Unsupervised Learning , 1993, Neural Computation.

[24]  Nicholas T. Carnevale,et al.  Hebbian learning is jointly controlled by electrotonic and input structure , 1994 .

[25]  C. C. Law,et al.  Formation of receptive fields in realistic visual environments according to the Bienenstock, Cooper, and Munro (BCM) theory. , 1994, Proceedings of the National Academy of Sciences of the United States of America.

[26]  J. Nadal,et al.  Nonlinear neurons in the low-noise limit: a factorial code maximizes information transfer Network 5 , 1994 .

[27]  Zhaoping Li,et al.  Efficient stereo coding in the multiscale representation , 1994 .

[28]  Schuster,et al.  Separation of a mixture of independent signals using time delayed correlations. , 1994, Physical review letters.

[29]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[30]  J. Rospars,et al.  Coding of odour quality: roles of convergence and inhibition , 1994 .

[31]  Lucas C. Parra,et al.  Symplectic Nonlinear Component Analysis , 1995, NIPS.

[32]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[33]  Yves Deville,et al.  Application of blind source separation techniques to multi-tag contactless identification systems , 1995 .

[34]  Deco,et al.  Information theory and local learning rules in a self-organizing network of Ising spins. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[35]  Geoffrey E. Hinton,et al.  The "wake-sleep" algorithm for unsupervised neural networks. , 1995, Science.

[36]  Nathalie Delfosse,et al.  Adaptive blind separation of independent sources: A deflation approach , 1995, Signal Process..

[37]  J. Nadal,et al.  Maximization of mutual information in a linear noisy network: a detailed study , 1995 .

[38]  Andrzej Cichocki,et al.  A New Learning Algorithm for Blind Signal Separation , 1995, NIPS.

[39]  D. Obradovic,et al.  Information Maximization and Independent Component Analysis: Is There a Difference? , 1998, Neural Computation.

[40]  Ali Mansour,et al.  Blind Separation of Sources , 1999 .

[41]  Terrence J. Sejnowski,et al.  Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources , 1999, Neural Computation.