A "nonnegative PCA" algorithm for independent component analysis

We consider the task of independent component analysis when the independent sources are known to be nonnegative and well-grounded, so that they have a nonzero probability density function (pdf) in the region of zero. We propose the use of a "nonnegative principal component analysis (nonnegative PCA)" algorithm, which is a special case of the nonlinear PCA algorithm, but with a rectification nonlinearity, and we conjecture that this algorithm will find such nonnegative well-grounded independent sources, under reasonable initial conditions. While the algorithm has proved difficult to analyze in the general case, we give some analytical results that are consistent with this conjecture and some numerical simulations that illustrate its operation.

[1]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[2]  D Charles,et al.  Modelling multiple-cause structure using rectification constraints. , 1998, Network.

[3]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[4]  Juha Karhunen,et al.  Representation and separation of signals using nonlinear PCA type learning , 1994, Neural Networks.

[5]  Aapo Hyvärinen,et al.  Independent Component Analysis for Time-dependent Stochastic Processes , 1998 .

[6]  A. Cichocki,et al.  Blind Source Separation Algorithms with Matrix Constraints , 2003, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[7]  Colin Fyfe,et al.  A Neural Network for PCA and Beyond , 1997, Neural Processing Letters.

[8]  Lei Xu,et al.  Least mean square error reconstruction principle for self-organizing neural-nets , 1993, Neural Networks.

[9]  P. Paatero,et al.  Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values† , 1994 .

[10]  Mark D. Plumbley Conditions for nonnegative independent component analysis , 2002, IEEE Signal Processing Letters.

[11]  George Francis Harpur,et al.  Low Entropy Coding with Unsupervised Neural Networks , 1997 .

[12]  Mark D. Plumbley Lyapunov functions for convergence of principal component algorithms , 1995, Neural Networks.

[13]  Erkki Oja,et al.  Neural Networks, Principal Components, and Subspaces , 1989, Int. J. Neural Syst..

[14]  Kurt Hornik,et al.  Convergence analysis of local feature extraction algorithms , 1992, Neural Networks.

[15]  Jean-François Cardoso,et al.  Equivariant adaptive source separation , 1996, IEEE Trans. Signal Process..

[16]  Ronald C. Henry,et al.  Multivariate receptor models—current practice and future trends , 2002 .

[17]  Kenji Kita,et al.  Dimensionality reduction using non-negative matrix factorization for information retrieval , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[18]  Erkki Oja,et al.  The nonlinear PCA learning rule in independent component analysis , 1997, Neurocomputing.

[19]  Juha Karhunen,et al.  Blind source separation using least-squares type adaptive algorithms , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[20]  Aapo Hyvärinen,et al.  Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces , 2000, Neural Computation.

[21]  James V. Candy,et al.  Adaptive and Learning Systems for Signal Processing, Communications, and Control , 2006 .

[22]  Andreas Ziehe,et al.  Unmixing Hyperspectral Data , 1999, NIPS.

[23]  Mark D. Plumbley Algorithms for nonnegative independent component analysis , 2003, IEEE Trans. Neural Networks.

[24]  Sun-Yuan Kung,et al.  Principal Component Neural Networks: Theory and Applications , 1996 .

[25]  E. Oja,et al.  On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix , 1985 .

[26]  John B. Moore,et al.  Global analysis of Oja's flow for neural networks , 1994, IEEE Trans. Neural Networks.