Blind Separation of Positive Sources by Globally Convergent Gradient Search

The instantaneous noise-free linear mixing model in independent component analysis is largely a solved problem under the usual assumption of independent nongaussian sources and full column rank mixing matrix. However, with some prior information on the sources, like positivity, new analysis and perhaps simplified solution methods may yet become possible. In this letter, we consider the task of independent component analysis when the independent sources are known to be nonnegative and well grounded, which means that they have a nonzero pdf in the region of zero. It can be shown that in this case, the solution method is basically very simple: an orthogonal rotation of the whitened observation vector into nonnegative outputs will give a positive permutation of the original sources. We propose a cost function whose minimum coincides with nonnegativity and derive the gradient algorithm under the whitening constraint, under which the separating matrix is orthogonal. We further prove that in the Stiefel manifold of orthogonal matrices, the cost function is a Lyapunov function for the matrix gradient flow, implying global convergence. Thus, this algorithm is guaranteed to find the nonnegative well-grounded independent sources. The analysis is complemented by a numerical simulation, which illustrates the algorithm.

[1]  Ronald C. Henry,et al.  Multivariate receptor models—current practice and future trends , 2002 .

[2]  Mark D. Plumbley Algorithms for Non-Negative Independent Component Analysis , 2002 .

[3]  Mark D. Plumbley,et al.  BLIND SEPARATION OF POSITIVE SOURCES USING NON-NEGATIVE PC A , 2003 .

[4]  Kenji Kita,et al.  Dimensionality reduction using non-negative matrix factorization for information retrieval , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[5]  Lei Xu,et al.  Least mean square error reconstruction principle for self-organizing neural-nets , 1993, Neural Networks.

[6]  D Charles,et al.  Modelling multiple-cause structure using rectification constraints. , 1998, Network.

[7]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[8]  Erkki Oja,et al.  Independent Component Analysis , 2001 .

[9]  Daniel D. Lee,et al.  APPLICATION OF NON-NEGATIVE MATRIX FACTORIZATION TO DYNAMIC POSITRON EMISSION TOMOGRAPHY , 2001 .

[10]  Mark D. Plumbley Conditions for nonnegative independent component analysis , 2002, IEEE Signal Processing Letters.

[11]  George Francis Harpur,et al.  Low Entropy Coding with Unsupervised Neural Networks , 1997 .

[12]  Simone G. O. Fiori,et al.  A Theory for Learning by Weight Flow on Stiefel-Grassman Manifold , 2001, Neural Computation.

[13]  Erkki Oja,et al.  The nonlinear PCA learning rule in independent component analysis , 1997, Neurocomputing.

[14]  Andrzej Cichocki,et al.  Adaptive Blind Signal and Image Processing - Learning Algorithms and Applications , 2002 .

[15]  Jean-François Cardoso,et al.  Equivariant adaptive source separation , 1996, IEEE Trans. Signal Process..

[16]  Andrzej Cichocki,et al.  Adaptive blind signal and image processing , 2002 .

[17]  Erkki Oja,et al.  A "nonnegative PCA" algorithm for independent component analysis , 2004, IEEE Transactions on Neural Networks.

[18]  Alan Edelman,et al.  The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[19]  P. Paatero,et al.  Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values† , 1994 .

[20]  Andreas Ziehe,et al.  Unmixing Hyperspectral Data , 1999, NIPS.

[21]  Erkki Oja,et al.  Subspace methods of pattern recognition , 1983 .

[22]  Mark D. Plumbley Algorithms for nonnegative independent component analysis , 2003, IEEE Trans. Neural Networks.