Introduction to Special Issue on Independent Components Analysis

Independent Component Analysis (ICA) and Blind Source Separation (BSS) have become standard tools in multivariate data analysis. ICA continues to generate a flurry of research interest, resulting in increasing numbers of papers submitted to conferences and journals. Furthermore, there are many workshops and special sessions conducted in major conferences that focus on recent research results. The International Conference on ICA and BSS is a prime example of the attractiveness and research diversity of this field. In many universities, ICA is now taught in the graduate curriculum of electrical engineering, computer science and statistics departments. The goal of this special issue is to present the latest research in ICA. We received 43 papers, of which 14 were accepted for publication. The topics covered in this issue cover a wide range of research areas including ICA theories and algorithms, representations such as nonlinear mixing, non-stationary, sparseness, and ICA applications. Theory and Algorithms: In the first paper in this issue, Cardoso explores the space of multivariate random variables to elucidate the relations among mutual information, entropy and nonGaussianity. Mutual information is decomposed into the sum of the term due to correlation and that due to non-Gaussianity. In the second paper, Bach and Jordan present a generalization of ICA, where instead of looking for a linear transform that makes the data components independent, they look for a transform that makes the data components well-fit by a tree-structured graphical model. This tree-dependent component analysis (TCA) provides a tractable and flexible approach to weakening the assumption of independence in ICA. In the the third paper, Teh et al. present a new way of extending ICA to overcomplete representations. In contrast to the causal generative extensions of ICA, they propose an energy based model where features are defined as deterministic functions of the inputs resulting in conditional independence of the features given the inputs. One of the basic ICA contrasts, mutual information, leads to minimizing the entropies of the found sources.