Analysis of Interactions Among Hidden Components for Tucker Model

Tensor representation and tensor decompositions are natural approaches to deal with large amounts of data with multiple aspects and high dimensionality in modern applications, such as environmental analysis, chemometrices, pharmaceutical analysis, spectral analysis, neuroscience. The two most popular decomposition/factorization models for N -th order tensors are the Tucker model and the more restricted PARAFAC model. The Tucker decomposition allows for the extraction of different numbers of factors in each of the modes, and permits interactions within each modality while PARAFAC does not. This advantage, however, is also one of the weakness of this decomposition. The difficult problem is to identify the dominant relationships between components, and to establish unique representation. In this paper, we will introduce a new measure index which is called the Joint Rate (JR) index, in order to evaluate interactions among various components in the general Tucker decomposition. The Hinton diagram is also extended to 3-D visualization. The use of the JR index will be illustrated with the analysis of EEG data for classification and BCI applications. I. TENSOR DECOMPOSITIONS AND INTERACTIVE RELATION AMONG THEIR HIDDEN COMPONENTS Standard matrix factorizations, such as PCA/SVD, ICA, NMF, and their variants, are invaluable tools for feature selection, dimensionality reduction, noise reduction, and data mining [1]. However, they have only two modes or 2-way representations, and their use is therefore limited. In many applications such as studies in neuroscience, the data structures often contain higher-order ways (modes) such as trials, task conditions, subjects, and groups together with the intrinsic dimensions of space, time, and frequency. If the data for every subject were analyzed separately by extracting a matrix or slice from a data block, we would lose the covariance information among subjects. To discover hidden components within the data and retain the integrative information, the analysis tools should reflect the multi-dimensional structure of the data [2], [3]. This way all dimensions or modes are retained by virtue of multi-linear models which often produce unique and physically meaningful components. The two most popular decompositions/factorizations for N th order tensors are the Tucker model and the more restricted PARAFAC model. The Tucker decomposition is described as a “decomposition of a given N -th order tensor Y ∈ R I1×I2···×IN into an unknown core tensor G ∈ R12N multiplied by a set of N unknown component matrices, A (n) = [a (n) 1 ,a (n) 2 , . . . ,a (n) Jn ] ∈ Rnn (n = 1, 2, . . . , N ), = Y G E + ( ) I J 3 3

[1]  Andrzej Cichocki,et al.  Nonnegative Tucker decomposition with alpha-divergence , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[2]  Michael P. Friedlander,et al.  Computing non-negative tensor factorizations , 2008, Optim. Methods Softw..

[3]  Andrzej Cichocki,et al.  Nonnegative Matrix and Tensor Factorization T , 2007 .

[4]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[5]  Hualou Liang,et al.  Single-Trial Decoding of Bistable Perception Based on Sparse Nonnegative Tensor Decomposition , 2008, Comput. Intell. Neurosci..

[6]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[7]  Lars Kai Hansen,et al.  Algorithms for Sparse Nonnegative Tucker Decompositions , 2008, Neural Computation.

[8]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[9]  Seungjin Choi,et al.  Nonnegative Tucker Decomposition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  T. Kolda Multilinear operators for higher-order decompositions , 2006 .

[11]  Liqing Zhang,et al.  Noninvasive BCIs: Multiway Signal-Processing Array Decompositions , 2008, Computer.

[12]  Pedro A. Valdes-Sosa,et al.  Penalized PARAFAC analysis of spontaneous EEG recordings , 2008 .

[13]  Lars Kai Hansen,et al.  Parallel Factor Analysis as an exploratory tool for wavelet transformed event-related EEG , 2006, NeuroImage.

[14]  Andrzej Cichocki,et al.  Local Learning Rules for Nonnegative Tucker Decomposition , 2009, ICONIP.

[15]  M. Friedlander,et al.  Computing non-negative tensor factorizations , 2008, Optim. Methods Softw..

[16]  William S Rayens,et al.  Structure-seeking multilinear methods for the analysis of fMRI data , 2004, NeuroImage.

[17]  Lars Kai Hansen,et al.  ERPWAVELAB A toolbox for multi-channel analysis of time–frequency transformed event related potentials , 2007, Journal of Neuroscience Methods.

[18]  Fumikazu Miwakeichi,et al.  Decomposing EEG data into space–time–frequency components using Parallel Factor Analysis , 2004, NeuroImage.

[19]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[20]  Andrzej Cichocki,et al.  Fast and Efficient Algorithms for Nonnegative Tucker Decomposition , 2008, ISNN.

[21]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[22]  Andrzej Cichocki,et al.  Extraction and classification of common independent components in single-trial crossmodal cortical responses , 2010 .