Tensor Networks for Latent Variable Analysis: Higher Order Canonical Polyadic Decomposition

The canonical polyadic decomposition (CPD) is a convenient and intuitive tool for tensor factorization; however, for higher order tensors, it often exhibits high computational cost and permutation of tensor entries, and these undesirable effects grow exponentially with the tensor order. Prior compression of tensor in-hand can reduce the computational cost of CPD, but this is only applicable when the rank $R$ of the decomposition does not exceed the tensor dimensions. To resolve these issues, we present a novel method for CPD of higher order tensors, which rests upon a simple tensor network of representative inter-connected core tensors of orders not higher than 3. For rigor, we develop an exact conversion scheme from the core tensors to the factor matrices in CPD and an iterative algorithm of low complexity to estimate these factor matrices for the inexact case. Comprehensive simulations over a variety of scenarios support the proposed approach.

[1]  Jun Yu,et al.  Click Prediction for Web Image Reranking Using Multimodal Sparse Coding , 2014, IEEE Transactions on Image Processing.

[2]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..

[3]  V. Strassen Gaussian elimination is not optimal , 1969 .

[4]  Yandong Tang,et al.  A Generalized Model for Robust Tensor Factorization With Noise Modeling by Mixture of Gaussians , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Giorgio Ottaviani,et al.  Effective Criteria for Specific Identifiability of Tensors and Forms , 2016, SIAM J. Matrix Anal. Appl..

[6]  Ivan V. Oseledets,et al.  Speeding-up Convolutional Neural Networks Using Fine-tuned CP-Decomposition , 2014, ICLR.

[7]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[8]  Andrzej Cichocki,et al.  On Revealing Replicating Structures in Multiway Data: A Novel Tensor Decomposition Approach , 2012, LVA/ICA.

[9]  R. Harshman,et al.  Modeling multi‐way data with linearly dependent loadings , 2009 .

[10]  Yong Luo,et al.  Tensor Canonical Correlation Analysis for Multi-View Dimension Reduction , 2015, IEEE Transactions on Knowledge and Data Engineering.

[11]  Tamara G. Kolda,et al.  Categories and Subject Descriptors: G.4 [Mathematics of Computing]: Mathematical Software— , 2022 .

[12]  Nikos D. Sidiropoulos,et al.  Kruskal's permutation lemma and the identification of CANDECOMP/PARAFAC and bilinear models with constant modulus constraints , 2004, IEEE Transactions on Signal Processing.

[13]  N. Sidiropoulos,et al.  On the uniqueness of multilinear decomposition of N‐way arrays , 2000 .

[14]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[15]  Yi Ling,et al.  Tensor chain and constraints in tensor networks , 2018, Journal of High Energy Physics.

[16]  Andrzej Cichocki,et al.  Low Complexity Damped Gauss-Newton Algorithms for CANDECOMP/PARAFAC , 2012, SIAM J. Matrix Anal. Appl..

[17]  Kishore Kumar Naraparaju,et al.  A note on tensor chain approximation , 2012, Comput. Vis. Sci..

[18]  J. Chang,et al.  Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition , 1970 .

[19]  G. Vidal Efficient classical simulation of slightly entangled quantum computations. , 2003, Physical review letters.

[20]  Huachun Tan,et al.  A Fused CP Factorization Method for Incomplete Tensors , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[21]  J. Kruskal Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics , 1977 .

[22]  Eugene E. Tyrtyshnikov,et al.  Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions , 2009, SIAM J. Sci. Comput..

[23]  Lieven De Lathauwer,et al.  Blind Separation of Exponential Polynomials and the Decomposition of a Tensor in Rank-(Lr, Lr, 1) Terms , 2011, SIAM J. Matrix Anal. Appl..

[24]  Richard A. Harshman,et al.  Determination and Proof of Minimum Uniqueness Conditions for PARAFAC1 , 1972 .

[25]  Nikos D. Sidiropoulos,et al.  Adaptive Algorithms to Track the PARAFAC Decomposition of a Third-Order Tensor , 2009, IEEE Transactions on Signal Processing.

[26]  P. Comon,et al.  Tensor decompositions, alternating least squares and other tales , 2009 .

[27]  Lieven De Lathauwer,et al.  Blind Signal Separation via Tensor Decomposition With Vandermonde Factor: Canonical Polyadic Decomposition , 2013, IEEE Transactions on Signal Processing.

[28]  Andrzej Cichocki,et al.  CANDECOMP/PARAFAC Decomposition of High-Order Tensors Through Tensor Reshaping , 2012, IEEE Transactions on Signal Processing.

[29]  Nikos D. Sidiropoulos,et al.  Parallel factor analysis in sensor array processing , 2000, IEEE Trans. Signal Process..

[30]  Andrzej Cichocki,et al.  From basis components to complex structural patterns , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[31]  Andrzej Cichocki,et al.  Tensor Networks for Latent Variable Analysis. Part I: Algorithms for Tensor Train Decomposition , 2016, ArXiv.

[32]  Andrzej Cichocki,et al.  Numerical CP decomposition of some difficult tensors , 2016, J. Comput. Appl. Math..

[33]  Jun Yu,et al.  Local Deep-Feature Alignment for Unsupervised Dimension Reduction , 2018, IEEE Transactions on Image Processing.

[34]  Yunming Ye,et al.  MR-NTD: Manifold Regularization Nonnegative Tucker Decomposition for Tensor Data Dimension Reduction and Representation , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[35]  L. Qi,et al.  Infinite dimensional Hilbert tensors on spaces of analytic functions , 2016, 1611.02357.

[36]  Andrew Zisserman,et al.  Speeding up Convolutional Neural Networks with Low Rank Expansions , 2014, BMVC.

[37]  Zbynek Koldovský,et al.  Cramér-Rao-Induced Bounds for CANDECOMP/PARAFAC Tensor Decomposition , 2012, IEEE Transactions on Signal Processing.

[38]  B. Kowalski,et al.  Tensorial resolution: A direct trilinear decomposition , 1990 .

[39]  Yinghuan Shi,et al.  Incomplete-Data Oriented Multiview Dimension Reduction via Sparse Low-Rank Representation , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[40]  Andrzej Cichocki,et al.  Error Preserving Correction: A Method for CP Decomposition at a Target Error Bound , 2019, IEEE Transactions on Signal Processing.

[41]  Andrzej Cichocki,et al.  Fast Alternating LS Algorithms for High Order CANDECOMP/PARAFAC Tensor Factorizations , 2013, IEEE Transactions on Signal Processing.

[42]  F. L. Hitchcock Multiple Invariants and Generalized Rank of a P‐Way Matrix or Tensor , 1928 .

[43]  Andreas Schadschneider,et al.  Equivalence and solution of anisotropic spin-1 models and generalized t-J fermion models in one dimension , 1991 .

[44]  Liqing Zhang,et al.  Bayesian Robust Tensor Factorization for Incomplete Multiway Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[45]  Andrzej Cichocki,et al.  Tensor Deflation for CANDECOMP/PARAFAC— Part I: Alternating Subspace Update Algorithm , 2015, IEEE Transactions on Signal Processing.

[46]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[47]  Aditya Bhaskara,et al.  Smoothed analysis of tensor decompositions , 2013, STOC.

[48]  Pierre Comon,et al.  Blind identification of under-determined mixtures based on the characteristic function , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[49]  Arie Yeredor,et al.  Blind source separation via the second characteristic function , 2000, Signal Process..

[50]  Raf Vandebril,et al.  Computing the Gradient in Optimization Algorithms for the CP Decomposition in Constant Memory through Tensor Blocking , 2015, SIAM J. Sci. Comput..

[51]  Junbin Gao,et al.  Vectorial Dimension Reduction for Tensors Based on Bayesian Inference , 2017, IEEE Transactions on Neural Networks and Learning Systems.