Tensor Ring Decomposition

Tensor networks have in recent years emerged as the powerful tools for solving the large-scale optimization problems. One of the most popular tensor network is tensor train (TT) decomposition that acts as the building blocks for the complicated tensor networks. However, the TT decomposition highly depends on permutations of tensor dimensions, due to its strictly sequential multilinear products over latent cores, which leads to difficulties in finding the optimal TT representation. In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition. The key advantage of TR model is the circular dimensional permutation invariance which is gained by employing the trace operation and treating the latent cores equivalently. TR model can be viewed as a linear combination of TT decompositions, thus obtaining the powerful and generalized representation abilities. For optimization of latent cores, we present four different algorithms based on the sequential SVDs, ALS scheme, and block-wise ALS techniques. Furthermore, the mathematical properties of TR model are investigated, which shows that the basic multilinear algebra can be performed efficiently by using TR representaions and the classical tensor decompositions can be conveniently transformed into the TR representation. Finally, the experiments on both synthetic signals and real-world datasets were conducted to evaluate the performance of different algorithms.

[1]  О. С. Лебедева Tensor conjugate-gradient-type method for Rayleigh quotient minimization in block QTT format , 2011 .

[2]  Boris N. Khoromskij,et al.  A fast iteration method for solving elliptic problems with quasiperiodic coefficients , 2015, 1510.00284.

[3]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[4]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  R. Bro PARAFAC. Tutorial and applications , 1997 .

[6]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[7]  Eric Jeckelmann Dynamical density-matrix renormalization-group method , 2002 .

[8]  B. Khoromskij O(dlog N)-Quantics Approximation of N-d Tensors in High-Dimensional Numerical Modeling , 2011 .

[9]  Reinhold Schneider,et al.  Optimization problems in contracted tensor networks , 2011, Comput. Vis. Sci..

[10]  Liqing Zhang,et al.  Bayesian Robust Tensor Factorization for Incomplete Multiway Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  J. Ballani,et al.  Black box approximation of tensors in hierarchical Tucker format , 2013 .

[12]  Eugene E. Tyrtyshnikov,et al.  Tucker Dimensionality Reduction of Three-Dimensional Arrays in Linear Time , 2008, SIAM J. Matrix Anal. Appl..

[13]  Yukihiko Yamashita,et al.  Smooth nonnegative matrix and tensor factorizations for robust multi-way data analysis , 2015, Signal Process..

[14]  Lars Grasedyck,et al.  Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format , 2015, SIAM J. Sci. Comput..

[15]  Boris N. Khoromskij,et al.  Superfast Fourier Transform Using QTT Approximation , 2012 .

[16]  Ivan V. Oseledets,et al.  Approximation of 2d˟2d Matrices Using Tensor Decomposition , 2010, SIAM J. Matrix Anal. Appl..

[17]  Daijin Kim,et al.  Tensor-Based AAM with Continuous Variation Estimation: Application to Variation-Robust Face Recognition , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Reinhold Schneider,et al.  The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format , 2012, SIAM J. Sci. Comput..

[19]  W. Dur,et al.  Concatenated tensor network states , 2009, 0904.1925.

[20]  C. F. Beckmann,et al.  Tensorial extensions of independent component analysis for multisubject FMRI analysis , 2005, NeuroImage.

[21]  Frank Verstraete,et al.  Matrix product state representations , 2006, Quantum Inf. Comput..

[22]  Fumikazu Miwakeichi,et al.  Decomposing EEG data into space–time–frequency components using Parallel Factor Analysis , 2004, NeuroImage.

[23]  Lieven De Lathauwer,et al.  A Link between the Canonical Decomposition in Multilinear Algebra and Simultaneous Matrix Diagonalization , 2006, SIAM J. Matrix Anal. Appl..

[24]  E. Tyrtyshnikov,et al.  TT-cross approximation for multidimensional arrays , 2010 .

[25]  B. Khoromskij Tensor numerical methods for multidimensional PDES: theoretical analysis and initial applications , 2015 .

[26]  Boris N. Khoromskij,et al.  Computation of extreme eigenvalues in higher dimensions using block tensor train format , 2013, Comput. Phys. Commun..

[27]  R. Schneider,et al.  Tree Tensor Network State with Variable Tensor Order: An Efficient Multireference Method for Strongly Correlated Systems , 2015, Journal of chemical theory and computation.

[28]  Ivan Laptev,et al.  Local Descriptors for Spatio-temporal Recognition , 2004, SCVMA.

[29]  Andrzej Cichocki,et al.  Linked Component Analysis From Matrices to High-Order Tensors: Applications to Biomedical Data , 2015, Proceedings of the IEEE.

[30]  S. V. DOLGOV,et al.  Fast Solution of Parabolic Problems in the Tensor Train/Quantized Tensor Train Format with Initial Application to the Fokker-Planck Equation , 2012, SIAM J. Sci. Comput..

[31]  T. Schulte-Herbrüggen,et al.  Computations in quantum tensor networks , 2012, 1212.5005.

[32]  Kyuwan Choi,et al.  Detecting the Number of Clusters in n-Way Probabilistic Clustering , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Andrzej Cichocki,et al.  Computing Sparse Representations of Multidimensional Signals Using Kronecker Bases , 2013, Neural Computation.

[34]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[35]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[36]  André Uschmajew,et al.  On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format , 2013, SIAM J. Numer. Anal..

[37]  Subir Sachdev Viewpoint: Tensor networks—a new tool for old problems , 2009 .

[38]  Daniel Kressner,et al.  Low-Rank Tensor Methods with Subspace Correction for Symmetric Eigenvalue Problems , 2014, SIAM J. Sci. Comput..

[39]  Boris N. Khoromskij,et al.  Two-Level QTT-Tucker Format for Optimized Tensor Calculus , 2013, SIAM J. Matrix Anal. Appl..

[40]  Michael Steinlechner,et al.  Riemannian Optimization for High-Dimensional Tensor Completion , 2016, SIAM J. Sci. Comput..

[41]  Lars Grasedyck,et al.  Hierarchical Singular Value Decomposition of Tensors , 2010, SIAM J. Matrix Anal. Appl..

[42]  Yuanning Yu,et al.  Blind MIMO System Estimation Based on PARAFAC Decomposition of Higher Order Output Tensors , 2006, IEEE Transactions on Signal Processing.

[43]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[44]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[45]  Sameer A. Nene,et al.  Columbia Object Image Library (COIL100) , 1996 .

[46]  Minh N. Do,et al.  Matrix Product State for Feature Extraction of Higher-Order Tensors , 2015 .

[47]  Roman Orus,et al.  A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States , 2013, 1306.2164.

[48]  Andrzej Cichocki,et al.  Very Large-Scale Singular Value Decomposition Using Tensor Train Networks , 2014, ArXiv.

[49]  S. V. Dolgov,et al.  ALTERNATING MINIMAL ENERGY METHODS FOR LINEAR SYSTEMS IN HIGHER DIMENSIONS∗ , 2014 .

[50]  Andrzej Cichocki,et al.  Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness , 2014, IEEE Transactions on Image Processing.

[51]  Zenglin Xu,et al.  Infinite Tucker Decomposition: Nonparametric Bayesian Models for Multiway Data Analysis , 2011, ICML.

[52]  Minh N. Do,et al.  Efficient tensor completion: Low-rank tensor train , 2016, ArXiv.

[53]  Andrzej Cichocki,et al.  Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions , 2014, ArXiv.

[54]  Pierre Comon,et al.  Tensor CP Decomposition With Structured Factor Matrices: Algorithms and Performance , 2016, IEEE Journal of Selected Topics in Signal Processing.

[55]  Anton Rodomanov,et al.  Putting MRFs on a Tensor Train , 2014, ICML.

[56]  Fumikazu Miwakeichi,et al.  Decomposing EEG Data into Space-Time-Frequency Components Using Parallel Factor Analysis and Its Relation with Cerebral Blood Flow , 2007, ICONIP.