Adaptive Rank Selection for Tensor Ring Decomposition

Optimal rank selection is an important issue in tensor decomposition problems, especially for Tensor Train (TT) and Tensor Ring (TR) (also known as Tensor Chain) decompositions. In this paper, a new rank selection method for TR decomposition has been proposed for automatically finding near-optimal TR ranks, which result in a lower storage cost, especially for tensors with inexact TT or TR structures. In many of the existing approaches, TR ranks are determined in advance or by using truncated Singular Value Decomposition (t-SVD). There are also other approaches for selecting TR ranks adaptively. In our approach, the TR ranks are not determined in advance, but are increased gradually in each iteration until the model achieves a desired approximation accuracy. For this purpose, in each iteration, the sensitivity of the approximation error to each of the core tensors is measured and the core tensors with the highest sensitivity measures are selected and their sizes are increased. Simulation results confirmed that the proposed approach reduces the storage cost considerably and allows us to find optimal model in TR format, while preserving the desired accuracy of the approximation.

[1]  Hidekata Hontani,et al.  Manifold Modeling in Embedded Space: An Interpretable Alternative to Deep Image Prior , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[2]  Xi-Le Zhao,et al.  Tensor completion via nonconvex tensor ring rank minimization with guaranteed convergence , 2020, Signal Process..

[3]  Chao Li,et al.  Low Tensor-Ring Rank Completion by Parallel Matrix Factorization , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Yipeng Liu,et al.  Bayesian Low Rank Tensor Ring Model for Image Completion , 2020, ArXiv.

[5]  M. Salman Asif,et al.  Low-Rank Tensor Ring Model for Completing Missing Visual Data , 2020, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[6]  Sid Ying-Ze Bao,et al.  A Novel Rank Selection Scheme in Tensor Ring Decomposition Based on Reinforcement Learning for Deep Neural Networks , 2020, ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[7]  Qiquan Shi,et al.  Matrix and Tensor Completion in Multiway Delay Embedded Space Using Tensor Train, With Application to Signal Reconstruction , 2020, IEEE Signal Processing Letters.

[8]  Ce Zhu,et al.  Robust Low-Rank Tensor Ring Completion , 2020, IEEE Transactions on Computational Imaging.

[9]  Hierarchical Tensor Ring Completion , 2020, ArXiv.

[10]  Lei Chen,et al.  Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting , 2020, AAAI.

[11]  Danilo P Mandic,et al.  Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[12]  Ce Zhu,et al.  A Unified Framework for Coupled Tensor Completion. , 2020 .

[13]  Jocelyn Chanussot,et al.  Hyperspectral Images Super-Resolution via Learning High-Order Coupled Tensor Ring Representation , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[14]  Jianfeng Lu,et al.  Tensor Ring Decomposition: Optimization Landscape and One-loop Convergence of Alternating Least Squares , 2019, SIAM J. Matrix Anal. Appl..

[15]  Sertac Karaman,et al.  On algorithms for and computing with the tensor ring decomposition , 2018, Numer. Linear Algebra Appl..

[16]  Longhao Yuan,et al.  Rank minimization on tensor ring: an efficient approach for tensor decomposition and completion , 2019, Machine Learning.

[17]  Naoto Yokoya,et al.  Remote Sensing Image Reconstruction Using Tensor Ring Completion and Total Variation , 2019, IEEE Transactions on Geoscience and Remote Sensing.

[18]  Jianting Cao,et al.  High-order tensor completion via gradient-based optimization under tensor train format , 2019, Signal Process. Image Commun..

[19]  Ce Zhu,et al.  Image Completion Using Low Tensor Tree Rank and Total Variation Minimization , 2019, IEEE Transactions on Multimedia.

[20]  Zenglin Xu,et al.  Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition , 2018, AAAI.

[21]  Chao Li,et al.  Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion , 2018, AAAI.

[22]  Masashi Sugiyama,et al.  Learning Efficient Tensor Representations with Ring-structured Networks , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[23]  Qibin Zhao,et al.  An Effective Tensor Completion Method Based on Multi-linear Tensor Ring Decomposition , 2018, 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[24]  Hidekata Hontani,et al.  Tensor Completion with Shift-invariant Cosine Bases , 2018, 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[25]  Ngai Wong,et al.  Fast and Accurate Tensor Completion with Tensor Trains: A System Identification Approach , 2018, ArXiv.

[26]  Hidekata Hontani,et al.  Missing Slice Recovery for Tensors Using a Low-Rank Model in Embedded Space , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[27]  Yifan Sun,et al.  Wide Compression: Tensor Ring Nets , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[28]  V. Aggarwal,et al.  Efficient Low Rank Tensor Ring Completion , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[29]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..

[30]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[31]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[32]  Andrzej Cichocki,et al.  Smooth PARAFAC Decomposition for Tensor Completion , 2015, IEEE Transactions on Signal Processing.

[33]  Lars Grasedyck,et al.  Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format , 2015, SIAM J. Sci. Comput..

[34]  Marko Filipovic,et al.  Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion , 2015, Multidimens. Syst. Signal Process..

[35]  Shuchin Aeron,et al.  5D seismic data completion and denoising using a novel class of tensor decompositions , 2015 .

[36]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[37]  S. Sanei,et al.  Tensor Based Singular Spectrum Analysis for Automatic Scoring of Sleep EEG , 2015, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[38]  Stefan Handschuh,et al.  Numerical methods in Tensor Networks , 2014 .

[39]  Andrzej Cichocki,et al.  Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions , 2014, ArXiv.

[40]  Lynn Burroughs,et al.  Interpolation Using Hankel Tensor Completion , 2013 .

[41]  Kishore Kumar Naraparaju,et al.  A note on tensor chain approximation , 2012, Comput. Vis. Sci..

[42]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[43]  Ivan Oseledets,et al.  QTT approximation of elliptic solution operators in higher dimensions , 2011 .

[44]  Nikos D. Sidiropoulos,et al.  Batch and Adaptive PARAFAC-Based Blind Separation of Convolutive Speech Mixtures , 2010, IEEE Transactions on Audio, Speech, and Language Processing.

[45]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.