Randomized Algorithms for Computation of Tucker Decomposition and Higher Order SVD (HOSVD)

Big data analysis has become a crucial part of new emerging technologies such as the internet of things, cyber-physical analysis, deep learning, anomaly detection, etc. Among many other techniques, dimensionality reduction plays a key role in such analyses and facilitates feature selection and feature extraction. Randomized algorithms are efficient tools for handling big data tensors. They accelerate decomposing large-scale data tensors by reducing the computational complexity of deterministic algorithms and the communication among different levels of memory hierarchy, which is the main bottleneck in modern computing environments and architectures. In this article, we review recent advances in randomization for computation of Tucker decomposition and Higher Order SVD (HOSVD). We discuss random projection and sampling approaches, single-pass and multi-pass randomized algorithms and how to utilize them in the computation of the Tucker decomposition and the HOSVD. Simulations on synthetic and real datasets are provided to compare the performance of some of best and most promising algorithms.

[1]  David P. Woodruff,et al.  Low rank approximation and regression in input sparsity time , 2012, STOC '13.

[2]  Jorge Nocedal,et al.  An investigation of Newton-Sketch and subsampled Newton methods , 2017, Optim. Methods Softw..

[3]  André Uschmajew,et al.  On the interconnection between the higher-order singular values of real tensors , 2017, Numerische Mathematik.

[4]  Alain Rakotomamonjy,et al.  Singleshot : a scalable Tucker tensor decomposition , 2019, NeurIPS.

[5]  G. W. Stewart,et al.  Four algorithms for the the efficient computation of truncated pivoted QR approximations to a sparse matrix , 1999, Numerische Mathematik.

[6]  Andrzej Cichocki,et al.  One time is not enough: iterative tensor decomposition for neural network compression , 2019, ArXiv.

[7]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[8]  Wolfgang Hackbusch,et al.  Tensor product approximation with optimal rank in quantum chemistry. , 2007, The Journal of chemical physics.

[9]  Sabine Van Huffel,et al.  Differential-geometric Newton method for the best rank-(R1, R2, R3) approximation of tensors , 2008, Numerical Algorithms.

[10]  Tae-Hyun Oh,et al.  Fast Randomized Singular Value Thresholding for Low-Rank Optimization , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  V. Rokhlin,et al.  A randomized algorithm for the approximation of matrices , 2006 .

[12]  Boris N. Khoromskij On tensor approximation of Green iterations for Kohn-Sham equations , 2008 .

[13]  Robert M. Gower,et al.  Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms , 2016, SIAM J. Matrix Anal. Appl..

[14]  Dimitris Achlioptas,et al.  Database-friendly random projections: Johnson-Lindenstrauss with binary coins , 2003, J. Comput. Syst. Sci..

[15]  Gregory Beylkin,et al.  Randomized interpolative decomposition of separated representations , 2013, J. Comput. Phys..

[16]  Martin J. Wainwright,et al.  Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence , 2015, SIAM J. Optim..

[17]  Ming Gu,et al.  Efficient Algorithms for Computing a Strong Rank-Revealing QR Factorization , 1996, SIAM J. Sci. Comput..

[18]  Christos Boutsidis,et al.  Optimal principal component analysis in distributed and streaming models , 2015, STOC.

[19]  Charalampos E. Tsourakakis MACH: Fast Randomized Tensor Decompositions , 2009, SDM.

[20]  Huan Liu,et al.  CubeSVD: a novel approach to personalized Web search , 2005, WWW '05.

[21]  Guoxu Zhou,et al.  Orthogonal random projection for tensor completion , 2020, IET Comput. Vis..

[22]  Wenjian Yu,et al.  Fast Randomized PCA for Sparse Data , 2018, ACML.

[23]  David P. Woodruff,et al.  Relative Error Tensor Low Rank Approximation , 2017, Electron. Colloquium Comput. Complex..

[24]  Yaohang Li,et al.  Faster Matrix Completion Using Randomized SVD , 2018, 2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI).

[25]  James T. Kwok,et al.  Large-Scale Nyström Kernel Matrix Approximation Using Randomized SVD , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Michael W. Mahoney,et al.  Sub-Sampled Newton Methods I: Globally Convergent Algorithms , 2016, ArXiv.

[27]  Zhihua Zhang,et al.  Improving CUR matrix decomposition and the Nyström approximation via adaptive sampling , 2013, J. Mach. Learn. Res..

[28]  Yi Wang,et al.  SketchyCoreSVD: SketchySVD from Random Subsampling of the Data Matrix , 2019, 2019 IEEE International Conference on Big Data (Big Data).

[29]  Ming Gu,et al.  Randomized Complete Pivoting for Solving Symmetric Indefinite Linear Systems , 2017, SIAM J. Matrix Anal. Appl..

[30]  Lars Grasedyck,et al.  Existence and Computation of Low Kronecker-Rank Approximations for Large Linear Systems of Tensor Product Structure , 2004, Computing.

[31]  Anima Anandkumar,et al.  Tensor Dropout for Robust Learning. , 2020 .

[32]  Michael W. Mahoney,et al.  RandNLA , 2016, Commun. ACM.

[33]  Michael W. Mahoney Randomized Algorithms for Matrices and Data , 2011, Found. Trends Mach. Learn..

[34]  Christos Boutsidis,et al.  Topics in Matrix Sampling Algorithms , 2011, ArXiv.

[35]  Panagiotis Symeonidis,et al.  MusicBox: Personalized Music Recommendation Based on Cubic Analysis of Social Tags , 2010, IEEE Transactions on Audio, Speech, and Language Processing.

[36]  Sameer A. Nene,et al.  Columbia Object Image Library (COIL100) , 1996 .

[37]  Christos Faloutsos,et al.  S-HOT: Scalable High-Order Tucker Decomposition , 2017, WSDM.

[38]  S. Goreinov,et al.  A Theory of Pseudoskeleton Approximations , 1997 .

[39]  Ameet Talwalkar,et al.  Sampling Methods for the Nyström Method , 2012, J. Mach. Learn. Res..

[40]  Raf Vandebril,et al.  A New Truncation Strategy for the Higher-Order Singular Value Decomposition , 2012, SIAM J. Sci. Comput..

[41]  De LathauwerLieven Decompositions of a Higher-Order Tensor in Block TermsPart II: Definitions and Uniqueness , 2008 .

[42]  Per-Gunnar Martinsson,et al.  Efficient algorithms for cur and interpolative matrix decompositions , 2014, Adv. Comput. Math..

[43]  Cameron Musco,et al.  Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition , 2015, NIPS.

[44]  Ameet Talwalkar,et al.  Sampling Techniques for the Nystrom Method , 2009, AISTATS.

[45]  Rodrigo C. de Lamare,et al.  Low-rank and sparse matrix recovery based on a randomized rank-revealing decomposition , 2017, 2017 22nd International Conference on Digital Signal Processing (DSP).

[46]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part I: Lemmas for Partitioned Matrices , 2008, SIAM J. Matrix Anal. Appl..

[47]  Madeleine Udell,et al.  Tensor Random Projection for Low Memory Dimension Reduction , 2021, ArXiv.

[48]  João Marcos Travassos Romano,et al.  Randomized methods for higher-order subspace separation , 2016, 2016 24th European Signal Processing Conference (EUSIPCO).

[49]  Michael W. Mahoney,et al.  Sub-Sampled Newton Methods II: Local Convergence Rates , 2016, ArXiv.

[50]  David P. Woodruff Sketching as a Tool for Numerical Linear Algebra , 2014, Found. Trends Theor. Comput. Sci..

[51]  Peter Richtárik,et al.  Randomized Iterative Methods for Linear Systems , 2015, SIAM J. Matrix Anal. Appl..

[52]  Kenneth Ward Church,et al.  Very sparse random projections , 2006, KDD '06.

[53]  Rasmus Pagh,et al.  Compressed matrix multiplication , 2011, ITCS '12.

[54]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[55]  Alan M. Frieze,et al.  Fast monte-carlo algorithms for finding low-rank approximations , 2004, JACM.

[56]  Kishore Kumar Naraparaju,et al.  A note on tensor chain approximation , 2012, Comput. Vis. Sci..

[57]  Danny C. Sorensen,et al.  A DEIM Induced CUR Factorization , 2014, SIAM J. Sci. Comput..

[58]  Eunhyeok Park,et al.  Compression of Deep Convolutional Neural Networks for Fast and Low Power Mobile Applications , 2015, ICLR.

[59]  Stephen Becker,et al.  Fast randomized matrix and tensor interpolative decomposition using CountSketch , 2019, Advances in Computational Mathematics.

[60]  A. Cichocki,et al.  Generalizing the column–row matrix decomposition to multi-way arrays , 2010 .

[61]  C. Eckart,et al.  The approximation of one matrix by another of lower rank , 1936 .

[62]  Yasuyuki Matsushita,et al.  Fast randomized Singular Value Thresholding for Nuclear Norm Minimization , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[63]  Michael W. Mahoney,et al.  A randomized algorithm for a tensor-based generalization of the singular value decomposition , 2007 .

[64]  Stephen Becker,et al.  Low-Rank Tucker Decomposition of Large Tensors Using TensorSketch , 2018, NeurIPS.

[65]  Petros Daras,et al.  The TFC Model: Tensor Factorization and Tag Clustering for Item Recommendation in Social Tagging Systems , 2013, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[66]  S. Friedland,et al.  Fast low rank approximations of matrices and tensors , 2011 .

[67]  Martin J. Mohlenkamp,et al.  Numerical operator calculus in higher dimensions , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[68]  Per-Gunnar Martinsson,et al.  Randomized methods for matrix computations , 2016, IAS/Park City Mathematics Series.

[69]  Danny C. Sorensen,et al.  Discrete Empirical Interpolation for nonlinear model reduction , 2009, Proceedings of the 48h IEEE Conference on Decision and Control (CDC) held jointly with 2009 28th Chinese Control Conference.

[70]  Hong Yan,et al.  Randomized algorithms for the low multilinear rank approximations of tensors , 2019, J. Comput. Appl. Math..

[71]  Yimin Wei,et al.  Randomized algorithms for the approximations of Tucker and the tensor train decompositions , 2018, Adv. Comput. Math..

[72]  Roland Badeau,et al.  Fast Multilinear Singular Value Decomposition for Structured Tensors , 2008, SIAM J. Matrix Anal. Appl..

[73]  Anh Huy Phan,et al.  Randomized algorithms for fast computation of low rank tensor ring model , 2020, Mach. Learn. Sci. Technol..

[74]  Wim Van Paesschen,et al.  Block term decomposition for modelling epileptic seizures , 2014, EURASIP J. Adv. Signal Process..

[75]  Yang Guo,et al.  Low-Rank Tucker Approximation of a Tensor From Streaming Data , 2019, SIAM J. Math. Data Sci..

[76]  George Atia,et al.  Randomized Robust Subspace Recovery and Outlier Detection for High Dimensional Data Matrices , 2015, IEEE Transactions on Signal Processing.

[77]  Laurence T. Yang,et al.  A Big Data-as-a-Service Framework: State-of-the-Art and Perspectives , 2018, IEEE Transactions on Big Data.

[78]  Tamara G. Kolda,et al.  A Practical Randomized CP Tensor Decomposition , 2017, SIAM J. Matrix Anal. Appl..

[79]  M. Gu,et al.  Strong rank revealing LU factorizations , 2003 .

[80]  Estevam R. Hruschka,et al.  Toward an Architecture for Never-Ending Language Learning , 2010, AAAI.

[81]  Lee Sael,et al.  High-Performance Tucker Factorization on Heterogeneous Platforms , 2019, IEEE Transactions on Parallel and Distributed Systems.

[82]  Panos P. Markopoulos,et al.  L1-Norm Tucker Tensor Decomposition , 2019, IEEE Access.

[83]  E. Tyrtyshnikov Tensor approximations of matrices generated by asymptotically smooth functions , 2003 .

[84]  Elvar K. Bjarkason,et al.  Pass-Efficient Randomized Algorithms for Low-Rank Matrix Approximation Using Any Number of Views , 2018, SIAM J. Sci. Comput..

[85]  Marc M. Van Hulle,et al.  Fast Multiway Partial Least Squares Regression , 2019, IEEE Transactions on Biomedical Engineering.

[86]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[87]  Andrzej Cichocki,et al.  Stable, Robust, and Super Fast Reconstruction of Tensors Using Multi-Way Projections , 2014, IEEE Transactions on Signal Processing.

[88]  Alexander Sebastian Johannes Wolf Wolf Low rank tensor decompositions for high dimensional data approximation, recovery and prediction , 2019 .

[89]  Chao Yang,et al.  a-Tucker: Input-Adaptive and Matricization-Free Tucker Decomposition for Dense Tensors on CPUs and GPUs , 2020, ArXiv.

[90]  Laurence T. Yang,et al.  A Multi-Order Distributed HOSVD with Its Incremental Computing for Big Services in Cyber-Physical-Social Systems , 2020, IEEE Transactions on Big Data.

[91]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[92]  Ngai Wong,et al.  Computing low-rank approximations of large-scale matrices with the Tensor Network randomized SVD , 2018, SIAM J. Matrix Anal. Appl..

[93]  Demetri Terzopoulos,et al.  Multilinear Analysis of Image Ensembles: TensorFaces , 2002, ECCV.

[94]  S. Zucker,et al.  Accelerated dense random projections , 2009 .

[95]  Tamara G. Kolda,et al.  Scalable Tensor Decompositions for Multi-aspect Data Mining , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[96]  F. L. Hitchcock The Expression of a Tensor or a Polyadic as a Sum of Products , 1927 .

[97]  Rasmus Pagh,et al.  Fast and scalable polynomial kernels via explicit feature maps , 2013, KDD.

[98]  Yang Shi,et al.  Higher-Order Count Sketch: Dimensionality Reduction that Retains Efficient Tensor Operations , 2020, 2020 Data Compression Conference (DCC).

[99]  Panagiotis Symeonidis,et al.  ClustHOSVD: Item Recommendation by Combining Semantically Enhanced Tag Clustering With Tensor HOSVD , 2016, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[100]  Per-Gunnar Martinsson,et al.  RSVDPACK: Subroutines for computing partial singular value decompositions via randomized sampling on single core, multi core, and GPU architectures , 2015, ArXiv.

[101]  George Atia,et al.  Randomized Robust Subspace Recovery for High Dimensional Data Matrices , 2015, ArXiv.

[102]  Petros Drineas,et al.  Fast Monte Carlo Algorithms for Matrices II: Computing a Low-Rank Approximation to a Matrix , 2006, SIAM J. Comput..

[103]  Sanyang Liu,et al.  Randomized Method for Robust Principal Component Analysis , 2018, CSAE '18.

[104]  Berkant Savas,et al.  Handwritten digit classification using higher order singular value decomposition , 2007, Pattern Recognit..

[105]  Yimin Wei,et al.  The Computation of Low Multilinear Rank Approximations of Tensors via Power Scheme and Random Projection , 2020, SIAM J. Matrix Anal. Appl..

[106]  L. Lathauwer,et al.  Dimensionality reduction in higher-order signal processing and rank-(R1,R2,…,RN) reduction in multilinear algebra , 2004 .

[107]  Joshua B. Tenenbaum,et al.  Global Versus Local Methods in Nonlinear Dimensionality Reduction , 2002, NIPS.

[108]  Santosh Vempala,et al.  Randomized algorithms in numerical linear algebra , 2017, Acta Numerica.

[109]  André Uschmajew,et al.  Perturbation of Higher-Order Singular Values , 2017, SIAM J. Appl. Algebra Geom..

[110]  Sebastian Krämer The Geometrical Description of Feasible Singular Values in the Tensor Train Format , 2017 .

[111]  Yangyang Xu,et al.  Fast algorithms for Higher-order Singular Value Decomposition from incomplete data , 2014 .

[112]  V. Rokhlin,et al.  A fast randomized algorithm for overdetermined linear least-squares regression , 2008, Proceedings of the National Academy of Sciences.

[113]  Petros Drineas,et al.  Tensor-CUR decompositions for tensor-based data , 2006, KDD '06.

[114]  Chao Yang,et al.  Efficient Alternating Least Squares Algorithms for Truncated HOSVD of Higher-Order Tensors , 2020, ArXiv.

[115]  Severnyi Kavkaz Pseudo-Skeleton Approximations by Matrices of Maximal Volume , 2022 .

[116]  Patrick S. Huggins,et al.  A Randomized Singular Value Decomposition Algorithm for Image Processing Applications , 2004 .

[117]  V. Rokhlin,et al.  A fast randomized algorithm for the approximation of matrices ✩ , 2007 .

[118]  George Michailidis,et al.  Fast Randomized Algorithms for t-Product Based Tensor Operations and Decompositions with Applications to Imaging Data , 2017, SIAM J. Imaging Sci..

[119]  Steven L. Brunton,et al.  Compressed Singular Value Decomposition for Image and Video Processing , 2017, 2017 IEEE International Conference on Computer Vision Workshops (ICCVW).

[120]  VandewalleJoos,et al.  On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors , 2000 .

[121]  Guangshi Yu,et al.  EFFICIENT RANDOMIZED ALGORITHMS FOR THE FIXED-PRECISION LOW-RANK MATRIX APPROXIMATION , 2018 .

[122]  David P. Woodruff,et al.  Fast approximation of matrix coherence and statistical leverage , 2011, ICML.

[123]  Yan Liu,et al.  SPALS: Fast Alternating Least Squares via Implicit Leverage Scores Sampling , 2016, NIPS.

[124]  Boris N. Khoromskij,et al.  Hierarchical Kronecker tensor-product approximations , 2005, J. Num. Math..

[125]  Jimeng Sun,et al.  Sparse Hierarchical Tucker Factorization and Its Application to Healthcare , 2015, 2015 IEEE International Conference on Data Mining.

[126]  Alexander J. Smola,et al.  Fast and Guaranteed Tensor Decomposition via Sketching , 2015, NIPS.

[127]  Volkan Cevher,et al.  Practical Sketching Algorithms for Low-Rank Matrix Approximation , 2016, SIAM J. Matrix Anal. Appl..

[128]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[129]  S. Goreinov,et al.  The maximum-volume concept in approximation by low-rank matrices , 2001 .

[130]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[131]  Sjsu ScholarWorks,et al.  Rank revealing QR factorizations , 2014 .

[132]  Martin J. Mohlenkamp,et al.  Algorithms for Numerical Analysis in High Dimensions , 2005, SIAM J. Sci. Comput..

[133]  Peter Lancaster,et al.  The theory of matrices , 1969 .

[134]  Jimeng Sun,et al.  MultiVis: Content-Based Social Network Exploration through Multi-way Visual Analysis , 2009, SDM.

[135]  William J. Astle,et al.  Statistical properties of sketching algorithms , 2017, Biometrika.

[136]  Ivan V. Oseledets,et al.  Fast Multidimensional Convolution in Low-Rank Tensor Formats via Cross Approximation , 2015, SIAM J. Sci. Comput..

[137]  Arvind K. Saibaba,et al.  HOID: Higher Order Interpolatory Decomposition for Tensors Based on Tucker Representation , 2015, SIAM J. Matrix Anal. Appl..

[138]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part III: Alternating Least Squares Algorithms , 2008, SIAM J. Matrix Anal. Appl..

[139]  Andrzej Cichocki,et al.  On Fast algorithms for orthogonal Tucker decomposition , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[140]  Laurence T. Yang,et al.  Improved Multi-Order Distributed HOSVD with Its Incremental Computing for Smart City Services , 2018, IEEE Transactions on Sustainable Computing.

[141]  S. Muthukrishnan,et al.  Relative-Error CUR Matrix Decompositions , 2007, SIAM J. Matrix Anal. Appl..

[142]  C. Pan On the existence and computation of rank-revealing LU factorizations , 2000 .

[143]  Nir Ailon,et al.  Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes , 2008, SODA '08.

[144]  Rasmus Bro,et al.  MULTI-WAY ANALYSIS IN THE FOOD INDUSTRY Models, Algorithms & Applications , 1998 .

[145]  Lee Sael,et al.  Scalable Tucker Factorization for Sparse Tensors - Algorithms and Discoveries , 2017, 2018 IEEE 34th International Conference on Data Engineering (ICDE).

[146]  Andrzej Cichocki,et al.  MUSCO: Multi-Stage Compression of neural networks , 2019 .

[147]  Moses Charikar,et al.  Finding frequent items in data streams , 2002, Theor. Comput. Sci..

[148]  De LathauwerLieven,et al.  Decompositions of a Higher-Order Tensor in Block TermsPart III: Alternating Least Squares Algorithms , 2008 .

[149]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[150]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[151]  F. L. Hitchcock Multiple Invariants and Generalized Rank of a P‐Way Matrix or Tensor , 1928 .

[152]  Shusen Wang,et al.  Error Estimation for Randomized Least-Squares Algorithms via the Bootstrap , 2018, ICML.

[153]  Anru Zhang,et al.  ISLET: Fast and Optimal Low-rank Tensor Regression via Importance Sketching , 2020, SIAM J. Math. Data Sci..

[154]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part II: Definitions and Uniqueness , 2008, SIAM J. Matrix Anal. Appl..

[155]  Yangyang Xu,et al.  On Higher-order Singular Value Decomposition from Incomplete Data , 2014, ArXiv.

[156]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[157]  Laurence T. Yang,et al.  A Tensor Computation and Optimization Model for Cyber-Physical-Social Big Data , 2019, IEEE Transactions on Sustainable Computing.

[158]  Jalaj Upadhyay,et al.  The Price of Privacy for Low-rank Factorization , 2018, NeurIPS.

[159]  Lieven De Lathauwer,et al.  Block Component Analysis, a New Concept for Blind Source Separation , 2012, LVA/ICA.

[160]  André Lima Férrer de Almeida,et al.  Enhanced block term decomposition for atrial activity extraction in atrial fibrillation ECG , 2016, 2016 IEEE Sensor Array and Multichannel Signal Processing Workshop (SAM).

[161]  Nikos D. Sidiropoulos,et al.  Tensor Decomposition for Signal Processing and Machine Learning , 2016, IEEE Transactions on Signal Processing.

[162]  Per-Gunnar Martinsson,et al.  A Randomized Blocked Algorithm for Efficiently Computing Rank-revealing Factorizations of Matrices , 2015, SIAM J. Sci. Comput..

[163]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[164]  Laurence T. Yang,et al.  A Distributed HOSVD Method With Its Incremental Computation for Big Data in Cyber-Physical-Social Systems , 2018, IEEE Transactions on Computational Social Systems.

[165]  Arvind K. Saibaba,et al.  Randomized algorithms for low-rank tensor decompositions in the Tucker format , 2019, SIAM J. Math. Data Sci..

[166]  Andrzej Cichocki,et al.  Decomposition of Big Tensors With Low Multilinear Rank , 2014, ArXiv.

[167]  Eugene E. Tyrtyshnikov,et al.  Incomplete Cross Approximation in the Mosaic-Skeleton Method , 2000, Computing.

[168]  A. Yu. Mikhalev,et al.  Iterative representing set selection for nested cross approximation , 2013, Numer. Linear Algebra Appl..

[169]  Mark Tygert,et al.  Algorithm 971 , 2017, ACM transactions on mathematical software. Association for Computing Machinery.

[170]  S. Muthukrishnan,et al.  Data streams: algorithms and applications , 2005, SODA '03.

[171]  Berkant Savas,et al.  A Newton-Grassmann Method for Computing the Best Multilinear Rank-(r1, r2, r3) Approximation of a Tensor , 2009, SIAM J. Matrix Anal. Appl..

[172]  Michael W. Mahoney,et al.  Spectral Gap Error Bounds for Improving CUR Matrix Decomposition and the Nyström Method , 2015, AISTATS.

[173]  Jun-Gi Jang,et al.  D-Tucker: Fast and Memory-Efficient Tucker Decomposition for Dense Tensors , 2020, 2020 IEEE 36th International Conference on Data Engineering (ICDE).

[174]  S. Goreinov,et al.  How to find a good submatrix , 2010 .

[175]  Ilse C. F. Ipsen,et al.  Randomized Least Squares Regression: Combining Model- and Algorithm-Induced Uncertainties , 2018, ArXiv.

[176]  Andrzej Cichocki,et al.  Tensor Networks for Latent Variable Analysis: Higher Order Canonical Polyadic Decomposition , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[177]  T. Kolda Multilinear operators for higher-order decompositions , 2006 .

[178]  Eugene E. Tyrtyshnikov,et al.  Tucker Dimensionality Reduction of Three-Dimensional Arrays in Linear Time , 2008, SIAM J. Matrix Anal. Appl..

[179]  Sabine Van Huffel,et al.  Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme , 2011, SIAM J. Matrix Anal. Appl..

[180]  Steven L. Brunton,et al.  Randomized CP tensor decomposition , 2017, Mach. Learn. Sci. Technol..

[181]  Yang Shi Efficient Tensor Operations via Compression and Parallel Computation , 2019 .