Tensor completion via functional smooth component deflation

For the matrix/tensor completion problem with very high missing ratio, the standard local (e.g., patch, probabilistic, and smoothness) and global (e.g., low-rank) structure-based methods do not work well. To address this issue, we proposed to use local and global data structures at the same time by applying a novel functional smooth PARAFAC decomposition model for the tensor completion. This decomposition model is constructed as a sum of the outer product of functional smooth component vectors, which are represented by linear combinations of smooth basis functions. A new algorithm was developed by applying greedy deflation and smooth rank-one tensor decomposition. Our extensive experiments demonstrated the high performance and advantages of our algorithm in comparison to existing state-of-the-art methods.

[1]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..

[2]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[3]  Nathan Srebro,et al.  Fast maximum margin matrix factorization for collaborative prediction , 2005, ICML.

[4]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[5]  Marko Filipovic,et al.  Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion , 2015, Multidimens. Syst. Signal Process..

[6]  Hong Cheng,et al.  Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion , 2014, NIPS.

[7]  Patrick Pérez,et al.  Region filling and object removal by exemplar-based image inpainting , 2004, IEEE Transactions on Image Processing.

[8]  Hong-Yuan Mark Liao,et al.  Simultaneous Tensor Decomposition and Completion Using Factor Priors , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  L. Senhadji,et al.  Linear Total Variation Approximate Regularized Nuclear Norm Optimization for Matrix Completion , 2014 .

[10]  Johan A. K. Suykens,et al.  A Rank-One Tensor Updating Algorithm for Tensor Completion , 2015, IEEE Signal Processing Letters.

[11]  Morteza Mardani,et al.  Subspace Learning and Imputation for Streaming Big Data Matrices and Tensors , 2014, IEEE Transactions on Signal Processing.

[12]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Sun-Yuan Kung,et al.  On gradient adaptation with unit-norm constraints , 2000, IEEE Trans. Signal Process..

[14]  A. Ms.PatilV. Region Filling and Object Removal by Exemplar-Based Image Inpainting , 2012 .

[15]  Márcia M. C. Ferreira,et al.  PARAFAC with splines: a case study , 2002 .

[16]  Andrzej Cichocki,et al.  Smooth PARAFAC Decomposition for Tensor Completion , 2015, IEEE Transactions on Signal Processing.

[17]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Massimiliano Pontil,et al.  A New Convex Relaxation for Tensor Completion , 2013, NIPS.

[19]  Genevera I. Allen Multi-way functional principal components analysis , 2013, 2013 5th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[20]  Pedro A. Valdes-Sosa,et al.  Penalized PARAFAC analysis of spontaneous EEG recordings , 2008 .

[21]  Nico Vervliet,et al.  Breaking the Curse of Dimensionality Using Decompositions of Incomplete Tensors: Tensor-based scientific computing in big data analysis , 2014, IEEE Signal Processing Magazine.

[22]  Gonzalo Mateos,et al.  Rank Regularization and Bayesian Inference for Tensor Completion and Extrapolation , 2013, IEEE Transactions on Signal Processing.

[23]  Bin Ran,et al.  Tensor completion via a multi-linear low-n-rank factorization model , 2014, Neurocomputing.

[24]  Marieke E. Timmerman,et al.  Three-way component analysis with smoothness constraints , 2002 .