Affiliation:
1. Marchuk Insitute of Numerical Mathematics, Russian Academy of Sciences , Moscow , Russia
Abstract
Abstract
The projected gradient method for matrix completion is generalized towards the higher-dimensional case of low-rank Tucker tensors. It is shown that an operation order rearrangement in the common projected gradient approach provides a complexity improvement. An even better algorithm complexity can be obtained by replacing the completion operator by a general operator that satisfies restricted isometry property; however, such a replacement transforms the completion algorithm into an approximation algorithm.
Reference22 articles.
1. D. Achlioptas, Database-friendly random projections: Johnson–Lindenstrauss with binary coins. Journal of Computer and System Sciences 66 (2003), No. 4, 671–687.
2. A. Ahmed and J. Romberg, Compressive multiplexing of correlated signals. IEEE Transactions on Information Theory 61 (2014), No. 1, 479–498.
3. N. Ailon and B. Chazelle, The fast Johnson–Lindenstrauss transform and approximate nearest neighbors. SIAM Journal on Computing 39 (2009), No. 1, 302–322.
4. A. L. de Almeida, Tensor modeling and signal processing for wireless communication systems. PhD thesis, Université de Nice Sophia Antipolis, 2007.
5. A. Argyriou, T. Evgeniou, and M. Pontil, Convex multi-task feature learning. Machine Learning 73 (2008), 243–272.