Author:
Sugiyama Mahito,Nakahara Hiroyuki,Tsuda Koji
Abstract
Abstract
We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters. Thanks to the well-developed theory of information geometry, the reconstructed tensor is unique and always minimizes the KL divergence from an input tensor. We empirically show that Legendre decomposition can more accurately reconstruct tensors than other nonnegative tensor decomposition methods.
Subject
Statistics, Probability and Uncertainty,Statistics and Probability,Statistical and Nonlinear Physics
Reference41 articles.
1. A learning algorithm for Boltzmann machines;Ackley;Cogn. Sci.,1985
2. Natural gradient works efficiently in learning;Amari;Neural Comput.,1998
3. Information geometry on hierarchy of probability distributions;Amari;IEEE Trans. Inf. Theory,2001
4. Information geometry and its applications: convex function and dually flat manifold;Amari,2009
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献