Abstract
We propose and investigate two new methods to approximate f(A)b for large, sparse, Hermitian matrices A. Computations of this form play an important role in numerous signal processing and machine learning tasks. The main idea behind both methods is to first estimate the spectral density of A, and then find polynomials of a fixed order that better approximate the function f on areas of the spectrum with a higher density of eigenvalues. Compared to state-of-the-art methods such as the Lanczos method and truncated Chebyshev expansion, the proposed methods tend to provide more accurate approximations of f(A)b at lower polynomial orders, and for matrices A with a large number of distinct interior eigenvalues and a small spectral width. We also explore the application of these techniques to (i) fast estimation of the norms of localized graph spectral filter dictionary atoms, and (ii) fast filtering of time-vertex signals.
Subject
Computational Mathematics,Computational Theory and Mathematics,Numerical Analysis,Theoretical Computer Science
Reference67 articles.
1. Kernels and Regularization on Graphs;Smola,2003
2. Regularization and Semi-Supervised Learning on Large Graphs;Belkin,2004
3. Learning with Local and Global Consistency;Zhou,2004
4. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains
5. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering;Defferrard,2016