Abstract
Abstract
We define a novel type of ensemble graph convolutional network (GCN) model. Using optimized linear projection operators to map between spatial scales of graph, this ensemble model learns to aggregate information from each scale for its final prediction. We calculate these linear projection operators as the infima of an objective function relating the structure matrices used for each GCN. Equipped with these projections, our model (a Graph Prolongation-Convolutional Network) outperforms other GCN ensemble models at predicting the potential energy of monomer subunits in a coarse-grained mechanochemical simulation of microtubule bending. We demonstrate these performance gains by measuring an estimate of the Floating Point OPerations spent to train each model, as well as wall-clock time. Because our model learns at multiple scales, it is possible to train at each scale according to a predetermined schedule of coarse vs. fine training. We examine several such schedules adapted from the algebraic multigrid literature, and quantify the computational benefit of each. We also compare this model to another model which features an optimized coarsening of the input graph. Finally, we derive backpropagation rules for the input of our network model with respect to its output, and discuss how our method may be extended to very large graphs.
Funder
National Science Founation
National Institute of Aging
Human Frontiers Science Program
Subject
Artificial Intelligence,Human-Computer Interaction,Software
Reference42 articles.
1. Tensorflow: A system for large-scale machine learning;Abadi,2016
2. N-GCN: multi-scale graph convolution for semi-supervised node classification;Abu-El-Haija,2018
3. A gentle introduction to deep learning for graphs;Bacciu,2019
4. Solving the rectangular assignment problem and applications;Bijsterbosch;Ann. Oper. Res.,2010
5. A computational framework for cortical microtubule dynamics in realistically shaped plant cells;Chakrabortty;PLoS Comput. Biol.,2018
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献