作者
Sami Abu-El-Haija, Nazanin Alipourfard, Hrayr Harutyunyan, Amol Kapoor, Bryan Perozzi
发表日期
2018/12
期刊
Proceedings of the 32nd Conference on Neural Information Processing Systems (NIPS 2018)
简介
Recent methods generalize convolutional layers from Euclidean domains to graphstructured data by approximating the eigenbasis of the graph Laplacian. The computationally-efficient and broadly-used Graph ConvNet of Kipf & Welling [11], over-simplifies the approximation, effectively rendering graph convolution as a neighborhood-averaging operator. This simplification restricts the model from learning delta operators, the very premise of the graph Laplacian. In this work, we propose a new Graph Convolutional layer which mixes multiple powers of the adjacency matrix, allowing it to learn delta operators. Our layer exhibits the same memory footprint and computational complexity as a GCN. We illustrate the strength of our proposed layer on both synthetic graph datasets, and on several real-world citation graphs, setting the record state-of-the-art on Pubmed.
引用总数
201920202021202220232024116221
学术搜索中的文章
S Abu-El-Haija, N Alipourfard, H Harutyunyan… - Proceedings of the 32nd Conference on Neural …, 2018