Linkless link prediction via relational distillation

Z Guo, W Shiao, S Zhang, Y Liu… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …

Graph condensation for inductive node representation learning

X Gao, T Chen, Y Zang, W Zhang… - 2024 IEEE 40th …, 2024 - ieeexplore.ieee.org
Graph neural networks (GNNs) encounter significant computational challenges when
handling large-scale graphs, which severely restricts their efficacy across diverse …

Knowledge distillation on graphs: A survey

Y Tian, S Pei, X Zhang, C Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
Graph Neural Networks (GNNs) have attracted tremendous attention by demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …

Do we really need graph neural networks for traffic forecasting?

X Liu, Y Liang, C Huang, H Hu, Y Cao, B Hooi… - arXiv preprint arXiv …, 2023 - arxiv.org
Spatio-temporal graph neural networks (STGNN) have become the most popular solution to
traffic forecasting. While successful, they rely on the message passing scheme of GNNs to …

Ugmae: A unified framework for graph masked autoencoders

Y Tian, C Zhang, Z Kou, Z Liu, X Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
Generative self-supervised learning on graphs, particularly graph masked autoencoders,
has emerged as a popular learning paradigm and demonstrated its efficacy in handling non …

Fine-grained learning behavior-oriented knowledge distillation for graph neural networks

K Liu, Z Huang, CD Wang, B Gao… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Knowledge distillation (KD), as an effective compression technology, is used to reduce the
resource consumption of graph neural networks (GNNs) and facilitate their deployment on …

Edge-free but structure-aware: Prototype-guided knowledge distillation from gnns to mlps

T Wu, Z Zhao, J Wang, X Bai, L Wang, N Wong… - arXiv preprint arXiv …, 2023 - arxiv.org
Distilling high-accuracy Graph Neural Networks~(GNNs) to low-latency multilayer
perceptrons~(MLPs) on graph tasks has become a hot research topic. However, MLPs rely …

Classifying Nodes in Graphs without GNNs

D Winter, N Cohen, Y Hoshen - arXiv preprint arXiv:2402.05934, 2024 - arxiv.org
Graph neural networks (GNNs) are the dominant paradigm for classifying nodes in a graph,
but they have several undesirable attributes stemming from their message passing …

Graph Knowledge Distillation to Mixture of Experts

P Rumiantsev, M Coates - arXiv preprint arXiv:2406.11919, 2024 - arxiv.org
In terms of accuracy, Graph Neural Networks (GNNs) are the best architectural choice for the
node classification task. Their drawback in real-world deployment is the latency that …

Enhancing the Resilience of Graph Neural Networks to Topological Perturbations in Sparse Graphs

S He, J Zhuang, D Wang, L Peng, J Song - arXiv preprint arXiv …, 2024 - arxiv.org
Graph neural networks (GNNs) have been extensively employed in node classification.
Nevertheless, recent studies indicate that GNNs are vulnerable to topological perturbations …