Linkless link prediction via relational distillation
Abstract Graph Neural Networks (GNNs) have shown exceptional performance in the task of
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
link prediction. Despite their effectiveness, the high latency brought by non-trivial …
Graph condensation for inductive node representation learning
Graph neural networks (GNNs) encounter significant computational challenges when
handling large-scale graphs, which severely restricts their efficacy across diverse …
handling large-scale graphs, which severely restricts their efficacy across diverse …
Knowledge distillation on graphs: A survey
Graph Neural Networks (GNNs) have attracted tremendous attention by demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …
capability to handle graph data. However, they are difficult to be deployed in resource …
Do we really need graph neural networks for traffic forecasting?
Spatio-temporal graph neural networks (STGNN) have become the most popular solution to
traffic forecasting. While successful, they rely on the message passing scheme of GNNs to …
traffic forecasting. While successful, they rely on the message passing scheme of GNNs to …
Ugmae: A unified framework for graph masked autoencoders
Generative self-supervised learning on graphs, particularly graph masked autoencoders,
has emerged as a popular learning paradigm and demonstrated its efficacy in handling non …
has emerged as a popular learning paradigm and demonstrated its efficacy in handling non …
Fine-grained learning behavior-oriented knowledge distillation for graph neural networks
K Liu, Z Huang, CD Wang, B Gao… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Knowledge distillation (KD), as an effective compression technology, is used to reduce the
resource consumption of graph neural networks (GNNs) and facilitate their deployment on …
resource consumption of graph neural networks (GNNs) and facilitate their deployment on …
Edge-free but structure-aware: Prototype-guided knowledge distillation from gnns to mlps
Distilling high-accuracy Graph Neural Networks~(GNNs) to low-latency multilayer
perceptrons~(MLPs) on graph tasks has become a hot research topic. However, MLPs rely …
perceptrons~(MLPs) on graph tasks has become a hot research topic. However, MLPs rely …
Classifying Nodes in Graphs without GNNs
Graph neural networks (GNNs) are the dominant paradigm for classifying nodes in a graph,
but they have several undesirable attributes stemming from their message passing …
but they have several undesirable attributes stemming from their message passing …
Graph Knowledge Distillation to Mixture of Experts
P Rumiantsev, M Coates - arXiv preprint arXiv:2406.11919, 2024 - arxiv.org
In terms of accuracy, Graph Neural Networks (GNNs) are the best architectural choice for the
node classification task. Their drawback in real-world deployment is the latency that …
node classification task. Their drawback in real-world deployment is the latency that …
Enhancing the Resilience of Graph Neural Networks to Topological Perturbations in Sparse Graphs
S He, J Zhuang, D Wang, L Peng, J Song - arXiv preprint arXiv …, 2024 - arxiv.org
Graph neural networks (GNNs) have been extensively employed in node classification.
Nevertheless, recent studies indicate that GNNs are vulnerable to topological perturbations …
Nevertheless, recent studies indicate that GNNs are vulnerable to topological perturbations …