Graph neural networks

G Corso, H Stark, S Jegelka, T Jaakkola… - Nature Reviews …, 2024 - nature.com
Graphs are flexible mathematical objects that can represent many entities and knowledge
from different domains, including in the life sciences. Graph neural networks (GNNs) are …

A survey of graph meets large language model: Progress and future directions

Y Li, Z Li, P Wang, J Li, X Sun, H Cheng… - arXiv preprint arXiv …, 2023 - arxiv.org
Graph plays a significant role in representing and analyzing complex relationships in real-
world applications such as citation networks, social networks, and biological data. Recently …

Exploring the potential of large language models (llms) in learning on graphs

Z Chen, H Mao, H Li, W Jin, H Wen, X Wei… - ACM SIGKDD …, 2024 - dl.acm.org
Learning on Graphs has attracted immense attention due to its wide real-world applications.
The most popular pipeline for learning on graphs with textual node attributes primarily relies …

Recipe for a general, powerful, scalable graph transformer

L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer
with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …

Graph inductive biases in transformers without message passing

L Ma, C Lin, D Lim, A Romero-Soriano… - International …, 2023 - proceedings.mlr.press
Transformers for graph data are increasingly widely studied and successful in numerous
learning tasks. Graph inductive biases are crucial for Graph Transformers, and previous …

Drew: Dynamically rewired message passing with delay

B Gutteridge, X Dong, MM Bronstein… - International …, 2023 - proceedings.mlr.press
Message passing neural networks (MPNNs) have been shown to suffer from the
phenomenon of over-squashing that causes poor performance for tasks relying on long …

Exphormer: Sparse transformers for graphs

H Shirzad, A Velingker… - International …, 2023 - proceedings.mlr.press
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …

A generalization of vit/mlp-mixer to graphs

X He, B Hooi, T Laurent, A Perold… - International …, 2023 - proceedings.mlr.press
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …

Graph mamba: Towards learning on graphs with state space models

A Behrouz, F Hashemi - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have shown promising potential in graph representation
learning. The majority of GNNs define a local message-passing mechanism, propagating …

Understanding oversquashing in gnns through the lens of effective resistance

M Black, Z Wan, A Nayyeri… - … Conference on Machine …, 2023 - proceedings.mlr.press
Message passing graph neural networks (GNNs) are a popular learning architectures for
graph-structured data. However, one problem GNNs experience is oversquashing, where a …