Elastic graph neural networks

X Liu, W Jin, Y Ma, Y Li, H Liu, Y Wang… - International …, 2021 - proceedings.mlr.press
While many existing graph neural networks (GNNs) have been proven to perform $\ell_2 $-
based graph smoothing that enforces smoothness globally, in this work we aim to further …

Graph neural networks with adaptive residual

X Liu, J Ding, W Jin, H Xu, Y Ma… - Advances in Neural …, 2021 - proceedings.neurips.cc
Graph neural networks (GNNs) have shown the power in graph representation learning for
numerous tasks. In this work, we discover an interesting phenomenon that although residual …

Graph neural networks inspired by classical iterative algorithms

Y Yang, T Liu, Y Wang, J Zhou, Q Gan… - International …, 2021 - proceedings.mlr.press
Despite the recent success of graph neural networks (GNN), common architectures often
exhibit significant limitations, including sensitivity to oversmoothing, long-range …

Lazygnn: Large-scale graph neural networks via lazy propagation

R Xue, H Han, MA Torkamani… - … on Machine Learning, 2023 - proceedings.mlr.press
Recent works have demonstrated the benefits of capturing long-distance dependency in
graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long …

From hypergraph energy functions to hypergraph neural networks

Y Wang, Q Gan, X Qiu, X Huang… - … on Machine Learning, 2023 - proceedings.mlr.press
Hypergraphs are a powerful abstraction for representing higher-order interactions between
entities of interest. To exploit these relationships in making downstream predictions, a …

MuseGNN: Interpretable and convergent graph neural network layers at scale

H Jiang, R Liu, X Yan, Z Cai, M Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Among the many variants of graph neural network (GNN) architectures capable of modeling
data with cross-instance relations, an important subclass involves layers designed such that …

Implicit vs unfolded graph neural networks

Y Yang, T Liu, Y Wang, Z Huang, D Wipf - arXiv preprint arXiv:2111.06592, 2021 - arxiv.org
It has been observed that graph neural networks (GNN) sometimes struggle to maintain a
healthy balance between the efficient modeling long-range dependencies across nodes …

[HTML][HTML] Does your graph need a confidence boost? convergent boosted smoothing on graphs with tabular node features

J Chen, J Mueller, VN Ioannidis, S Adeshina, Y Wang… - 2021 - amazon.science
For supervised learning with tabular data, decision tree ensembles produced via boosting
techniques generally dominate real-world applications involving iid training/test sets …

Efficient link prediction via gnn layers induced by negative sampling

Y Wang, X Hu, Q Gan, X Huang, X Qiu… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Graph neural networks (GNNs) for link prediction can loosely be divided into two broad
categories. First, node-wise architectures pre-compute individual embeddings for each node …

Graph Machine Learning through the Lens of Bilevel Optimization

AY Zheng, T He, Y Qiu, M Wang… - … Conference on Artificial …, 2024 - proceedings.mlr.press
Bilevel optimization refers to scenarios whereby the optimal solution of a lower-level energy
function serves as input features to an upper-level objective of interest. These optimal …