A generalization of vit/mlp-mixer to graphs
Abstract Graph Neural Networks (GNNs) have shown great potential in the field of graph
representation learning. Standard GNNs define a local message-passing mechanism which …
representation learning. Standard GNNs define a local message-passing mechanism which …
How powerful are k-hop message passing graph neural networks
The most popular design paradigm for Graph Neural Networks (GNNs) is 1-hop message
passing---aggregating information from 1-hop neighbors repeatedly. However, the …
passing---aggregating information from 1-hop neighbors repeatedly. However, the …
Rethinking the expressive power of gnns via graph biconnectivity
Designing expressive Graph Neural Networks (GNNs) is a central topic in learning graph-
structured data. While numerous approaches have been proposed to improve GNNs in …
structured data. While numerous approaches have been proposed to improve GNNs in …
Weisfeiler and leman go machine learning: The story so far
In recent years, algorithms and neural architectures based on the Weisfeiler-Leman
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a …
A complete expressiveness hierarchy for subgraph gnns via subgraph weisfeiler-lehman tests
Recently, subgraph GNNs have emerged as an important direction for developing
expressive graph neural networks (GNNs). While numerous architectures have been …
expressive graph neural networks (GNNs). While numerous architectures have been …
Facilitating graph neural networks with random walk on simplicial complexes
Node-level random walk has been widely used to improve Graph Neural Networks.
However, there is limited attention to random walk on edge and, more generally, on $ k …
However, there is limited attention to random walk on edge and, more generally, on $ k …
Ordered subgraph aggregation networks
Numerous subgraph-enhanced graph neural networks (GNNs) have emerged recently,
provably boosting the expressive power of standard (message-passing) GNNs. However …
provably boosting the expressive power of standard (message-passing) GNNs. However …
Equivariant polynomials for graph neural networks
Abstract Graph Neural Networks (GNN) are inherently limited in their expressive power.
Recent seminal works (Xu et al., 2019; Morris et al., 2019b) introduced the Weisfeiler …
Recent seminal works (Xu et al., 2019; Morris et al., 2019b) introduced the Weisfeiler …
Universal prompt tuning for graph neural networks
In recent years, prompt tuning has sparked a research surge in adapting pre-trained models.
Unlike the unified pre-training strategy employed in the language field, the graph field …
Unlike the unified pre-training strategy employed in the language field, the graph field …
Rethinking tokenizer and decoder in masked graph modeling for molecules
Masked graph modeling excels in the self-supervised representation learning of molecular
graphs. Scrutinizing previous studies, we can reveal a common scheme consisting of three …
graphs. Scrutinizing previous studies, we can reveal a common scheme consisting of three …