Representational strengths and limitations of transformers
Attention layers, as commonly used in transformers, form the backbone of modern deep
learning, yet there is no mathematical description of their benefits and deficiencies as …
learning, yet there is no mathematical description of their benefits and deficiencies as …
Survey of local algorithms
J Suomela - ACM Computing Surveys (CSUR), 2013 - dl.acm.org
A local algorithm is a distributed algorithm that runs in constant time, independently of the
size of the network. Being highly scalable and fault tolerant, such algorithms are ideal in the …
size of the network. Being highly scalable and fault tolerant, such algorithms are ideal in the …
Principal neighbourhood aggregation for graph nets
Abstract Graph Neural Networks (GNNs) have been shown to be effective models for
different predictive tasks on graph-structured data. Recent work on their expressive power …
different predictive tasks on graph-structured data. Recent work on their expressive power …
Improving graph neural network expressivity via subgraph isomorphism counting
While Graph Neural Networks (GNNs) have achieved remarkable results in a variety of
applications, recent studies exposed important shortcomings in their ability to capture the …
applications, recent studies exposed important shortcomings in their ability to capture the …
Elements of the theory of dynamic networks
O Michail, PG Spirakis - Communications of the ACM, 2018 - dl.acm.org
Elements of the theory of dynamic networks Page 1 72 COMMUNICATIONS OF THE ACM |
FEBRUARY 2018 | VOL. 61 | NO. 2 review articles IMA GE B Y ANITHA DEVI MUR THI A …
FEBRUARY 2018 | VOL. 61 | NO. 2 review articles IMA GE B Y ANITHA DEVI MUR THI A …
What graph neural networks cannot learn: depth vs width
A Loukas - arXiv preprint arXiv:1907.03199, 2019 - arxiv.org
This paper studies the expressive power of graph neural networks falling within the
message-passing framework (GNNmp). Two results are presented. First, GNNmp are shown …
message-passing framework (GNNmp). Two results are presented. First, GNNmp are shown …
Random features strengthen graph neural networks
Graph neural networks (GNNs) are powerful machine learning models for various graph
learning tasks. Recently, the limitations of the expressive power of various GNN models …
learning tasks. Recently, the limitations of the expressive power of various GNN models …
Graph neural networks for wireless communications: From theory to practice
Deep learning-based approaches have been developed to solve challenging problems in
wireless communications, leading to promising results. Early attempts adopted neural …
wireless communications, leading to promising results. Early attempts adopted neural …
[图书][B] Fundamentals of parameterized complexity
RG Downey, MR Fellows - 2013 - Springer
Parameterized complexity/multivariate complexity algorithmics is an exciting field of modern
algorithm design and analysis, with a broad range of theoretical and practical aspects that …
algorithm design and analysis, with a broad range of theoretical and practical aspects that …
Theory of graph neural networks: Representation and learning
S Jegelka - The International Congress of Mathematicians, 2022 - ems.press
Abstract Graph Neural Networks (GNNs), neural network architectures targeted to learning
representations of graphs, have become a popular learning model for prediction tasks on …
representations of graphs, have become a popular learning model for prediction tasks on …