Distributed graph neural network training: A survey
Graph neural networks (GNNs) are a type of deep learning models that are trained on
graphs and have been successfully applied in various domains. Despite the effectiveness of …
graphs and have been successfully applied in various domains. Despite the effectiveness of …
Federated learning over images: vertical decompositions and pre-trained backbones are difficult to beat
We carefully evaluate a number of algorithms for learning in a federated environment, and
test their utility for a variety of image classification tasks. We consider many issues that have …
test their utility for a variety of image classification tasks. We consider many issues that have …
Efficient and light-weight federated learning via asynchronous distributed dropout
C Dun, M Hipolito, C Jermaine… - International …, 2023 - proceedings.mlr.press
Asynchronous learning protocols have regained attention lately, especially in the Federated
Learning (FL) setup, where slower clients can severely impede the learning process. Herein …
Learning (FL) setup, where slower clients can severely impede the learning process. Herein …
Submodel partitioning in hierarchical federated learning: Algorithm design and convergence analysis
Hierarchical federated learning (HFL) has demonstrated promising scalability advantages
over the traditional" star-topology" architecture-based federated learning (FL). However, HFL …
over the traditional" star-topology" architecture-based federated learning (FL). However, HFL …
On the convergence of shallow neural network training with randomly masked neurons
F Liao, A Kyrillidis - arXiv preprint arXiv:2112.02668, 2021 - arxiv.org
With the motive of training all the parameters of a neural network, we study why and when
one can achieve this by iteratively creating, training, and combining randomly selected …
one can achieve this by iteratively creating, training, and combining randomly selected …
FedP3: Federated Personalized and Privacy-friendly Network Pruning under Model Heterogeneity
The interest in federated learning has surged in recent research due to its unique ability to
train a global model using privacy-secured information held locally on each client. This …
train a global model using privacy-secured information held locally on each client. This …
Towards a better theoretical understanding of independent subnetwork training
E Shulgin, P Richtárik - arXiv preprint arXiv:2306.16484, 2023 - arxiv.org
Modern advancements in large-scale machine learning would be impossible without the
paradigm of data-parallel distributed computing. Since distributed computing with large …
paradigm of data-parallel distributed computing. Since distributed computing with large …
Sugar: Efficient subgraph-level training via resource-aware graph partitioning
Graph Neural Networks (GNNs) have demonstrated a great potential in a variety of graph-
based applications, such as recommender systems, drug discovery, and object recognition …
based applications, such as recommender systems, drug discovery, and object recognition …
Leveraging Sparse Input and Sparse Models: Efficient Distributed Learning in Resource-Constrained Environments
E Kariotakis, G Tsagkatakis… - … on Parsimony and …, 2024 - proceedings.mlr.press
Optimizing for reduced computational and bandwidth resources enables model training in
less-than-ideal environments and paves the way for practical and accessible AI solutions …
less-than-ideal environments and paves the way for practical and accessible AI solutions …
MAST: Model-Agnostic Sparsified Training
We introduce a novel optimization problem formulation that departs from the conventional
way of minimizing machine learning model loss as a black-box function. Unlike traditional …
way of minimizing machine learning model loss as a black-box function. Unlike traditional …