Vertical federated learning: Concepts, advances, and challenges
Vertical Federated Learning (VFL) is a federated learning setting where multiple parties with
different features about the same set of users jointly train machine learning models without …
different features about the same set of users jointly train machine learning models without …
Demystifying parallel and distributed deep learning: An in-depth concurrency analysis
Deep Neural Networks (DNNs) are becoming an important tool in modern computing
applications. Accelerating their training is a major challenge and techniques range from …
applications. Accelerating their training is a major challenge and techniques range from …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Flower: A friendly federated learning research framework
Federated Learning (FL) has emerged as a promising technique for edge devices to
collaboratively learn a shared prediction model, while keeping their training data on the …
collaboratively learn a shared prediction model, while keeping their training data on the …
Training neural networks with fixed sparse masks
During typical gradient-based training of deep neural networks, all of the model's
parameters are updated at each iteration. Recent work has shown that it is possible to …
parameters are updated at each iteration. Recent work has shown that it is possible to …
Sparsified SGD with memory
SU Stich, JB Cordonnier… - Advances in neural …, 2018 - proceedings.neurips.cc
Huge scale machine learning problems are nowadays tackled by distributed optimization
algorithms, ie algorithms that leverage the compute power of many devices for training. The …
algorithms, ie algorithms that leverage the compute power of many devices for training. The …
Model pruning enables efficient federated learning on edge devices
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …
devices while preserving data privacy, which has wide applicability to image and vision …
[HTML][HTML] A survey of federated learning for edge computing: Research problems and solutions
Federated Learning is a machine learning scheme in which a shared prediction model can
be collaboratively learned by a number of distributed nodes using their locally stored data. It …
be collaboratively learned by a number of distributed nodes using their locally stored data. It …
Local SGD converges fast and communicates little
SU Stich - arXiv preprint arXiv:1805.09767, 2018 - arxiv.org
Mini-batch stochastic gradient descent (SGD) is state of the art in large scale distributed
training. The scheme can reach a linear speedup with respect to the number of workers, but …
training. The scheme can reach a linear speedup with respect to the number of workers, but …
Deep gradient compression: Reducing the communication bandwidth for distributed training
Large-scale distributed training requires significant communication bandwidth for gradient
exchange that limits the scalability of multi-node training, and requires expensive high …
exchange that limits the scalability of multi-node training, and requires expensive high …