Vertical federated learning: Concepts, advances, and challenges

Y Liu, Y Kang, T Zou, Y Pu, Y He, X Ye… - … on Knowledge and …, 2024 - ieeexplore.ieee.org
Vertical Federated Learning (VFL) is a federated learning setting where multiple parties with
different features about the same set of users jointly train machine learning models without …

Demystifying parallel and distributed deep learning: An in-depth concurrency analysis

T Ben-Nun, T Hoefler - ACM Computing Surveys (CSUR), 2019 - dl.acm.org
Deep Neural Networks (DNNs) are becoming an important tool in modern computing
applications. Accelerating their training is a major challenge and techniques range from …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Flower: A friendly federated learning research framework

DJ Beutel, T Topal, A Mathur, X Qiu… - arXiv preprint arXiv …, 2020 - arxiv.org
Federated Learning (FL) has emerged as a promising technique for edge devices to
collaboratively learn a shared prediction model, while keeping their training data on the …

Training neural networks with fixed sparse masks

YL Sung, V Nair, CA Raffel - Advances in Neural …, 2021 - proceedings.neurips.cc
During typical gradient-based training of deep neural networks, all of the model's
parameters are updated at each iteration. Recent work has shown that it is possible to …

Sparsified SGD with memory

SU Stich, JB Cordonnier… - Advances in neural …, 2018 - proceedings.neurips.cc
Huge scale machine learning problems are nowadays tackled by distributed optimization
algorithms, ie algorithms that leverage the compute power of many devices for training. The …

Model pruning enables efficient federated learning on edge devices

Y Jiang, S Wang, V Valls, BJ Ko… - … on Neural Networks …, 2022 - ieeexplore.ieee.org
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …

[HTML][HTML] A survey of federated learning for edge computing: Research problems and solutions

Q Xia, W Ye, Z Tao, J Wu, Q Li - High-Confidence Computing, 2021 - Elsevier
Federated Learning is a machine learning scheme in which a shared prediction model can
be collaboratively learned by a number of distributed nodes using their locally stored data. It …

Local SGD converges fast and communicates little

SU Stich - arXiv preprint arXiv:1805.09767, 2018 - arxiv.org
Mini-batch stochastic gradient descent (SGD) is state of the art in large scale distributed
training. The scheme can reach a linear speedup with respect to the number of workers, but …

Deep gradient compression: Reducing the communication bandwidth for distributed training

Y Lin, S Han, H Mao, Y Wang, WJ Dally - arXiv preprint arXiv:1712.01887, 2017 - arxiv.org
Large-scale distributed training requires significant communication bandwidth for gradient
exchange that limits the scalability of multi-node training, and requires expensive high …