Distributed learning in wireless networks: Recent progress and future challenges

M Chen, D Gündüz, K Huang, W Saad… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
The next-generation of wireless networks will enable many machine learning (ML) tools and
applications to efficiently analyze various types of data collected by edge devices for …

Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arXiv preprint arXiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

Fjord: Fair and accurate federated learning under heterogeneous targets with ordered dropout

S Horvath, S Laskaridis, M Almeida… - Advances in …, 2021 - proceedings.neurips.cc
Federated Learning (FL) has been gaining significant traction across different ML tasks,
ranging from vision to keyboard predictions. In large-scale deployments, client heterogeneity …

Hermes: an efficient federated learning framework for heterogeneous mobile clients

A Li, J Sun, P Li, Y Pu, H Li, Y Chen - Proceedings of the 27th Annual …, 2021 - dl.acm.org
Federated learning (FL) has been a popular method to achieve distributed machine learning
among numerous devices without sharing their data to a cloud server. FL aims to learn a …

Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking

A Li, J Sun, X Zeng, M Zhang, H Li, Y Chen - Proceedings of the 19th …, 2021 - dl.acm.org
Recent advancements in deep neural networks (DNN) enabled various mobile deep
learning applications. However, it is technically challenging to locally train a DNN model due …

EF21: A new, simpler, theoretically better, and practically faster error feedback

P Richtárik, I Sokolov… - Advances in Neural …, 2021 - proceedings.neurips.cc
Error feedback (EF), also known as error compensation, is an immensely popular
convergence stabilization mechanism in the context of distributed training of supervised …

Optimal client sampling for federated learning

W Chen, S Horvath, P Richtarik - arXiv preprint arXiv:2010.13723, 2020 - arxiv.org
It is well understood that client-master communication can be a primary bottleneck in
Federated Learning. In this work, we address this issue with a novel client subsampling …

UVeQFed: Universal vector quantization for federated learning

N Shlezinger, M Chen, YC Eldar… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Traditional deep learning models are trained at a centralized server using data samples
collected from users. Such data samples often include private information, which the users …

A guide through the zoo of biased SGD

Y Demidovich, G Malinovsky… - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Stochastic Gradient Descent (SGD) is arguably the most important single algorithm
in modern machine learning. Although SGD with unbiased gradient estimators has been …