Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

Timely communication in federated learning

B Buyukates, S Ulukus - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
We consider a federated learning framework in which a parameter server (PS) trains a
global model by using n clients without actually storing the client data centrally at a cloud …

Communication compression techniques in distributed deep learning: A survey

Z Wang, M Wen, Y Xu, Y Zhou, JH Wang… - Journal of Systems …, 2023 - Elsevier
Nowadays, the training data and neural network models are getting increasingly large. The
training time of deep learning will become unbearably long on a single machine. To reduce …

Sparse random networks for communication-efficient federated learning

B Isik, F Pase, D Gunduz, T Weissman… - arXiv preprint arXiv …, 2022 - arxiv.org
One main challenge in federated learning is the large communication cost of exchanging
weight updates from clients to the server at each round. While prior work has made great …

Time-correlated sparsification for communication-efficient federated learning

E Ozfatura, K Ozfatura, D Gündüz - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Federated learning (FL) enables multiple clients to collaboratively train a shared model, with
the help of a parameter server (PS), without disclosing their local datasets. However, due to …

12 Collaborative Learning over Wireless Networks: An Introductory Overview

E Ozfatura, D Gündüz, HV Poor - Machine Learning and Wireless …, 2022 - cambridge.org
The number of devices connected to the Internet has already surpassed 1 billion. With the
increasing proliferation of mobile devices, the amount of data collected and transmitted over …

Resfed: Communication efficient federated learning with deep compressed residuals

R Song, L Zhou, L Lyu, A Festag… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
Federated learning allows for cooperative training among distributed clients by sharing their
locally learned model parameters, such as weights or gradients. However, as model size …

Exact optimality of communication-privacy-utility tradeoffs in distributed mean estimation

B Isik, WN Chen, A Ozgur… - Advances in Neural …, 2024 - proceedings.neurips.cc
We study the mean estimation problem under communication and local differential privacy
constraints. While previous work has proposed order-optimal algorithms for the same …

Fedltn: Federated learning for sparse and personalized lottery ticket networks

V Mugunthan, E Lin, V Gokul, C Lau, L Kagal… - … on Computer Vision, 2022 - Springer
Federated learning (FL) enables clients to collaboratively train a model, while keeping their
local training data decentralized. However, high communication costs, data heterogeneity …

EF-BV: A unified theory of error feedback and variance reduction mechanisms for biased and unbiased compression in distributed optimization

L Condat, K Yi, P Richtárik - Advances in Neural …, 2022 - proceedings.neurips.cc
In distributed or federated optimization and learning, communication between the different
computing units is often the bottleneck and gradient compression is widely used to reduce …