Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

Breaking the communication-privacy-accuracy trilemma

WN Chen, P Kairouz, A Ozgur - Advances in Neural …, 2020 - proceedings.neurips.cc
Two major challenges in distributed learning and estimation are 1) preserving the privacy of
the local samples; and 2) communicating them efficiently to a central server, while achieving …

Personalized federated learning by structured and unstructured pruning under data heterogeneity

S Vahidian, M Morafah, B Lin - 2021 IEEE 41st international …, 2021 - ieeexplore.ieee.org
The traditional approach in FL tries to learn a single global model collaboratively with the
help of many clients under the orchestration of a central server. However, learning a single …

Privacy amplification via compression: Achieving the optimal privacy-accuracy-communication trade-off in distributed mean estimation

WN Chen, D Song, A Ozgur… - Advances in Neural …, 2024 - proceedings.neurips.cc
Privacy and communication constraints are two major bottlenecks in federated learning (FL)
and analytics (FA). We study the optimal accuracy of mean and frequency estimation …

Communication-efficient federated learning with binary neural networks

Y Yang, Z Zhang, Q Yang - IEEE Journal on Selected Areas in …, 2021 - ieeexplore.ieee.org
Federated learning (FL) is a privacy-preserving machine learning setting that enables many
devices to jointly train a shared global model without the need to reveal their data to a …

Optimal compression of locally differentially private mechanisms

A Shah, WN Chen, J Balle… - International …, 2022 - proceedings.mlr.press
Compressing the output of $\epsilon $-locally differentially private (LDP) randomizers
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …

Model compression for communication efficient federated learning

SM Shah, VKN Lau - IEEE Transactions on Neural Networks …, 2021 - ieeexplore.ieee.org
Despite the many advantages of using deep neural networks over shallow networks in
various machine learning tasks, their effectiveness is compromised in a federated learning …

Lower bounds for learning distributions under communication constraints via fisher information

LP Barnes, Y Han, A Ozgur - Journal of Machine Learning Research, 2020 - jmlr.org
We consider the problem of learning high-dimensional, nonparametric and structured (eg,
Gaussian) distributions in distributed networks, where each node in the network observes an …

Inference under information constraints I: Lower bounds from chi-square contraction

J Acharya, CL Canonne, H Tyagi - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Multiple players are each given one independent sample, about which they can only provide
limited information to a central referee. Each player is allowed to describe its observed …

rTop-k: A Statistical Estimation Approach to Distributed SGD

LP Barnes, HA Inan, B Isik… - IEEE Journal on Selected …, 2020 - ieeexplore.ieee.org
The large communication cost for exchanging gradients between different nodes
significantly limits the scalability of distributed training for large-scale learning models …