Advances and open problems in federated learning
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …
devices or whole organizations) collaboratively train a model under the orchestration of a …
Breaking the communication-privacy-accuracy trilemma
Two major challenges in distributed learning and estimation are 1) preserving the privacy of
the local samples; and 2) communicating them efficiently to a central server, while achieving …
the local samples; and 2) communicating them efficiently to a central server, while achieving …
Personalized federated learning by structured and unstructured pruning under data heterogeneity
The traditional approach in FL tries to learn a single global model collaboratively with the
help of many clients under the orchestration of a central server. However, learning a single …
help of many clients under the orchestration of a central server. However, learning a single …
Privacy amplification via compression: Achieving the optimal privacy-accuracy-communication trade-off in distributed mean estimation
Privacy and communication constraints are two major bottlenecks in federated learning (FL)
and analytics (FA). We study the optimal accuracy of mean and frequency estimation …
and analytics (FA). We study the optimal accuracy of mean and frequency estimation …
Communication-efficient federated learning with binary neural networks
Federated learning (FL) is a privacy-preserving machine learning setting that enables many
devices to jointly train a shared global model without the need to reveal their data to a …
devices to jointly train a shared global model without the need to reveal their data to a …
Optimal compression of locally differentially private mechanisms
Compressing the output of $\epsilon $-locally differentially private (LDP) randomizers
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …
naively leads to suboptimal utility. In this work, we demonstrate the benefits of using …
Model compression for communication efficient federated learning
Despite the many advantages of using deep neural networks over shallow networks in
various machine learning tasks, their effectiveness is compromised in a federated learning …
various machine learning tasks, their effectiveness is compromised in a federated learning …
Lower bounds for learning distributions under communication constraints via fisher information
We consider the problem of learning high-dimensional, nonparametric and structured (eg,
Gaussian) distributions in distributed networks, where each node in the network observes an …
Gaussian) distributions in distributed networks, where each node in the network observes an …
Inference under information constraints I: Lower bounds from chi-square contraction
Multiple players are each given one independent sample, about which they can only provide
limited information to a central referee. Each player is allowed to describe its observed …
limited information to a central referee. Each player is allowed to describe its observed …
rTop-k: A Statistical Estimation Approach to Distributed SGD
The large communication cost for exchanging gradients between different nodes
significantly limits the scalability of distributed training for large-scale learning models …
significantly limits the scalability of distributed training for large-scale learning models …