Advances and open problems in federated learning
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …
devices or whole organizations) collaboratively train a model under the orchestration of a …
Timely communication in federated learning
B Buyukates, S Ulukus - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
We consider a federated learning framework in which a parameter server (PS) trains a
global model by using n clients without actually storing the client data centrally at a cloud …
global model by using n clients without actually storing the client data centrally at a cloud …
Communication compression techniques in distributed deep learning: A survey
Nowadays, the training data and neural network models are getting increasingly large. The
training time of deep learning will become unbearably long on a single machine. To reduce …
training time of deep learning will become unbearably long on a single machine. To reduce …
Sparse random networks for communication-efficient federated learning
One main challenge in federated learning is the large communication cost of exchanging
weight updates from clients to the server at each round. While prior work has made great …
weight updates from clients to the server at each round. While prior work has made great …
Time-correlated sparsification for communication-efficient federated learning
Federated learning (FL) enables multiple clients to collaboratively train a shared model, with
the help of a parameter server (PS), without disclosing their local datasets. However, due to …
the help of a parameter server (PS), without disclosing their local datasets. However, due to …
12 Collaborative Learning over Wireless Networks: An Introductory Overview
The number of devices connected to the Internet has already surpassed 1 billion. With the
increasing proliferation of mobile devices, the amount of data collected and transmitted over …
increasing proliferation of mobile devices, the amount of data collected and transmitted over …
Resfed: Communication efficient federated learning with deep compressed residuals
Federated learning allows for cooperative training among distributed clients by sharing their
locally learned model parameters, such as weights or gradients. However, as model size …
locally learned model parameters, such as weights or gradients. However, as model size …
Exact optimality of communication-privacy-utility tradeoffs in distributed mean estimation
We study the mean estimation problem under communication and local differential privacy
constraints. While previous work has proposed order-optimal algorithms for the same …
constraints. While previous work has proposed order-optimal algorithms for the same …
Fedltn: Federated learning for sparse and personalized lottery ticket networks
Federated learning (FL) enables clients to collaboratively train a model, while keeping their
local training data decentralized. However, high communication costs, data heterogeneity …
local training data decentralized. However, high communication costs, data heterogeneity …
EF-BV: A unified theory of error feedback and variance reduction mechanisms for biased and unbiased compression in distributed optimization
In distributed or federated optimization and learning, communication between the different
computing units is often the bottleneck and gradient compression is widely used to reduce …
computing units is often the bottleneck and gradient compression is widely used to reduce …