Faster adaptive federated learning

X Wu, F Huang, Z Hu, H Huang - … of the AAAI conference on artificial …, 2023 - ojs.aaai.org
Federated learning has attracted increasing attention with the emergence of distributed data.
While extensive federated learning algorithms have been proposed for the non-convex …

MARINA: Faster non-convex distributed learning with compression

E Gorbunov, KP Burlachenko, Z Li… - … on Machine Learning, 2021 - proceedings.mlr.press
We develop and analyze MARINA: a new communication efficient method for non-convex
distributed learning over heterogeneous datasets. MARINA employs a novel communication …

Quasi-global momentum: Accelerating decentralized deep learning on heterogeneous data

T Lin, SP Karimireddy, SU Stich, M Jaggi - arXiv preprint arXiv:2102.04761, 2021 - arxiv.org
Decentralized training of deep learning models is a key element for enabling data privacy
and on-device learning over networks. In realistic learning scenarios, the presence of …

Towards efficient communications in federated learning: A contemporary survey

Z Zhao, Y Mao, Y Liu, L Song, Y Ouyang… - Journal of the Franklin …, 2023 - Elsevier
In the traditional distributed machine learning scenario, the user's private data is transmitted
between clients and a central server, which results in significant potential privacy risks. In …

Federated conditional stochastic optimization

X Wu, J Sun, Z Hu, J Li, A Zhang… - Advances in Neural …, 2024 - proceedings.neurips.cc
Conditional stochastic optimization has found applications in a wide range of machine
learning tasks, such as invariant learning, AUPRC maximization, and meta-learning. As the …

Fedvarp: Tackling the variance due to partial client participation in federated learning

D Jhunjhunwala, P Sharma… - Uncertainty in …, 2022 - proceedings.mlr.press
Data-heterogeneous federated learning (FL) systems suffer from two significant sources of
convergence error: 1) client drift error caused by performing multiple local optimization steps …

Dynamic regularized sharpness aware minimization in federated learning: Approaching global consistency and smooth landscape

Y Sun, L Shen, S Chen, L Ding… - … Conference on Machine …, 2023 - proceedings.mlr.press
In federated learning (FL), a cluster of local clients are chaired under the coordination of the
global server and cooperatively train one model with privacy protection. Due to the multiple …

Stem: A stochastic two-sided momentum algorithm achieving near-optimal sample and communication complexities for federated learning

P Khanduri, P Sharma, H Yang… - Advances in …, 2021 - proceedings.neurips.cc
Federated Learning (FL) refers to the paradigm where multiple worker nodes (WNs) build a
joint model by using local data. Despite extensive research, for a generic non-convex FL …

On the unreasonable effectiveness of federated averaging with heterogeneous data

J Wang, R Das, G Joshi, S Kale, Z Xu… - arXiv preprint arXiv …, 2022 - arxiv.org
Existing theory predicts that data heterogeneity will degrade the performance of the
Federated Averaging (FedAvg) algorithm in federated learning. However, in practice, the …

Bias-variance reduced local SGD for less heterogeneous federated learning

T Murata, T Suzuki - arXiv preprint arXiv:2102.03198, 2021 - arxiv.org
Recently, local SGD has got much attention and been extensively studied in the distributed
learning community to overcome the communication bottleneck problem. However, the …