Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arXiv preprint arXiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

A survey on optimization techniques for edge artificial intelligence (ai)

C Surianarayanan, JJ Lawrence, PR Chelliah… - Sensors, 2023 - mdpi.com
Artificial Intelligence (Al) models are being produced and used to solve a variety of current
and future business and technical problems. Therefore, AI model engineering processes …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Federated learning based on dynamic regularization

DAE Acar, Y Zhao, RM Navarro, M Mattina… - arXiv preprint arXiv …, 2021 - arxiv.org
We propose a novel federated learning method for distributively training neural network
models, where the server orchestrates cooperation between a subset of randomly chosen …

Hermes: an efficient federated learning framework for heterogeneous mobile clients

A Li, J Sun, P Li, Y Pu, H Li, Y Chen - Proceedings of the 27th Annual …, 2021 - dl.acm.org
Federated learning (FL) has been a popular method to achieve distributed machine learning
among numerous devices without sharing their data to a cloud server. FL aims to learn a …

Bayesian model-agnostic meta-learning

J Yoon, T Kim, O Dia, S Kim… - Advances in neural …, 2018 - proceedings.neurips.cc
Due to the inherent model uncertainty, learning to infer Bayesian posterior from a few-shot
dataset is an important step towards robust meta-learning. In this paper, we propose a novel …

Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking

A Li, J Sun, X Zeng, M Zhang, H Li, Y Chen - Proceedings of the 19th …, 2021 - dl.acm.org
Recent advancements in deep neural networks (DNN) enabled various mobile deep
learning applications. However, it is technically challenging to locally train a DNN model due …

Vulnerabilities in federated learning

N Bouacida, P Mohapatra - IEEE Access, 2021 - ieeexplore.ieee.org
With more regulations tackling the protection of users' privacy-sensitive data in recent years,
access to such data has become increasingly restricted. A new decentralized training …

A guide through the zoo of biased SGD

Y Demidovich, G Malinovsky… - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Stochastic Gradient Descent (SGD) is arguably the most important single algorithm
in modern machine learning. Although SGD with unbiased gradient estimators has been …

Timely communication in federated learning

B Buyukates, S Ulukus - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
We consider a federated learning framework in which a parameter server (PS) trains a
global model by using n clients without actually storing the client data centrally at a cloud …