Distributed artificial intelligence empowered by end-edge-cloud computing: A survey

S Duan, D Wang, J Ren, F Lyu, Y Zhang… - … Surveys & Tutorials, 2022 - ieeexplore.ieee.org
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …

Communication-efficient distributed learning: An overview

X Cao, T Başar, S Diggavi, YC Eldar… - IEEE journal on …, 2023 - ieeexplore.ieee.org
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Model pruning enables efficient federated learning on edge devices

Y Jiang, S Wang, V Valls, BJ Ko… - … on Neural Networks …, 2022 - ieeexplore.ieee.org
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …

Adaptive gradient sparsification for efficient federated learning: An online learning approach

P Han, S Wang, KK Leung - 2020 IEEE 40th international …, 2020 - ieeexplore.ieee.org
Federated learning (FL) is an emerging technique for training machine learning models
using geographically dispersed data collected by local entities. It includes local computation …

Towards efficient communications in federated learning: A contemporary survey

Z Zhao, Y Mao, Y Liu, L Song, Y Ouyang… - Journal of the Franklin …, 2023 - Elsevier
In the traditional distributed machine learning scenario, the user's private data is transmitted
between clients and a central server, which results in significant potential privacy risks. In …

Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arXiv preprint arXiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

1-bit adam: Communication efficient large-scale training with adam's convergence speed

H Tang, S Gan, AA Awan… - International …, 2021 - proceedings.mlr.press
Scalable training of large models (like BERT and GPT-3) requires careful optimization
rooted in model design, architecture, and system capabilities. From a system standpoint …

Blockchain-supported federated learning for trustworthy vehicular networks

S Otoum, I Al Ridhawi… - GLOBECOM 2020-2020 …, 2020 - ieeexplore.ieee.org
The advances in today's IoT devices and machine learning methods have given rise to the
concept of Federated Learning. Through such a technique, a plethora of network devices …

A survey of what to share in federated learning: Perspectives on model utility, privacy leakage, and communication efficiency

J Shao, Z Li, W Sun, T Zhou, Y Sun, L Liu, Z Lin… - arXiv preprint arXiv …, 2023 - arxiv.org
Federated learning (FL) has emerged as a secure paradigm for collaborative training among
clients. Without data centralization, FL allows clients to share local information in a privacy …