Distributed artificial intelligence empowered by end-edge-cloud computing: A survey
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …
also supports artificial intelligence evolving from a centralized manner to a distributed one …
Communication-efficient distributed learning: An overview
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Model pruning enables efficient federated learning on edge devices
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …
devices while preserving data privacy, which has wide applicability to image and vision …
Adaptive gradient sparsification for efficient federated learning: An online learning approach
Federated learning (FL) is an emerging technique for training machine learning models
using geographically dispersed data collected by local entities. It includes local computation …
using geographically dispersed data collected by local entities. It includes local computation …
Towards efficient communications in federated learning: A contemporary survey
In the traditional distributed machine learning scenario, the user's private data is transmitted
between clients and a central server, which results in significant potential privacy risks. In …
between clients and a central server, which results in significant potential privacy risks. In …
Communication-efficient distributed deep learning: A comprehensive survey
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
1-bit adam: Communication efficient large-scale training with adam's convergence speed
Scalable training of large models (like BERT and GPT-3) requires careful optimization
rooted in model design, architecture, and system capabilities. From a system standpoint …
rooted in model design, architecture, and system capabilities. From a system standpoint …
Blockchain-supported federated learning for trustworthy vehicular networks
S Otoum, I Al Ridhawi… - GLOBECOM 2020-2020 …, 2020 - ieeexplore.ieee.org
The advances in today's IoT devices and machine learning methods have given rise to the
concept of Federated Learning. Through such a technique, a plethora of network devices …
concept of Federated Learning. Through such a technique, a plethora of network devices …
A survey of what to share in federated learning: Perspectives on model utility, privacy leakage, and communication efficiency
Federated learning (FL) has emerged as a secure paradigm for collaborative training among
clients. Without data centralization, FL allows clients to share local information in a privacy …
clients. Without data centralization, FL allows clients to share local information in a privacy …