Communication-efficient distributed deep learning: A comprehensive survey
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Model pruning enables efficient federated learning on edge devices
Federated learning (FL) allows model training from local data collected by edge/mobile
devices while preserving data privacy, which has wide applicability to image and vision …
devices while preserving data privacy, which has wide applicability to image and vision …
A comprehensive survey on training acceleration for large machine learning models in IoT
The ever-growing artificial intelligence (AI) applications have greatly reshaped our world in
many areas, eg, smart home, computer vision, natural language processing, etc. Behind …
many areas, eg, smart home, computer vision, natural language processing, etc. Behind …
Gossipfl: A decentralized federated learning framework with sparsified and adaptive communication
Recently, federated learning (FL) techniques have enabled multiple users to train machine
learning models collaboratively without data sharing. However, existing FL algorithms suffer …
learning models collaboratively without data sharing. However, existing FL algorithms suffer …
Adaptive gradient sparsification for efficient federated learning: An online learning approach
Federated learning (FL) is an emerging technique for training machine learning models
using geographically dispersed data collected by local entities. It includes local computation …
using geographically dispersed data collected by local entities. It includes local computation …
Joint model pruning and device selection for communication-efficient federated edge learning
In recent years, wireless federated learning (FL) has been proposed to support the mobile
intelligent applications over the wireless network, which protects the data privacy and …
intelligent applications over the wireless network, which protects the data privacy and …
Near-optimal sparse allreduce for distributed deep learning
Communication overhead is one of the major obstacles to train large deep learning models
at scale. Gradient sparsification is a promising technique to reduce the communication …
at scale. Gradient sparsification is a promising technique to reduce the communication …
Compressed-vfl: Communication-efficient learning with vertically partitioned data
Abstract We propose Compressed Vertical Federated Learning (C-VFL) for communication-
efficient training on vertically partitioned data. In C-VFL, a server and multiple parties …
efficient training on vertically partitioned data. In C-VFL, a server and multiple parties …
Towards scalable distributed training of deep learning on public cloud clusters
Distributed training techniques have been widely deployed in large-scale deep models
training on dense-GPU clusters. However, on public cloud clusters, due to the moderate …
training on dense-GPU clusters. However, on public cloud clusters, due to the moderate …