A comprehensive survey on training acceleration for large machine learning models in IoT
The ever-growing artificial intelligence (AI) applications have greatly reshaped our world in
many areas, eg, smart home, computer vision, natural language processing, etc. Behind …
many areas, eg, smart home, computer vision, natural language processing, etc. Behind …
Communication compression techniques in distributed deep learning: A survey
Nowadays, the training data and neural network models are getting increasingly large. The
training time of deep learning will become unbearably long on a single machine. To reduce …
training time of deep learning will become unbearably long on a single machine. To reduce …
Federated learning over wireless device-to-device networks: Algorithms and convergence analysis
The proliferation of Internet-of-Things (IoT) devices and cloud-computing applications over
siloed data centers is motivating renewed interest in the collaborative training of a shared …
siloed data centers is motivating renewed interest in the collaborative training of a shared …
Quasi-global momentum: Accelerating decentralized deep learning on heterogeneous data
Decentralized training of deep learning models is a key element for enabling data privacy
and on-device learning over networks. In realistic learning scenarios, the presence of …
and on-device learning over networks. In realistic learning scenarios, the presence of …
1% vs 100%: Parameter-efficient low rank adapter for dense predictions
Fine-tuning large-scale pre-trained vision models to downstream tasks is a standard
technique for achieving state-of-the-art performance on computer vision benchmarks …
technique for achieving state-of-the-art performance on computer vision benchmarks …
Cross-gradient aggregation for decentralized learning from non-iid data
Decentralized learning enables a group of collaborative agents to learn models using a
distributed dataset without the need for a central parameter server. Recently, decentralized …
distributed dataset without the need for a central parameter server. Recently, decentralized …
Datalens: Scalable privacy preserving training via gradient compression and aggregation
Recent success of deep neural networks (DNNs) hinges on the availability of large-scale
dataset; however, training on such dataset often poses privacy risks for sensitive training …
dataset; however, training on such dataset often poses privacy risks for sensitive training …
Rank diminishing in deep neural networks
The rank of neural networks measures information flowing across layers. It is an instance of
a key structural condition that applies across broad domains of machine learning. In …
a key structural condition that applies across broad domains of machine learning. In …
Layer-wise adaptive model aggregation for scalable federated learning
Abstract In Federated Learning (FL), a common approach for aggregating local solutions
across clients is periodic full model averaging. It is, however, known that different layers of …
across clients is periodic full model averaging. It is, however, known that different layers of …
Secure decentralized image classification with multiparty homomorphic encryption
Decentralized image classification plays a key role in various scenarios due to its attractive
properties, including tolerating high network latency and less prone to single-point failures …
properties, including tolerating high network latency and less prone to single-point failures …