Communication-efficient distributed deep learning: A comprehensive survey
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …
A survey on optimization techniques for edge artificial intelligence (ai)
C Surianarayanan, JJ Lawrence, PR Chelliah… - Sensors, 2023 - mdpi.com
Artificial Intelligence (Al) models are being produced and used to solve a variety of current
and future business and technical problems. Therefore, AI model engineering processes …
and future business and technical problems. Therefore, AI model engineering processes …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Federated learning based on dynamic regularization
We propose a novel federated learning method for distributively training neural network
models, where the server orchestrates cooperation between a subset of randomly chosen …
models, where the server orchestrates cooperation between a subset of randomly chosen …
Hermes: an efficient federated learning framework for heterogeneous mobile clients
Federated learning (FL) has been a popular method to achieve distributed machine learning
among numerous devices without sharing their data to a cloud server. FL aims to learn a …
among numerous devices without sharing their data to a cloud server. FL aims to learn a …
Bayesian model-agnostic meta-learning
Due to the inherent model uncertainty, learning to infer Bayesian posterior from a few-shot
dataset is an important step towards robust meta-learning. In this paper, we propose a novel …
dataset is an important step towards robust meta-learning. In this paper, we propose a novel …
Fedmask: Joint computation and communication-efficient personalized federated learning via heterogeneous masking
Recent advancements in deep neural networks (DNN) enabled various mobile deep
learning applications. However, it is technically challenging to locally train a DNN model due …
learning applications. However, it is technically challenging to locally train a DNN model due …
Vulnerabilities in federated learning
N Bouacida, P Mohapatra - IEEE Access, 2021 - ieeexplore.ieee.org
With more regulations tackling the protection of users' privacy-sensitive data in recent years,
access to such data has become increasingly restricted. A new decentralized training …
access to such data has become increasingly restricted. A new decentralized training …
A guide through the zoo of biased SGD
Y Demidovich, G Malinovsky… - Advances in Neural …, 2023 - proceedings.neurips.cc
Abstract Stochastic Gradient Descent (SGD) is arguably the most important single algorithm
in modern machine learning. Although SGD with unbiased gradient estimators has been …
in modern machine learning. Although SGD with unbiased gradient estimators has been …
Timely communication in federated learning
B Buyukates, S Ulukus - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
We consider a federated learning framework in which a parameter server (PS) trains a
global model by using n clients without actually storing the client data centrally at a cloud …
global model by using n clients without actually storing the client data centrally at a cloud …