Distributed learning in wireless networks: Recent progress and future challenges

M Chen, D Gündüz, K Huang, W Saad… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
The next-generation of wireless networks will enable many machine learning (ML) tools and
applications to efficiently analyze various types of data collected by edge devices for …

A review of convolutional neural network architectures and their optimizations

S Cong, Y Zhou - Artificial Intelligence Review, 2023 - Springer
The research advances concerning the typical architectures of convolutional neural
networks (CNNs) as well as their optimizations are analyzed and elaborated in detail in this …

Beyond transmitting bits: Context, semantics, and task-oriented communications

D Gündüz, Z Qin, IE Aguerri, HS Dhillon… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Communication systems to date primarily aim at reliably communicating bit sequences.
Such an approach provides efficient engineering designs that are agnostic to the meanings …

Blueprint separable residual network for efficient image super-resolution

Z Li, Y Liu, X Chen, H Cai, J Gu… - Proceedings of the …, 2022 - openaccess.thecvf.com
Recent advances in single image super-resolution (SISR) have achieved extraordinary
performance, but the computational cost is too heavy to apply in edge devices. To alleviate …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Pruning and quantization for deep neural network acceleration: A survey

T Liang, J Glossner, L Wang, S Shi, X Zhang - Neurocomputing, 2021 - Elsevier
Deep neural networks have been applied in many applications exhibiting extraordinary
abilities in the field of computer vision. However, complex network architectures challenge …

Structured pruning for deep convolutional neural networks: A survey

Y He, L Xiao - IEEE transactions on pattern analysis and …, 2023 - ieeexplore.ieee.org
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …

Binary neural networks: A survey

H Qin, R Gong, X Liu, X Bai, J Song, N Sebe - Pattern Recognition, 2020 - Elsevier
The binary neural network, largely saving the storage and computation, serves as a
promising technique for deploying deep models on resource-limited devices. However, the …

Importance estimation for neural network pruning

P Molchanov, A Mallya, S Tyree… - Proceedings of the …, 2019 - openaccess.thecvf.com
Structural pruning of neural network parameters reduces computational, energy, and
memory transfer costs during inference. We propose a novel method that estimates the …

Rethinking the value of network pruning

Z Liu, M Sun, T Zhou, G Huang, T Darrell - arXiv preprint arXiv:1810.05270, 2018 - arxiv.org
Network pruning is widely used for reducing the heavy inference cost of deep models in low-
resource settings. A typical pruning algorithm is a three-stage pipeline, ie, training (a large …