Structured pruning for deep convolutional neural networks: A survey

Y He, L Xiao - IEEE transactions on pattern analysis and …, 2023 - ieeexplore.ieee.org
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …

Survey on evolutionary deep learning: Principles, algorithms, applications, and open issues

N Li, L Ma, G Yu, B Xue, M Zhang, Y Jin - ACM Computing Surveys, 2023 - dl.acm.org
Over recent years, there has been a rapid development of deep learning (DL) in both
industry and academia fields. However, finding the optimal hyperparameters of a DL model …

Continual semantic segmentation with automatic memory sample selection

L Zhu, T Chen, J Yin, S See… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Abstract Continual Semantic Segmentation (CSS) extends static semantic segmentation by
incrementally introducing new classes for training. To alleviate the catastrophic forgetting …

Automatic network pruning via hilbert-schmidt independence criterion lasso under information bottleneck principle

S Guo, L Zhang, X Zheng, Y Wang… - Proceedings of the …, 2023 - openaccess.thecvf.com
Most existing neural network pruning methods hand-crafted their importance criteria and
structures to prune. This constructs heavy and unintended dependencies on heuristics and …

Boosting zero-shot learning via contrastive optimization of attribute representations

Y Du, M Shi, F Wei, G Li - IEEE Transactions on Neural …, 2023 - ieeexplore.ieee.org
Zero-shot learning (ZSL) aims to recognize classes that do not have samples in the training
set. One representative solution is to directly learn an embedding function associating visual …

Compressing convolutional neural networks with hierarchical Tucker-2 decomposition

M Gabor, R Zdunek - Applied Soft Computing, 2023 - Elsevier
Convolutional neural networks (CNNs) play a crucial role and achieve top results in
computer vision tasks but at the cost of high computational cost and storage complexity. One …

HALOC: hardware-aware automatic low-rank compression for compact neural networks

J Xiao, C Zhang, Y Gong, M Yin, Y Sui… - Proceedings of the …, 2023 - ojs.aaai.org
Low-rank compression is an important model compression strategy for obtaining compact
neural network models. In general, because the rank values directly determine the model …

MCMC: Multi-Constrained Model Compression via One-Stage Envelope Reinforcement Learning

S Li, J Chen, S Liu, C Zhu, G Tian… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Model compression methods are being developed to bridge the gap between the massive
scale of neural networks and the limited hardware resources on edge devices. Since most …

Enhanced network compression through tensor decompositions and pruning

Y Zniyed, TP Nguyen - IEEE Transactions on Neural …, 2024 - ieeexplore.ieee.org
Network compression techniques that combine tensor decompositions and pruning have
shown promise in leveraging the advantages of both strategies. In this work, we propose …

Consecutive layer collaborative filter similarity for differentiable neural network pruning

X Zu, Y Li, B Yin - Neurocomputing, 2023 - Elsevier
Filter pruning is proven to be an effective strategy in model compression. However,
convolutional filter pruning methods usually pay all attention to evaluating filters' importance …