Structured pruning for deep convolutional neural networks: A survey
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …
attributed to their deeper and wider architectures, which can come with significant …
Learning best combination for efficient n: M sparsity
By forcing N out of M consecutive weights to be non-zero, the recent N: M fine-grained
network sparsity has received increasing attention with its two attractive advantages over …
network sparsity has received increasing attention with its two attractive advantages over …
Rgp: Neural network pruning through regular graph with edges swapping
Deep learning technology has found a promising application in lightweight model design, for
which pruning is an effective means of achieving a large reduction in both model parameters …
which pruning is an effective means of achieving a large reduction in both model parameters …
Model compression of deep neural network architectures for visual pattern recognition: Current status and future directions
S Bhalgaonkar, M Munot - Computers and Electrical Engineering, 2024 - Elsevier
Abstract Visual Pattern Recognition Networks (VPRNs) are widely used in various visual
data based applications such as computer vision and edge AI. VPRNs help to enhance a …
data based applications such as computer vision and edge AI. VPRNs help to enhance a …
Multidimensional pruning and its extension: A unified framework for model compression
Observing that the existing model compression approaches only focus on reducing the
redundancies in convolutional neural networks (CNNs) along one particular dimension (eg …
redundancies in convolutional neural networks (CNNs) along one particular dimension (eg …
EACP: An effective automatic channel pruning for neural networks
The large data scale and computational resources required by Convolutional Neural
Networks (CNNs) hinder the practical application on mobile devices. However, channel …
Networks (CNNs) hinder the practical application on mobile devices. However, channel …
Reaf: Remembering enhancement and entropy-based asymptotic forgetting for filter pruning
Neurologically, filter pruning is a procedure of forgetting and remembering recovering.
Prevailing methods directly forget less important information from an unrobust baseline at …
Prevailing methods directly forget less important information from an unrobust baseline at …
FPFS: Filter-level pruning via distance weight measuring filter similarity
W Zhang, Z Wang - Neurocomputing, 2022 - Elsevier
Abstract Deep Neural Networks (DNNs) enjoy the welfare of convolution, while also bearing
huge computational pressure. Therefore, model compression techniques are used to …
huge computational pressure. Therefore, model compression techniques are used to …
An accelerating convolutional neural networks via a 2D entropy based-adaptive filter search method for image recognition
The success of CNNs for various vision tasks has been accompanied by a significant
increase in required FLOPs and parameter quantities, which has impeded the deployment of …
increase in required FLOPs and parameter quantities, which has impeded the deployment of …
Co-exploring structured sparsification and low-rank tensor decomposition for compact dnns
Sparsification and low-rank decomposition are two important techniques to compress deep
neural network (DNN) models. To date, these two popular yet distinct approaches are …
neural network (DNN) models. To date, these two popular yet distinct approaches are …