Structured pruning for deep convolutional neural networks: A survey
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …
attributed to their deeper and wider architectures, which can come with significant …
Survey on evolutionary deep learning: Principles, algorithms, applications, and open issues
Over recent years, there has been a rapid development of deep learning (DL) in both
industry and academia fields. However, finding the optimal hyperparameters of a DL model …
industry and academia fields. However, finding the optimal hyperparameters of a DL model …
Continual semantic segmentation with automatic memory sample selection
Abstract Continual Semantic Segmentation (CSS) extends static semantic segmentation by
incrementally introducing new classes for training. To alleviate the catastrophic forgetting …
incrementally introducing new classes for training. To alleviate the catastrophic forgetting …
Automatic network pruning via hilbert-schmidt independence criterion lasso under information bottleneck principle
Most existing neural network pruning methods hand-crafted their importance criteria and
structures to prune. This constructs heavy and unintended dependencies on heuristics and …
structures to prune. This constructs heavy and unintended dependencies on heuristics and …
Boosting zero-shot learning via contrastive optimization of attribute representations
Zero-shot learning (ZSL) aims to recognize classes that do not have samples in the training
set. One representative solution is to directly learn an embedding function associating visual …
set. One representative solution is to directly learn an embedding function associating visual …
Compressing convolutional neural networks with hierarchical Tucker-2 decomposition
Convolutional neural networks (CNNs) play a crucial role and achieve top results in
computer vision tasks but at the cost of high computational cost and storage complexity. One …
computer vision tasks but at the cost of high computational cost and storage complexity. One …
HALOC: hardware-aware automatic low-rank compression for compact neural networks
Low-rank compression is an important model compression strategy for obtaining compact
neural network models. In general, because the rank values directly determine the model …
neural network models. In general, because the rank values directly determine the model …
MCMC: Multi-Constrained Model Compression via One-Stage Envelope Reinforcement Learning
Model compression methods are being developed to bridge the gap between the massive
scale of neural networks and the limited hardware resources on edge devices. Since most …
scale of neural networks and the limited hardware resources on edge devices. Since most …
Enhanced network compression through tensor decompositions and pruning
Network compression techniques that combine tensor decompositions and pruning have
shown promise in leveraging the advantages of both strategies. In this work, we propose …
shown promise in leveraging the advantages of both strategies. In this work, we propose …
Consecutive layer collaborative filter similarity for differentiable neural network pruning
X Zu, Y Li, B Yin - Neurocomputing, 2023 - Elsevier
Filter pruning is proven to be an effective strategy in model compression. However,
convolutional filter pruning methods usually pay all attention to evaluating filters' importance …
convolutional filter pruning methods usually pay all attention to evaluating filters' importance …