Sparse training via boosting pruning plasticity with neuroregeneration
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised
a lot of attention currently on post-training pruning (iterative magnitude pruning), and before …
a lot of attention currently on post-training pruning (iterative magnitude pruning), and before …
Low-rank compression of neural nets: Learning the rank of each layer
Y Idelbayev… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Neural net compression can be achieved by approximating each layer's weight matrix by a
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …
BSQ: Exploring bit-level sparsity for mixed-precision neural network quantization
Mixed-precision quantization can potentially achieve the optimal tradeoff between
performance and compression rate of deep neural networks, and thus, have been widely …
performance and compression rate of deep neural networks, and thus, have been widely …
Trp: Trained rank pruning for efficient deep neural networks
To enable DNNs on edge devices like mobile phones, low-rank approximation has been
widely adopted because of its solid theoretical rationale and efficient implementations …
widely adopted because of its solid theoretical rationale and efficient implementations …
Learning low-rank deep neural networks via singular vector orthogonality regularization and singular value sparsification
Modern deep neural networks (DNNs) often require high memory consumption and large
computational loads. In order to deploy DNN algorithms efficiently on edge or mobile …
computational loads. In order to deploy DNN algorithms efficiently on edge or mobile …
An effective low-rank compression with a joint rank selection followed by a compression-friendly training
Low-rank compression of a neural network is one of the popular compression techniques,
where it has been known to have two main challenges. The first challenge is determining the …
where it has been known to have two main challenges. The first challenge is determining the …
Pruning by training: A novel deep neural network compression framework for image processing
Filter pruning for a pre-trained convolutional neural network is most normally performed
through human-made constraints or criteria such as norms, ranks, etc. Typically, the pruning …
through human-made constraints or criteria such as norms, ranks, etc. Typically, the pruning …
Cstar: towards compact and structured deep neural networks with adversarial robustness
Abstract Model compression and model defense for deep neural networks (DNNs) have
been extensively and individually studied. Considering the co-importance of model …
been extensively and individually studied. Considering the co-importance of model …
[图书][B] Low-power computer vision: improve the efficiency of artificial intelligence
Energy efficiency is critical for running computer vision on battery-powered systems, such as
mobile phones or UAVs (unmanned aerial vehicles, or drones). This book collects the …
mobile phones or UAVs (unmanned aerial vehicles, or drones). This book collects the …