[HTML][HTML] Literature review of deep network compression
Deep networks often possess a vast number of parameters, and their significant redundancy
in parameterization has become a widely-recognized property. This presents significant …
in parameterization has become a widely-recognized property. This presents significant …
A critical review on the state-of-the-art and future prospects of Machine Learning for Earth Observation Operations
P Miralles, K Thangavel, AF Scannapieco… - Advances in Space …, 2023 - Elsevier
Abstract The continuing Machine Learning (ML) revolution indubitably has had a significant
positive impact on the analysis of downlinked satellite data. Other aspects of the Earth …
positive impact on the analysis of downlinked satellite data. Other aspects of the Earth …
Sequence-level knowledge distillation
Neural machine translation (NMT) offers a novel alternative formulation of translation that is
potentially simpler than statistical approaches. However to reach competitive performance …
potentially simpler than statistical approaches. However to reach competitive performance …
On compressing deep models by low rank and sparse decomposition
Deep compression refers to removing the redundancy of parameters and feature maps for
deep learning models. Low-rank approximation and pruning for sparse structures play a vital …
deep learning models. Low-rank approximation and pruning for sparse structures play a vital …
Scalpel: Customizing dnn pruning to the underlying hardware parallelism
As the size of Deep Neural Networks (DNNs) continues to grow to increase accuracy and
solve more complex problems, their energy footprint also scales. Weight pruning reduces …
solve more complex problems, their energy footprint also scales. Weight pruning reduces …
Deepx: A software accelerator for low-power deep learning inference on mobile devices
Breakthroughs from the field of deep learning are radically changing how sensor data are
interpreted to extract the high-level information needed by mobile apps. It is critical that the …
interpreted to extract the high-level information needed by mobile apps. It is critical that the …
Sparsification and separation of deep learning layers for constrained resource inference on wearables
S Bhattacharya, ND Lane - Proceedings of the 14th ACM Conference on …, 2016 - dl.acm.org
Deep learning has revolutionized the way sensor data are analyzed and interpreted. The
accuracy gains these approaches offer make them attractive for the next generation of …
accuracy gains these approaches offer make them attractive for the next generation of …
Cambricon-S: Addressing irregularity in sparse neural networks through a cooperative software/hardware approach
X Zhou, Z Du, Q Guo, S Liu, C Liu… - 2018 51st Annual …, 2018 - ieeexplore.ieee.org
Neural networks have become the dominant algorithms rapidly as they achieve state-of-the-
art performance in a broad range of applications such as image recognition, speech …
art performance in a broad range of applications such as image recognition, speech …
Compression of deep learning models for text: A survey
In recent years, the fields of natural language processing (NLP) and information retrieval (IR)
have made tremendous progress thanks to deep learning models like Recurrent Neural …
have made tremendous progress thanks to deep learning models like Recurrent Neural …
Learning representations for neural network-based classification using the information bottleneck principle
In this theory paper, we investigate training deep neural networks (DNNs) for classification
via minimizing the information bottleneck (IB) functional. We show that the resulting …
via minimizing the information bottleneck (IB) functional. We show that the resulting …