A comprehensive survey on model compression and acceleration

T Choudhary, V Mishra, A Goswami… - Artificial Intelligence …, 2020 - Springer
In recent years, machine learning (ML) and deep learning (DL) have shown remarkable
improvement in computer vision, natural language processing, stock prediction, forecasting …

Communication-efficient edge AI: Algorithms and systems

Y Shi, K Yang, T Jiang, J Zhang… - … Surveys & Tutorials, 2020 - ieeexplore.ieee.org
Artificial intelligence (AI) has achieved remarkable breakthroughs in a wide range of fields,
ranging from speech processing, image classification to drug discovery. This is driven by the …

Exploration by random network distillation

Y Burda, H Edwards, A Storkey, O Klimov - arXiv preprint arXiv …, 2018 - arxiv.org
We introduce an exploration bonus for deep reinforcement learning methods that is easy to
implement and adds minimal overhead to the computation performed. The bonus is the error …

Large-scale study of curiosity-driven learning

Y Burda, H Edwards, D Pathak, A Storkey… - arXiv preprint arXiv …, 2018 - arxiv.org
Reinforcement learning algorithms rely on carefully engineering environment rewards that
are extrinsic to the agent. However, annotating each environment with hand-designed …

A survey of model compression and acceleration for deep neural networks

Y Cheng, D Wang, P Zhou, T Zhang - arXiv preprint arXiv:1710.09282, 2017 - arxiv.org
Deep neural networks (DNNs) have recently achieved great success in many visual
recognition tasks. However, existing deep neural network models are computationally …

Deconstructing lottery tickets: Zeros, signs, and the supermask

H Zhou, J Lan, R Liu, J Yosinski - Advances in neural …, 2019 - proceedings.neurips.cc
Abstract The recent" Lottery Ticket Hypothesis" paper by Frankle & Carbin showed that a
simple approach to creating sparse networks (keep the large weights) results in models that …

Nisp: Pruning networks using neuron importance score propagation

R Yu, A Li, CF Chen, JH Lai… - Proceedings of the …, 2018 - openaccess.thecvf.com
To reduce the significant redundancy in deep Convolutional Neural Networks (CNNs), most
existing methods prune neurons by only considering the statistics of an individual layer or …

Netadapt: Platform-aware neural network adaptation for mobile applications

TJ Yang, A Howard, B Chen, X Zhang… - Proceedings of the …, 2018 - openaccess.thecvf.com
This work proposes an algorithm, called NetAdapt, that automatically adapts a pre-trained
deep neural network to a mobile platform given a resource budget. While many existing …

EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces

VJ Lawhern, AJ Solon, NR Waytowich… - Journal of neural …, 2018 - iopscience.iop.org
Objective. Brain–computer interfaces (BCI) enable direct communication with a computer,
using neural activity as the control signal. This neural signal is generally chosen from a …

Model compression and acceleration for deep neural networks: The principles, progress, and challenges

Y Cheng, D Wang, P Zhou… - IEEE Signal Processing …, 2018 - ieeexplore.ieee.org
In recent years, deep neural networks (DNNs) have received increased attention, have been
applied to different applications, and achieved dramatic accuracy improvements in many …