A comprehensive survey on model compression and acceleration
T Choudhary, V Mishra, A Goswami… - Artificial Intelligence …, 2020 - Springer
In recent years, machine learning (ML) and deep learning (DL) have shown remarkable
improvement in computer vision, natural language processing, stock prediction, forecasting …
improvement in computer vision, natural language processing, stock prediction, forecasting …
Communication-efficient edge AI: Algorithms and systems
Artificial intelligence (AI) has achieved remarkable breakthroughs in a wide range of fields,
ranging from speech processing, image classification to drug discovery. This is driven by the …
ranging from speech processing, image classification to drug discovery. This is driven by the …
Exploration by random network distillation
We introduce an exploration bonus for deep reinforcement learning methods that is easy to
implement and adds minimal overhead to the computation performed. The bonus is the error …
implement and adds minimal overhead to the computation performed. The bonus is the error …
Large-scale study of curiosity-driven learning
Reinforcement learning algorithms rely on carefully engineering environment rewards that
are extrinsic to the agent. However, annotating each environment with hand-designed …
are extrinsic to the agent. However, annotating each environment with hand-designed …
A survey of model compression and acceleration for deep neural networks
Deep neural networks (DNNs) have recently achieved great success in many visual
recognition tasks. However, existing deep neural network models are computationally …
recognition tasks. However, existing deep neural network models are computationally …
Deconstructing lottery tickets: Zeros, signs, and the supermask
Abstract The recent" Lottery Ticket Hypothesis" paper by Frankle & Carbin showed that a
simple approach to creating sparse networks (keep the large weights) results in models that …
simple approach to creating sparse networks (keep the large weights) results in models that …
Nisp: Pruning networks using neuron importance score propagation
To reduce the significant redundancy in deep Convolutional Neural Networks (CNNs), most
existing methods prune neurons by only considering the statistics of an individual layer or …
existing methods prune neurons by only considering the statistics of an individual layer or …
Netadapt: Platform-aware neural network adaptation for mobile applications
This work proposes an algorithm, called NetAdapt, that automatically adapts a pre-trained
deep neural network to a mobile platform given a resource budget. While many existing …
deep neural network to a mobile platform given a resource budget. While many existing …
EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces
Objective. Brain–computer interfaces (BCI) enable direct communication with a computer,
using neural activity as the control signal. This neural signal is generally chosen from a …
using neural activity as the control signal. This neural signal is generally chosen from a …
Model compression and acceleration for deep neural networks: The principles, progress, and challenges
In recent years, deep neural networks (DNNs) have received increased attention, have been
applied to different applications, and achieved dramatic accuracy improvements in many …
applied to different applications, and achieved dramatic accuracy improvements in many …