Deep neural network–based enhancement for image and video streaming systems: A survey and future directions
Internet-enabled smartphones and ultra-wide displays are transforming a variety of visual
apps spanning from on-demand movies and 360° videos to video-conferencing and live …
apps spanning from on-demand movies and 360° videos to video-conferencing and live …
Edge intelligence: The confluence of edge computing and artificial intelligence
Along with the rapid developments in communication technologies and the surge in the use
of mobile devices, a brand-new computation paradigm, edge computing, is surging in …
of mobile devices, a brand-new computation paradigm, edge computing, is surging in …
SPINN: synergistic progressive inference of neural networks over device and cloud
Despite the soaring use of convolutional neural networks (CNNs) in mobile applications,
uniformly sustaining high-performance inference on mobile has been elusive due to the …
uniformly sustaining high-performance inference on mobile has been elusive due to the …
Self-distillation: Towards efficient and compact neural networks
Remarkable achievements have been obtained by deep neural networks in the last several
years. However, the breakthrough in neural networks accuracy is always accompanied by …
years. However, the breakthrough in neural networks accuracy is always accompanied by …
Wavelet knowledge distillation: Towards efficient image-to-image translation
Remarkable achievements have been attained with Generative Adversarial Networks
(GANs) in image-to-image translation. However, due to a tremendous amount of parameters …
(GANs) in image-to-image translation. However, due to a tremendous amount of parameters …
Improve object detection with feature-based knowledge distillation: Towards accurate and efficient detectors
Knowledge distillation, in which a student model is trained to mimic a teacher model, has
been proved as an effective technique for model compression and model accuracy boosting …
been proved as an effective technique for model compression and model accuracy boosting …
Student customized knowledge distillation: Bridging the gap between student and teacher
Y Zhu, Y Wang - Proceedings of the IEEE/CVF International …, 2021 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) transfers the dark knowledge from cumbersome
networks (teacher) to lightweight (student) networks and expects the student to achieve …
networks (teacher) to lightweight (student) networks and expects the student to achieve …
Adapting Neural Networks at Runtime: Current Trends in At-Runtime Optimizations for Deep Learning
Adaptive optimization methods for deep learning adjust the inference task to the current
circumstances at runtime to improve the resource footprint while maintaining the model's …
circumstances at runtime to improve the resource footprint while maintaining the model's …
Fast and robust early-exiting framework for autoregressive language models with synchronized parallel decoding
To tackle the high inference latency exhibited by autoregressive language models, previous
studies have proposed an early-exiting framework that allocates adaptive computation paths …
studies have proposed an early-exiting framework that allocates adaptive computation paths …