Dynamic neural networks: A survey
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …
models which have fixed computational graphs and parameters at the inference stage …
Early-Exit Deep Neural Network-A Comprehensive Survey
Deep neural networks (DNNs) typically have a single exit point that makes predictions by
running the entire stack of neural layers. Since not all inputs require the same amount of …
running the entire stack of neural layers. Since not all inputs require the same amount of …
Flow: per-instance personalized federated learning
Federated learning (FL) suffers from data heterogeneity, where the diverse data distributions
across clients make it challenging to train a single global model effectively. Existing …
across clients make it challenging to train a single global model effectively. Existing …
Meta-GF: Training dynamic-depth neural networks harmoniously
Y Sun, J Li, X Xu - European Conference on Computer Vision, 2022 - Springer
Most state-of-the-art deep neural networks use static inference graphs, which makes it
impossible for such networks to dynamically adjust the depth or width of the network …
impossible for such networks to dynamically adjust the depth or width of the network …
Tiny Models are the Computational Saver for Large Models
This paper introduces TinySaver, an early-exit-like dynamic model compression approach
which employs tiny models to substitute large models adaptively. Distinct from traditional …
which employs tiny models to substitute large models adaptively. Distinct from traditional …
Fiancee: Faster inference of adversarial networks via conditional early exits
P Karpikova, E Radionova… - Proceedings of the …, 2023 - openaccess.thecvf.com
Generative DNNs are a powerful tool for image synthesis, but they are limited by their
computational load. On the other hand, given a trained model and a task, eg faces …
computational load. On the other hand, given a trained model and a task, eg faces …
When Neural Code Completion Models Size up the Situation: Attaining Cheaper and Faster Completion through Dynamic Model Inference
Leveraging recent advancements in large language models, modern neural code
completion models have demonstrated the capability to generate highly accurate code …
completion models have demonstrated the capability to generate highly accurate code …
Neural network developments: A detailed survey from static to dynamic models
Abstract Dynamic Neural Networks (DNNs) are an evolving research field within deep
learning (DL), offering a robust, adaptable, and efficient alternative to the conventional Static …
learning (DL), offering a robust, adaptable, and efficient alternative to the conventional Static …
Efficient edge inference by selective query
A Kag, I Fedorov - International Conference on Learning …, 2023 - par.nsf.gov
Edge devices provide inference on predictive tasks to many end-users. However, deploying
deep neural networks that achieve state-of-the-art accuracy on these devices is infeasible …
deep neural networks that achieve state-of-the-art accuracy on these devices is infeasible …
[HTML][HTML] Semi-HFL: semi-supervised federated learning for heterogeneous devices
In the vanilla federated learning (FL) framework, the central server distributes a globally
unified model to each client and uses labeled samples for training. However, in most cases …
unified model to each client and uses labeled samples for training. However, in most cases …