[引用][C] Embedded knowledge distillation in depth-level dynamic neural network

S Lyu, TB Xu, G Cheng - arXiv e-prints, 2021

Embedded knowledge distillation in depth-level dynamic neural network

Q Zhao, S Lyu, Z Zhang, TB Xu, G Cheng - arXiv preprint arXiv …, 2021 - arxiv.org
In real applications, different computation-resource devices need different-depth networks
(eg, ResNet-18/34/50) with high-accuracy. Usually, existing methods either design multiple
networks and train them independently, or construct depth-level/width-level dynamic neural
networks which is hard to prove the accuracy of each sub-net. In this article, we propose an
elegant Depth-Level Dynamic Neural Network (DDNN) integrated different-depth sub-nets of
similar architectures. To improve the generalization of sub-nets, we design the Embedded …
以上显示的是最相近的搜索结果。 查看全部搜索结果