Modeling teacher-student techniques in deep neural networks for knowledge distillation

S Abbasi, M Hajabdollahi, N Karimi… - … on Machine Vision …, 2020 - ieeexplore.ieee.org
2020 International Conference on Machine Vision and Image …, 2020ieeexplore.ieee.org
Knowledge distillation (KD) is a new method for transferring knowledge of a structure under
training to another one. The conventional application of KD is in the form of learning a small
model (named as a student) by soft labels produced by a complex model (named as a
teacher). Due to the novel idea introduced in KD, recently, its notion is used in different
methods such as compression and processes that are going to enhance the model
accuracy. Although different techniques are proposed in the area of KD, there is a lack of a …
Knowledge distillation (KD) is a new method for transferring knowledge of a structure under training to another one. The conventional application of KD is in the form of learning a small model (named as a student) by soft labels produced by a complex model (named as a teacher). Due to the novel idea introduced in KD, recently, its notion is used in different methods such as compression and processes that are going to enhance the model accuracy. Although different techniques are proposed in the area of KD, there is a lack of a model to generalize KD techniques. In this paper, various studies in the scope of KD are investigated and analyzed to build a general model for KD. All the methods and techniques in KD can be summarized through the proposed model. By utilizing the proposed model, different methods in KD are better investigated and explored. The advantages and disadvantages of different approaches in KD can be better understood and developing a new strategy for KD can be possible. Using the proposed model, different KD methods are represented in an abstract view.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果