Darkrank: Accelerating deep metric learning via cross sample similarities transfer

Y Chen, N Wang, Z Zhang - Proceedings of the AAAI conference on …, 2018 - ojs.aaai.org
Proceedings of the AAAI conference on artificial intelligence, 2018ojs.aaai.org
We have witnessed rapid evolution of deep neural network architecture design in the past
years. These latest progresses greatly facilitate the developments in various areas such as
computer vision and natural language processing. However, along with the extraordinary
performance, these state-of-the-art models also bring in expensive computational cost.
Directly deploying these models into applications with real-time requirement is still
infeasible. Recently, Hinton et al. have shown that the dark knowledge within a powerful …
Abstract
We have witnessed rapid evolution of deep neural network architecture design in the past years. These latest progresses greatly facilitate the developments in various areas such as computer vision and natural language processing. However, along with the extraordinary performance, these state-of-the-art models also bring in expensive computational cost. Directly deploying these models into applications with real-time requirement is still infeasible. Recently, Hinton et al. have shown that the dark knowledge within a powerful teacher model can significantly help the training of a smaller and faster student network. These knowledge are vastly beneficial to improve the generalization ability of the student model. Inspired by their work, we introduce a new type of knowledge---cross sample similarities for model compression and acceleration. This knowledge can be naturally derived from deep metric learning model. To transfer them, we bring the" learning to rank" technique into deep metric learning formulation. We test our proposed DarkRank method on various metric learning tasks including pedestrian re-identification, image retrieval and image clustering. The results are quite encouraging. Our method can improve over the baseline method by a large margin. Moreover, it is fully compatible with other existing methods. When combined, the performance can be further boosted.
ojs.aaai.org
以上显示的是最相近的搜索结果。 查看全部搜索结果