Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …
solving the most complex problem statements. However, these models are huge in size with …
Factorizing knowledge in neural networks
In this paper, we explore a novel and ambitious knowledge-transfer task, termed Knowledge
Factorization (KF). The core idea of KF lies in the modularization and assemblability of …
Factorization (KF). The core idea of KF lies in the modularization and assemblability of …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
One-for-all: Bridge the gap between heterogeneous architectures in knowledge distillation
Abstract Knowledge distillation (KD) has proven to be a highly effective approach for
enhancing model performance through a teacher-student training scheme. However, most …
enhancing model performance through a teacher-student training scheme. However, most …
Distilling knowledge from graph convolutional networks
Existing knowledge distillation methods focus on convolutional neural networks (CNNs),
where the input samples like images lie in a grid domain, and have largely overlooked …
where the input samples like images lie in a grid domain, and have largely overlooked …
Explainable AI in deep reinforcement learning models for power system emergency control
Artificial intelligence (AI) technology has become an important trend to support the analysis
and control of complex and time-varying power systems. Although deep reinforcement …
and control of complex and time-varying power systems. Although deep reinforcement …
Explainable AI in deep reinforcement learning models: A shap method applied in power system emergency control
The application of artificial intelligence (AI) system is more and more extensive, using the
explainable AI (XAI) technology to explain why machine learning (ML) models make certain …
explainable AI (XAI) technology to explain why machine learning (ML) models make certain …
Industrial cyber-physical systems-based cloud IoT edge for federated heterogeneous distillation
Deep convoloutional networks have been widely deployed in modern cyber-physical
systems performing different visual classification tasks. As the fog and edge devices have …
systems performing different visual classification tasks. As the fog and edge devices have …
Transferring inductive biases through knowledge distillation
Having the right inductive biases can be crucial in many tasks or scenarios where data or
computing resources are a limiting factor, or where training data is not perfectly …
computing resources are a limiting factor, or where training data is not perfectly …
Customizing student networks from heterogeneous teachers via adaptive knowledge amalgamation
A massive number of well-trained deep networks have been released by developers online.
These networks may focus on different tasks and in many cases are optimized for different …
These networks may focus on different tasks and in many cases are optimized for different …