DE-DFKD: diversity enhancing data-free knowledge distillation

Y Liu, A Ye, Q Chen, Y Zhang, J Chen - Multimedia Tools and Applications, 2024 - Springer
Abstract Data-Free Knowledge Distillation (DFKD) can be used to train students using
synthetic data, when the original dataset of the teacher network is not accessible. However …

InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation

J Huang, W He, L Gou, L Ren, C Bryan - arXiv preprint arXiv:2406.17838, 2024 - arxiv.org
The emergence of large-scale pre-trained models has heightened their application in
various downstream tasks, yet deployment is a challenge in environments with limited …

[PDF][PDF] 空间位置矫正的稀疏特征图像分类网络

姜文涛, 陈晨, 张晟翀 - 光电工程, 2024 - researching.cn
为稀疏语义并加强对重点特征的关注, 增强空间位置和局部特征的关联性, 对特征空间位置进行
约束, 本文提出空间位置矫正的稀疏特征图像分类网络(SSCNet). 该网络以ResNet-34 …

Sparse feature image classification network with spatial position correction

J Wentao, C Chen, Z Shengchong - Opto-Electronic Engineering, 2024 - oejournal.org
To sparse semantics and enhance attention to key features, enhance the correlation
between spatial and local features, and constrain the spatial position of features, this paper …

Explain, Simplify, and Integrate Artificial Intelligence with Visual Analytics

J Huang - 2024 - search.proquest.com
Artificial Intelligence (AI) technology has advanced significantly, enabling AI models to learn
from diverse data and automate tasks previously performed solely by humans. This …

DFED: Data-Free Ensemble Distillation with Multi-Source GANs for Heterogeneous Federated Learning

J Liu, Y Gao - openreview.net
Federated Learning (FL) is a decentralized machine learning paradigm that enables clients
to collaboratively train models while preserving data privacy. However, surmounting the …

사용자의선호도변화를반영한추천시스템의지속학습

정지운, 김형진, 서영덕 - 정보과학회컴퓨팅의실제논문지, 2024 - dbpia.co.kr
추천 시스템의 지속 학습 연구는 컴퓨터 비전이나 자연어 처리 분야의 연구와 같이 치명적 망각
현상 완화를 목표로 하는 안정성 향상에 초점을 맞춘다. 하지만 안정성에만 초점을 맞춘 지속 …