Class-aware Information for Logit-based Knowledge Distillation S Zhang, H Liu, JE Hopcroft, K He arXiv preprint arXiv:2211.14773, 2022 | 2 | 2022 |
Knowledge Distillation via Token-Level Relationship Graph Based on the Big Data Technologies S Zhang, H Liu, K He Big Data Research 36, 100438, 2024 | 1 | 2024 |
Knowledge Distillation via Token-level Relationship Graph S Zhang, H Liu, K He arXiv preprint arXiv:2306.12442, 2023 | 1 | 2023 |
Leveraging Contrastive Learning for Enhanced Node Representations in Tokenized Graph Transformers J Chen, H Liu, JE Hopcroft, K He arXiv preprint arXiv:2406.19258, 2024 | | 2024 |
You Only Need Less Attention at Each Stage in Vision Transformers S Zhang, H Liu, S Lin, K He Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2024 | | 2024 |