关注
Hanpeng Liu
Hanpeng Liu
在 hust.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Class-aware Information for Logit-based Knowledge Distillation
S Zhang, H Liu, JE Hopcroft, K He
arXiv preprint arXiv:2211.14773, 2022
22022
Knowledge Distillation via Token-Level Relationship Graph Based on the Big Data Technologies
S Zhang, H Liu, K He
Big Data Research 36, 100438, 2024
12024
Knowledge Distillation via Token-level Relationship Graph
S Zhang, H Liu, K He
arXiv preprint arXiv:2306.12442, 2023
12023
Leveraging Contrastive Learning for Enhanced Node Representations in Tokenized Graph Transformers
J Chen, H Liu, JE Hopcroft, K He
arXiv preprint arXiv:2406.19258, 2024
2024
You Only Need Less Attention at Each Stage in Vision Transformers
S Zhang, H Liu, S Lin, K He
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–5