关注
Shitong Shao
Shitong Shao
The Hong Kong University of Science and Technology (Guangzhou)
在 connect.hkust-gz.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Bootstrap Generalization Ability from Loss Landscape Perspective
H Chen, S Shao, Z Wang, Z Shang, J Chen, X Ji, X Wu
ECCV workshop, 500-517, 2022
172022
Catch-up distillation: You only need to train once for accelerating sampling
S Shao, X Dai, S Yin, L Li, H Chen, Y Hu
arXiv preprint arXiv:2305.10769, 2023
142023
A Bi-Stream hybrid model with MLPBlocks and self-attention mechanism for EEG-based emotion recognition
W Li, Y Tian, B Hou, J Dong, S Shao, A Song
Biomedical Signal Processing and Control 86, 105223, 2023
112023
Generalized Large-Scale Data Condensation via Various Backbone and Statistical Matching
S Shao, Z Yin, M Zhou, X Zhang, Z Shen
CVPR highlight, 2023
82023
DiffuseExpand: Expanding dataset for 2D medical image segmentation using diffusion models
S Shao, X Yuan, Z Huang, Z Qiu, S Wang, K Zhou
IJCAI workshop, 2023
82023
Your diffusion model is secretly a certifiably robust classifier
H Chen, Y Dong, S Shao, Z Hao, X Yang, H Su, J Zhu
arXiv preprint arXiv:2402.02316, 2024
72024
BiSMSM: A Hybrid MLP-Based Model of Global Self-Attention Processes for EEG-Based Emotion Recognition
W Li, Y Tian, B Hou, J Dong, S Shao
ICANN, 37-48, 2022
72022
What Role Does Data Augmentation Play in Knowledge Distillation?
W Li, S Shao, W Liu, Z Qiu, Z Zhu, W Huan
ACCV, 2204-2220, 2022
72022
Attention-based intrinsic reward mixing network for credit assignment in multi-agent reinforcement learning
W Li, W Liu, S Shao, S Huang, A Song
IEEE Transactions on Games, 2023
62023
Hybrid knowledge distillation from intermediate layers for efficient Single Image Super-Resolution
J Xie, L Gong, S Shao, S Lin, L Luo
Neurocomputing 554, 126592, 2023
42023
MS-FRAN: a novel multi-source domain adaptation method for EEG-based emotion recognition
W Li, W Huan, S Shao, B Hou, A Song
IEEE Journal of Biomedical and Health Informatics, 2023
32023
AIIR-MIX: Multi-Agent Reinforcement Learning Meets Attention Individual Intrinsic Reward Mixing Network
W Li, W Liu, S Shao, S Huang
ACML, 579-594, 2023
32023
Teaching What You Should Teach: A Data-Based Distillation Method
S Shao, H Chen, Z Huang, L Gong, S Wang, X Wu
IJCAI, 2022
32022
Multi-perspective analysis on data augmentation in knowledge distillation
W Li, S Shao, Z Qiu, A Song
Neurocomputing 583, 127516, 2024
22024
Self-supervised Dataset Distillation: A Good Compression Is All You Need
M Zhou, Z Yin, S Shao, Z Shen
arXiv preprint arXiv:2404.07976, 2024
22024
Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation
S Wang, D Zhang, Z Yan, S Shao, R Li
IJCAI workshop, 2023
22023
Elucidating the Design Space of Dataset Condensation
S Shao, Z Zhou, H Chen, Z Shen
arXiv preprint arXiv:2404.13733, 2024
12024
Precise Knowledge Transfer via Flow Matching
S Shao, Z Shen, L Gong, H Chen, X Dai
arXiv preprint arXiv:2402.02012, 2024
12024
Spatial-Temporal Constraint Learning for Cross-Subject EEG-Based Emotion Recognition
W Li, B Hou, S Shao, W Huan, Y Tian
IJCNN, 1-8, 2023
12023
Generalized Contrastive Partial Label Learning for Cross-Subject EEG-Based Emotion Recognition
W Li, L Fan, S Shao, A Song
IEEE Transactions on Instrumentation and Measurement, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–20