关注
Kuluhan Binici
Kuluhan Binici
在 comp.nus.edu.sg 的电子邮件经过验证
标题
引用次数
引用次数
年份
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
K Binici, S Aggarwal, NT Pham, K Leman, T Mitra
Proceedings of the AAAI Conference on Artificial Intelligence 36 (6), 6089-6096, 2022
492022
Preventing catastrophic forgetting and distribution mismatch in knowledge distillation via synthetic data
K Binici, NT Pham, T Mitra, K Leman
Proceedings of the IEEE/CVF winter conference on applications of computer …, 2022
442022
Chameleon: Dual memory replay for online continual learning on edge devices
S Aggarwal, K Binici, T Mitra
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 2023
62023
Visual-Policy Learning Through Multi-Camera View to Single-Camera View Knowledge Distillation for Robot Manipulation Tasks
C Acar, K Binici, A Tekirdağ, Y Wu
IEEE Robotics and Automation Letters 9 (1), 691-698, 2023
22023
Condensed Sample-Guided Model Inversion for Knowledge Distillation
K Binici, S Aggarwal, C Acar, NT Pham, K Leman, GH Lee, T Mitra
arXiv preprint arXiv:2408.13850, 2024
12024
Generalizing teacher networks for effective knowledge distillation across student architectures
K Binici, W Wu, T Mitra
arXiv preprint arXiv:2407.16040, 2024
12024
MEDSAGE: Enhancing Robustness of Medical Dialogue Summarization to ASR Errors with LLM-generated Synthetic Dialogues
K Binici, AR Kashyap, V Schlegel, AT Liu, VP Dwivedi, TT Nguyen, X Gao, ...
arXiv preprint arXiv:2408.14418, 2024
2024
LLMs are not Zero-Shot Reasoners for Biomedical Information Extraction
A Nagar, V Schlegel, TT Nguyen, H Li, Y Wu, K Binici, S Winkler
arXiv preprint arXiv:2408.12249, 2024
2024
CRISP: Hybrid Structured Sparsity for Class-aware Model Pruning
S Aggarwal, K Binici, T Mitra
2024 Design, Automation & Test in Europe Conference & Exhibition (DATE), 1-6, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–9