关注
Like Hui
Like Hui
CSE, UCSD
在 ucsd.edu 的电子邮件经过验证
标题
引用次数
引用次数
年份
Evaluation of neural architectures trained with square loss vs cross-entropy in classification tasks
L Hui, M Belkin
arXiv preprint arXiv:2006.07322, 2020
1802020
Limitations of neural collapse for understanding generalization in deep learning
L Hui, M Belkin, P Nakkiran
arXiv preprint arXiv:2202.08384, 2022
582022
Joint training of complex ratio mask based beamformer and acoustic model for noise robust asr
Y Xu, C Weng, L Hui, J Liu, M Yu, D Su, D Yu
ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and …, 2019
442019
Convolutional maxout neural networks for speech separation
L Hui, M Cai, C Guo, L He, WQ Zhang, J Liu
2015 IEEE international symposium on signal processing and information …, 2015
442015
High-performance Swahili keyword search with very limited language pack: The THUEE system for the OpenKWS15 evaluation
M Cai, Z Lv, C Lu, J Kang, L Hui, Z Zhang, J Liu
2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU …, 2015
142015
Kernel Machines Beat Deep Neural Networks on Mask-based Single-channel Speech Enhancement
L Hui, S Ma, M Belkin
INTERSPEECH 2019, 2019
122019
Cut your losses with squentropy
L Hui, M Belkin, S Wright
International Conference on Machine Learning, 14114-14131, 2023
72023
Evaluation of neural architectures trained with square loss vs cross-entropy in classification tasks, 2020
L Hui, M Belkin
URL https://arxiv. org/abs, 2020
7*2020
A speech enhancement algorithm using computational auditory scene analysis with spectral subtraction
C Guo, L Hui, WQ Zhang, J Liu
2016 IEEE International Symposium on Signal Processing and Information …, 2016
52016
ReLU soothes the NTK condition number and accelerates optimization for wide neural networks
C Liu, L Hui
arXiv preprint arXiv:2305.08813, 2023
42023
Improved system fusion for keyword search
Z Lv, M Cai, C Lu, J Kang, L Hui, WQ Zhang, J Liu
2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU …, 2015
32015
Surprising Empirical Phenomena of Deep Learning and Kernel Machines
L Hui
University of California, San Diego, 2023
2023
ReLU soothes NTK conditioning and accelerates optimization for wide neural networks
C Liu, L Hui, X Liu
系统目前无法执行此操作,请稍后再试。
文章 1–13