关注
Lyan Verwimp
Lyan Verwimp
在 esat.kuleuven.be 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Character-Word LSTM Language Models
L Verwimp, J Pelemans, H Van hamme, P Wambacq
European Chapter of the Association for Computational Linguistics (EACL …, 2017
652017
Definite il y a-clefts in spoken French
L Verwimp, K Lahousse
Journal of French Language Studies, 2016
212016
Improving the translation environment for professional translators
V Vandeghinste, T Vanallemeersch, L Augustinus, B Bulté, F Van Eynde, ...
Informatics 6 (2), 24, 2019
112019
A comparison of different punctuation prediction approaches in a translation context
V Vandeghinste, L Verwimp, J Pelemans, P Wambacq
European Association for Machine Translation, 2018
112018
Analyzing the Contribution of Top-Down Lexical and Bottom-Up Acoustic Cues in the Detection of Sentence Prominence.
S Kakouros, J Pelemans, L Verwimp, P Wambacq, O Räsänen
Interspeech, 1074-1078, 2016
112016
Error-driven pruning of language models for virtual assistants
S Gondala, L Verwimp, E Pusateri, M Tsagkias, C Van Gysel
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
102021
State Gradients for Analyzing Memory in LSTM Language Models
L Verwimp, H Van hamme, P Wambacq
Computer Speech & Language, 101034, 2019
92019
TF-LM: TensorFlow-based Language Modeling Toolkit
L Verwimp, H Van hamme, P Wambacq
Proceedings Language Resources and Evaluation Conference (LREC), 2018
92018
Van hamme, H., and Wambacq, P.(2019)
L Verwimp, J Pelemans
Tf-lm: Tensorflow-based language modeling toolkit. In http://www. lrec-conf …, 0
6
Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models?
L Verwimp, JR Bellegarda
Interspeech 2019, 2019
52019
Optimizing bilingual neural transducer with synthetic code-switching text generation
T Nguyen, N Tran, L Deng, TF da Silva, M Radzihovsky, R Hsiao, ...
arXiv preprint arXiv:2210.12214, 2022
42022
State Gradients for RNN Memory Analysis
L Verwimp, H Van hamme, V Renkens, P Wambacq
Interspeech 2018, 1467-1471, 2018
42018
Language model adaptation for ASR of spoken translations using phrase-based translation models and named entity models
J Pelemans, T Vanallemeersch, K Demuynck, L Verwimp, P Wambacq
2016 IEEE International Conference on Acoustics, Speech and Signal …, 2016
42016
STON: Efficient Subtitling in Dutch Using State-of-the-Art Tools
L Verwimp, B Desplanques, K Demuynck, J Pelemans, M Lycke, ...
Interspeech 2016, 780-781, 2016
42016
Information-Weighted Neural Cache Language Models for ASR
L Verwimp, J Pelemans, H Van hamme, P Wambacq
IEEE Workshop on Spoken Language Technology (SLT), 2018
32018
Domain adaptation for LSTM language models
W Boes, R Van Rompaey, J Pelemans, L Verwimp, P Wambacq
Book of abstracts CLIN27, 2017
32017
Application-agnostic language modeling for on-device ASR
M Nußbaum-Thom, L Verwimp, Y Oualil
arXiv preprint arXiv:2305.09764, 2023
22023
Smart Computer-Aided Translation Environment (SCATE): Highlights (. pdf)
V Vandeghinste, T Vanallemeersch, B Bulté, L Augustinus, F Van Eynde, ...
22018
Expanding n-gram training data for language models based on morpho-syntactic transformations
L Verwimp, J Pelemans, H Van hamme, P Wambacq
Computational Linguistics in the Netherlands Journal 5, 49-64, 2015
22015
Language Models of Spoken Dutch
L Verwimp, J Pelemans, M Lycke, H Van hamme, P Wambacq
arXiv preprint arXiv:1709.03759, 2017
12017
系统目前无法执行此操作,请稍后再试。
文章 1–20