Fast nearest neighbor machine translation

Y Meng, X Li, X Zheng, F Wu, X Sun, T Zhang… - arXiv preprint arXiv …, 2021 - arxiv.org
Though nearest neighbor Machine Translation ($ k $ NN-MT)\citep {khandelwal2020nearest
} has proved to introduce significant performance boosts over standard neural MT systems, it …

MetaMT, a meta learning method leveraging multiple domain data for low resource machine translation

R Li, X Wang, H Yu - Proceedings of the AAAI Conference on Artificial …, 2020 - aaai.org
Neural machine translation (NMT) models have achieved state-of-the-art translation quality
with a large quantity of parallel corpora available. However, their performance suffers …

The future of human-artificial intelligence nexus and its environmental costs

P Spelda, V Stritecky - Futures, 2020 - Elsevier
The environmental costs and energy constraints have become emerging issues for the
future development of Machine Learning (ML) and Artificial Intelligence (AI). So far, the …

Improving robustness and generality of NLP models using disentangled representations

J Wu, X Li, X Ao, Y Meng, F Wu, J Li - arXiv preprint arXiv:2009.09587, 2020 - arxiv.org
Supervised neural networks, which first map an input $ x $ to a single representation $ z $,
and then map $ z $ to the output label $ y $, have achieved remarkable success in a wide …

[引用][C] Ko-SenseBERT: 한국어어휘지도(UWordMap) 에기반한의미강화된언어모델

이현민, 이건희, 나승훈, 옥철영 - 한국정보과학회학술발표논문집, 2022 - dbpia.co.kr
요 약 Transformer 는 self-attention 메커니즘을 이용하여 순차적 데이터에서 관계를 추적하여
문맥과 의미를 학습하는 신경망이다. Transformer 는 자연어처리 분야에서 획기적인 성능 …