Chatgpt beyond english: Towards a comprehensive evaluation of large language models in multilingual learning

VD Lai, NT Ngo, APB Veyseh, H Man… - arXiv preprint arXiv …, 2023 - arxiv.org
Over the last few years, large language models (LLMs) have emerged as the most important
breakthroughs in natural language processing (NLP) that fundamentally transform research …

生成对抗网络在各领域应用研究进展

刘建伟, 谢浩杰, 罗雄麟 - 自动化学报, 2020 - aas.net.cn
随着深度学习的快速发展, 生成式模型领域也取得了显著进展. 生成对抗网络(Generative
adversarial network, GAN) 是一种无监督的学习方法, 它是根据博弈论中的二人零和博弈理论 …

XNLI: Evaluating cross-lingual sentence representations

A Conneau, G Lample, R Rinott, A Williams… - arXiv preprint arXiv …, 2018 - arxiv.org
State-of-the-art natural language processing systems rely on supervision in the form of
annotated data to learn competent models. These models are generally trained on data in a …

A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddings

M Artetxe, G Labaka, E Agirre - arXiv preprint arXiv:1805.06297, 2018 - arxiv.org
Recent work has managed to learn cross-lingual word embeddings without parallel data by
mapping monolingual embeddings to a shared space through adversarial training …

A survey of cross-lingual word embedding models

S Ruder, I Vulić, A Søgaard - Journal of Artificial Intelligence Research, 2019 - jair.org
Cross-lingual representations of words enable us to reason about word meaning in
multilingual contexts and are a key facilitator of cross-lingual transfer when developing …

Learning bilingual word embeddings with (almost) no bilingual data

M Artetxe, G Labaka, E Agirre - … of the 55th Annual Meeting of the …, 2017 - aclanthology.org
Most methods to learn bilingual word embeddings rely on large parallel corpora, which is
difficult to obtain for most language pairs. This has motivated an active research line to relax …

Gromov-Wasserstein alignment of word embedding spaces

D Alvarez-Melis, TS Jaakkola - arXiv preprint arXiv:1809.00013, 2018 - arxiv.org
Cross-lingual or cross-domain correspondences play key roles in tasks ranging from
machine translation to transfer learning. Recently, purely unsupervised methods operating …

Generalizing and improving bilingual word embedding mappings with a multi-step framework of linear transformations

M Artetxe, G Labaka, E Agirre - Proceedings of the AAAI Conference on …, 2018 - ojs.aaai.org
Using a dictionary to map independently trained word embeddings to a shared space has
shown to be an effective approach to learn bilingual word embeddings. In this work, we …

Adversarial training for unsupervised bilingual lexicon induction

M Zhang, Y Liu, H Luan, M Sun - … of the 55th Annual Meeting of …, 2017 - aclanthology.org
Word embeddings are well known to capture linguistic regularities of the language on which
they are trained. Researchers also observe that these regularities can transfer across …

Neural cross-lingual named entity recognition with minimal resources

J Xie, Z Yang, G Neubig, NA Smith… - arXiv preprint arXiv …, 2018 - arxiv.org
For languages with no annotated resources, unsupervised transfer of natural language
processing models such as named-entity recognition (NER) from resource-rich languages …