Improving neural machine translation for low resource Algerian dialect by transductive transfer learning strategy
A Slim, A Melouah, U Faghihi, K Sahib - Arabian Journal for Science and …, 2022 - Springer
This study is the first work on a transductive transfer learning approach for low-resource
neural machine translation applied to the Algerian Arabic dialect. The transductive approach …
neural machine translation applied to the Algerian Arabic dialect. The transductive approach …
Lmsanitator: Defending prompt-tuning against task-agnostic backdoors
Prompt-tuning has emerged as an attractive paradigm for deploying large-scale language
models due to its strong downstream task performance and efficient multitask serving ability …
models due to its strong downstream task performance and efficient multitask serving ability …
Efficient joint learning for clinical named entity recognition and relation extraction using Fourier networks: a use case in adverse drug events
Current approaches for clinical information extraction are inefficient in terms of
computational costs and memory consumption, hindering their application to process large …
computational costs and memory consumption, hindering their application to process large …
Transferring Zero-shot Multilingual Chinese-Chinese Translation Model for Chinese Minority Language Translation
Z Yan, H Zan, Y Guo, H Xu - 2024 International Conference on …, 2024 - ieeexplore.ieee.org
Transfer learning is an effective method to improve the performance of low-resource
translation, but its effectiveness heavily relies on specific languages, and transferring …
translation, but its effectiveness heavily relies on specific languages, and transferring …
NAPG: Non-autoregressive program generation for hybrid tabular-textual question answering
Hybrid tabular-textual question answering (QA) requires reasoning from heterogeneous
information, and the types of reasoning are mainly divided into numerical reasoning and …
information, and the types of reasoning are mainly divided into numerical reasoning and …
DSISA: A New Neural Machine Translation Combining Dependency Weight and Neighbors
L Li, A Zhang, MX Luo - ACM Transactions on Asian and Low-Resource …, 2024 - dl.acm.org
Most of the previous neural machine translations (NMT) rely on parallel corpus. Integrating
explicitly prior syntactic structure information can improve the neural machine translation. In …
explicitly prior syntactic structure information can improve the neural machine translation. In …
Attention Link: An Efficient Attention-Based Low Resource Machine Translation Architecture
Z Min - Procedia Computer Science, 2023 - Elsevier
Transformers have emerged as a pivotal tool in machine translation. Nonetheless, their
effectiveness typically hinges on extensive training with millions of bilingual parallel corpora …
effectiveness typically hinges on extensive training with millions of bilingual parallel corpora …
Improving Chinese-Centric Low-Resource Translation Using English-Centric Pivoted Parallel Data
The good performance of Neural Machine Trans-lation (NMT) normally relies on a large
amount of parallel data, while the bilingual data between languages are usually insufficient …
amount of parallel data, while the bilingual data between languages are usually insufficient …
A Memory-Based Neural Network Model for English to Telugu Language Translation on Different Types of Sentences.
B Bataineh, B Vamsi, A Al Bataineh… - … Journal of Advanced …, 2024 - search.ebscohost.com
In India, regional languages play an important role in government-to-public, public-to-citizen
rights, weather forecasting and farming. Depending on the state the language also changes …
rights, weather forecasting and farming. Depending on the state the language also changes …
Rewiring the Transformer with Depth-Wise LSTMs
H Xu, Y Song, Q Liu, J van Genabith… - Proceedings of the 2024 …, 2024 - aclanthology.org
Stacking non-linear layers allows deep neural networks to model complicated functions, and
including residual connections in Transformer layers is beneficial for convergence and …
including residual connections in Transformer layers is beneficial for convergence and …