PEACH: Pre-Training Sequence-to-Sequence Multilingual Models for Translation with Semi-Supervised Pseudo-Parallel Document Generation

A Salemi, A Abaskohi, S Tavakoli… - arXiv preprint arXiv …, 2023 - arxiv.org
Multilingual pre-training significantly improves many multilingual NLP tasks, including
machine translation. Most existing methods are based on some variants of masked …

Survey of Afghan (Dari) Language NLP for Building Afghan NLIDB System

S Karimi - 2022 - dspace.bracu.ac.bd
Technology adoption is extremely limited in Afghanistan, especially since people have
limited access to the Internet, smartphone, and computer due to power limitations and the …

Enhancing Persian Text Summarization Using the mT5 Transformer Model: A Three-Phased Fine-Tuning Approach and Reinforcement Learning

VNM Abadi, F Ghasemian - 2023 - researchsquare.com
In the contemporary era, grappling with the vast expanse of big data presents a formidable
obstacle, particularly when it comes to extracting vital information from extensive textual …

[PDF][PDF] Persian Text Summarization via Fine Tuning mT5 Transformer

VN Mahmoodabadi, F Ghasemian - language, 2023 - researchgate.net
Nowadays, one of the main challenges in the world of big data is finding important
information from a large text. At any moment, large volumes of news from different news …