Ammus: A survey of transformer-based pretrained models in natural language processing
KS Kalyan, A Rajasekharan, S Sangeetha - arXiv preprint arXiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …
almost every NLP task. The evolution of these models started with GPT and BERT. These …
Pre-trained models for natural language processing: A survey
Recently, the emergence of pre-trained models (PTMs) has brought natural language
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
processing (NLP) to a new era. In this survey, we provide a comprehensive review of PTMs …
[PDF][PDF] mt5: A massively multilingual pre-trained text-to-text transformer
L Xue - arXiv preprint arXiv:2010.11934, 2020 - fq.pkwyx.com
The recent" Text-to-Text Transfer Transformer"(T5) leveraged a unified text-to-text format and
scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this …
scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this …
BERTimbau: pretrained BERT models for Brazilian Portuguese
Recent advances in language representation using neural networks have made it viable to
transfer the learned internal states of large pretrained language models (LMs) to …
transfer the learned internal states of large pretrained language models (LMs) to …
ARBERT & MARBERT: Deep bidirectional transformers for Arabic
M Abdul-Mageed, AR Elmadany… - arXiv preprint arXiv …, 2020 - arxiv.org
Pre-trained language models (LMs) are currently integral to many natural language
processing systems. Although multilingual LMs were also introduced to serve many …
processing systems. Although multilingual LMs were also introduced to serve many …
Spanish pre-trained bert model and evaluation data
J Cañete, G Chaperon, R Fuentes, JH Ho… - arXiv preprint arXiv …, 2023 - arxiv.org
The Spanish language is one of the top 5 spoken languages in the world. Nevertheless,
finding resources to train or evaluate Spanish language models is not an easy task. In this …
finding resources to train or evaluate Spanish language models is not an easy task. In this …
Transfer learning for sentiment analysis using BERT based supervised fine-tuning
The growth of the Internet has expanded the amount of data expressed by users across
multiple platforms. The availability of these different worldviews and individuals' emotions …
multiple platforms. The availability of these different worldviews and individuals' emotions …
Arabert: Transformer-based model for arabic language understanding
The Arabic language is a morphologically rich language with relatively few resources and a
less explored syntax compared to English. Given these limitations, Arabic Natural Language …
less explored syntax compared to English. Given these limitations, Arabic Natural Language …
Maria: Spanish language models
A Gutiérrez-Fandiño, J Armengol-Estapé… - arXiv preprint arXiv …, 2021 - arxiv.org
This work presents MarIA, a family of Spanish language models and associated resources
made available to the industry and the research community. Currently, MarIA includes …
made available to the industry and the research community. Currently, MarIA includes …
[HTML][HTML] A systematic review of hate speech automatic detection using natural language processing
MS Jahan, M Oussalah - Neurocomputing, 2023 - Elsevier
With the multiplication of social media platforms, which offer anonymity, easy access and
online community formation and online debate, the issue of hate speech detection and …
online community formation and online debate, the issue of hate speech detection and …