Mukayese: Turkish NLP strikes back

A Safaya, E Kurtuluş, A Göktoğan, D Yuret - arXiv preprint arXiv …, 2022 - arxiv.org
Having sufficient resources for language X lifts it from the under-resourced languages class,
but not necessarily from the under-researched class. In this paper, we address the problem …

PoliBERTweet: a pre-trained language model for analyzing political content on Twitter

K Kawintiranon, L Singh - Proceedings of the Thirteenth Language …, 2022 - aclanthology.org
Transformer-based models have become the state-of-the-art for numerous natural language
processing (NLP) tasks, especially for noisy data sets, including social media posts. For …

BRDS: An FPGA-based LSTM accelerator with row-balanced dual-ratio sparsification

SA Ghasemzadeh, EB Tavakoli, M Kamal… - arXiv preprint arXiv …, 2021 - arxiv.org
In this paper, first, a hardware-friendly pruning algorithm for reducing energy consumption
and improving the speed of Long Short-Term Memory (LSTM) neural network accelerators is …

CzeGPT-2–Training New Model for Czech Generative Text Processing Evaluated with the Summarization Task

A Hájek, A Horák - IEEE Access, 2024 - ieeexplore.ieee.org
Automatic text summarization (ATS), alongside neural machine translation or question
answering, is one of the leading tasks in Natural Language Processing (NLP). In recent …

Large-context question answering with cross-lingual transfer

M Sagen - 2021 - diva-portal.org
The transformer architecture has become one of the most prominent in natural language
processing (NLP) since its introduction in 2017 [65]. The primary component of the …

Predicting the Unpredictable–Using Language Models to Assess Literary Quality

Y Wu - 2023 - diva-portal.org
People read for various purposes like learning specific skills, acquiring foreign languages,
and enjoying the pure reading experience, etc. This kind of pure enjoyment may credit to …

Notes towards infrastructure governance for large language models

L Dal Molin - First Monday, 2024 - firstmonday.org
This paper draws on information infrastructures (IIs) in science and technology studies
(STS), as well as on feminist STS scholarship and contemporary critical accounts of digital …

[PDF][PDF] BureauBERTo: adapting UmBERTo to the Italian bureaucratic language.

S Auriemma, M Madeddu, M Miliani, A Bondielli… - Ital-IA, 2023 - ceur-ws.org
In this work, we introduce BureauBERTo, the first transformer-based language model
adapted to the Italian Public Administration (PA) and technical-bureaucratic domains. We …

Dreams Are More" Predictable''Than You Think

L Bertolini - arXiv preprint arXiv:2305.05054, 2023 - arxiv.org
A consistent body of evidence suggests that dream reports significantly vary from other types
of textual transcripts with respect to semantic content. Furthermore, it appears to be a …

Algoritmos basados en descriptores a nivel de aminoácidos y estructuras terciarias predichas para predecir péptidos antimicrobianos mediante aprendizaje de grafos

GC Delgado - 2024 - cicese.repositorioinstitucional.mx
La resistencia a los antimicrobianos constituye una grave amenaza para la salud humana.
El descubrimiento de fármacos basados en péptidos antimicrobianos (AMP) es uno de los …