75 languages, 1 model: Parsing universal dependencies universally

D Kondratyuk, M Straka - arXiv preprint arXiv:1904.02099, 2019 - arxiv.org
We present UDify, a multilingual multi-task model capable of accurately predicting universal
part-of-speech, morphological features, lemmas, and dependency trees simultaneously for …

[PDF][PDF] Machine learning for ancient languages: A survey

T Sommerschield, Y Assael, J Pavlopoulos… - Computational …, 2023 - direct.mit.edu
Ancient languages preserve the cultures and histories of the past. However, their study is
fraught with difficulties, and experts must tackle a range of challenging text-based tasks, from …

Latin bert: A contextual language model for classical philology

D Bamman, PJ Burns - arXiv preprint arXiv:2009.10053, 2020 - arxiv.org
We present Latin BERT, a contextual language model for the Latin language, trained on
642.7 million words from a variety of sources spanning the Classical era to the 21st century …

MRP 2019: Cross-framework meaning representation parsing

S Oepen, O Abend, J Hajic, D Hershcovich… - Proceedings of the …, 2019 - aclanthology.org
Abstract The 2019 Shared Task at the Conference for Computational Language Learning
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks. Five …

Exploring large language models for classical philology

F Riemenschneider, A Frank - arXiv preprint arXiv:2305.13698, 2023 - arxiv.org
Recent advances in NLP have led to the creation of powerful language models for many
languages including Ancient Greek and Latin. While prior work on Classical languages …

Indic-transformers: An analysis of transformer language models for Indian languages

K Jain, A Deshpande, K Shridhar, F Laumann… - arXiv preprint arXiv …, 2020 - arxiv.org
Language models based on the Transformer architecture have achieved state-of-the-art
performance on a wide range of NLP tasks such as text classification, question-answering …

Structure-level knowledge distillation for multilingual sequence labeling

X Wang, Y Jiang, N Bach, T Wang, F Huang… - arXiv preprint arXiv …, 2020 - arxiv.org
Multilingual sequence labeling is a task of predicting label sequences using a single unified
model for multiple languages. Compared with relying on multiple monolingual models, using …

A customizable framework for multimodal emotion recognition using ensemble of deep neural network models

C Dixit, SM Satapathy - Multimedia Systems, 2023 - Springer
Multimodal emotion recognition of videos of human oration, commonly called opinion
videos, has a wide scope of applications across all domains. Here, the speakers express …

Understanding model robustness to user-generated noisy texts

J Náplava, M Popel, M Straka, J Straková - arXiv preprint arXiv …, 2021 - arxiv.org
Sensitivity of deep-neural models to input noise is known to be a challenging problem. In
NLP, model performance often deteriorates with naturally occurring noise, such as spelling …

Accurate dependency parsing and tagging of Latin

S Nehrdich, O Hellwig - Proceedings of the Second Workshop on …, 2022 - aclanthology.org
Having access to high-quality grammatical annotations is important for downstream tasks in
NLP as well as for corpus-based research. In this paper, we describe experiments with the …