75 languages, 1 model: Parsing universal dependencies universally
D Kondratyuk, M Straka - arXiv preprint arXiv:1904.02099, 2019 - arxiv.org
We present UDify, a multilingual multi-task model capable of accurately predicting universal
part-of-speech, morphological features, lemmas, and dependency trees simultaneously for …
part-of-speech, morphological features, lemmas, and dependency trees simultaneously for …
[PDF][PDF] Machine learning for ancient languages: A survey
Ancient languages preserve the cultures and histories of the past. However, their study is
fraught with difficulties, and experts must tackle a range of challenging text-based tasks, from …
fraught with difficulties, and experts must tackle a range of challenging text-based tasks, from …
Latin bert: A contextual language model for classical philology
We present Latin BERT, a contextual language model for the Latin language, trained on
642.7 million words from a variety of sources spanning the Classical era to the 21st century …
642.7 million words from a variety of sources spanning the Classical era to the 21st century …
MRP 2019: Cross-framework meaning representation parsing
Abstract The 2019 Shared Task at the Conference for Computational Language Learning
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks. Five …
(CoNLL) was devoted to Meaning Representation Parsing (MRP) across frameworks. Five …
Exploring large language models for classical philology
F Riemenschneider, A Frank - arXiv preprint arXiv:2305.13698, 2023 - arxiv.org
Recent advances in NLP have led to the creation of powerful language models for many
languages including Ancient Greek and Latin. While prior work on Classical languages …
languages including Ancient Greek and Latin. While prior work on Classical languages …
Indic-transformers: An analysis of transformer language models for Indian languages
Language models based on the Transformer architecture have achieved state-of-the-art
performance on a wide range of NLP tasks such as text classification, question-answering …
performance on a wide range of NLP tasks such as text classification, question-answering …
Structure-level knowledge distillation for multilingual sequence labeling
Multilingual sequence labeling is a task of predicting label sequences using a single unified
model for multiple languages. Compared with relying on multiple monolingual models, using …
model for multiple languages. Compared with relying on multiple monolingual models, using …
A customizable framework for multimodal emotion recognition using ensemble of deep neural network models
C Dixit, SM Satapathy - Multimedia Systems, 2023 - Springer
Multimodal emotion recognition of videos of human oration, commonly called opinion
videos, has a wide scope of applications across all domains. Here, the speakers express …
videos, has a wide scope of applications across all domains. Here, the speakers express …
Understanding model robustness to user-generated noisy texts
Sensitivity of deep-neural models to input noise is known to be a challenging problem. In
NLP, model performance often deteriorates with naturally occurring noise, such as spelling …
NLP, model performance often deteriorates with naturally occurring noise, such as spelling …
Accurate dependency parsing and tagging of Latin
S Nehrdich, O Hellwig - Proceedings of the Second Workshop on …, 2022 - aclanthology.org
Having access to high-quality grammatical annotations is important for downstream tasks in
NLP as well as for corpus-based research. In this paper, we describe experiments with the …
NLP as well as for corpus-based research. In this paper, we describe experiments with the …