Robbert: a dutch roberta-based language model
P Delobelle, T Winters, B Berendt - arXiv preprint arXiv:2001.06286, 2020 - arxiv.org
Pre-trained language models have been dominating the field of natural language
processing in recent years, and have led to significant performance gains for various …
processing in recent years, and have led to significant performance gains for various …
What the [mask]? making sense of language-specific BERT models
Recently, Natural Language Processing (NLP) has witnessed an impressive progress in
many areas, due to the advent of novel, pretrained contextual representation models. In …
many areas, due to the advent of novel, pretrained contextual representation models. In …
A comprehensive review on feature set used for anaphora resolution
Abstract In linguistics, the Anaphora Resolution (AR) is the method of identifying the
antecedent for anaphora. In simple terms, this is the problem that helps to solve what the …
antecedent for anaphora. In simple terms, this is the problem that helps to solve what the …
A deep neural network model for speakers coreference resolution in legal texts
Coreference resolution is one of the fundamental tasks in natural language processing
(NLP), and is of great significance to understand the semantics of texts. Meanwhile …
(NLP), and is of great significance to understand the semantics of texts. Meanwhile …
SICKNL: a dataset for Dutch natural language inference
G Wijnholds, M Moortgat - arXiv preprint arXiv:2101.05716, 2021 - arxiv.org
We present SICK-NL (read: signal), a dataset targeting Natural Language Inference in
Dutch. SICK-NL is obtained by translating the SICK dataset of Marelli et al.(2014) from …
Dutch. SICK-NL is obtained by translating the SICK dataset of Marelli et al.(2014) from …
Robbertje: A distilled dutch bert model
P Delobelle, T Winters, B Berendt - arXiv preprint arXiv:2204.13511, 2022 - arxiv.org
Pre-trained large-scale language models such as BERT have gained a lot of attention
thanks to their outstanding performance on a wide range of natural language tasks …
thanks to their outstanding performance on a wide range of natural language tasks …
Robbert-2022: Updating a dutch language model to account for evolving language use
P Delobelle, T Winters, B Berendt - arXiv preprint arXiv:2211.08192, 2022 - arxiv.org
Large transformer-based language models, eg BERT and GPT-3, outperform previous
architectures on most natural language processing tasks. Such language models are first …
architectures on most natural language processing tasks. Such language models are first …
Investigating Cross-Document Event Coreference for Dutch
L De Langhe, O De Clercq, V Hoste - Proceedings of the Fifth …, 2022 - aclanthology.org
In this paper we present baseline results for Event Coreference Resolution (ECR) in Dutch
using gold-standard (ie non-predicted) event mentions. A newly developed benchmark …
using gold-standard (ie non-predicted) event mentions. A newly developed benchmark …
Towards fine (r)-grained identification of event coreference resolution types
L De Langhe, O De Clercq, V Hoste - Computational Linguistics in …, 2022 - clinjournal.org
In this paper we present initial efforts to study complex event-event relations or event
coreference in the Dutch language. We are primarily interested in the event-subevent …
coreference in the Dutch language. We are primarily interested in the event-subevent …
Structural ambiguity and its disambiguation in language model based parsers: the case of dutch clause relativization
G Wijnholds, M Moortgat - arXiv preprint arXiv:2305.14917, 2023 - arxiv.org
This paper addresses structural ambiguity in Dutch relative clauses. By investigating the task
of disambiguation by grounding, we study how the presence of a prior sentence can resolve …
of disambiguation by grounding, we study how the presence of a prior sentence can resolve …