A survey on semantic processing techniques
Semantic processing is a fundamental research domain in computational linguistics. In the
era of powerful pre-trained language models and large language models, the advancement …
era of powerful pre-trained language models and large language models, the advancement …
A neural entity coreference resolution review
N Stylianou, I Vlahavas - Expert Systems with Applications, 2021 - Elsevier
Abstract Entity Coreference Resolution is the task of resolving all mentions in a document
that refer to the same real world entity and is considered as one of the most difficult tasks in …
that refer to the same real world entity and is considered as one of the most difficult tasks in …
Quoref: A reading comprehension dataset with questions requiring coreferential reasoning
Machine comprehension of texts longer than a single sentence often requires coreference
resolution. However, most current reading comprehension benchmarks do not contain …
resolution. However, most current reading comprehension benchmarks do not contain …
An annotated dataset of coreference in English literature
D Bamman, O Lewke, A Mansoor - arXiv preprint arXiv:1912.01140, 2019 - arxiv.org
We present in this work a new dataset of coreference annotations for works of literature in
English, covering 29,103 mentions in 210,532 tokens from 100 works of fiction. This dataset …
English, covering 29,103 mentions in 210,532 tokens from 100 works of fiction. This dataset …
Modeling fine-grained entity types with box embeddings
Neural entity typing models typically represent fine-grained entity types as vectors in a high-
dimensional space, but such spaces are not well-suited to modeling these types' complex …
dimensional space, but such spaces are not well-suited to modeling these types' complex …
Entity tracking in language models
N Kim, S Schuster - arXiv preprint arXiv:2305.02363, 2023 - arxiv.org
Keeping track of how states of entities change as a text or dialog unfolds is a key
prerequisite to discourse understanding. Yet, there have been few systematic investigations …
prerequisite to discourse understanding. Yet, there have been few systematic investigations …
On generalization in coreference resolution
While coreference resolution is defined independently of dataset domain, most models for
performing coreference resolution do not transfer well to unseen domains. We consolidate a …
performing coreference resolution do not transfer well to unseen domains. We consolidate a …
Schema-learning and rebinding as mechanisms of in-context learning and emergence
S Swaminathan, A Dedieu… - Advances in …, 2024 - proceedings.neurips.cc
In-context learning (ICL) is one of the most powerful and most unexpected capabilities to
emerge in recent transformer-based large language models (LLMs). Yet the mechanisms …
emerge in recent transformer-based large language models (LLMs). Yet the mechanisms …
A crowdsourced corpus of multiple judgments and disagreement on anaphoric interpretation
We present a corpus of anaphoric information (coreference) crowdsourced through a game-
with-a-purpose. The corpus, containing annotations for about 108,000 markables, is one of …
with-a-purpose. The corpus, containing annotations for about 108,000 markables, is one of …
Moving on from OntoNotes: Coreference resolution model transfer
P Xia, B Van Durme - arXiv preprint arXiv:2104.08457, 2021 - arxiv.org
Academic neural models for coreference resolution (coref) are typically trained on a single
dataset, OntoNotes, and model improvements are benchmarked on that same dataset …
dataset, OntoNotes, and model improvements are benchmarked on that same dataset …