Modeling Structure‐Building in the Brain With CCG Parsing and Large Language Models
To model behavioral and neural correlates of language comprehension in naturalistic
environments, researchers have turned to broad‐coverage tools from natural‐language …
environments, researchers have turned to broad‐coverage tools from natural‐language …
Supertagging Combinatory Categorial Grammar with attentive graph convolutional networks
Supertagging is conventionally regarded as an important task for combinatory categorial
grammar (CCG) parsing, where effective modeling of contextual information is highly …
grammar (CCG) parsing, where effective modeling of contextual information is highly …
Modality and negation in event extraction
Language provides speakers with a rich system of modality for expressing thoughts about
events, without being committed to their actual occurrence. Modality is commonly used in the …
events, without being committed to their actual occurrence. Modality is commonly used in the …
Max-margin incremental CCG parsing
M Stanojević, M Steedman - 2020 Annual Conference of the …, 2020 - research.ed.ac.uk
Incremental syntactic parsing has been an active research area both for cognitive scientists
trying to model human sentence processing and for NLP researchers attempting to combine …
trying to model human sentence processing and for NLP researchers attempting to combine …
Modeling incremental language comprehension in the brain with Combinatory Categorial Grammar
Hierarchical sentence structure plays a role in word-by-word human sentence
comprehension, but it remains unclear how best to characterize this structure and unknown …
comprehension, but it remains unclear how best to characterize this structure and unknown …
Multivalent entailment graphs for question answering
Drawing inferences between open-domain natural language predicates is a necessity for
true language understanding. There has been much progress in unsupervised learning of …
true language understanding. There has been much progress in unsupervised learning of …
Something old, something new: Grammar-based CCG parsing with transformer models
S Clark - arXiv preprint arXiv:2109.10044, 2021 - arxiv.org
This report describes the parsing problem for Combinatory Categorial Grammar (CCG),
showing how a combination of Transformer-based neural models and a symbolic CCG …
showing how a combination of Transformer-based neural models and a symbolic CCG …
On the challenges of fully incremental neural dependency parsing
Since the popularization of BiLSTMs and Transformer-based bidirectional encoders, state-of-
the-art syntactic parsers have lacked incrementality, requiring access to the whole sentence …
the-art syntactic parsers have lacked incrementality, requiring access to the whole sentence …
Incorporating temporal information in entailment graph mining
We present a novel method for injecting temporality into entailment graphs to address the
problem of spurious entailments, which may arise from similar but temporally distinct events …
problem of spurious entailments, which may arise from similar but temporally distinct events …
Category Locality Theory: A unified account of locality effects in sentence comprehension
S Isono - Cognition, 2024 - Elsevier
In real-time sentence comprehension, the comprehender is often required to establish
syntactic dependencies between words that are linearly distant. Major models of sentence …
syntactic dependencies between words that are linearly distant. Major models of sentence …