Constituency parsing with a self-attentive encoder
N Kitaev, D Klein - arXiv preprint arXiv:1805.01052, 2018 - arxiv.org
We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead
to improvements to a state-of-the-art discriminative constituency parser. The use of attention …
to improvements to a state-of-the-art discriminative constituency parser. The use of attention …
Multilingual constituency parsing with self-attention and pre-training
We show that constituency parsing benefits from unsupervised pre-training across a variety
of languages and a range of pre-training conditions. We first compare the benefits of no pre …
of languages and a range of pre-training conditions. We first compare the benefits of no pre …
What's going on in neural constituency parsers? an analysis
A number of differences have emerged between modern and classic approaches to
constituency parsing in recent years, with structural components like grammars and feature …
constituency parsing in recent years, with structural components like grammars and feature …
What do character-level models learn about morphology? The case of dependency parsing
When parsing morphologically-rich languages with neural models, it is beneficial to model
input at the character level, and it has been claimed that this is because character-level …
input at the character level, and it has been claimed that this is because character-level …
Parsing as a cue‐based retrieval model
J Dotlačil - Cognitive science, 2021 - Wiley Online Library
This paper develops a novel psycholinguistic parser and tests it against experimental and
corpus reading data. The parser builds on the recent research into memory structures, which …
corpus reading data. The parser builds on the recent research into memory structures, which …
Sequence labeling parsing by learning across representations
We use parsing as sequence labeling as a common framework to learn across constituency
and dependency syntactic abstractions. To do so, we cast the problem as multitask learning …
and dependency syntactic abstractions. To do so, we cast the problem as multitask learning …
Unlexicalized transition-based discontinuous constituency parsing
Lexicalized parsing models are based on the assumptions that (i) constituents are organized
around a lexical head and (ii) bilexical statistics are crucial to solve ambiguities. In this …
around a lexical head and (ii) bilexical statistics are crucial to solve ambiguities. In this …
Better, faster, stronger sequence tagging constituent parsers
Sequence tagging models for constituent parsing are faster, but less accurate than other
types of parsers. In this work, we address the following weaknesses of such constituent …
types of parsers. In this work, we address the following weaknesses of such constituent …
Discontinuous constituency parsing with a stack-free transition system and a dynamic oracle
We introduce a novel transition system for discontinuous constituency parsing. Instead of
storing subtrees in a stack--ie a data structure with linear-time sequential access--the …
storing subtrees in a stack--ie a data structure with linear-time sequential access--the …
A conditional splitting framework for efficient constituency parsing
We introduce a generic seq2seq parsing framework that casts constituency parsing
problems (syntactic and discourse parsing) into a series of conditional splitting decisions …
problems (syntactic and discourse parsing) into a series of conditional splitting decisions …