Constituency parsing with a self-attentive encoder

N Kitaev, D Klein - arXiv preprint arXiv:1805.01052, 2018 - arxiv.org
We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead
to improvements to a state-of-the-art discriminative constituency parser. The use of attention …

Multilingual constituency parsing with self-attention and pre-training

N Kitaev, S Cao, D Klein - arXiv preprint arXiv:1812.11760, 2018 - arxiv.org
We show that constituency parsing benefits from unsupervised pre-training across a variety
of languages and a range of pre-training conditions. We first compare the benefits of no pre …

What's going on in neural constituency parsers? an analysis

D Gaddy, M Stern, D Klein - arXiv preprint arXiv:1804.07853, 2018 - arxiv.org
A number of differences have emerged between modern and classic approaches to
constituency parsing in recent years, with structural components like grammars and feature …

What do character-level models learn about morphology? The case of dependency parsing

C Vania, A Grivas, A Lopez - arXiv preprint arXiv:1808.09180, 2018 - arxiv.org
When parsing morphologically-rich languages with neural models, it is beneficial to model
input at the character level, and it has been claimed that this is because character-level …

Parsing as a cue‐based retrieval model

J Dotlačil - Cognitive science, 2021 - Wiley Online Library
This paper develops a novel psycholinguistic parser and tests it against experimental and
corpus reading data. The parser builds on the recent research into memory structures, which …

Sequence labeling parsing by learning across representations

M Strzyz, D Vilares, C Gómez-Rodríguez - arXiv preprint arXiv:1907.01339, 2019 - arxiv.org
We use parsing as sequence labeling as a common framework to learn across constituency
and dependency syntactic abstractions. To do so, we cast the problem as multitask learning …

Unlexicalized transition-based discontinuous constituency parsing

M Coavoux, B Crabbé, SB Cohen - Transactions of the Association …, 2019 - direct.mit.edu
Lexicalized parsing models are based on the assumptions that (i) constituents are organized
around a lexical head and (ii) bilexical statistics are crucial to solve ambiguities. In this …

Better, faster, stronger sequence tagging constituent parsers

D Vilares, M Abdou, A Søgaard - arXiv preprint arXiv:1902.10985, 2019 - arxiv.org
Sequence tagging models for constituent parsing are faster, but less accurate than other
types of parsers. In this work, we address the following weaknesses of such constituent …

Discontinuous constituency parsing with a stack-free transition system and a dynamic oracle

M Coavoux, SB Cohen - arXiv preprint arXiv:1904.00615, 2019 - arxiv.org
We introduce a novel transition system for discontinuous constituency parsing. Instead of
storing subtrees in a stack--ie a data structure with linear-time sequential access--the …

A conditional splitting framework for efficient constituency parsing

TT Nguyen, XP Nguyen, S Joty, X Li - arXiv preprint arXiv:2106.15760, 2021 - arxiv.org
We introduce a generic seq2seq parsing framework that casts constituency parsing
problems (syntactic and discourse parsing) into a series of conditional splitting decisions …