Analysis methods in neural language processing: A survey
Y Belinkov, J Glass - … of the Association for Computational Linguistics, 2019 - direct.mit.edu
The field of natural language processing has seen impressive progress in recent years, with
neural network models replacing many of the traditional systems. A plethora of new models …
neural network models replacing many of the traditional systems. A plethora of new models …
Constituency parsing with a self-attentive encoder
N Kitaev, D Klein - arXiv preprint arXiv:1805.01052, 2018 - arxiv.org
We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead
to improvements to a state-of-the-art discriminative constituency parser. The use of attention …
to improvements to a state-of-the-art discriminative constituency parser. The use of attention …
Head-driven phrase structure grammar parsing on Penn treebank
Head-driven phrase structure grammar (HPSG) enjoys a uniform formalism representing rich
contextual syntactic and even semantic meanings. This paper makes the first attempt to …
contextual syntactic and even semantic meanings. This paper makes the first attempt to …
A survey of syntactic-semantic parsing based on constituent and dependency structures
MS Zhang - Science China Technological Sciences, 2020 - Springer
Syntactic and semantic parsing has been investigated for decades, which is one primary
topic in the natural language processing community. This article aims for a brief survey on …
topic in the natural language processing community. This article aims for a brief survey on …
Efficient second-order TreeCRF for neural dependency parsing
In the deep learning (DL) era, parsing models are extremely simplified with little hurt on
performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …
performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …
Rethinking self-attention: Towards interpretability in neural parsing
Attention mechanisms have improved the performance of NLP tasks while allowing models
to remain explainable. Self-attention is currently widely used, however interpretability is …
to remain explainable. Self-attention is currently widely used, however interpretability is …
Fast and accurate neural CRF constituency parsing
Estimating probability distribution is one of the core issues in the NLP field. However, in both
deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
LIMIT-BERT: Linguistic informed multi-task bert
In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning
language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …
language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …
Straight to the tree: Constituency parsing with neural syntactic distance
In this work, we propose a novel constituency parsing scheme. The model predicts a vector
of real-valued scalars, named syntactic distances, for each split position in the input …
of real-valued scalars, named syntactic distances, for each split position in the input …
Improving constituency parsing with span attention
Constituency parsing is a fundamental and important task for natural language
understanding, where a good representation of contextual information can help this task. N …
understanding, where a good representation of contextual information can help this task. N …