Globally normalized transition-based neural networks
We introduce a globally normalized transition-based neural network model that achieves
state-of-the-art part-of-speech tagging, dependency parsing and sentence compression …
state-of-the-art part-of-speech tagging, dependency parsing and sentence compression …
[PDF][PDF] Sentence compression by deletion with lstms
K Filippova, E Alfonseca, CA Colmenares… - Proceedings of the …, 2015 - aclanthology.org
We present an LSTM approach to deletion-based sentence compression where the task is to
translate a sentence into a sequence of zeros and ones, corresponding to token deletion …
translate a sentence into a sequence of zeros and ones, corresponding to token deletion …
Stack-pointer networks for dependency parsing
We introduce a novel architecture for dependency parsing:\emph {stack-pointer
networks}(\textbf {\textsc {StackPtr}}). Combining pointer networks~\citep …
networks}(\textbf {\textsc {StackPtr}}). Combining pointer networks~\citep …
Seq2seq dependency parsing
This paper presents a sequence to sequence (seq2seq) dependency parser by directly
predicting the relative position of head for each given word, which therefore results in a truly …
predicting the relative position of head for each given word, which therefore results in a truly …
Transforming dependency structures to logical forms for semantic parsing
The strongly typed syntax of grammar formalisms such as CCG, TAG, LFG and HPSG offers
a synchronous framework for deriving syntactic structures and semantic logical forms. In …
a synchronous framework for deriving syntactic structures and semantic logical forms. In …
[PDF][PDF] Graph-based dependency parsing with bidirectional LSTM
In this paper, we propose a neural network model for graph-based dependency parsing
which utilizes Bidirectional LSTM (BLSTM) to capture richer contextual information instead of …
which utilizes Bidirectional LSTM (BLSTM) to capture richer contextual information instead of …
Structured training for neural network transition-based parsing
We present structured perceptron training for neural network transition-based dependency
parsing. We learn the neural network representation using a gold corpus augmented by a …
parsing. We learn the neural network representation using a gold corpus augmented by a …
[PDF][PDF] Semantic role labeling with neural network factors
We present a new method for semantic role labeling in which arguments and semantic roles
are jointly embedded in a shared vector space for a given predicate. These embeddings …
are jointly embedded in a shared vector space for a given predicate. These embeddings …
Yara parser: A fast and accurate dependency parser
MS Rasooli, J Tetreault - arXiv preprint arXiv:1503.06733, 2015 - arxiv.org
Dependency parsers are among the most crucial tools in natural language processing as
they have many important applications in downstream tasks such as information retrieval …
they have many important applications in downstream tasks such as information retrieval …
Dependency parsing as head selection
Conventional graph-based dependency parsers guarantee a tree structure both during
training and inference. Instead, we formalize dependency parsing as the problem of …
training and inference. Instead, we formalize dependency parsing as the problem of …