Graph neural networks for natural language processing: A survey
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Graph pre-training for AMR parsing and generation
Abstract meaning representation (AMR) highlights the core semantic information of text in a
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …
AMR-based network for aspect-based sentiment analysis
Aspect-based sentiment analysis (ABSA) is a fine-grained sentiment classification task.
Many recent works have used dependency trees to extract the relation between aspects and …
Many recent works have used dependency trees to extract the relation between aspects and …
Structural adapters in pretrained language models for amr-to-text generation
Pretrained language models (PLM) have recently advanced graph-to-text generation, where
the input graph is linearized into a sequence and fed into the PLM to obtain its …
the input graph is linearized into a sequence and fed into the PLM to obtain its …
Semantic representation for dialogue modeling
Although neural models have achieved competitive results in dialogue systems, they have
shown limited ability in representing core semantics, such as ignoring important entities. To …
shown limited ability in representing core semantics, such as ignoring important entities. To …
De-confounded variational encoder-decoder for logical table-to-text generation
Logical table-to-text generation aims to automatically generate fluent and logically faithful
text from tables. The task remains challenging where deep learning models often generated …
text from tables. The task remains challenging where deep learning models often generated …
Promoting graph awareness in linearized graph-to-text generation
Generating text from structured inputs, such as meaning representations or RDF triples, has
often involved the use of specialized graph-encoding neural networks. However, recent …
often involved the use of specialized graph-encoding neural networks. However, recent …
Abstract meaning representation of Turkish
Abstract meaning representation (AMR) is a graph-based sentence-level meaning
representation that has become highly popular in recent years. AMR is a knowledge-based …
representation that has become highly popular in recent years. AMR is a knowledge-based …
Semantic-based pre-training for dialogue understanding
Pre-trained language models have made great progress on dialogue tasks. However, these
models are typically trained on surface dialogue text, thus are proven to be weak in …
models are typically trained on surface dialogue text, thus are proven to be weak in …
Neural Methods for Data-to-text Generation
The neural boom that has sparked natural language processing (NLP) research throughout
the last decade has similarly led to significant innovations in data-to-text generation (D2T) …
the last decade has similarly led to significant innovations in data-to-text generation (D2T) …