Graph convolutional networks in language and vision: A survey

H Ren, W Lu, Y Xiao, X Chang, X Wang, Z Dong… - Knowledge-Based …, 2022 - Elsevier
Graph convolutional networks (GCNs) have a strong ability to learn graph representation
and have achieved good performance in a range of applications, including social …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

[引用][C] Introduction to natural language processing

J Eisenstein - 2019 - books.google.com
A survey of computational methods for understanding, generating, and manipulating human
language, which offers a synthesis of classical representations and algorithms with …

One SPRING to rule them both: Symmetric AMR semantic parsing and generation without a complex pipeline

M Bevilacqua, R Blloshmi, R Navigli - Proceedings of the AAAI …, 2021 - ojs.aaai.org
In Text-to-AMR parsing, current state-of-the-art semantic parsers use cumbersome pipelines
integrating several different modules or components, and exploit graph recategorization, ie …

Spice: Semantic propositional image caption evaluation

P Anderson, B Fernando, M Johnson… - Computer Vision–ECCV …, 2016 - Springer
There is considerable interest in the task of automatically generating image captions.
However, evaluation is challenging. Existing automatic evaluation metrics are primarily …

Paraphrase identification with deep learning: A review of datasets and methods

C Zhou, C Qiu, L Liang, DE Acuna - arXiv preprint arXiv:2212.06933, 2022 - arxiv.org
The rapid progress of Natural Language Processing (NLP) technologies has led to the
widespread availability and effectiveness of text generation tools such as ChatGPT and …

Graph-to-sequence learning using gated graph neural networks

D Beck, G Haffari, T Cohn - arXiv preprint arXiv:1806.09835, 2018 - arxiv.org
Many NLP applications can be framed as a graph-to-sequence learning problem. Previous
work proposing neural architectures on this setting obtained promising results compared to …

Graph pre-training for AMR parsing and generation

X Bai, Y Chen, Y Zhang - arXiv preprint arXiv:2203.07836, 2022 - arxiv.org
Abstract meaning representation (AMR) highlights the core semantic information of text in a
graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of …

Bridging knowledge graphs to generate scene graphs

A Zareian, S Karaman, SF Chang - … , Glasgow, UK, August 23–28, 2020 …, 2020 - Springer
Scene graphs are powerful representations that parse images into their abstract semantic
elements, ie, objects and their interactions, which facilitates visual comprehension and …

Structured pruning of large language models

Z Wang, J Wohlwend, T Lei - arXiv preprint arXiv:1910.04732, 2019 - arxiv.org
Large language models have recently achieved state of the art performance across a wide
variety of natural language tasks. Meanwhile, the size of these models and their latency …