A review on the attention mechanism of deep learning

Z Niu, G Zhong, H Yu - Neurocomputing, 2021 - Elsevier
Attention has arguably become one of the most important concepts in the deep learning
field. It is inspired by the biological systems of humans that tend to focus on the distinctive …

Permutation invariant graph-to-sequence model for template-free retrosynthesis and reaction prediction

Z Tu, CW Coley - Journal of chemical information and modeling, 2022 - ACS Publications
Synthesis planning and reaction outcome prediction are two fundamental problems in
computer-aided organic chemistry for which a variety of data-driven approaches have …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Iterative deep graph learning for graph neural networks: Better and robust node embeddings

Y Chen, L Wu, M Zaki - Advances in neural information …, 2020 - proceedings.neurips.cc
In this paper, we propose an end-to-end graph learning framework, namely\textbf {I}
terative\textbf {D} eep\textbf {G} raph\textbf {L} earning (\alg), for jointly and iteratively …

[图书][B] Deep learning on graphs

Y Ma, J Tang - 2021 - books.google.com
Deep learning on graphs has become one of the hottest topics in machine learning. The
book consists of four parts to best accommodate our readers with diverse backgrounds and …

Attention, please! A survey of neural attention models in deep learning

A de Santana Correia, EL Colombini - Artificial Intelligence Review, 2022 - Springer
In humans, Attention is a core property of all perceptual and cognitive operations. Given our
limited ability to process competing sources, attention mechanisms select, modulate, and …

Improved code summarization via a graph neural network

A LeClair, S Haque, L Wu, C McMillan - Proceedings of the 28th …, 2020 - dl.acm.org
Automatic source code summarization is the task of generating natural language
descriptions for source code. Automatic code summarization is a rapidly expanding research …

Heterogeneous global graph neural networks for personalized session-based recommendation

Y Pang, L Wu, Q Shen, Y Zhang, Z Wei, F Xu… - Proceedings of the …, 2022 - dl.acm.org
Predicting the next interaction of a short-term interaction session is a challenging task in
session-based recommendation. Almost all existing works rely on item transition patterns …

Graph transformer for graph-to-sequence learning

D Cai, W Lam - Proceedings of the AAAI conference on artificial …, 2020 - ojs.aaai.org
The dominant graph-to-sequence transduction models employ graph neural networks for
graph representation learning, where the structural information is reflected by the receptive …

Attention models in graphs: A survey

JB Lee, RA Rossi, S Kim, NK Ahmed… - ACM Transactions on …, 2019 - dl.acm.org
Graph-structured data arise naturally in many different application domains. By representing
data as graphs, we can capture entities (ie, nodes) as well as their relationships (ie, edges) …