A survey on knowledge graphs: Representation, acquisition, and applications

S Ji, S Pan, E Cambria, P Marttinen… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Human knowledge provides a formal understanding of the world. Knowledge graphs that
represent structural relations between entities have become an increasingly popular …

A comprehensive survey on automatic knowledge graph construction

L Zhong, J Wu, Q Li, H Peng, X Wu - ACM Computing Surveys, 2023 - dl.acm.org
Automatic knowledge graph construction aims at manufacturing structured human
knowledge. To this end, much effort has historically been spent extracting informative fact …

[HTML][HTML] Ptr: Prompt tuning with rules for text classification

X Han, W Zhao, N Ding, Z Liu, M Sun - AI Open, 2022 - Elsevier
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-
trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved …

Differentiable prompt makes pre-trained language models better few-shot learners

N Zhang, L Li, X Chen, S Deng, Z Bi, C Tan… - arXiv preprint arXiv …, 2021 - arxiv.org
Large-scale pre-trained language models have contributed significantly to natural language
processing by demonstrating remarkable abilities as few-shot learners. However, their …

Learning from context or names? an empirical study on neural relation extraction

H Peng, T Gao, X Han, Y Lin, P Li, Z Liu, M Sun… - arXiv preprint arXiv …, 2020 - arxiv.org
Neural models have achieved remarkable success on relation extraction (RE) benchmarks.
However, there is no clear understanding which type of information affects existing RE …

ERICA: Improving entity and relation understanding for pre-trained language models via contrastive learning

Y Qin, Y Lin, R Takanobu, Z Liu, P Li, H Ji… - arXiv preprint arXiv …, 2020 - arxiv.org
Pre-trained Language Models (PLMs) have shown superior performance on various
downstream Natural Language Processing (NLP) tasks. However, conventional pre-training …

More data, more relations, more context and more openness: A review and outlook for relation extraction

X Han, T Gao, Y Lin, H Peng, Y Yang, C Xiao… - arXiv preprint arXiv …, 2020 - arxiv.org
Relational facts are an important component of human knowledge, which are hidden in vast
amounts of text. In order to extract these facts from text, people have been working on …

Multimodal relation extraction with efficient graph alignment

C Zheng, J Feng, Z Fu, Y Cai, Q Li, T Wang - Proceedings of the 29th …, 2021 - dl.acm.org
Relation extraction (RE) is a fundamental process in constructing knowledge graphs.
However, previous methods on relation extraction suffer sharp performance decline in short …

A survey on narrative extraction from textual data

B Santana, R Campos, E Amorim, A Jorge… - Artificial Intelligence …, 2023 - Springer
Narratives are present in many forms of human expression and can be understood as a
fundamental way of communication between people. Computational understanding of the …

Visualizing transformers for nlp: a brief survey

AMP Braşoveanu, R Andonie - 2020 24th International …, 2020 - ieeexplore.ieee.org
The introduction of Transformer neural networks has changed the landscape of Natural
Language Processing during the last three years. While models inspired by it have …