A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt

Y Cao, S Li, Y Liu, Z Yan, Y Dai, PS Yu… - arXiv preprint arXiv …, 2023 - arxiv.org
Recently, ChatGPT, along with DALL-E-2 and Codex, has been gaining significant attention
from society. As a result, many individuals have become interested in related resources and …

[HTML][HTML] Pre-trained language models and their applications

H Wang, J Li, H Wu, E Hovy, Y Sun - Engineering, 2023 - Elsevier
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …

Investigating pretrained language models for graph-to-text generation

LFR Ribeiro, M Schmitt, H Schütze… - arXiv preprint arXiv …, 2020 - arxiv.org
Graph-to-text generation aims to generate fluent texts from graph-based data. In this paper,
we investigate two recently proposed pretrained language models (PLMs) and analyze the …

Jointgt: Graph-text joint representation learning for text generation from knowledge graphs

P Ke, H Ji, Y Ran, X Cui, L Wang, L Song, X Zhu… - arXiv preprint arXiv …, 2021 - arxiv.org
Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …

FactGraph: Evaluating factuality in summarization with semantic graph representations

LFR Ribeiro, M Liu, I Gurevych, M Dreyer… - arXiv preprint arXiv …, 2022 - arxiv.org
Despite recent improvements in abstractive summarization, most current approaches
generate summaries that are not factually consistent with the source document, severely …

Incorporating context graph with logical reasoning for inductive relation prediction

Q Lin, J Liu, F Xu, Y Pan, Y Zhu, L Zhang… - Proceedings of the 45th …, 2022 - dl.acm.org
Relation prediction on knowledge graphs (KGs) aims to infer missing valid triples from
observed ones. Although this task has been deeply studied, most previous studies are …

Structural adapters in pretrained language models for amr-to-text generation

LFR Ribeiro, Y Zhang, I Gurevych - arXiv preprint arXiv:2103.09120, 2021 - arxiv.org
Pretrained language models (PLM) have recently advanced graph-to-text generation, where
the input graph is linearized into a sequence and fed into the PLM to obtain its …

Generative de novo protein design with global context

C Tan, Z Gao, J Xia, B Hu, SZ Li - arXiv preprint arXiv:2204.10673, 2022 - arxiv.org
The linear sequence of amino acids determines protein structure and function. Protein
design, known as the inverse of protein structure prediction, aims to obtain a novel protein …

Few-shot knowledge graph-to-text generation with pretrained language models

J Li, T Tang, WX Zhao, Z Wei, NJ Yuan… - arXiv preprint arXiv …, 2021 - arxiv.org
This paper studies how to automatically generate a natural language text that describes the
facts in knowledge graph (KG). Considering the few-shot setting, we leverage the excellent …

Evaluating generative models for graph-to-text generation

S Yuan, M Färber - arXiv preprint arXiv:2307.14712, 2023 - arxiv.org
Large language models (LLMs) have been widely employed for graph-to-text generation
tasks. However, the process of finetuning LLMs requires significant training resources and …