A comprehensive survey of ai-generated content (aigc): A history of generative ai from gan to chatgpt
Recently, ChatGPT, along with DALL-E-2 and Codex, has been gaining significant attention
from society. As a result, many individuals have become interested in related resources and …
from society. As a result, many individuals have become interested in related resources and …
[HTML][HTML] Pre-trained language models and their applications
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …
Investigating pretrained language models for graph-to-text generation
Graph-to-text generation aims to generate fluent texts from graph-based data. In this paper,
we investigate two recently proposed pretrained language models (PLMs) and analyze the …
we investigate two recently proposed pretrained language models (PLMs) and analyze the …
Jointgt: Graph-text joint representation learning for text generation from knowledge graphs
Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …
tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which …
FactGraph: Evaluating factuality in summarization with semantic graph representations
Despite recent improvements in abstractive summarization, most current approaches
generate summaries that are not factually consistent with the source document, severely …
generate summaries that are not factually consistent with the source document, severely …
Incorporating context graph with logical reasoning for inductive relation prediction
Relation prediction on knowledge graphs (KGs) aims to infer missing valid triples from
observed ones. Although this task has been deeply studied, most previous studies are …
observed ones. Although this task has been deeply studied, most previous studies are …
Structural adapters in pretrained language models for amr-to-text generation
Pretrained language models (PLM) have recently advanced graph-to-text generation, where
the input graph is linearized into a sequence and fed into the PLM to obtain its …
the input graph is linearized into a sequence and fed into the PLM to obtain its …
Generative de novo protein design with global context
The linear sequence of amino acids determines protein structure and function. Protein
design, known as the inverse of protein structure prediction, aims to obtain a novel protein …
design, known as the inverse of protein structure prediction, aims to obtain a novel protein …
Few-shot knowledge graph-to-text generation with pretrained language models
This paper studies how to automatically generate a natural language text that describes the
facts in knowledge graph (KG). Considering the few-shot setting, we leverage the excellent …
facts in knowledge graph (KG). Considering the few-shot setting, we leverage the excellent …
Evaluating generative models for graph-to-text generation
Large language models (LLMs) have been widely employed for graph-to-text generation
tasks. However, the process of finetuning LLMs requires significant training resources and …
tasks. However, the process of finetuning LLMs requires significant training resources and …