A survey of controllable text generation using transformer-based pre-trained language models

H Zhang, H Song, S Li, M Zhou, D Song - ACM Computing Surveys, 2023 - dl.acm.org
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …

Ammus: A survey of transformer-based pretrained models in natural language processing

KS Kalyan, A Rajasekharan, S Sangeetha - arXiv preprint arXiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …

Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

A survey of knowledge-enhanced text generation

W Yu, C Zhu, Z Li, Z Hu, Q Wang, H Ji… - ACM Computing …, 2022 - dl.acm.org
The goal of text-to-text generation is to make machines express like a human in many
applications such as conversation, summarization, and translation. It is one of the most …

The gem benchmark: Natural language generation, its evaluation and metrics

S Gehrmann, T Adewumi, K Aggarwal… - arXiv preprint arXiv …, 2021 - arxiv.org
We introduce GEM, a living benchmark for natural language Generation (NLG), its
Evaluation, and Metrics. Measuring progress in NLG relies on a constantly evolving …

JGLUE: Japanese general language understanding evaluation

K Kurihara, D Kawahara, T Shibata - Proceedings of the Thirteenth …, 2022 - aclanthology.org
To develop high-performance natural language understanding (NLU) models, it is
necessary to have a benchmark to evaluate and analyze NLU ability from various …

Ar-diffusion: Auto-regressive diffusion model for text generation

T Wu, Z Fan, X Liu, HT Zheng, Y Gong… - Advances in …, 2023 - proceedings.neurips.cc
Diffusion models have gained significant attention in the realm of image generation due to
their exceptional performance. Their success has been recently expanded to text generation …

Compression, transduction, and creation: A unified framework for evaluating natural language generation

M Deng, B Tan, Z Liu, EP Xing, Z Hu - arXiv preprint arXiv:2109.06379, 2021 - arxiv.org
Natural language generation (NLG) spans a broad range of tasks, each of which serves for
specific objectives and desires different properties of generated text. The complexity makes …

Prophetnet-x: Large-scale pre-training models for english, chinese, multi-lingual, dialog, and code generation

W Qi, Y Gong, Y Yan, C Xu, B Yao, B Zhou… - arXiv preprint arXiv …, 2021 - arxiv.org
Now, the pre-training technique is ubiquitous in natural language processing field.
ProphetNet is a pre-training based natural language generation method which shows …