A survey of controllable text generation using transformer-based pre-trained language models

H Zhang, H Song, S Li, M Zhou, D Song - ACM Computing Surveys, 2023 - dl.acm.org
Controllable Text Generation (CTG) is an emerging area in the field of natural language
generation (NLG). It is regarded as crucial for the development of advanced text generation …

A survey of knowledge-enhanced text generation

W Yu, C Zhu, Z Li, Z Hu, Q Wang, H Ji… - ACM Computing …, 2022 - dl.acm.org
The goal of text-to-text generation is to make machines express like a human in many
applications such as conversation, summarization, and translation. It is one of the most …

A survey of the usages of deep learning for natural language processing

DW Otter, JR Medina, JK Kalita - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
Over the last several years, the field of natural language processing has been propelled
forward by an explosion in the use of deep learning models. This article provides a brief …

Topic modelling meets deep neural networks: A survey

H Zhao, D Phung, V Huynh, Y Jin, L Du… - arXiv preprint arXiv …, 2021 - arxiv.org
Topic modelling has been a successful technique for text analysis for almost twenty years.
When topic modelling met deep neural networks, there emerged a new and increasingly …

[HTML][HTML] The survey: Text generation models in deep learning

T Iqbal, S Qureshi - Journal of King Saud University-Computer and …, 2022 - Elsevier
Deep learning methods possess many processing layers to understand the stratified
representation of data and have achieved state-of-art results in several domains. Recently …

Generative AI-driven semantic communication networks: Architecture, technologies and applications

C Liang, H Du, Y Sun, D Niyato, J Kang… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
Generative artificial intelligence (GAI) has emerged as a rapidly burgeoning field
demonstrating significant potential in creating diverse content intelligently and automatically …

Ernie-gen: An enhanced multi-flow pre-training and fine-tuning framework for natural language generation

D Xiao, H Zhang, Y Li, Y Sun, H Tian, H Wu… - arXiv preprint arXiv …, 2020 - arxiv.org
Current pre-training works in natural language generation pay little attention to the problem
of exposure bias on downstream tasks. To address this issue, we propose an enhanced …

Exploring chemical space using natural language processing methodologies for drug discovery

H Öztürk, A Özgür, P Schwaller, T Laino… - Drug Discovery Today, 2020 - Elsevier
Highlights•Biochemical data can be represented with text-based languages codified by
humans.•Natural language processing (NLP) can be applied to textual biochemical …

Text-based interactive recommendation via constraint-augmented reinforcement learning

R Zhang, T Yu, Y Shen, H Jin… - Advances in neural …, 2019 - proceedings.neurips.cc
Text-based interactive recommendation provides richer user preferences and has
demonstrated advantages over traditional interactive recommender systems. However …

Controlvae: Controllable variational autoencoder

H Shao, S Yao, D Sun, A Zhang, S Liu… - International …, 2020 - proceedings.mlr.press
Variational Autoencoders (VAE) and their variants have been widely used in a variety of
applications, such as dialog generation, image generation and disentangled representation …