[HTML][HTML] Pre-trained language models and their applications

H Wang, J Li, H Wu, E Hovy, Y Sun - Engineering, 2023 - Elsevier
Pre-trained language models have achieved striking success in natural language
processing (NLP), leading to a paradigm shift from supervised learning to pre-training …

A survey of natural language generation

C Dong, Y Li, H Gong, M Chen, J Li, Y Shen… - ACM Computing …, 2022 - dl.acm.org
This article offers a comprehensive review of the research on Natural Language Generation
(NLG) over the past two decades, especially in relation to data-to-text generation and text-to …

[HTML][HTML] Pre-trained models: Past, present and future

X Han, Z Zhang, N Ding, Y Gu, X Liu, Y Huo, J Qiu… - AI Open, 2021 - Elsevier
Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved
great success and become a milestone in the field of artificial intelligence (AI). Owing to …

Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

Recent advances in deep learning based dialogue systems: A systematic survey

J Ni, T Young, V Pandelea, F Xue… - Artificial intelligence review, 2023 - Springer
Dialogue systems are a popular natural language processing (NLP) task as it is promising in
real-life applications. It is also a complicated task since many NLP tasks deserving study are …

A simple language model for task-oriented dialogue

E Hosseini-Asl, B McCann, CS Wu… - Advances in Neural …, 2020 - proceedings.neurips.cc
Task-oriented dialogue is often decomposed into three tasks: understanding user input,
deciding actions, and generating a response. While such decomposition might suggest a …

Multi-task pre-training for plug-and-play task-oriented dialogue system

Y Su, L Shu, E Mansimov, A Gupta, D Cai… - arXiv preprint arXiv …, 2021 - arxiv.org
Pre-trained language models have been recently shown to benefit task-oriented dialogue
(TOD) systems. Despite their success, existing methods often formulate this task as a …

Galaxy: A generative pre-trained model for task-oriented dialog with semi-supervised learning and explicit policy injection

W He, Y Dai, Y Zheng, Y Wu, Z Cao, D Liu… - Proceedings of the …, 2022 - ojs.aaai.org
Pre-trained models have proved to be powerful in enhancing task-oriented dialog systems.
However, current pre-training methods mainly focus on enhancing dialog understanding …

Conversational agents: Goals, technologies, vision and challenges

M Allouch, A Azaria, R Azoulay - Sensors, 2021 - mdpi.com
In recent years, conversational agents (CAs) have become ubiquitous and are a presence in
our daily routines. It seems that the technology has finally ripened to advance the use of CAs …

TOD-BERT: Pre-trained natural language understanding for task-oriented dialogue

CS Wu, S Hoi, R Socher, C Xiong - arXiv preprint arXiv:2004.06871, 2020 - arxiv.org
The underlying difference of linguistic patterns between general text and task-oriented
dialogue makes existing pre-trained language models less useful in practice. In this work …