Survey of hallucination in natural language generation

Z Ji, N Lee, R Frieske, T Yu, D Su, Y Xu, E Ishii… - ACM Computing …, 2023 - dl.acm.org
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …

Pre-trained language models for text generation: A survey

J Li, T Tang, WX Zhao, JY Nie, JR Wen - ACM Computing Surveys, 2024 - dl.acm.org
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …

Lift: Language-interfaced fine-tuning for non-language machine learning tasks

T Dinh, Y Zeng, R Zhang, Z Lin… - Advances in …, 2022 - proceedings.neurips.cc
Fine-tuning pretrained language models (LMs) without making any architectural changes
has become a norm for learning various language downstream tasks. However, for non …

MultiHiertt: Numerical reasoning over multi hierarchical tabular and textual data

Y Zhao, Y Li, C Li, R Zhang - arXiv preprint arXiv:2206.01347, 2022 - arxiv.org
Numerical reasoning over hybrid data containing both textual and tabular content (eg,
financial reports) has recently attracted much attention in the NLP community. However …

Transformers for tabular data representation: A survey of models and applications

G Badaro, M Saeed, P Papotti - Transactions of the Association for …, 2023 - direct.mit.edu
In the last few years, the natural language processing community has witnessed advances
in neural representations of free texts with transformer-based language models (LMs). Given …

Table pre-training: A survey on model architectures, pre-training objectives, and downstream tasks

H Dong, Z Cheng, X He, M Zhou, A Zhou… - arXiv preprint arXiv …, 2022 - arxiv.org
Since a vast number of tables can be easily collected from web pages, spreadsheets, PDFs,
and various other document types, a flurry of table pre-training frameworks have been …

A systematic literature review on text generation using deep neural network models

N Fatima, AS Imran, Z Kastrati, SM Daudpota… - IEEE …, 2022 - ieeexplore.ieee.org
In recent years, significant progress has been made in text generation. The latest text
generation models are revolutionizing the domain by generating human-like text. It has …

Hitab: A hierarchical table dataset for question answering and natural language generation

Z Cheng, H Dong, Z Wang, R Jia, J Guo, Y Gao… - arXiv preprint arXiv …, 2021 - arxiv.org
Tables are often created with hierarchies, but existing works on table reasoning mainly focus
on flat tables and neglect hierarchical tables. Hierarchical tables challenge existing methods …

PACIFIC: towards proactive conversational question answering over tabular and textual data in finance

Y Deng, W Lei, W Zhang, W Lam, TS Chua - arXiv preprint arXiv …, 2022 - arxiv.org
To facilitate conversational question answering (CQA) over hybrid contexts in finance, we
present a new dataset, named PACIFIC. Compared with existing CQA datasets, PACIFIC …

PLOG: Table-to-logic pretraining for logical table-to-text generation

A Liu, H Dong, N Okazaki, S Han, D Zhang - arXiv preprint arXiv …, 2022 - arxiv.org
Logical table-to-text generation is a task that involves generating logically faithful sentences
from tables, which requires models to derive logical level facts from table records via logical …