Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
Pre-trained language models for text generation: A survey
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
Lift: Language-interfaced fine-tuning for non-language machine learning tasks
Fine-tuning pretrained language models (LMs) without making any architectural changes
has become a norm for learning various language downstream tasks. However, for non …
has become a norm for learning various language downstream tasks. However, for non …
MultiHiertt: Numerical reasoning over multi hierarchical tabular and textual data
Numerical reasoning over hybrid data containing both textual and tabular content (eg,
financial reports) has recently attracted much attention in the NLP community. However …
financial reports) has recently attracted much attention in the NLP community. However …
Transformers for tabular data representation: A survey of models and applications
In the last few years, the natural language processing community has witnessed advances
in neural representations of free texts with transformer-based language models (LMs). Given …
in neural representations of free texts with transformer-based language models (LMs). Given …
Table pre-training: A survey on model architectures, pre-training objectives, and downstream tasks
Since a vast number of tables can be easily collected from web pages, spreadsheets, PDFs,
and various other document types, a flurry of table pre-training frameworks have been …
and various other document types, a flurry of table pre-training frameworks have been …
A systematic literature review on text generation using deep neural network models
In recent years, significant progress has been made in text generation. The latest text
generation models are revolutionizing the domain by generating human-like text. It has …
generation models are revolutionizing the domain by generating human-like text. It has …
Hitab: A hierarchical table dataset for question answering and natural language generation
Tables are often created with hierarchies, but existing works on table reasoning mainly focus
on flat tables and neglect hierarchical tables. Hierarchical tables challenge existing methods …
on flat tables and neglect hierarchical tables. Hierarchical tables challenge existing methods …
PACIFIC: towards proactive conversational question answering over tabular and textual data in finance
To facilitate conversational question answering (CQA) over hybrid contexts in finance, we
present a new dataset, named PACIFIC. Compared with existing CQA datasets, PACIFIC …
present a new dataset, named PACIFIC. Compared with existing CQA datasets, PACIFIC …
PLOG: Table-to-logic pretraining for logical table-to-text generation
Logical table-to-text generation is a task that involves generating logically faithful sentences
from tables, which requires models to derive logical level facts from table records via logical …
from tables, which requires models to derive logical level facts from table records via logical …