The WebNLG challenge: Generating text from RDF data

C Gardent, A Shimorina, S Narayan… - 10th International …, 2017 - research.ed.ac.uk
The WebNLG challenge consists in mapping sets of RDF triples to text. It provides a
common benchmark on which to train, evaluate and compare “microplanners”, ie generation …

The E2E dataset: New challenges for end-to-end generation

J Novikova, O Dušek, V Rieser - arXiv preprint arXiv:1706.09254, 2017 - arxiv.org
This paper describes the E2E data, a new dataset for training end-to-end, data-driven
natural language generation systems in the restaurant domain, which is ten times bigger …

The gem benchmark: Natural language generation, its evaluation and metrics

S Gehrmann, T Adewumi, K Aggarwal… - arXiv preprint arXiv …, 2021 - arxiv.org
We introduce GEM, a living benchmark for natural language Generation (NLG), its
Evaluation, and Metrics. Measuring progress in NLG relies on a constantly evolving …

[HTML][HTML] Evaluating the state-of-the-art of end-to-end natural language generation: The e2e nlg challenge

O Dušek, J Novikova, V Rieser - Computer Speech & Language, 2020 - Elsevier
This paper provides a comprehensive analysis of the first shared task on End-to-End Natural
Language Generation (NLG) and identifies avenues for future research based on the results …

A survey on neural data-to-text generation

Y Lin, T Ruan, J Liu, H Wang - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Data-to-text Generation (D2T) aims to generate textual natural language statements that can
fluently and precisely describe the structured data such as graphs, tables, and meaning …

Controlling hallucinations at word level in data-to-text generation

C Rebuffel, M Roberti, L Soulier, G Scoutheeten… - Data Mining and …, 2022 - Springer
Abstract Data-to-Text Generation (DTG) is a subfield of Natural Language Generation
aiming at transcribing structured data in natural language descriptions. The field has been …

Controlled hallucinations: Learning to generate faithfully from noisy data

K Filippova - arXiv preprint arXiv:2010.05873, 2020 - arxiv.org
Neural text generation (data-or text-to-text) demonstrates remarkable performance when
training data is abundant which for many applications is not the case. To collect a large …

A simple recipe towards reducing hallucination in neural surface realisation

F Nie, JG Yao, J Wang, R Pan… - Proceedings of the 57th …, 2019 - aclanthology.org
Recent neural language generation systems often hallucinate contents (ie, producing
irrelevant or contradicted facts), especially when trained on loosely corresponding pairs of …

Operations guided neural networks for high fidelity data-to-text generation

F Nie, J Wang, JG Yao, R Pan, CY Lin - arXiv preprint arXiv:1809.02735, 2018 - arxiv.org
Recent neural models for data-to-text generation are mostly based on data-driven end-to-
end training over encoder-decoder networks. Even though the generated texts are mostly …

How do seq2seq models perform on end-to-end data-to-text generation?

X Yin, X Wan - Proceedings of the 60th Annual Meeting of the …, 2022 - aclanthology.org
With the rapid development of deep learning, Seq2Seq paradigm has become prevalent for
end-to-end data-to-text generation, and the BLEU scores have been increasing in recent …