The WebNLG challenge: Generating text from RDF data
C Gardent, A Shimorina, S Narayan… - 10th International …, 2017 - research.ed.ac.uk
The WebNLG challenge consists in mapping sets of RDF triples to text. It provides a
common benchmark on which to train, evaluate and compare “microplanners”, ie generation …
common benchmark on which to train, evaluate and compare “microplanners”, ie generation …
The E2E dataset: New challenges for end-to-end generation
This paper describes the E2E data, a new dataset for training end-to-end, data-driven
natural language generation systems in the restaurant domain, which is ten times bigger …
natural language generation systems in the restaurant domain, which is ten times bigger …
The gem benchmark: Natural language generation, its evaluation and metrics
We introduce GEM, a living benchmark for natural language Generation (NLG), its
Evaluation, and Metrics. Measuring progress in NLG relies on a constantly evolving …
Evaluation, and Metrics. Measuring progress in NLG relies on a constantly evolving …
[HTML][HTML] Evaluating the state-of-the-art of end-to-end natural language generation: The e2e nlg challenge
This paper provides a comprehensive analysis of the first shared task on End-to-End Natural
Language Generation (NLG) and identifies avenues for future research based on the results …
Language Generation (NLG) and identifies avenues for future research based on the results …
A survey on neural data-to-text generation
Data-to-text Generation (D2T) aims to generate textual natural language statements that can
fluently and precisely describe the structured data such as graphs, tables, and meaning …
fluently and precisely describe the structured data such as graphs, tables, and meaning …
Controlling hallucinations at word level in data-to-text generation
Abstract Data-to-Text Generation (DTG) is a subfield of Natural Language Generation
aiming at transcribing structured data in natural language descriptions. The field has been …
aiming at transcribing structured data in natural language descriptions. The field has been …
Controlled hallucinations: Learning to generate faithfully from noisy data
K Filippova - arXiv preprint arXiv:2010.05873, 2020 - arxiv.org
Neural text generation (data-or text-to-text) demonstrates remarkable performance when
training data is abundant which for many applications is not the case. To collect a large …
training data is abundant which for many applications is not the case. To collect a large …
A simple recipe towards reducing hallucination in neural surface realisation
Recent neural language generation systems often hallucinate contents (ie, producing
irrelevant or contradicted facts), especially when trained on loosely corresponding pairs of …
irrelevant or contradicted facts), especially when trained on loosely corresponding pairs of …
Operations guided neural networks for high fidelity data-to-text generation
Recent neural models for data-to-text generation are mostly based on data-driven end-to-
end training over encoder-decoder networks. Even though the generated texts are mostly …
end training over encoder-decoder networks. Even though the generated texts are mostly …
How do seq2seq models perform on end-to-end data-to-text generation?
With the rapid development of deep learning, Seq2Seq paradigm has become prevalent for
end-to-end data-to-text generation, and the BLEU scores have been increasing in recent …
end-to-end data-to-text generation, and the BLEU scores have been increasing in recent …