Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
A survey of evaluation metrics used for NLG systems
In the last few years, a large number of automatic evaluation metrics have been proposed for
evaluating Natural Language Generation (NLG) systems. The rapid development and …
evaluating Natural Language Generation (NLG) systems. The rapid development and …
Siren's song in the AI ocean: a survey on hallucination in large language models
While large language models (LLMs) have demonstrated remarkable capabilities across a
range of downstream tasks, a significant concern revolves around their propensity to exhibit …
range of downstream tasks, a significant concern revolves around their propensity to exhibit …
Palm: Scaling language modeling with pathways
Large language models have been shown to achieve remarkable performance across a
variety of natural language tasks using few-shot learning, which drastically reduces the …
variety of natural language tasks using few-shot learning, which drastically reduces the …
Graph neural networks for natural language processing: A survey
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Language Processing (NLP). Although text inputs are typically represented as a sequence …
Spot: Better frozen model adaptation through soft prompt transfer
There has been growing interest in parameter-efficient methods to apply pre-trained
language models to downstream tasks. Building on the Prompt Tuning approach of Lester et …
language models to downstream tasks. Building on the Prompt Tuning approach of Lester et …
Ext5: Towards extreme multi-task scaling for transfer learning
Despite the recent success of multi-task learning and transfer learning for natural language
processing (NLP), few works have systematically studied the effect of scaling up the number …
processing (NLP), few works have systematically studied the effect of scaling up the number …
TPLinker: Single-stage joint extraction of entities and relations through token pair linking
Extracting entities and relations from unstructured text has attracted increasing attention in
recent years but remains challenging, due to the intrinsic difficulty in identifying overlapping …
recent years but remains challenging, due to the intrinsic difficulty in identifying overlapping …
A novel cascade binary tagging framework for relational triple extraction
Extracting relational triples from unstructured text is crucial for large-scale knowledge graph
construction. However, few existing works excel in solving the overlapping triple problem …
construction. However, few existing works excel in solving the overlapping triple problem …
Onerel: Joint entity and relation extraction with one module in one step
Joint entity and relation extraction is an essential task in natural language processing and
knowledge graph construction. Existing approaches usually decompose the joint extraction …
knowledge graph construction. Existing approaches usually decompose the joint extraction …