Survey of hallucination in natural language generation
Natural Language Generation (NLG) has improved exponentially in recent years thanks to
the development of sequence-to-sequence deep learning technologies such as Transformer …
the development of sequence-to-sequence deep learning technologies such as Transformer …
A survey of deep learning for mathematical reasoning
Mathematical reasoning is a fundamental aspect of human intelligence and is applicable in
various fields, including science, engineering, finance, and everyday life. The development …
various fields, including science, engineering, finance, and everyday life. The development …
Measuring and improving consistency in pretrained language models
Consistency of a model—that is, the invariance of its behavior under meaning-preserving
alternations in its input—is a highly desirable property in natural language processing. In …
alternations in its input—is a highly desirable property in natural language processing. In …
Reasoning or reciting? exploring the capabilities and limitations of language models through counterfactual tasks
The impressive performance of recent language models across a wide range of tasks
suggests that they possess a degree of abstract reasoning skills. Are these skills general …
suggests that they possess a degree of abstract reasoning skills. Are these skills general …
Lift: Language-interfaced fine-tuning for non-language machine learning tasks
Fine-tuning pretrained language models (LMs) without making any architectural changes
has become a norm for learning various language downstream tasks. However, for non …
has become a norm for learning various language downstream tasks. However, for non …
mgpt: Few-shot learners go multilingual
Recent studies report that autoregressive language models can successfully solve many
NLP tasks via zero-and few-shot learning paradigms, which opens up new possibilities for …
NLP tasks via zero-and few-shot learning paradigms, which opens up new possibilities for …
Language models: past, present, and future
H Li - Communications of the ACM, 2022 - dl.acm.org
Language models: past, present, and future Page 1 56 COMMUNICATIONS OF THE ACM |
JULY 2022 | VOL. 65 | NO. 7 contributed articles NATURAL LANGUAGE PROCESSING (NLP) …
JULY 2022 | VOL. 65 | NO. 7 contributed articles NATURAL LANGUAGE PROCESSING (NLP) …
Representing numbers in NLP: a survey and a vision
NLP systems rarely give special consideration to numbers found in text. This starkly
contrasts with the consensus in neuroscience that, in the brain, numbers are represented …
contrasts with the consensus in neuroscience that, in the brain, numbers are represented …
Geollm: Extracting geospatial knowledge from large language models
The application of machine learning (ML) in a range of geospatial tasks is increasingly
common but often relies on globally available covariates such as satellite imagery that can …
common but often relies on globally available covariates such as satellite imagery that can …
FiNER: Financial numeric entity recognition for XBRL tagging
Publicly traded companies are required to submit periodic reports with eXtensive Business
Reporting Language (XBRL) word-level tags. Manually tagging the reports is tedious and …
Reporting Language (XBRL) word-level tags. Manually tagging the reports is tedious and …