Reasoning with language model prompting: A survey

S Qiao, Y Ou, N Zhang, X Chen, Y Yao, S Deng… - arXiv preprint arXiv …, 2022 - arxiv.org
Reasoning, as an essential ability for complex problem-solving, can provide back-end
support for various real-world applications, such as medical diagnosis, negotiation, etc. This …

Thought propagation: An analogical approach to complex reasoning with large language models

J Yu, R He, R Ying - arXiv preprint arXiv:2310.03965, 2023 - arxiv.org
Large Language Models (LLMs) have achieved remarkable success in reasoning tasks with
the development of prompting methods. However, existing prompting approaches cannot …

Say what you mean! large language models speak too positively about negative commonsense knowledge

J Chen, W Shi, Z Fu, S Cheng, L Li, Y Xiao - arXiv preprint arXiv …, 2023 - arxiv.org
Large language models (LLMs) have been widely studied for their ability to store and utilize
positive knowledge. However, negative knowledge, such as" lions don't live in the ocean", is …

Open-Ethical AI: Advancements in Open-Source Human-Centric Neural Language Models

S Sicari, JF Cevallos M, A Rizzardi… - ACM Computing …, 2024 - dl.acm.org
This survey summarises the most recent methods for building and assessing helpful, honest,
and harmless neural language models, considering small, medium, and large-size models …

In-context analogical reasoning with pre-trained language models

X Hu, S Storks, RL Lewis, J Chai - arXiv preprint arXiv:2305.17626, 2023 - arxiv.org
Analogical reasoning is a fundamental capacity of human cognition that allows us to reason
abstractly about novel situations by relating them to past experiences. While it is thought to …

BertNet: Harvesting knowledge graphs with arbitrary relations from pretrained language models

S Hao, B Tan, K Tang, B Ni, X Shao, H Zhang… - arXiv preprint arXiv …, 2022 - arxiv.org
It is crucial to automatically construct knowledge graphs (KGs) of diverse new relations to
support knowledge discovery and broad applications. Previous KG construction methods …

Analogykb: Unlocking analogical reasoning of language models with a million-scale knowledge base

S Yuan, J Chen, C Sun, J Liang, Y Xiao… - arXiv preprint arXiv …, 2023 - arxiv.org
Analogical reasoning is a fundamental cognitive ability of humans. However, current
language models (LMs) still struggle to achieve human-like performance in analogical …

This is the way: designing and compiling LEPISZCZE, a comprehensive NLP benchmark for Polish

L Augustyniak, K Tagowski… - Advances in …, 2022 - proceedings.neurips.cc
The availability of compute and data to train larger and larger language models increases
the demand for robust methods of benchmarking the true progress of LM training. Recent …

Harnessing Knowledge and Reasoning for Human-Like Natural Language Generation: A Brief Review

J Chen, Y Xiao - arXiv preprint arXiv:2212.03747, 2022 - arxiv.org
The rapid development and application of natural language generation (NLG) techniques
has revolutionized the field of automatic text production. However, these techniques are still …

Buffer of Thoughts: Thought-Augmented Reasoning with Large Language Models

L Yang, Z Yu, T Zhang, S Cao, M Xu, W Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
We introduce Buffer of Thoughts (BoT), a novel and versatile thought-augmented reasoning
approach for enhancing accuracy, efficiency and robustness of large language models …