Towards reasoning in large language models: A survey
Reasoning is a fundamental aspect of human intelligence that plays a crucial role in
activities such as problem solving, decision making, and critical thinking. In recent years …
activities such as problem solving, decision making, and critical thinking. In recent years …
Metamath: Bootstrap your own mathematical questions for large language models
Large language models (LLMs) have pushed the limits of natural language understanding
and exhibited excellent problem-solving ability. Despite the great success, most existing …
and exhibited excellent problem-solving ability. Despite the great success, most existing …
Specializing smaller language models towards multi-step reasoning
The surprising ability of Large Language Models (LLMs) to perform well on complex
reasoning with only few-shot chain-of-thought prompts is believed to emerge only in very …
reasoning with only few-shot chain-of-thought prompts is believed to emerge only in very …
Reasoning with language model prompting: A survey
Reasoning, as an essential ability for complex problem-solving, can provide back-end
support for various real-world applications, such as medical diagnosis, negotiation, etc. This …
support for various real-world applications, such as medical diagnosis, negotiation, etc. This …
Large language models are reasoning teachers
Recent works have shown that chain-of-thought (CoT) prompting can elicit language models
to solve complex reasoning tasks, step-by-step. However, prompt-based CoT methods are …
to solve complex reasoning tasks, step-by-step. However, prompt-based CoT methods are …
Teaching small language models to reason
Chain of thought prompting successfully improves the reasoning capabilities of large
language models, achieving state of the art results on a range of datasets. However, these …
language models, achieving state of the art results on a range of datasets. However, these …
A survey on model compression for large language models
Large Language Models (LLMs) have revolutionized natural language processing tasks with
remarkable success. However, their formidable size and computational demands present …
remarkable success. However, their formidable size and computational demands present …
A survey on transformer compression
Large models based on the Transformer architecture play increasingly vital roles in artificial
intelligence, particularly within the realms of natural language processing (NLP) and …
intelligence, particularly within the realms of natural language processing (NLP) and …
Distilling reasoning capabilities into smaller language models
Step-by-step reasoning approaches like chain of thought (CoT) have proved to be very
effective in inducing reasoning capabilities in large language models. However, the success …
effective in inducing reasoning capabilities in large language models. However, the success …
Symbolic chain-of-thought distillation: Small models can also" think" step-by-step
Chain-of-thought prompting (eg," Let's think step-by-step") primes large language models to
verbalize rationalization for their predictions. While chain-of-thought can lead to dramatic …
verbalize rationalization for their predictions. While chain-of-thought can lead to dramatic …