Larger language models do in-context learning differently
We study how in-context learning (ICL) in language models is affected by semantic priors
versus input-label mappings. We investigate two setups-ICL with flipped labels and ICL with …
versus input-label mappings. We investigate two setups-ICL with flipped labels and ICL with …
Testing the general deductive reasoning capacity of large language models using ood examples
Given the intractably large size of the space of proofs, any model that is capable of general
deductive reasoning must generalize to proofs of greater complexity. Recent studies have …
deductive reasoning must generalize to proofs of greater complexity. Recent studies have …
In-context learning with iterative demonstration selection
Spurred by advancements in scale, large language models (LLMs) have demonstrated
strong few-shot learning ability via in-context learning (ICL). However, the performance of …
strong few-shot learning ability via in-context learning (ICL). However, the performance of …
Open-Ethical AI: Advancements in Open-Source Human-Centric Neural Language Models
This survey summarizes the most recent methods for building and assessing helpful, honest,
and harmless neural language models, considering small, medium, and large-size models …
and harmless neural language models, considering small, medium, and large-size models …
Instruct me more! random prompting for visual in-context learning
Large-scale models trained on extensive datasets, have emerged as the preferred approach
due to their high generalizability across various tasks. In-context learning (ICL), a popular …
due to their high generalizability across various tasks. In-context learning (ICL), a popular …
Skill-based few-shot selection for in-context learning
In-context learning is the paradigm that adapts large language models to downstream tasks
by providing a few examples. Few-shot selection--selecting appropriate examples for each …
by providing a few examples. Few-shot selection--selecting appropriate examples for each …
Magnifico: Evaluating the in-context learning ability of large language models to generalize to novel interpretations
Humans possess a remarkable ability to assign novel interpretations to linguistic
expressions, enabling them to learn new words and understand community-specific …
expressions, enabling them to learn new words and understand community-specific …
Towards General Industrial Intelligence: A Survey on IIoT-Enhanced Continual Large Models
Currently, most applications in the Industrial Internet of Things (IIoT) still rely on CNN-based
neural networks. Although Transformer-based large models (LMs), including language …
neural networks. Although Transformer-based large models (LMs), including language …
Do large language models have compositional ability? an investigation into limitations and scalability
Large language models (LLM) have emerged as a powerful tool exhibiting remarkable in-
context learning (ICL) capabilities. In this study, we delve into the ICL capabilities of LLMs on …
context learning (ICL) capabilities. In this study, we delve into the ICL capabilities of LLMs on …
Leveraging code to improve in-context learning for semantic parsing
In-context learning (ICL) is an appealing approach for semantic parsing due to its few-shot
nature and improved generalization. However, learning to parse to rare domain-specific …
nature and improved generalization. However, learning to parse to rare domain-specific …