Human-like systematic generalization through a meta-learning neural network
The power of human language and thought arises from systematic compositionality—the
algebraic ability to understand and produce novel combinations from known components …
algebraic ability to understand and produce novel combinations from known components …
Least-to-most prompting enables complex reasoning in large language models
Chain-of-thought prompting has demonstrated remarkable performance on various natural
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
language reasoning tasks. However, it tends to perform poorly on tasks which requires …
Compositional semantic parsing with large language models
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …
shows that appropriate prompting techniques enable large language models (LLMs) to …
Randomized positional encodings boost length generalization of transformers
Transformers have impressive generalization capabilities on tasks with a fixed context
length. However, they fail to generalize to sequences of arbitrary length, even for seemingly …
length. However, they fail to generalize to sequences of arbitrary length, even for seemingly …
How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …
understanding of the world. Such an agent would require the ability to continually …
Compositional generalization and natural language variation: Can a semantic parsing approach handle both?
Sequence-to-sequence models excel at handling natural language variation, but have been
shown to struggle with out-of-distribution compositional generalization. This has motivated …
shown to struggle with out-of-distribution compositional generalization. This has motivated …
The devil is in the detail: Simple tricks improve systematic generalization of transformers
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …
neural networks. The companion baseline Transformers, typically trained with default hyper …
Improving compositional generalization with latent structure and data augmentation
Generic unstructured neural networks have been shown to struggle on out-of-distribution
compositional generalization. Compositional data augmentation via example recombination …
compositional generalization. Compositional data augmentation via example recombination …
Span-based semantic parsing for compositional generalization
Despite the success of sequence-to-sequence (seq2seq) models in semantic parsing,
recent work has shown that they fail in compositional generalization, ie, the ability to …
recent work has shown that they fail in compositional generalization, ie, the ability to …
Making transformers solve compositional tasks
Several studies have reported the inability of Transformer models to generalize
compositionally, a key type of generalization in many NLP tasks such as semantic parsing …
compositionally, a key type of generalization in many NLP tasks such as semantic parsing …