Compositional semantic parsing with large language models
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …
shows that appropriate prompting techniques enable large language models (LLMs) to …
How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …
understanding of the world. Such an agent would require the ability to continually …
The devil is in the detail: Simple tricks improve systematic generalization of transformers
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …
neural networks. The companion baseline Transformers, typically trained with default hyper …
Complex knowledge base question answering: A survey
Knowledge base question answering (KBQA) aims to answer a question over a knowledge
base (KB). Early studies mainly focused on answering simple questions over KBs and …
base (KB). Early studies mainly focused on answering simple questions over KBs and …
Unlocking compositional generalization in pre-trained models using intermediate representations
Sequence-to-sequence (seq2seq) models are prevalent in semantic parsing, but have been
found to struggle at out-of-distribution compositional generalization. While specialized …
found to struggle at out-of-distribution compositional generalization. While specialized …
Systematic generalization with edge transformers
L Bergen, T O'Donnell… - Advances in Neural …, 2021 - proceedings.neurips.cc
Recent research suggests that systematic generalization in natural language understanding
remains a challenge for state-of-the-art neural models such as Transformers and Graph …
remains a challenge for state-of-the-art neural models such as Transformers and Graph …
Consistency regularization training for compositional generalization
Existing neural models have difficulty generalizing to unseen combinations of seen
components. To achieve compositional generalization, models are required to consistently …
components. To achieve compositional generalization, models are required to consistently …
Disentangled sequence to sequence learning for compositional generalization
There is mounting evidence that existing neural network models, in particular the very
popular sequence-to-sequence architecture, struggle to systematically generalize to unseen …
popular sequence-to-sequence architecture, struggle to systematically generalize to unseen …
Finding needles in a haystack: Sampling structurally-diverse training sets from synthetic data for compositional generalization
Modern semantic parsers suffer from two principal limitations. First, training requires
expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize …
expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize …
Revisiting iterative back-translation from the perspective of compositional generalization
Human intelligence exhibits compositional generalization (ie, the capacity to understand
and produce unseen combinations of seen components), but current neural seq2seq …
and produce unseen combinations of seen components), but current neural seq2seq …