Compositional semantic parsing with large language models

A Drozdov, N Schärli, E Akyürek, N Scales… - The Eleventh …, 2022 - openreview.net
Humans can reason compositionally when presented with new tasks. Previous research
shows that appropriate prompting techniques enable large language models (LLMs) to …

How to reuse and compose knowledge for a lifetime of tasks: A survey on continual learning and functional composition

JA Mendez, E Eaton - arXiv preprint arXiv:2207.07730, 2022 - arxiv.org
A major goal of artificial intelligence (AI) is to create an agent capable of acquiring a general
understanding of the world. Such an agent would require the ability to continually …

The devil is in the detail: Simple tricks improve systematic generalization of transformers

R Csordás, K Irie, J Schmidhuber - arXiv preprint arXiv:2108.12284, 2021 - arxiv.org
Recently, many datasets have been proposed to test the systematic generalization ability of
neural networks. The companion baseline Transformers, typically trained with default hyper …

Complex knowledge base question answering: A survey

Y Lan, G He, J Jiang, J Jiang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Knowledge base question answering (KBQA) aims to answer a question over a knowledge
base (KB). Early studies mainly focused on answering simple questions over KBs and …

Unlocking compositional generalization in pre-trained models using intermediate representations

J Herzig, P Shaw, MW Chang, K Guu… - arXiv preprint arXiv …, 2021 - arxiv.org
Sequence-to-sequence (seq2seq) models are prevalent in semantic parsing, but have been
found to struggle at out-of-distribution compositional generalization. While specialized …

Systematic generalization with edge transformers

L Bergen, T O'Donnell… - Advances in Neural …, 2021 - proceedings.neurips.cc
Recent research suggests that systematic generalization in natural language understanding
remains a challenge for state-of-the-art neural models such as Transformers and Graph …

Consistency regularization training for compositional generalization

Y Yin, J Zeng, Y Li, F Meng, J Zhou… - Proceedings of the 61st …, 2023 - aclanthology.org
Existing neural models have difficulty generalizing to unseen combinations of seen
components. To achieve compositional generalization, models are required to consistently …

Disentangled sequence to sequence learning for compositional generalization

H Zheng, M Lapata - arXiv preprint arXiv:2110.04655, 2021 - arxiv.org
There is mounting evidence that existing neural network models, in particular the very
popular sequence-to-sequence architecture, struggle to systematically generalize to unseen …

Finding needles in a haystack: Sampling structurally-diverse training sets from synthetic data for compositional generalization

I Oren, J Herzig, J Berant - arXiv preprint arXiv:2109.02575, 2021 - arxiv.org
Modern semantic parsers suffer from two principal limitations. First, training requires
expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize …

Revisiting iterative back-translation from the perspective of compositional generalization

Y Guo, H Zhu, Z Lin, B Chen, JG Lou… - Proceedings of the AAAI …, 2021 - ojs.aaai.org
Human intelligence exhibits compositional generalization (ie, the capacity to understand
and produce unseen combinations of seen components), but current neural seq2seq …