Improving compositional generalization with latent structure and data augmentation
Generic unstructured neural networks have been shown to struggle on out-of-distribution
compositional generalization. Compositional data augmentation via example recombination …
compositional generalization. Compositional data augmentation via example recombination …
How Do In-Context Examples Affect Compositional Generalization?
Compositional generalization--understanding unseen combinations of seen primitives--is an
essential reasoning capability in human intelligence. The AI community mainly studies this …
essential reasoning capability in human intelligence. The AI community mainly studies this …
Unlocking compositional generalization in pre-trained models using intermediate representations
Sequence-to-sequence (seq2seq) models are prevalent in semantic parsing, but have been
found to struggle at out-of-distribution compositional generalization. While specialized …
found to struggle at out-of-distribution compositional generalization. While specialized …
Sequence-to-sequence learning with latent neural grammars
Y Kim - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Sequence-to-sequence learning with neural networks has become the de facto standard for
sequence modeling. This approach typically models the local distribution over the next …
sequence modeling. This approach typically models the local distribution over the next …
Generating Data for Symbolic Language with Large Language Models
While large language models (LLMs) bring not only performance but also complexity, recent
work has started to turn LLMs into data generators rather than task inferencers, where …
work has started to turn LLMs into data generators rather than task inferencers, where …
Unobserved local structures make compositional generalization hard
While recent work has convincingly showed that sequence-to-sequence models struggle to
generalize to new compositions (termed compositional generalization), little is known on …
generalize to new compositions (termed compositional generalization), little is known on …
Finding needles in a haystack: Sampling structurally-diverse training sets from synthetic data for compositional generalization
Modern semantic parsers suffer from two principal limitations. First, training requires
expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize …
expensive collection of utterance-program pairs. Second, semantic parsers fail to generalize …
Inducing Transformer's Compositional Generalization Ability via Auxiliary Sequence Prediction Tasks
Systematic compositionality is an essential mechanism in human language, allowing the
recombination of known parts to create novel expressions. However, existing neural models …
recombination of known parts to create novel expressions. However, existing neural models …
Compositional generalization in multilingual semantic parsing over Wikidata
Semantic parsing (SP) allows humans to leverage vast knowledge resources through
natural interaction. However, parsers are mostly designed for and evaluated on English …
natural interaction. However, parsers are mostly designed for and evaluated on English …
Learning to substitute spans towards improving compositional generalization
Despite the rising prevalence of neural sequence models, recent empirical evidences
suggest their deficiency in compositional generalization. One of the current de-facto …
suggest their deficiency in compositional generalization. One of the current de-facto …