Grammar prompting for domain-specific language generation with large language models
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …
from just a handful of in-context examples. However, for generating strings from highly …
Tree-Averaging Algorithms for Ensemble-Based Unsupervised Discontinuous Constituency Parsing
We address unsupervised discontinuous constituency parsing, where we observe a high
variance in the performance of the only previous model in the literature. We propose to build …
variance in the performance of the only previous model in the literature. We propose to build …
Simple Hardware-Efficient PCFGs with Independent Left and Right Productions
Scaling dense PCFGs to thousands of nonterminals via a low-rank parameterization of the
rule probability tensor has been shown to be beneficial for unsupervised parsing. However …
rule probability tensor has been shown to be beneficial for unsupervised parsing. However …
Ensemble-Based Unsupervised Discontinuous Constituency Parsing by Tree Averaging
We address unsupervised discontinuous constituency parsing, where we observe a high
variance in the performance of the only previous model. We propose to build an ensemble of …
variance in the performance of the only previous model. We propose to build an ensemble of …