A survey on non-autoregressive generation for neural machine translation and beyond
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …
(NMT) to speed up inference, has attracted much attention in both machine learning and …
Prompt-based editing for text style transfer
Prompting approaches have been recently explored in text style transfer, where a textual
prompt is used to query a pretrained language model to generate style-transferred texts …
prompt is used to query a pretrained language model to generate style-transferred texts …
SLOG: A structural generalization benchmark for semantic parsing
The goal of compositional generalization benchmarks is to evaluate how well models
generalize to new complex linguistic expressions. Existing benchmarks often focus on …
generalize to new complex linguistic expressions. Existing benchmarks often focus on …
Referee: Reference-free sentence summarization with sharper controllability through symbolic knowledge distillation
We present Referee, a novel framework for sentence summarization that can be trained
reference-free (ie, requiring no gold summaries for supervision), while allowing direct control …
reference-free (ie, requiring no gold summaries for supervision), while allowing direct control …
Screening through a broad pool: Towards better diversity for lexically constrained text generation
C Yuan, H Huang, Y Cao, Q Cao - Information Processing & Management, 2024 - Elsevier
Lexically constrained text generation (CTG) is to generate text that contains given
constrained keywords. However, the text diversity of existing models is still unsatisfactory. In …
constrained keywords. However, the text diversity of existing models is still unsatisfactory. In …
Unsupervised chunking with hierarchical RNN
In Natural Language Processing (NLP), predicting linguistic structures, such as parsing and
chunking, has mostly relied on manual annotations of syntactic structures. This paper …
chunking, has mostly relied on manual annotations of syntactic structures. This paper …
From Lengthy to Lucid: A Systematic Literature Review on NLP Techniques for Taming Long Sentences
T Passali, E Chatzikyriakidis, S Andreadis… - arXiv preprint arXiv …, 2023 - arxiv.org
Long sentences have been a persistent issue in written communication for many years since
they make it challenging for readers to grasp the main points or follow the initial intention of …
they make it challenging for readers to grasp the main points or follow the initial intention of …
Generating multiple-length summaries via reinforcement learning for unsupervised sentence summarization
Sentence summarization shortens given texts while maintaining core contents of the texts.
Unsupervised approaches have been studied to summarize texts without human-written …
Unsupervised approaches have been studied to summarize texts without human-written …
A character-level length-control algorithm for non-autoregressive sentence summarization
Sentence summarization aims at compressing a long sentence into a short one that keeps
the main gist, and has extensive real-world applications such as headline generation. In …
the main gist, and has extensive real-world applications such as headline generation. In …
RenewNAT: renewing potential translation for non-autoregressive transformer
P Guo, Y Xiao, J Li, M Zhang - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
Non-autoregressive neural machine translation (NAT) models are proposed to accelerate
the inference process while maintaining relatively high performance. However, existing NAT …
the inference process while maintaining relatively high performance. However, existing NAT …