A survey on non-autoregressive generation for neural machine translation and beyond

Y Xiao, L Wu, J Guo, J Li, M Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …

Prompt-based editing for text style transfer

G Luo, YT Han, L Mou, M Firdaus - arXiv preprint arXiv:2301.11997, 2023 - arxiv.org
Prompting approaches have been recently explored in text style transfer, where a textual
prompt is used to query a pretrained language model to generate style-transferred texts …

SLOG: A structural generalization benchmark for semantic parsing

B Li, L Donatelli, A Koller, T Linzen, Y Yao… - arXiv preprint arXiv …, 2023 - arxiv.org
The goal of compositional generalization benchmarks is to evaluate how well models
generalize to new complex linguistic expressions. Existing benchmarks often focus on …

Referee: Reference-free sentence summarization with sharper controllability through symbolic knowledge distillation

M Sclar, P West, S Kumar, Y Tsvetkov… - arXiv preprint arXiv …, 2022 - arxiv.org
We present Referee, a novel framework for sentence summarization that can be trained
reference-free (ie, requiring no gold summaries for supervision), while allowing direct control …

Screening through a broad pool: Towards better diversity for lexically constrained text generation

C Yuan, H Huang, Y Cao, Q Cao - Information Processing & Management, 2024 - Elsevier
Lexically constrained text generation (CTG) is to generate text that contains given
constrained keywords. However, the text diversity of existing models is still unsatisfactory. In …

Unsupervised chunking with hierarchical RNN

Z Wu, AA Deshmukh, Y Wu, J Lin, L Mou - arXiv preprint arXiv:2309.04919, 2023 - arxiv.org
In Natural Language Processing (NLP), predicting linguistic structures, such as parsing and
chunking, has mostly relied on manual annotations of syntactic structures. This paper …

From Lengthy to Lucid: A Systematic Literature Review on NLP Techniques for Taming Long Sentences

T Passali, E Chatzikyriakidis, S Andreadis… - arXiv preprint arXiv …, 2023 - arxiv.org
Long sentences have been a persistent issue in written communication for many years since
they make it challenging for readers to grasp the main points or follow the initial intention of …

Generating multiple-length summaries via reinforcement learning for unsupervised sentence summarization

D Hyun, X Wang, C Park, X Xie, H Yu - arXiv preprint arXiv:2212.10843, 2022 - arxiv.org
Sentence summarization shortens given texts while maintaining core contents of the texts.
Unsupervised approaches have been studied to summarize texts without human-written …

A character-level length-control algorithm for non-autoregressive sentence summarization

P Liu, X Zhang, L Mou - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Sentence summarization aims at compressing a long sentence into a short one that keeps
the main gist, and has extensive real-world applications such as headline generation. In …

RenewNAT: renewing potential translation for non-autoregressive transformer

P Guo, Y Xiao, J Li, M Zhang - Proceedings of the AAAI Conference on …, 2023 - ojs.aaai.org
Non-autoregressive neural machine translation (NAT) models are proposed to accelerate
the inference process while maintaining relatively high performance. However, existing NAT …