Towards facilitating empathic conversations in online mental health support: A reinforcement learning approach

A Sharma, IW Lin, AS Miner, DC Atkins… - Proceedings of the Web …, 2021 - dl.acm.org
Online peer-to-peer support platforms enable conversations between millions of people who
seek and provide mental health support. If successful, web-based mental health …

Mix and match: Learning-free controllable text generation using energy language models

F Mireshghallah, K Goyal, T Berg-Kirkpatrick - arXiv preprint arXiv …, 2022 - arxiv.org
Recent work on controlled text generation has either required attribute-based fine-tuning of
the base language model (LM), or has restricted the parameterization of the attribute …

Extracting latent steering vectors from pretrained language models

N Subramani, N Suresh, ME Peters - arXiv preprint arXiv:2205.05124, 2022 - arxiv.org
Prior work on controllable text generation has focused on learning how to control language
models through trainable decoding, smart-prompt design, or fine-tuning based on a desired …

Towards practical plug-and-play diffusion models

H Go, Y Lee, JY Kim, S Lee, M Jeong… - Proceedings of the …, 2023 - openaccess.thecvf.com
Diffusion-based generative models have achieved remarkable success in image generation.
Their guidance formulation allows an external model to plug-and-play control the generation …

A distributional lens for multi-aspect controllable text generation

Y Gu, X Feng, S Ma, L Zhang, H Gong, B Qin - arXiv preprint arXiv …, 2022 - arxiv.org
Multi-aspect controllable text generation is a more challenging and practical task than single-
aspect control. Existing methods achieve complex multi-aspect control by fusing multiple …

Plug-and-blend: a framework for plug-and-play controllable story generation with sketches

Z Lin, MO Riedl - Proceedings of the AAAI Conference on Artificial …, 2021 - ojs.aaai.org
Large pre-trained neural language models (LM) have very powerful text generation
capabilities. However, in practice, they are hard to control for creative purposes. We …

Composable text controls in latent space with odes

G Liu, Z Feng, Y Gao, Z Yang, X Liang, J Bao… - arXiv preprint arXiv …, 2022 - arxiv.org
Real-world text applications often involve composing a wide range of text control operations,
such as editing the text wrt an attribute, manipulating keywords and structure, and …

Sentence bottleneck autoencoders from transformer language models

I Montero, N Pappas, NA Smith - arXiv preprint arXiv:2109.00055, 2021 - arxiv.org
Representation learning for text via pretraining a language model on a large corpus has
become a standard starting point for building NLP systems. This approach stands in contrast …

Transductive learning for unsupervised text style transfer

F Xiao, L Pang, Y Lan, Y Wang, H Shen… - arXiv preprint arXiv …, 2021 - arxiv.org
Unsupervised style transfer models are mainly based on an inductive learning approach,
which represents the style as embeddings, decoder parameters, or discriminator parameters …

Feature-aware conditional GAN for category text generation

X Li, K Mao, F Lin, Z Feng - Neurocomputing, 2023 - Elsevier
Category text generation receives considerable attentions since it is beneficial for various
natural language processing tasks. Recently, the generative adversarial network (GAN) has …