A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Continual learning of natural language processing tasks: A survey

Z Ke, B Liu - arXiv preprint arXiv:2211.12701, 2022 - arxiv.org
Continual learning (CL) is a learning paradigm that emulates the human capability of
learning and accumulating knowledge continually without forgetting the previously learned …

Multi-concept customization of text-to-image diffusion

N Kumari, B Zhang, R Zhang… - Proceedings of the …, 2023 - openaccess.thecvf.com
While generative models produce high-quality images of concepts learned from a large-
scale database, a user often wishes to synthesize instantiations of their own concepts (for …

An image is worth one word: Personalizing text-to-image generation using textual inversion

R Gal, Y Alaluf, Y Atzmon, O Patashnik… - arXiv preprint arXiv …, 2022 - arxiv.org
Text-to-image models offer unprecedented freedom to guide creation through natural
language. Yet, it is unclear how such freedom can be exercised to generate images of …

Interactive natural language processing

Z Wang, G Zhang, K Yang, N Shi, W Zhou… - arXiv preprint arXiv …, 2023 - arxiv.org
Interactive Natural Language Processing (iNLP) has emerged as a novel paradigm within
the field of NLP, aimed at addressing limitations in existing frameworks while aligning with …

Reliable, adaptable, and attributable language models with retrieval

A Asai, Z Zhong, D Chen, PW Koh… - arXiv preprint arXiv …, 2024 - arxiv.org
Parametric language models (LMs), which are trained on vast amounts of web data, exhibit
remarkable flexibility and capability. However, they still face practical challenges such as …

Controllable textual inversion for personalized text-to-image generation

J Yang, H Wang, Y Zhang, R Xiao, S Wu… - arXiv preprint arXiv …, 2023 - arxiv.org
The recent large-scale generative modeling has attained unprecedented performance
especially in producing high-fidelity images driven by text prompts. Text inversion (TI) …

Mitigating catastrophic forgetting in large language models with self-synthesized rehearsal

J Huang, L Cui, A Wang, C Yang, X Liao… - arXiv preprint arXiv …, 2024 - arxiv.org
Large language models (LLMs) suffer from catastrophic forgetting during continual learning.
Conventional rehearsal-based methods rely on previous training data to retain the model's …

Continual learning for natural language generations with transformer calibration

P Yang, D Li, P Li - Proceedings of the 26th Conference on …, 2022 - aclanthology.org
Conventional natural language process (NLP) generation models are trained offline with a
given dataset for a particular task, which is referred to as isolated learning. Research on …

Power Norm Based Lifelong Learning for Paraphrase Generations

D Li, P Yang, Y Zhang, P Li - Proceedings of the 46th International ACM …, 2023 - dl.acm.org
Lifelong seq2seq language generation models are trained with multiple domains in a
lifelong learning manner, with data from each domain being observed in an online fashion. It …