A comprehensive survey of continual learning: theory, method and application
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …
Continual learning of natural language processing tasks: A survey
Continual learning (CL) is a learning paradigm that emulates the human capability of
learning and accumulating knowledge continually without forgetting the previously learned …
learning and accumulating knowledge continually without forgetting the previously learned …
Multi-concept customization of text-to-image diffusion
While generative models produce high-quality images of concepts learned from a large-
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
scale database, a user often wishes to synthesize instantiations of their own concepts (for …
An image is worth one word: Personalizing text-to-image generation using textual inversion
Text-to-image models offer unprecedented freedom to guide creation through natural
language. Yet, it is unclear how such freedom can be exercised to generate images of …
language. Yet, it is unclear how such freedom can be exercised to generate images of …
Interactive natural language processing
Interactive Natural Language Processing (iNLP) has emerged as a novel paradigm within
the field of NLP, aimed at addressing limitations in existing frameworks while aligning with …
the field of NLP, aimed at addressing limitations in existing frameworks while aligning with …
Reliable, adaptable, and attributable language models with retrieval
Parametric language models (LMs), which are trained on vast amounts of web data, exhibit
remarkable flexibility and capability. However, they still face practical challenges such as …
remarkable flexibility and capability. However, they still face practical challenges such as …
Controllable textual inversion for personalized text-to-image generation
The recent large-scale generative modeling has attained unprecedented performance
especially in producing high-fidelity images driven by text prompts. Text inversion (TI) …
especially in producing high-fidelity images driven by text prompts. Text inversion (TI) …
Mitigating catastrophic forgetting in large language models with self-synthesized rehearsal
Large language models (LLMs) suffer from catastrophic forgetting during continual learning.
Conventional rehearsal-based methods rely on previous training data to retain the model's …
Conventional rehearsal-based methods rely on previous training data to retain the model's …
Continual learning for natural language generations with transformer calibration
Conventional natural language process (NLP) generation models are trained offline with a
given dataset for a particular task, which is referred to as isolated learning. Research on …
given dataset for a particular task, which is referred to as isolated learning. Research on …
Power Norm Based Lifelong Learning for Paraphrase Generations
Lifelong seq2seq language generation models are trained with multiple domains in a
lifelong learning manner, with data from each domain being observed in an online fashion. It …
lifelong learning manner, with data from each domain being observed in an online fashion. It …