A comprehensive survey of continual learning: theory, method and application

L Wang, X Zhang, H Su, J Zhu - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
To cope with real-world dynamics, an intelligent system needs to incrementally acquire,
update, accumulate, and exploit knowledge throughout its lifetime. This ability, known as …

Structured pruning for deep convolutional neural networks: A survey

Y He, L Xiao - IEEE transactions on pattern analysis and …, 2023 - ieeexplore.ieee.org
The remarkable performance of deep Convolutional neural networks (CNNs) is generally
attributed to their deeper and wider architectures, which can come with significant …

Adding conditional control to text-to-image diffusion models

L Zhang, A Rao, M Agrawala - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
We present ControlNet, a neural network architecture to add spatial conditioning controls to
large, pretrained text-to-image diffusion models. ControlNet locks the production-ready large …

Dualprompt: Complementary prompting for rehearsal-free continual learning

Z Wang, Z Zhang, S Ebrahimi, R Sun, H Zhang… - … on Computer Vision, 2022 - Springer
Continual learning aims to enable a single model to learn a sequence of tasks without
catastrophic forgetting. Top-performing methods usually require a rehearsal buffer to store …

Learn from others and be yourself in heterogeneous federated learning

W Huang, M Ye, B Du - … of the IEEE/CVF Conference on …, 2022 - openaccess.thecvf.com
Federated learning has emerged as an important distributed learning paradigm, which
normally involves collaborative updating with others and local updating on private data …

Learning to prompt for continual learning

Z Wang, Z Zhang, CY Lee, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
The mainstream paradigm behind continual learning has been to adapt the model
parameters to non-stationary data distributions, where catastrophic forgetting is the central …

Edge learning using a fully integrated neuro-inspired memristor chip

W Zhang, P Yao, B Gao, Q Liu, D Wu, Q Zhang, Y Li… - Science, 2023 - science.org
Learning is highly important for edge intelligence devices to adapt to different application
scenes and owners. Current technologies for training neural networks require moving …

Der: Dynamically expandable representation for class incremental learning

S Yan, J Xie, X He - … of the IEEE/CVF conference on …, 2021 - openaccess.thecvf.com
We address the problem of class incremental learning, which is a core step towards
achieving adaptive vision intelligence. In particular, we consider the task setting of …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Towards open world object detection

KJ Joseph, S Khan, FS Khan… - Proceedings of the …, 2021 - openaccess.thecvf.com
Humans have a natural instinct to identify unknown object instances in their environments.
The intrinsic curiosity about these unknown instances aids in learning about them, when the …