The neural process family: Survey, applications and perspectives

S Jha, D Gong, X Wang, RE Turner, L Yao - arXiv preprint arXiv …, 2022 - arxiv.org
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …

Beyond unimodal: Generalising neural processes for multimodal uncertainty estimation

MC Jung, H Zhao, J Dipnall… - Advances in Neural …, 2024 - proceedings.neurips.cc
Uncertainty estimation is an important research area to make deep neural networks (DNNs)
more trustworthy. While extensive research on uncertainty estimation has been conducted …

CLAP4CLIP: Continual Learning with Probabilistic Finetuning for Vision-Language Models

S Jha, D Gong, L Yao - arXiv preprint arXiv:2403.19137, 2024 - arxiv.org
Continual learning (CL) aims to help deep neural networks to learn new knowledge while
retaining what has been learned. Recently, pre-trained vision-language models such as …

Self-Expansion of Pre-trained Models with Mixture of Adapters for Continual Learning

H Wang, H Lu, L Yao, D Gong - arXiv preprint arXiv:2403.18886, 2024 - arxiv.org
Continual learning aims to learn from a stream of continuously arriving data with minimum
forgetting of previously learned knowledge. While previous works have explored the …

Adaptive Rank, Reduced Forgetting: Knowledge Retention in Continual Learning Vision-Language Models with Dynamic Rank-Selective LoRA

H Lu, C Zhao, J Xue, L Yao, K Moore… - arXiv preprint arXiv …, 2024 - arxiv.org
We investigate whether the pre-trained knowledge of vision-language models (VLMs), such
as CLIP, can be retained or even enhanced during continual learning (CL) while absorbing …